Jan 11 17:30:24 crc systemd[1]: Starting Kubernetes Kubelet... Jan 11 17:30:24 crc restorecon[4694]: Relabeled /var/lib/kubelet/config.json from system_u:object_r:unlabeled_t:s0 to system_u:object_r:container_var_lib_t:s0 Jan 11 17:30:24 crc restorecon[4694]: /var/lib/kubelet/device-plugins not reset as customized by admin to system_u:object_r:container_file_t:s0 Jan 11 17:30:24 crc restorecon[4694]: /var/lib/kubelet/device-plugins/kubelet.sock not reset as customized by admin to system_u:object_r:container_file_t:s0 Jan 11 17:30:24 crc restorecon[4694]: /var/lib/kubelet/pods/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8/volumes/kubernetes.io~configmap/nginx-conf/..2025_02_23_05_40_35.4114275528/nginx.conf not reset as customized by admin to system_u:object_r:container_file_t:s0:c15,c25 Jan 11 17:30:24 crc restorecon[4694]: /var/lib/kubelet/pods/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c15,c25 Jan 11 17:30:24 crc restorecon[4694]: /var/lib/kubelet/pods/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8/containers/networking-console-plugin/22e96971 not reset as customized by admin to system_u:object_r:container_file_t:s0:c15,c25 Jan 11 17:30:24 crc restorecon[4694]: /var/lib/kubelet/pods/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8/containers/networking-console-plugin/21c98286 not reset as customized by admin to system_u:object_r:container_file_t:s0:c15,c25 Jan 11 17:30:24 crc restorecon[4694]: /var/lib/kubelet/pods/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8/containers/networking-console-plugin/0f1869e1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c15,c25 Jan 11 17:30:24 crc restorecon[4694]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c215,c682 Jan 11 17:30:24 crc restorecon[4694]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/setup/46889d52 not reset as customized by admin to system_u:object_r:container_file_t:s0:c225,c458 Jan 11 17:30:24 crc restorecon[4694]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/setup/5b6a5969 not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c963 Jan 11 17:30:24 crc restorecon[4694]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/setup/6c7921f5 not reset as customized by admin to system_u:object_r:container_file_t:s0:c215,c682 Jan 11 17:30:24 crc restorecon[4694]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/kube-rbac-proxy-crio/4804f443 not reset as customized by admin to system_u:object_r:container_file_t:s0:c225,c458 Jan 11 17:30:24 crc restorecon[4694]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/kube-rbac-proxy-crio/2a46b283 not reset as customized by admin to system_u:object_r:container_file_t:s0:c225,c458 Jan 11 17:30:24 crc restorecon[4694]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/kube-rbac-proxy-crio/a6b5573e not reset as customized by admin to system_u:object_r:container_file_t:s0:c225,c458 Jan 11 17:30:24 crc restorecon[4694]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/kube-rbac-proxy-crio/4f88ee5b not reset as customized by admin to system_u:object_r:container_file_t:s0:c225,c458 Jan 11 17:30:24 crc restorecon[4694]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/kube-rbac-proxy-crio/5a4eee4b not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c963 Jan 11 17:30:24 crc restorecon[4694]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/kube-rbac-proxy-crio/cd87c521 not reset as customized by admin to system_u:object_r:container_file_t:s0:c215,c682 Jan 11 17:30:24 crc restorecon[4694]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/volumes/kubernetes.io~configmap/service-ca-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Jan 11 17:30:24 crc restorecon[4694]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/volumes/kubernetes.io~configmap/service-ca-bundle/..2025_02_23_05_33_42.2574241751 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Jan 11 17:30:24 crc restorecon[4694]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/volumes/kubernetes.io~configmap/service-ca-bundle/..2025_02_23_05_33_42.2574241751/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Jan 11 17:30:24 crc restorecon[4694]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/volumes/kubernetes.io~configmap/service-ca-bundle/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Jan 11 17:30:24 crc restorecon[4694]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/volumes/kubernetes.io~configmap/service-ca-bundle/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Jan 11 17:30:24 crc restorecon[4694]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Jan 11 17:30:24 crc restorecon[4694]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/containers/router/38602af4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Jan 11 17:30:24 crc restorecon[4694]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/containers/router/1483b002 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Jan 11 17:30:24 crc restorecon[4694]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/containers/router/0346718b not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Jan 11 17:30:24 crc restorecon[4694]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/containers/router/d3ed4ada not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Jan 11 17:30:24 crc restorecon[4694]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/containers/router/3bb473a5 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Jan 11 17:30:24 crc restorecon[4694]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/containers/router/8cd075a9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Jan 11 17:30:24 crc restorecon[4694]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/containers/router/00ab4760 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Jan 11 17:30:24 crc restorecon[4694]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/containers/router/54a21c09 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Jan 11 17:30:24 crc restorecon[4694]: /var/lib/kubelet/pods/37a5e44f-9a88-4405-be8a-b645485e7312/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c589,c726 Jan 11 17:30:24 crc restorecon[4694]: /var/lib/kubelet/pods/37a5e44f-9a88-4405-be8a-b645485e7312/containers/network-operator/70478888 not reset as customized by admin to system_u:object_r:container_file_t:s0:c176,c499 Jan 11 17:30:24 crc restorecon[4694]: /var/lib/kubelet/pods/37a5e44f-9a88-4405-be8a-b645485e7312/containers/network-operator/43802770 not reset as customized by admin to system_u:object_r:container_file_t:s0:c176,c499 Jan 11 17:30:24 crc restorecon[4694]: /var/lib/kubelet/pods/37a5e44f-9a88-4405-be8a-b645485e7312/containers/network-operator/955a0edc not reset as customized by admin to system_u:object_r:container_file_t:s0:c176,c499 Jan 11 17:30:24 crc restorecon[4694]: /var/lib/kubelet/pods/37a5e44f-9a88-4405-be8a-b645485e7312/containers/network-operator/bca2d009 not reset as customized by admin to system_u:object_r:container_file_t:s0:c140,c1009 Jan 11 17:30:24 crc restorecon[4694]: /var/lib/kubelet/pods/37a5e44f-9a88-4405-be8a-b645485e7312/containers/network-operator/b295f9bd not reset as customized by admin to system_u:object_r:container_file_t:s0:c589,c726 Jan 11 17:30:24 crc restorecon[4694]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-binary-copy not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Jan 11 17:30:24 crc restorecon[4694]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-binary-copy/..2025_02_23_05_21_22.3617465230 not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Jan 11 17:30:24 crc restorecon[4694]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-binary-copy/..2025_02_23_05_21_22.3617465230/cnibincopy.sh not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Jan 11 17:30:24 crc restorecon[4694]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-binary-copy/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Jan 11 17:30:24 crc restorecon[4694]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-binary-copy/cnibincopy.sh not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Jan 11 17:30:24 crc restorecon[4694]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-sysctl-allowlist not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Jan 11 17:30:24 crc restorecon[4694]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-sysctl-allowlist/..2025_02_23_05_21_22.2050650026 not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Jan 11 17:30:24 crc restorecon[4694]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-sysctl-allowlist/..2025_02_23_05_21_22.2050650026/allowlist.conf not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Jan 11 17:30:24 crc restorecon[4694]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-sysctl-allowlist/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Jan 11 17:30:24 crc restorecon[4694]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-sysctl-allowlist/allowlist.conf not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Jan 11 17:30:24 crc restorecon[4694]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Jan 11 17:30:24 crc restorecon[4694]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/egress-router-binary-copy/bc46ea27 not reset as customized by admin to system_u:object_r:container_file_t:s0:c203,c924 Jan 11 17:30:24 crc restorecon[4694]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/egress-router-binary-copy/5731fc1b not reset as customized by admin to system_u:object_r:container_file_t:s0:c138,c778 Jan 11 17:30:24 crc restorecon[4694]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/egress-router-binary-copy/5e1b2a3c not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Jan 11 17:30:24 crc restorecon[4694]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/cni-plugins/943f0936 not reset as customized by admin to system_u:object_r:container_file_t:s0:c203,c924 Jan 11 17:30:24 crc restorecon[4694]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/cni-plugins/3f764ee4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c138,c778 Jan 11 17:30:24 crc restorecon[4694]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/cni-plugins/8695e3f9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Jan 11 17:30:24 crc restorecon[4694]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/bond-cni-plugin/aed7aa86 not reset as customized by admin to system_u:object_r:container_file_t:s0:c203,c924 Jan 11 17:30:24 crc restorecon[4694]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/bond-cni-plugin/c64d7448 not reset as customized by admin to system_u:object_r:container_file_t:s0:c138,c778 Jan 11 17:30:24 crc restorecon[4694]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/bond-cni-plugin/0ba16bd2 not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Jan 11 17:30:24 crc restorecon[4694]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/routeoverride-cni/207a939f not reset as customized by admin to system_u:object_r:container_file_t:s0:c203,c924 Jan 11 17:30:24 crc restorecon[4694]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/routeoverride-cni/54aa8cdb not reset as customized by admin to system_u:object_r:container_file_t:s0:c138,c778 Jan 11 17:30:24 crc restorecon[4694]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/routeoverride-cni/1f5fa595 not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Jan 11 17:30:24 crc restorecon[4694]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/whereabouts-cni-bincopy/bf9c8153 not reset as customized by admin to system_u:object_r:container_file_t:s0:c203,c924 Jan 11 17:30:24 crc restorecon[4694]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/whereabouts-cni-bincopy/47fba4ea not reset as customized by admin to system_u:object_r:container_file_t:s0:c138,c778 Jan 11 17:30:24 crc restorecon[4694]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/whereabouts-cni-bincopy/7ae55ce9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Jan 11 17:30:24 crc restorecon[4694]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/whereabouts-cni/7906a268 not reset as customized by admin to system_u:object_r:container_file_t:s0:c203,c924 Jan 11 17:30:24 crc restorecon[4694]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/whereabouts-cni/ce43fa69 not reset as customized by admin to system_u:object_r:container_file_t:s0:c138,c778 Jan 11 17:30:24 crc restorecon[4694]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/whereabouts-cni/7fc7ea3a not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Jan 11 17:30:24 crc restorecon[4694]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/kube-multus-additional-cni-plugins/d8c38b7d not reset as customized by admin to system_u:object_r:container_file_t:s0:c203,c924 Jan 11 17:30:24 crc restorecon[4694]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/kube-multus-additional-cni-plugins/9ef015fb not reset as customized by admin to system_u:object_r:container_file_t:s0:c138,c778 Jan 11 17:30:24 crc restorecon[4694]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/kube-multus-additional-cni-plugins/b9db6a41 not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Jan 11 17:30:24 crc restorecon[4694]: /var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c432,c991 Jan 11 17:30:24 crc restorecon[4694]: /var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/containers/network-metrics-daemon/b1733d79 not reset as customized by admin to system_u:object_r:container_file_t:s0:c476,c820 Jan 11 17:30:24 crc restorecon[4694]: /var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/containers/network-metrics-daemon/afccd338 not reset as customized by admin to system_u:object_r:container_file_t:s0:c272,c818 Jan 11 17:30:24 crc restorecon[4694]: /var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/containers/network-metrics-daemon/9df0a185 not reset as customized by admin to system_u:object_r:container_file_t:s0:c432,c991 Jan 11 17:30:24 crc restorecon[4694]: /var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/containers/kube-rbac-proxy/18938cf8 not reset as customized by admin to system_u:object_r:container_file_t:s0:c476,c820 Jan 11 17:30:24 crc restorecon[4694]: /var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/containers/kube-rbac-proxy/7ab4eb23 not reset as customized by admin to system_u:object_r:container_file_t:s0:c272,c818 Jan 11 17:30:24 crc restorecon[4694]: /var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/containers/kube-rbac-proxy/56930be6 not reset as customized by admin to system_u:object_r:container_file_t:s0:c432,c991 Jan 11 17:30:24 crc restorecon[4694]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes/kubernetes.io~configmap/env-overrides not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Jan 11 17:30:24 crc restorecon[4694]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes/kubernetes.io~configmap/env-overrides/..2025_02_23_05_21_35.630010865 not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Jan 11 17:30:24 crc restorecon[4694]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes/kubernetes.io~configmap/env-overrides/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Jan 11 17:30:24 crc restorecon[4694]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes/kubernetes.io~configmap/ovnkube-config not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Jan 11 17:30:24 crc restorecon[4694]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes/kubernetes.io~configmap/ovnkube-config/..2025_02_23_05_21_35.1088506337 not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Jan 11 17:30:24 crc restorecon[4694]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes/kubernetes.io~configmap/ovnkube-config/..2025_02_23_05_21_35.1088506337/ovnkube.conf not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Jan 11 17:30:24 crc restorecon[4694]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes/kubernetes.io~configmap/ovnkube-config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Jan 11 17:30:24 crc restorecon[4694]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes/kubernetes.io~configmap/ovnkube-config/ovnkube.conf not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Jan 11 17:30:24 crc restorecon[4694]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Jan 11 17:30:24 crc restorecon[4694]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/containers/kube-rbac-proxy/0d8e3722 not reset as customized by admin to system_u:object_r:container_file_t:s0:c89,c211 Jan 11 17:30:24 crc restorecon[4694]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/containers/kube-rbac-proxy/d22b2e76 not reset as customized by admin to system_u:object_r:container_file_t:s0:c382,c850 Jan 11 17:30:24 crc restorecon[4694]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/containers/kube-rbac-proxy/e036759f not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Jan 11 17:30:24 crc restorecon[4694]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/containers/ovnkube-cluster-manager/2734c483 not reset as customized by admin to system_u:object_r:container_file_t:s0:c89,c211 Jan 11 17:30:24 crc restorecon[4694]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/containers/ovnkube-cluster-manager/57878fe7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c89,c211 Jan 11 17:30:24 crc restorecon[4694]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/containers/ovnkube-cluster-manager/3f3c2e58 not reset as customized by admin to system_u:object_r:container_file_t:s0:c89,c211 Jan 11 17:30:24 crc restorecon[4694]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/containers/ovnkube-cluster-manager/375bec3e not reset as customized by admin to system_u:object_r:container_file_t:s0:c382,c850 Jan 11 17:30:24 crc restorecon[4694]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/containers/ovnkube-cluster-manager/7bc41e08 not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Jan 11 17:30:24 crc restorecon[4694]: /var/lib/kubelet/pods/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Jan 11 17:30:24 crc restorecon[4694]: /var/lib/kubelet/pods/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d/containers/download-server/48c7a72d not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Jan 11 17:30:24 crc restorecon[4694]: /var/lib/kubelet/pods/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d/containers/download-server/4b66701f not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Jan 11 17:30:24 crc restorecon[4694]: /var/lib/kubelet/pods/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d/containers/download-server/a5a1c202 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Jan 11 17:30:24 crc restorecon[4694]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/ovnkube-identity-cm not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Jan 11 17:30:24 crc restorecon[4694]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/ovnkube-identity-cm/..2025_02_23_05_21_40.3350632666 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Jan 11 17:30:24 crc restorecon[4694]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/ovnkube-identity-cm/..2025_02_23_05_21_40.3350632666/additional-cert-acceptance-cond.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Jan 11 17:30:24 crc restorecon[4694]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/ovnkube-identity-cm/..2025_02_23_05_21_40.3350632666/additional-pod-admission-cond.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Jan 11 17:30:24 crc restorecon[4694]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/ovnkube-identity-cm/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Jan 11 17:30:24 crc restorecon[4694]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/ovnkube-identity-cm/additional-cert-acceptance-cond.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Jan 11 17:30:24 crc restorecon[4694]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/ovnkube-identity-cm/additional-pod-admission-cond.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Jan 11 17:30:24 crc restorecon[4694]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/env-overrides not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Jan 11 17:30:24 crc restorecon[4694]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/env-overrides/..2025_02_23_05_21_40.1388695756 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Jan 11 17:30:24 crc restorecon[4694]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/env-overrides/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Jan 11 17:30:24 crc restorecon[4694]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Jan 11 17:30:24 crc restorecon[4694]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/webhook/26f3df5b not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Jan 11 17:30:24 crc restorecon[4694]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/webhook/6d8fb21d not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Jan 11 17:30:24 crc restorecon[4694]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/webhook/50e94777 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Jan 11 17:30:24 crc restorecon[4694]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/approver/208473b3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Jan 11 17:30:24 crc restorecon[4694]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/approver/ec9e08ba not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Jan 11 17:30:24 crc restorecon[4694]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/approver/3b787c39 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Jan 11 17:30:24 crc restorecon[4694]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/approver/208eaed5 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Jan 11 17:30:24 crc restorecon[4694]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/approver/93aa3a2b not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Jan 11 17:30:24 crc restorecon[4694]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/approver/3c697968 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Jan 11 17:30:24 crc restorecon[4694]: /var/lib/kubelet/pods/3b6479f0-333b-4a96-9adf-2099afdc2447/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c20,c21 Jan 11 17:30:24 crc restorecon[4694]: /var/lib/kubelet/pods/3b6479f0-333b-4a96-9adf-2099afdc2447/containers/network-check-target-container/ba950ec9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c20,c21 Jan 11 17:30:24 crc restorecon[4694]: /var/lib/kubelet/pods/3b6479f0-333b-4a96-9adf-2099afdc2447/containers/network-check-target-container/cb5cdb37 not reset as customized by admin to system_u:object_r:container_file_t:s0:c20,c21 Jan 11 17:30:24 crc restorecon[4694]: /var/lib/kubelet/pods/3b6479f0-333b-4a96-9adf-2099afdc2447/containers/network-check-target-container/f2df9827 not reset as customized by admin to system_u:object_r:container_file_t:s0:c20,c21 Jan 11 17:30:24 crc restorecon[4694]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/images not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Jan 11 17:30:24 crc restorecon[4694]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/images/..2025_02_23_05_22_30.473230615 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Jan 11 17:30:24 crc restorecon[4694]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/images/..2025_02_23_05_22_30.473230615/images.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Jan 11 17:30:24 crc restorecon[4694]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/images/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Jan 11 17:30:24 crc restorecon[4694]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/images/images.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Jan 11 17:30:24 crc restorecon[4694]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/auth-proxy-config not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Jan 11 17:30:24 crc restorecon[4694]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/auth-proxy-config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Jan 11 17:30:24 crc restorecon[4694]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/auth-proxy-config/config-file.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Jan 11 17:30:24 crc restorecon[4694]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/auth-proxy-config/..2025_02_24_06_22_02.1904938450 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Jan 11 17:30:25 crc restorecon[4694]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/auth-proxy-config/..2025_02_24_06_22_02.1904938450/config-file.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Jan 11 17:30:25 crc restorecon[4694]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Jan 11 17:30:25 crc restorecon[4694]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/containers/machine-config-operator/fedaa673 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Jan 11 17:30:25 crc restorecon[4694]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/containers/machine-config-operator/9ca2df95 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Jan 11 17:30:25 crc restorecon[4694]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/containers/machine-config-operator/b2d7460e not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Jan 11 17:30:25 crc restorecon[4694]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/containers/kube-rbac-proxy/2207853c not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Jan 11 17:30:25 crc restorecon[4694]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/containers/kube-rbac-proxy/241c1c29 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Jan 11 17:30:25 crc restorecon[4694]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/containers/kube-rbac-proxy/2d910eaf not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Jan 11 17:30:25 crc restorecon[4694]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Jan 11 17:30:25 crc restorecon[4694]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Jan 11 17:30:25 crc restorecon[4694]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-ca/..2025_02_23_05_23_49.3726007728 not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Jan 11 17:30:25 crc restorecon[4694]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-ca/..2025_02_23_05_23_49.3726007728/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Jan 11 17:30:25 crc restorecon[4694]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-ca/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Jan 11 17:30:25 crc restorecon[4694]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-service-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Jan 11 17:30:25 crc restorecon[4694]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-service-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Jan 11 17:30:25 crc restorecon[4694]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-service-ca/..2025_02_23_05_23_49.841175008 not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Jan 11 17:30:25 crc restorecon[4694]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-service-ca/..2025_02_23_05_23_49.841175008/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Jan 11 17:30:25 crc restorecon[4694]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-service-ca/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Jan 11 17:30:25 crc restorecon[4694]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Jan 11 17:30:25 crc restorecon[4694]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.843437178 not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Jan 11 17:30:25 crc restorecon[4694]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.843437178/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Jan 11 17:30:25 crc restorecon[4694]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Jan 11 17:30:25 crc restorecon[4694]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Jan 11 17:30:25 crc restorecon[4694]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Jan 11 17:30:25 crc restorecon[4694]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/containers/etcd-operator/c6c0f2e7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c263,c871 Jan 11 17:30:25 crc restorecon[4694]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/containers/etcd-operator/399edc97 not reset as customized by admin to system_u:object_r:container_file_t:s0:c263,c871 Jan 11 17:30:25 crc restorecon[4694]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/containers/etcd-operator/8049f7cc not reset as customized by admin to system_u:object_r:container_file_t:s0:c263,c871 Jan 11 17:30:25 crc restorecon[4694]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/containers/etcd-operator/0cec5484 not reset as customized by admin to system_u:object_r:container_file_t:s0:c263,c871 Jan 11 17:30:25 crc restorecon[4694]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/containers/etcd-operator/312446d0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c406,c828 Jan 11 17:30:25 crc restorecon[4694]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/containers/etcd-operator/8e56a35d not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Jan 11 17:30:25 crc restorecon[4694]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c108,c511 Jan 11 17:30:25 crc restorecon[4694]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.133159589 not reset as customized by admin to system_u:object_r:container_file_t:s0:c108,c511 Jan 11 17:30:25 crc restorecon[4694]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.133159589/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c108,c511 Jan 11 17:30:25 crc restorecon[4694]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c108,c511 Jan 11 17:30:25 crc restorecon[4694]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c108,c511 Jan 11 17:30:25 crc restorecon[4694]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c108,c511 Jan 11 17:30:25 crc restorecon[4694]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/containers/kube-controller-manager-operator/2d30ddb9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c380,c909 Jan 11 17:30:25 crc restorecon[4694]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/containers/kube-controller-manager-operator/eca8053d not reset as customized by admin to system_u:object_r:container_file_t:s0:c380,c909 Jan 11 17:30:25 crc restorecon[4694]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/containers/kube-controller-manager-operator/c3a25c9a not reset as customized by admin to system_u:object_r:container_file_t:s0:c168,c522 Jan 11 17:30:25 crc restorecon[4694]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/containers/kube-controller-manager-operator/b9609c22 not reset as customized by admin to system_u:object_r:container_file_t:s0:c108,c511 Jan 11 17:30:25 crc restorecon[4694]: /var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c968,c969 Jan 11 17:30:25 crc restorecon[4694]: /var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/containers/dns-operator/e8b0eca9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c106,c418 Jan 11 17:30:25 crc restorecon[4694]: /var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/containers/dns-operator/b36a9c3f not reset as customized by admin to system_u:object_r:container_file_t:s0:c529,c711 Jan 11 17:30:25 crc restorecon[4694]: /var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/containers/dns-operator/38af7b07 not reset as customized by admin to system_u:object_r:container_file_t:s0:c968,c969 Jan 11 17:30:25 crc restorecon[4694]: /var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/containers/kube-rbac-proxy/ae821620 not reset as customized by admin to system_u:object_r:container_file_t:s0:c106,c418 Jan 11 17:30:25 crc restorecon[4694]: /var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/containers/kube-rbac-proxy/baa23338 not reset as customized by admin to system_u:object_r:container_file_t:s0:c529,c711 Jan 11 17:30:25 crc restorecon[4694]: /var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/containers/kube-rbac-proxy/2c534809 not reset as customized by admin to system_u:object_r:container_file_t:s0:c968,c969 Jan 11 17:30:25 crc restorecon[4694]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c661,c999 Jan 11 17:30:25 crc restorecon[4694]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.3532625537 not reset as customized by admin to system_u:object_r:container_file_t:s0:c661,c999 Jan 11 17:30:25 crc restorecon[4694]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.3532625537/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c661,c999 Jan 11 17:30:25 crc restorecon[4694]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c661,c999 Jan 11 17:30:25 crc restorecon[4694]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c661,c999 Jan 11 17:30:25 crc restorecon[4694]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c661,c999 Jan 11 17:30:25 crc restorecon[4694]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/containers/kube-scheduler-operator-container/59b29eae not reset as customized by admin to system_u:object_r:container_file_t:s0:c338,c381 Jan 11 17:30:25 crc restorecon[4694]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/containers/kube-scheduler-operator-container/c91a8e4f not reset as customized by admin to system_u:object_r:container_file_t:s0:c338,c381 Jan 11 17:30:25 crc restorecon[4694]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/containers/kube-scheduler-operator-container/4d87494a not reset as customized by admin to system_u:object_r:container_file_t:s0:c442,c857 Jan 11 17:30:25 crc restorecon[4694]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/containers/kube-scheduler-operator-container/1e33ca63 not reset as customized by admin to system_u:object_r:container_file_t:s0:c661,c999 Jan 11 17:30:25 crc restorecon[4694]: /var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Jan 11 17:30:25 crc restorecon[4694]: /var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/containers/kube-rbac-proxy/8dea7be2 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Jan 11 17:30:25 crc restorecon[4694]: /var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/containers/kube-rbac-proxy/d0b04a99 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Jan 11 17:30:25 crc restorecon[4694]: /var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/containers/kube-rbac-proxy/d84f01e7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Jan 11 17:30:25 crc restorecon[4694]: /var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/containers/package-server-manager/4109059b not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Jan 11 17:30:25 crc restorecon[4694]: /var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/containers/package-server-manager/a7258a3e not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Jan 11 17:30:25 crc restorecon[4694]: /var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/containers/package-server-manager/05bdf2b6 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Jan 11 17:30:25 crc restorecon[4694]: /var/lib/kubelet/pods/6731426b-95fe-49ff-bb5f-40441049fde2/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Jan 11 17:30:25 crc restorecon[4694]: /var/lib/kubelet/pods/6731426b-95fe-49ff-bb5f-40441049fde2/containers/control-plane-machine-set-operator/f3261b51 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Jan 11 17:30:25 crc restorecon[4694]: /var/lib/kubelet/pods/6731426b-95fe-49ff-bb5f-40441049fde2/containers/control-plane-machine-set-operator/315d045e not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Jan 11 17:30:25 crc restorecon[4694]: /var/lib/kubelet/pods/6731426b-95fe-49ff-bb5f-40441049fde2/containers/control-plane-machine-set-operator/5fdcf278 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Jan 11 17:30:25 crc restorecon[4694]: /var/lib/kubelet/pods/6731426b-95fe-49ff-bb5f-40441049fde2/containers/control-plane-machine-set-operator/d053f757 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Jan 11 17:30:25 crc restorecon[4694]: /var/lib/kubelet/pods/6731426b-95fe-49ff-bb5f-40441049fde2/containers/control-plane-machine-set-operator/c2850dc7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Jan 11 17:30:25 crc restorecon[4694]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/volumes/kubernetes.io~configmap/marketplace-trusted-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 11 17:30:25 crc restorecon[4694]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/volumes/kubernetes.io~configmap/marketplace-trusted-ca/..2025_02_23_05_22_30.2390596521 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 11 17:30:25 crc restorecon[4694]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/volumes/kubernetes.io~configmap/marketplace-trusted-ca/..2025_02_23_05_22_30.2390596521/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 11 17:30:25 crc restorecon[4694]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/volumes/kubernetes.io~configmap/marketplace-trusted-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 11 17:30:25 crc restorecon[4694]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/volumes/kubernetes.io~configmap/marketplace-trusted-ca/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 11 17:30:25 crc restorecon[4694]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 11 17:30:25 crc restorecon[4694]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/containers/marketplace-operator/fcfb0b2b not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 11 17:30:25 crc restorecon[4694]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/containers/marketplace-operator/c7ac9b7d not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 11 17:30:25 crc restorecon[4694]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/containers/marketplace-operator/fa0c0d52 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 11 17:30:25 crc restorecon[4694]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/containers/marketplace-operator/c609b6ba not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 11 17:30:25 crc restorecon[4694]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/containers/marketplace-operator/2be6c296 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 11 17:30:25 crc restorecon[4694]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/containers/marketplace-operator/89a32653 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 11 17:30:25 crc restorecon[4694]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/containers/marketplace-operator/4eb9afeb not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 11 17:30:25 crc restorecon[4694]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/containers/marketplace-operator/13af6efa not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 11 17:30:25 crc restorecon[4694]: /var/lib/kubelet/pods/b6312bbd-5731-4ea0-a20f-81d5a57df44a/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Jan 11 17:30:25 crc restorecon[4694]: /var/lib/kubelet/pods/b6312bbd-5731-4ea0-a20f-81d5a57df44a/containers/olm-operator/b03f9724 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Jan 11 17:30:25 crc restorecon[4694]: /var/lib/kubelet/pods/b6312bbd-5731-4ea0-a20f-81d5a57df44a/containers/olm-operator/e3d105cc not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Jan 11 17:30:25 crc restorecon[4694]: /var/lib/kubelet/pods/b6312bbd-5731-4ea0-a20f-81d5a57df44a/containers/olm-operator/3aed4d83 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Jan 11 17:30:25 crc restorecon[4694]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Jan 11 17:30:25 crc restorecon[4694]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.1906041176 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Jan 11 17:30:25 crc restorecon[4694]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.1906041176/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Jan 11 17:30:25 crc restorecon[4694]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Jan 11 17:30:25 crc restorecon[4694]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Jan 11 17:30:25 crc restorecon[4694]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Jan 11 17:30:25 crc restorecon[4694]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/containers/kube-storage-version-migrator-operator/0765fa6e not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Jan 11 17:30:25 crc restorecon[4694]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/containers/kube-storage-version-migrator-operator/2cefc627 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Jan 11 17:30:25 crc restorecon[4694]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/containers/kube-storage-version-migrator-operator/3dcc6345 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Jan 11 17:30:25 crc restorecon[4694]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/containers/kube-storage-version-migrator-operator/365af391 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Jan 11 17:30:25 crc restorecon[4694]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes/kubernetes.io~empty-dir/available-featuregates not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Jan 11 17:30:25 crc restorecon[4694]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes/kubernetes.io~empty-dir/available-featuregates/featureGate-SelfManagedHA-Default.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Jan 11 17:30:25 crc restorecon[4694]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes/kubernetes.io~empty-dir/available-featuregates/featureGate-SelfManagedHA-TechPreviewNoUpgrade.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Jan 11 17:30:25 crc restorecon[4694]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes/kubernetes.io~empty-dir/available-featuregates/featureGate-SelfManagedHA-DevPreviewNoUpgrade.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Jan 11 17:30:25 crc restorecon[4694]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes/kubernetes.io~empty-dir/available-featuregates/featureGate-Hypershift-TechPreviewNoUpgrade.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Jan 11 17:30:25 crc restorecon[4694]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes/kubernetes.io~empty-dir/available-featuregates/featureGate-Hypershift-DevPreviewNoUpgrade.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Jan 11 17:30:25 crc restorecon[4694]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes/kubernetes.io~empty-dir/available-featuregates/featureGate-Hypershift-Default.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Jan 11 17:30:25 crc restorecon[4694]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Jan 11 17:30:25 crc restorecon[4694]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/containers/openshift-api/b1130c0f not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Jan 11 17:30:25 crc restorecon[4694]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/containers/openshift-api/236a5913 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Jan 11 17:30:25 crc restorecon[4694]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/containers/openshift-api/b9432e26 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Jan 11 17:30:25 crc restorecon[4694]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/containers/openshift-config-operator/5ddb0e3f not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Jan 11 17:30:25 crc restorecon[4694]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/containers/openshift-config-operator/986dc4fd not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Jan 11 17:30:25 crc restorecon[4694]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/containers/openshift-config-operator/8a23ff9a not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Jan 11 17:30:25 crc restorecon[4694]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/containers/openshift-config-operator/9728ae68 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Jan 11 17:30:25 crc restorecon[4694]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/containers/openshift-config-operator/665f31d0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Jan 11 17:30:25 crc restorecon[4694]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Jan 11 17:30:25 crc restorecon[4694]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.1255385357 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Jan 11 17:30:25 crc restorecon[4694]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.1255385357/operator-config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Jan 11 17:30:25 crc restorecon[4694]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Jan 11 17:30:25 crc restorecon[4694]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/config/operator-config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Jan 11 17:30:25 crc restorecon[4694]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/service-ca-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Jan 11 17:30:25 crc restorecon[4694]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/service-ca-bundle/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Jan 11 17:30:25 crc restorecon[4694]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/service-ca-bundle/..2025_02_23_05_23_57.573792656 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Jan 11 17:30:25 crc restorecon[4694]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/service-ca-bundle/..2025_02_23_05_23_57.573792656/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Jan 11 17:30:25 crc restorecon[4694]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/service-ca-bundle/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Jan 11 17:30:25 crc restorecon[4694]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/trusted-ca-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Jan 11 17:30:25 crc restorecon[4694]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/trusted-ca-bundle/..2025_02_23_05_22_30.3254245399 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Jan 11 17:30:25 crc restorecon[4694]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/trusted-ca-bundle/..2025_02_23_05_22_30.3254245399/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Jan 11 17:30:25 crc restorecon[4694]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/trusted-ca-bundle/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Jan 11 17:30:25 crc restorecon[4694]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/trusted-ca-bundle/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Jan 11 17:30:25 crc restorecon[4694]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Jan 11 17:30:25 crc restorecon[4694]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/containers/authentication-operator/136c9b42 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Jan 11 17:30:25 crc restorecon[4694]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/containers/authentication-operator/98a1575b not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Jan 11 17:30:25 crc restorecon[4694]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/containers/authentication-operator/cac69136 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Jan 11 17:30:25 crc restorecon[4694]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/containers/authentication-operator/5deb77a7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Jan 11 17:30:25 crc restorecon[4694]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/containers/authentication-operator/2ae53400 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Jan 11 17:30:25 crc restorecon[4694]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Jan 11 17:30:25 crc restorecon[4694]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.3608339744 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Jan 11 17:30:25 crc restorecon[4694]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.3608339744/operator-config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Jan 11 17:30:25 crc restorecon[4694]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Jan 11 17:30:25 crc restorecon[4694]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/volumes/kubernetes.io~configmap/config/operator-config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Jan 11 17:30:25 crc restorecon[4694]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Jan 11 17:30:25 crc restorecon[4694]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/containers/service-ca-operator/e46f2326 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Jan 11 17:30:25 crc restorecon[4694]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/containers/service-ca-operator/dc688d3c not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Jan 11 17:30:25 crc restorecon[4694]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/containers/service-ca-operator/3497c3cd not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Jan 11 17:30:25 crc restorecon[4694]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/containers/service-ca-operator/177eb008 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Jan 11 17:30:25 crc restorecon[4694]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Jan 11 17:30:25 crc restorecon[4694]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.3819292994 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Jan 11 17:30:25 crc restorecon[4694]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.3819292994/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Jan 11 17:30:25 crc restorecon[4694]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Jan 11 17:30:25 crc restorecon[4694]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Jan 11 17:30:25 crc restorecon[4694]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Jan 11 17:30:25 crc restorecon[4694]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/containers/openshift-apiserver-operator/af5a2afa not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Jan 11 17:30:25 crc restorecon[4694]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/containers/openshift-apiserver-operator/d780cb1f not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Jan 11 17:30:25 crc restorecon[4694]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/containers/openshift-apiserver-operator/49b0f374 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Jan 11 17:30:25 crc restorecon[4694]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/containers/openshift-apiserver-operator/26fbb125 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Jan 11 17:30:25 crc restorecon[4694]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/volumes/kubernetes.io~configmap/trusted-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Jan 11 17:30:25 crc restorecon[4694]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_23_05_22_30.3244779536 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Jan 11 17:30:25 crc restorecon[4694]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_23_05_22_30.3244779536/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Jan 11 17:30:25 crc restorecon[4694]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/volumes/kubernetes.io~configmap/trusted-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Jan 11 17:30:25 crc restorecon[4694]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/volumes/kubernetes.io~configmap/trusted-ca/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Jan 11 17:30:25 crc restorecon[4694]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Jan 11 17:30:25 crc restorecon[4694]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/cf14125a not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Jan 11 17:30:25 crc restorecon[4694]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/b7f86972 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Jan 11 17:30:25 crc restorecon[4694]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/e51d739c not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Jan 11 17:30:25 crc restorecon[4694]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/88ba6a69 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Jan 11 17:30:25 crc restorecon[4694]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/669a9acf not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Jan 11 17:30:25 crc restorecon[4694]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/5cd51231 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Jan 11 17:30:25 crc restorecon[4694]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/75349ec7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Jan 11 17:30:25 crc restorecon[4694]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/15c26839 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Jan 11 17:30:25 crc restorecon[4694]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/45023dcd not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Jan 11 17:30:25 crc restorecon[4694]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/2bb66a50 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Jan 11 17:30:25 crc restorecon[4694]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/kube-rbac-proxy/64d03bdd not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Jan 11 17:30:25 crc restorecon[4694]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/kube-rbac-proxy/ab8e7ca0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Jan 11 17:30:25 crc restorecon[4694]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/kube-rbac-proxy/bb9be25f not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Jan 11 17:30:25 crc restorecon[4694]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/volumes/kubernetes.io~configmap/trusted-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 11 17:30:25 crc restorecon[4694]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_23_05_22_30.2034221258 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 11 17:30:25 crc restorecon[4694]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_23_05_22_30.2034221258/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 11 17:30:25 crc restorecon[4694]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/volumes/kubernetes.io~configmap/trusted-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 11 17:30:25 crc restorecon[4694]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/volumes/kubernetes.io~configmap/trusted-ca/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 11 17:30:25 crc restorecon[4694]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 11 17:30:25 crc restorecon[4694]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/containers/cluster-image-registry-operator/9a0b61d3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 11 17:30:25 crc restorecon[4694]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/containers/cluster-image-registry-operator/d471b9d2 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 11 17:30:25 crc restorecon[4694]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/containers/cluster-image-registry-operator/8cb76b8e not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 11 17:30:25 crc restorecon[4694]: /var/lib/kubelet/pods/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Jan 11 17:30:25 crc restorecon[4694]: /var/lib/kubelet/pods/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9/containers/catalog-operator/11a00840 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Jan 11 17:30:25 crc restorecon[4694]: /var/lib/kubelet/pods/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9/containers/catalog-operator/ec355a92 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Jan 11 17:30:25 crc restorecon[4694]: /var/lib/kubelet/pods/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9/containers/catalog-operator/992f735e not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Jan 11 17:30:25 crc restorecon[4694]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Jan 11 17:30:25 crc restorecon[4694]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.1782968797 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Jan 11 17:30:25 crc restorecon[4694]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.1782968797/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Jan 11 17:30:25 crc restorecon[4694]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Jan 11 17:30:25 crc restorecon[4694]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Jan 11 17:30:25 crc restorecon[4694]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Jan 11 17:30:25 crc restorecon[4694]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/containers/openshift-controller-manager-operator/d59cdbbc not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Jan 11 17:30:25 crc restorecon[4694]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/containers/openshift-controller-manager-operator/72133ff0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Jan 11 17:30:25 crc restorecon[4694]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/containers/openshift-controller-manager-operator/c56c834c not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Jan 11 17:30:25 crc restorecon[4694]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/containers/openshift-controller-manager-operator/d13724c7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Jan 11 17:30:25 crc restorecon[4694]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/containers/openshift-controller-manager-operator/0a498258 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Jan 11 17:30:25 crc restorecon[4694]: /var/lib/kubelet/pods/5fe579f8-e8a6-4643-bce5-a661393c4dde/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Jan 11 17:30:25 crc restorecon[4694]: /var/lib/kubelet/pods/5fe579f8-e8a6-4643-bce5-a661393c4dde/containers/machine-config-server/fa471982 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Jan 11 17:30:25 crc restorecon[4694]: /var/lib/kubelet/pods/5fe579f8-e8a6-4643-bce5-a661393c4dde/containers/machine-config-server/fc900d92 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Jan 11 17:30:25 crc restorecon[4694]: /var/lib/kubelet/pods/5fe579f8-e8a6-4643-bce5-a661393c4dde/containers/machine-config-server/fa7d68da not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Jan 11 17:30:25 crc restorecon[4694]: /var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c22 Jan 11 17:30:25 crc restorecon[4694]: /var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/containers/migrator/4bacf9b4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c22 Jan 11 17:30:25 crc restorecon[4694]: /var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/containers/migrator/424021b1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c22 Jan 11 17:30:25 crc restorecon[4694]: /var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/containers/migrator/fc2e31a3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c22 Jan 11 17:30:25 crc restorecon[4694]: /var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/containers/graceful-termination/f51eefac not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c22 Jan 11 17:30:25 crc restorecon[4694]: /var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/containers/graceful-termination/c8997f2f not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c22 Jan 11 17:30:25 crc restorecon[4694]: /var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/containers/graceful-termination/7481f599 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c22 Jan 11 17:30:25 crc restorecon[4694]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/volumes/kubernetes.io~configmap/signing-cabundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Jan 11 17:30:25 crc restorecon[4694]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/volumes/kubernetes.io~configmap/signing-cabundle/..2025_02_23_05_22_49.2255460704 not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Jan 11 17:30:25 crc restorecon[4694]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/volumes/kubernetes.io~configmap/signing-cabundle/..2025_02_23_05_22_49.2255460704/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Jan 11 17:30:25 crc restorecon[4694]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/volumes/kubernetes.io~configmap/signing-cabundle/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Jan 11 17:30:25 crc restorecon[4694]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/volumes/kubernetes.io~configmap/signing-cabundle/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Jan 11 17:30:25 crc restorecon[4694]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Jan 11 17:30:25 crc restorecon[4694]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/containers/service-ca-controller/fdafea19 not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Jan 11 17:30:25 crc restorecon[4694]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/containers/service-ca-controller/d0e1c571 not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Jan 11 17:30:25 crc restorecon[4694]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/containers/service-ca-controller/ee398915 not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Jan 11 17:30:25 crc restorecon[4694]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/containers/service-ca-controller/682bb6b8 not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Jan 11 17:30:25 crc restorecon[4694]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Jan 11 17:30:25 crc restorecon[4694]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/setup/a3e67855 not reset as customized by admin to system_u:object_r:container_file_t:s0:c294,c884 Jan 11 17:30:25 crc restorecon[4694]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/setup/a989f289 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c1016 Jan 11 17:30:25 crc restorecon[4694]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/setup/915431bd not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Jan 11 17:30:25 crc restorecon[4694]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-ensure-env-vars/7796fdab not reset as customized by admin to system_u:object_r:container_file_t:s0:c294,c884 Jan 11 17:30:25 crc restorecon[4694]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-ensure-env-vars/dcdb5f19 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c1016 Jan 11 17:30:25 crc restorecon[4694]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-ensure-env-vars/a3aaa88c not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Jan 11 17:30:25 crc restorecon[4694]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-resources-copy/5508e3e6 not reset as customized by admin to system_u:object_r:container_file_t:s0:c294,c884 Jan 11 17:30:25 crc restorecon[4694]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-resources-copy/160585de not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c1016 Jan 11 17:30:25 crc restorecon[4694]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-resources-copy/e99f8da3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Jan 11 17:30:25 crc restorecon[4694]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcdctl/8bc85570 not reset as customized by admin to system_u:object_r:container_file_t:s0:c294,c884 Jan 11 17:30:25 crc restorecon[4694]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcdctl/a5861c91 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c1016 Jan 11 17:30:25 crc restorecon[4694]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcdctl/84db1135 not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Jan 11 17:30:25 crc restorecon[4694]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd/9e1a6043 not reset as customized by admin to system_u:object_r:container_file_t:s0:c294,c884 Jan 11 17:30:25 crc restorecon[4694]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd/c1aba1c2 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c1016 Jan 11 17:30:25 crc restorecon[4694]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd/d55ccd6d not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Jan 11 17:30:25 crc restorecon[4694]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-metrics/971cc9f6 not reset as customized by admin to system_u:object_r:container_file_t:s0:c294,c884 Jan 11 17:30:25 crc restorecon[4694]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-metrics/8f2e3dcf not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c1016 Jan 11 17:30:25 crc restorecon[4694]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-metrics/ceb35e9c not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Jan 11 17:30:25 crc restorecon[4694]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-readyz/1c192745 not reset as customized by admin to system_u:object_r:container_file_t:s0:c294,c884 Jan 11 17:30:25 crc restorecon[4694]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-readyz/5209e501 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c1016 Jan 11 17:30:25 crc restorecon[4694]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-readyz/f83de4df not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Jan 11 17:30:25 crc restorecon[4694]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-rev/e7b978ac not reset as customized by admin to system_u:object_r:container_file_t:s0:c294,c884 Jan 11 17:30:25 crc restorecon[4694]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-rev/c64304a1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c1016 Jan 11 17:30:25 crc restorecon[4694]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-rev/5384386b not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Jan 11 17:30:25 crc restorecon[4694]: /var/lib/kubelet/pods/efdd0498-1daa-4136-9a4a-3b948c2293fc/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c268,c620 Jan 11 17:30:25 crc restorecon[4694]: /var/lib/kubelet/pods/efdd0498-1daa-4136-9a4a-3b948c2293fc/containers/multus-admission-controller/cce3e3ff not reset as customized by admin to system_u:object_r:container_file_t:s0:c435,c756 Jan 11 17:30:25 crc restorecon[4694]: /var/lib/kubelet/pods/efdd0498-1daa-4136-9a4a-3b948c2293fc/containers/multus-admission-controller/8fb75465 not reset as customized by admin to system_u:object_r:container_file_t:s0:c268,c620 Jan 11 17:30:25 crc restorecon[4694]: /var/lib/kubelet/pods/efdd0498-1daa-4136-9a4a-3b948c2293fc/containers/kube-rbac-proxy/740f573e not reset as customized by admin to system_u:object_r:container_file_t:s0:c435,c756 Jan 11 17:30:25 crc restorecon[4694]: /var/lib/kubelet/pods/efdd0498-1daa-4136-9a4a-3b948c2293fc/containers/kube-rbac-proxy/32fd1134 not reset as customized by admin to system_u:object_r:container_file_t:s0:c268,c620 Jan 11 17:30:25 crc restorecon[4694]: /var/lib/kubelet/pods/20b0d48f-5fd6-431c-a545-e3c800c7b866/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c24 Jan 11 17:30:25 crc restorecon[4694]: /var/lib/kubelet/pods/20b0d48f-5fd6-431c-a545-e3c800c7b866/containers/serve-healthcheck-canary/0a861bd3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c24 Jan 11 17:30:25 crc restorecon[4694]: /var/lib/kubelet/pods/20b0d48f-5fd6-431c-a545-e3c800c7b866/containers/serve-healthcheck-canary/80363026 not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c24 Jan 11 17:30:25 crc restorecon[4694]: /var/lib/kubelet/pods/20b0d48f-5fd6-431c-a545-e3c800c7b866/containers/serve-healthcheck-canary/bfa952a8 not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c24 Jan 11 17:30:25 crc restorecon[4694]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes/kubernetes.io~configmap/auth-proxy-config not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Jan 11 17:30:25 crc restorecon[4694]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes/kubernetes.io~configmap/auth-proxy-config/..2025_02_23_05_33_31.2122464563 not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Jan 11 17:30:25 crc restorecon[4694]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes/kubernetes.io~configmap/auth-proxy-config/..2025_02_23_05_33_31.2122464563/config-file.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Jan 11 17:30:25 crc restorecon[4694]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes/kubernetes.io~configmap/auth-proxy-config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Jan 11 17:30:25 crc restorecon[4694]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes/kubernetes.io~configmap/auth-proxy-config/config-file.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Jan 11 17:30:25 crc restorecon[4694]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Jan 11 17:30:25 crc restorecon[4694]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes/kubernetes.io~configmap/config/..2025_02_23_05_33_31.333075221 not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Jan 11 17:30:25 crc restorecon[4694]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Jan 11 17:30:25 crc restorecon[4694]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Jan 11 17:30:25 crc restorecon[4694]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/containers/kube-rbac-proxy/793bf43d not reset as customized by admin to system_u:object_r:container_file_t:s0:c381,c387 Jan 11 17:30:25 crc restorecon[4694]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/containers/kube-rbac-proxy/7db1bb6e not reset as customized by admin to system_u:object_r:container_file_t:s0:c142,c438 Jan 11 17:30:25 crc restorecon[4694]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/containers/kube-rbac-proxy/4f6a0368 not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Jan 11 17:30:25 crc restorecon[4694]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/containers/machine-approver-controller/c12c7d86 not reset as customized by admin to system_u:object_r:container_file_t:s0:c381,c387 Jan 11 17:30:25 crc restorecon[4694]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/containers/machine-approver-controller/36c4a773 not reset as customized by admin to system_u:object_r:container_file_t:s0:c142,c438 Jan 11 17:30:25 crc restorecon[4694]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/containers/machine-approver-controller/4c1e98ae not reset as customized by admin to system_u:object_r:container_file_t:s0:c142,c438 Jan 11 17:30:25 crc restorecon[4694]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/containers/machine-approver-controller/a4c8115c not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Jan 11 17:30:25 crc restorecon[4694]: /var/lib/kubelet/pods/f4b27818a5e8e43d0dc095d08835c792/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c97,c980 Jan 11 17:30:25 crc restorecon[4694]: /var/lib/kubelet/pods/f4b27818a5e8e43d0dc095d08835c792/containers/setup/7db1802e not reset as customized by admin to system_u:object_r:container_file_t:s0:c97,c980 Jan 11 17:30:25 crc restorecon[4694]: /var/lib/kubelet/pods/f4b27818a5e8e43d0dc095d08835c792/containers/kube-apiserver/a008a7ab not reset as customized by admin to system_u:object_r:container_file_t:s0:c97,c980 Jan 11 17:30:25 crc restorecon[4694]: /var/lib/kubelet/pods/f4b27818a5e8e43d0dc095d08835c792/containers/kube-apiserver-cert-syncer/2c836bac not reset as customized by admin to system_u:object_r:container_file_t:s0:c97,c980 Jan 11 17:30:25 crc restorecon[4694]: /var/lib/kubelet/pods/f4b27818a5e8e43d0dc095d08835c792/containers/kube-apiserver-cert-regeneration-controller/0ce62299 not reset as customized by admin to system_u:object_r:container_file_t:s0:c97,c980 Jan 11 17:30:25 crc restorecon[4694]: /var/lib/kubelet/pods/f4b27818a5e8e43d0dc095d08835c792/containers/kube-apiserver-insecure-readyz/945d2457 not reset as customized by admin to system_u:object_r:container_file_t:s0:c97,c980 Jan 11 17:30:25 crc restorecon[4694]: /var/lib/kubelet/pods/f4b27818a5e8e43d0dc095d08835c792/containers/kube-apiserver-check-endpoints/7d5c1dd8 not reset as customized by admin to system_u:object_r:container_file_t:s0:c97,c980 Jan 11 17:30:25 crc restorecon[4694]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/utilities not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 11 17:30:25 crc restorecon[4694]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/utilities/copy-content not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 11 17:30:25 crc restorecon[4694]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 11 17:30:25 crc restorecon[4694]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 11 17:30:25 crc restorecon[4694]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/3scale-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 11 17:30:25 crc restorecon[4694]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/3scale-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 11 17:30:25 crc restorecon[4694]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/advanced-cluster-management not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 11 17:30:25 crc restorecon[4694]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/advanced-cluster-management/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 11 17:30:25 crc restorecon[4694]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq-broker-rhel8 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 11 17:30:25 crc restorecon[4694]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq-broker-rhel8/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 11 17:30:25 crc restorecon[4694]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq-online not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 11 17:30:25 crc restorecon[4694]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq-online/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 11 17:30:25 crc restorecon[4694]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq-streams not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 11 17:30:25 crc restorecon[4694]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq-streams/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 11 17:30:25 crc restorecon[4694]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq-streams-console not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 11 17:30:25 crc restorecon[4694]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq-streams-console/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 11 17:30:25 crc restorecon[4694]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq7-interconnect-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 11 17:30:25 crc restorecon[4694]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq7-interconnect-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 11 17:30:25 crc restorecon[4694]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ansible-automation-platform-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 11 17:30:25 crc restorecon[4694]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ansible-automation-platform-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 11 17:30:25 crc restorecon[4694]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ansible-cloud-addons-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 11 17:30:25 crc restorecon[4694]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ansible-cloud-addons-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 11 17:30:25 crc restorecon[4694]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicast-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 11 17:30:25 crc restorecon[4694]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicast-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 11 17:30:25 crc restorecon[4694]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicurio-registry-3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 11 17:30:25 crc restorecon[4694]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicurio-registry-3/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 11 17:30:25 crc restorecon[4694]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/authorino-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 11 17:30:25 crc restorecon[4694]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/authorino-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 11 17:30:25 crc restorecon[4694]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aws-load-balancer-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 11 17:30:25 crc restorecon[4694]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aws-load-balancer-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 11 17:30:25 crc restorecon[4694]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/bamoe-businessautomation-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 11 17:30:25 crc restorecon[4694]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/bamoe-businessautomation-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 11 17:30:25 crc restorecon[4694]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/bamoe-kogito-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 11 17:30:25 crc restorecon[4694]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/bamoe-kogito-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 11 17:30:25 crc restorecon[4694]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/bpfman-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 11 17:30:25 crc restorecon[4694]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/bpfman-operator/index.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 11 17:30:25 crc restorecon[4694]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/businessautomation-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 11 17:30:25 crc restorecon[4694]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/businessautomation-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 11 17:30:25 crc restorecon[4694]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cephcsi-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 11 17:30:25 crc restorecon[4694]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cephcsi-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 11 17:30:25 crc restorecon[4694]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cincinnati-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 11 17:30:25 crc restorecon[4694]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cincinnati-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 11 17:30:25 crc restorecon[4694]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-kube-descheduler-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 11 17:30:25 crc restorecon[4694]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-kube-descheduler-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 11 17:30:25 crc restorecon[4694]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-logging not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 11 17:30:25 crc restorecon[4694]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-logging/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 11 17:30:25 crc restorecon[4694]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-observability-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 11 17:30:25 crc restorecon[4694]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-observability-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 11 17:30:25 crc restorecon[4694]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/compliance-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 11 17:30:25 crc restorecon[4694]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/compliance-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 11 17:30:25 crc restorecon[4694]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/container-security-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 11 17:30:25 crc restorecon[4694]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/container-security-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 11 17:30:25 crc restorecon[4694]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/costmanagement-metrics-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 11 17:30:25 crc restorecon[4694]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/costmanagement-metrics-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 11 17:30:25 crc restorecon[4694]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cryostat-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 11 17:30:25 crc restorecon[4694]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cryostat-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 11 17:30:25 crc restorecon[4694]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datagrid not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 11 17:30:25 crc restorecon[4694]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datagrid/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 11 17:30:25 crc restorecon[4694]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/devspaces not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 11 17:30:25 crc restorecon[4694]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/devspaces/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 11 17:30:25 crc restorecon[4694]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/devworkspace-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 11 17:30:25 crc restorecon[4694]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/devworkspace-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 11 17:30:25 crc restorecon[4694]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dpu-network-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 11 17:30:25 crc restorecon[4694]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dpu-network-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 11 17:30:25 crc restorecon[4694]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eap not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 11 17:30:25 crc restorecon[4694]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eap/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 11 17:30:25 crc restorecon[4694]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/elasticsearch-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 11 17:30:25 crc restorecon[4694]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/elasticsearch-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 11 17:30:25 crc restorecon[4694]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/external-dns-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 11 17:30:25 crc restorecon[4694]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/external-dns-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 11 17:30:25 crc restorecon[4694]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fence-agents-remediation not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 11 17:30:25 crc restorecon[4694]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fence-agents-remediation/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 11 17:30:25 crc restorecon[4694]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/file-integrity-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 11 17:30:25 crc restorecon[4694]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/file-integrity-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 11 17:30:25 crc restorecon[4694]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fuse-apicurito not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 11 17:30:25 crc restorecon[4694]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fuse-apicurito/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 11 17:30:25 crc restorecon[4694]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fuse-console not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 11 17:30:25 crc restorecon[4694]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fuse-console/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 11 17:30:25 crc restorecon[4694]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fuse-online not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 11 17:30:25 crc restorecon[4694]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fuse-online/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 11 17:30:25 crc restorecon[4694]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/gatekeeper-operator-product not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 11 17:30:25 crc restorecon[4694]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/gatekeeper-operator-product/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 11 17:30:25 crc restorecon[4694]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/jaeger-product not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 11 17:30:25 crc restorecon[4694]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/jaeger-product/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 11 17:30:25 crc restorecon[4694]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/jws-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 11 17:30:25 crc restorecon[4694]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/jws-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 11 17:30:25 crc restorecon[4694]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kernel-module-management not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 11 17:30:25 crc restorecon[4694]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kernel-module-management/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 11 17:30:25 crc restorecon[4694]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kernel-module-management-hub not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 11 17:30:25 crc restorecon[4694]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kernel-module-management-hub/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 11 17:30:25 crc restorecon[4694]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kiali-ossm not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 11 17:30:25 crc restorecon[4694]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kiali-ossm/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 11 17:30:25 crc restorecon[4694]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubevirt-hyperconverged not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 11 17:30:25 crc restorecon[4694]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubevirt-hyperconverged/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 11 17:30:25 crc restorecon[4694]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/logic-operator-rhel8 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 11 17:30:25 crc restorecon[4694]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/logic-operator-rhel8/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 11 17:30:25 crc restorecon[4694]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/loki-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 11 17:30:25 crc restorecon[4694]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/loki-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 11 17:30:25 crc restorecon[4694]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/lvms-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 11 17:30:25 crc restorecon[4694]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/lvms-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 11 17:30:25 crc restorecon[4694]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/machine-deletion-remediation not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 11 17:30:25 crc restorecon[4694]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/machine-deletion-remediation/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 11 17:30:25 crc restorecon[4694]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mcg-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 11 17:30:25 crc restorecon[4694]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mcg-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 11 17:30:25 crc restorecon[4694]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mta-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 11 17:30:25 crc restorecon[4694]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mta-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 11 17:30:25 crc restorecon[4694]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mtc-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 11 17:30:25 crc restorecon[4694]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mtc-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 11 17:30:25 crc restorecon[4694]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mtr-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 11 17:30:25 crc restorecon[4694]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mtr-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 11 17:30:25 crc restorecon[4694]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mtv-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 11 17:30:25 crc restorecon[4694]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mtv-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 11 17:30:25 crc restorecon[4694]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/multicluster-engine not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 11 17:30:25 crc restorecon[4694]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/multicluster-engine/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 11 17:30:25 crc restorecon[4694]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netobserv-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 11 17:30:25 crc restorecon[4694]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netobserv-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 11 17:30:25 crc restorecon[4694]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-healthcheck-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 11 17:30:25 crc restorecon[4694]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-healthcheck-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 11 17:30:25 crc restorecon[4694]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-maintenance-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 11 17:30:25 crc restorecon[4694]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-maintenance-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 11 17:30:25 crc restorecon[4694]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-observability-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 11 17:30:25 crc restorecon[4694]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-observability-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 11 17:30:25 crc restorecon[4694]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ocs-client-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 11 17:30:25 crc restorecon[4694]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ocs-client-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 11 17:30:25 crc restorecon[4694]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ocs-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 11 17:30:25 crc restorecon[4694]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ocs-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 11 17:30:25 crc restorecon[4694]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odf-csi-addons-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 11 17:30:25 crc restorecon[4694]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odf-csi-addons-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 11 17:30:25 crc restorecon[4694]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odf-multicluster-orchestrator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 11 17:30:25 crc restorecon[4694]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odf-multicluster-orchestrator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 11 17:30:25 crc restorecon[4694]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odf-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 11 17:30:25 crc restorecon[4694]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odf-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 11 17:30:25 crc restorecon[4694]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odf-prometheus-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 11 17:30:25 crc restorecon[4694]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odf-prometheus-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 11 17:30:25 crc restorecon[4694]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odr-cluster-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 11 17:30:25 crc restorecon[4694]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odr-cluster-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 11 17:30:25 crc restorecon[4694]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odr-hub-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 11 17:30:25 crc restorecon[4694]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odr-hub-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 11 17:30:25 crc restorecon[4694]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-cert-manager-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 11 17:30:25 crc restorecon[4694]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-cert-manager-operator/bundle-v1.15.0.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 11 17:30:25 crc restorecon[4694]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-cert-manager-operator/channel.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 11 17:30:25 crc restorecon[4694]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-cert-manager-operator/package.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 11 17:30:25 crc restorecon[4694]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-custom-metrics-autoscaler-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 11 17:30:25 crc restorecon[4694]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-custom-metrics-autoscaler-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 11 17:30:25 crc restorecon[4694]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-gitops-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 11 17:30:25 crc restorecon[4694]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-gitops-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 11 17:30:25 crc restorecon[4694]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-pipelines-operator-rh not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 11 17:30:25 crc restorecon[4694]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-pipelines-operator-rh/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 11 17:30:25 crc restorecon[4694]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-secondary-scheduler-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 11 17:30:25 crc restorecon[4694]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-secondary-scheduler-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 11 17:30:25 crc restorecon[4694]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/opentelemetry-product not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 11 17:30:25 crc restorecon[4694]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/opentelemetry-product/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 11 17:30:25 crc restorecon[4694]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/quay-bridge-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 11 17:30:25 crc restorecon[4694]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/quay-bridge-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 11 17:30:25 crc restorecon[4694]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/quay-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 11 17:30:25 crc restorecon[4694]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/quay-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 11 17:30:25 crc restorecon[4694]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/recipe not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 11 17:30:25 crc restorecon[4694]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/recipe/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 11 17:30:25 crc restorecon[4694]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/red-hat-camel-k not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 11 17:30:25 crc restorecon[4694]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/red-hat-camel-k/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 11 17:30:25 crc restorecon[4694]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/red-hat-hawtio-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 11 17:30:25 crc restorecon[4694]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/red-hat-hawtio-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 11 17:30:25 crc restorecon[4694]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/redhat-oadp-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 11 17:30:25 crc restorecon[4694]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/redhat-oadp-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 11 17:30:25 crc restorecon[4694]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rh-service-binding-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 11 17:30:25 crc restorecon[4694]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rh-service-binding-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 11 17:30:25 crc restorecon[4694]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhacs-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 11 17:30:25 crc restorecon[4694]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhacs-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 11 17:30:25 crc restorecon[4694]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhbk-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 11 17:30:25 crc restorecon[4694]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhbk-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 11 17:30:25 crc restorecon[4694]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhdh not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 11 17:30:25 crc restorecon[4694]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhdh/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 11 17:30:25 crc restorecon[4694]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhods-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 11 17:30:25 crc restorecon[4694]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhods-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 11 17:30:25 crc restorecon[4694]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhods-prometheus-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 11 17:30:25 crc restorecon[4694]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhods-prometheus-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 11 17:30:25 crc restorecon[4694]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhpam-kogito-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 11 17:30:25 crc restorecon[4694]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhpam-kogito-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 11 17:30:25 crc restorecon[4694]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhsso-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 11 17:30:25 crc restorecon[4694]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhsso-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 11 17:30:25 crc restorecon[4694]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rook-ceph-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 11 17:30:25 crc restorecon[4694]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rook-ceph-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 11 17:30:25 crc restorecon[4694]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/run-once-duration-override-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 11 17:30:25 crc restorecon[4694]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/run-once-duration-override-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 11 17:30:25 crc restorecon[4694]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sandboxed-containers-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 11 17:30:25 crc restorecon[4694]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sandboxed-containers-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 11 17:30:25 crc restorecon[4694]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/security-profiles-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 11 17:30:25 crc restorecon[4694]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/security-profiles-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 11 17:30:25 crc restorecon[4694]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/self-node-remediation not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 11 17:30:25 crc restorecon[4694]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/self-node-remediation/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 11 17:30:25 crc restorecon[4694]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/serverless-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 11 17:30:25 crc restorecon[4694]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/serverless-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 11 17:30:25 crc restorecon[4694]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/service-registry-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 11 17:30:25 crc restorecon[4694]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/service-registry-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 11 17:30:25 crc restorecon[4694]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/servicemeshoperator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 11 17:30:25 crc restorecon[4694]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/servicemeshoperator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 11 17:30:25 crc restorecon[4694]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/servicemeshoperator3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 11 17:30:25 crc restorecon[4694]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/servicemeshoperator3/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 11 17:30:25 crc restorecon[4694]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/skupper-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 11 17:30:25 crc restorecon[4694]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/skupper-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 11 17:30:25 crc restorecon[4694]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/submariner not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 11 17:30:25 crc restorecon[4694]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/submariner/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 11 17:30:25 crc restorecon[4694]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tang-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 11 17:30:25 crc restorecon[4694]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tang-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 11 17:30:25 crc restorecon[4694]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tempo-product not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 11 17:30:25 crc restorecon[4694]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tempo-product/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 11 17:30:25 crc restorecon[4694]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/trustee-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 11 17:30:25 crc restorecon[4694]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/trustee-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 11 17:30:25 crc restorecon[4694]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/volsync-product not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 11 17:30:25 crc restorecon[4694]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/volsync-product/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 11 17:30:25 crc restorecon[4694]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/web-terminal not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 11 17:30:25 crc restorecon[4694]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/web-terminal/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 11 17:30:25 crc restorecon[4694]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 11 17:30:25 crc restorecon[4694]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 11 17:30:25 crc restorecon[4694]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 11 17:30:25 crc restorecon[4694]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/00000-1.psg not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 11 17:30:25 crc restorecon[4694]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/00000-1.psg.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 11 17:30:25 crc restorecon[4694]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/db.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 11 17:30:25 crc restorecon[4694]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/index.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 11 17:30:25 crc restorecon[4694]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/main.pix not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 11 17:30:25 crc restorecon[4694]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/overflow.pix not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 11 17:30:25 crc restorecon[4694]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/digest not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 11 17:30:25 crc restorecon[4694]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 11 17:30:25 crc restorecon[4694]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/extract-utilities/bc8d0691 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 11 17:30:25 crc restorecon[4694]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/extract-utilities/6b76097a not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 11 17:30:25 crc restorecon[4694]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/extract-utilities/34d1af30 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 11 17:30:25 crc restorecon[4694]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/extract-content/312ba61c not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 11 17:30:25 crc restorecon[4694]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/extract-content/645d5dd1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 11 17:30:25 crc restorecon[4694]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/extract-content/16e825f0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 11 17:30:25 crc restorecon[4694]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/registry-server/4cf51fc9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 11 17:30:25 crc restorecon[4694]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/registry-server/2a23d348 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 11 17:30:25 crc restorecon[4694]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/registry-server/075dbd49 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 11 17:30:25 crc restorecon[4694]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Jan 11 17:30:25 crc restorecon[4694]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Jan 11 17:30:25 crc restorecon[4694]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca/..2025_02_24_06_09_13.3521195566 not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Jan 11 17:30:25 crc restorecon[4694]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca/..2025_02_24_06_09_13.3521195566/image-registry.openshift-image-registry.svc..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Jan 11 17:30:25 crc restorecon[4694]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca/..2025_02_24_06_09_13.3521195566/image-registry.openshift-image-registry.svc.cluster.local..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Jan 11 17:30:25 crc restorecon[4694]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca/..2025_02_24_06_09_13.3521195566/default-route-openshift-image-registry.apps-crc.testing not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Jan 11 17:30:25 crc restorecon[4694]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca/default-route-openshift-image-registry.apps-crc.testing not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Jan 11 17:30:25 crc restorecon[4694]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca/image-registry.openshift-image-registry.svc..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Jan 11 17:30:25 crc restorecon[4694]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca/image-registry.openshift-image-registry.svc.cluster.local..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Jan 11 17:30:25 crc restorecon[4694]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Jan 11 17:30:25 crc restorecon[4694]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/containers/node-ca/dd585ddd not reset as customized by admin to system_u:object_r:container_file_t:s0:c377,c642 Jan 11 17:30:25 crc restorecon[4694]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/containers/node-ca/17ebd0ab not reset as customized by admin to system_u:object_r:container_file_t:s0:c338,c343 Jan 11 17:30:25 crc restorecon[4694]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/containers/node-ca/005579f4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Jan 11 17:30:25 crc restorecon[4694]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/etcd-serving-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Jan 11 17:30:25 crc restorecon[4694]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/etcd-serving-ca/..2025_02_23_05_23_11.449897510 not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Jan 11 17:30:25 crc restorecon[4694]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/etcd-serving-ca/..2025_02_23_05_23_11.449897510/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Jan 11 17:30:25 crc restorecon[4694]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/etcd-serving-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Jan 11 17:30:25 crc restorecon[4694]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/etcd-serving-ca/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Jan 11 17:30:25 crc restorecon[4694]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/trusted-ca-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Jan 11 17:30:25 crc restorecon[4694]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/trusted-ca-bundle/..2025_02_23_05_23_11.1287037894 not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Jan 11 17:30:25 crc restorecon[4694]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/trusted-ca-bundle/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Jan 11 17:30:25 crc restorecon[4694]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/audit-policies not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Jan 11 17:30:25 crc restorecon[4694]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/audit-policies/..2025_02_23_05_23_11.1301053334 not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Jan 11 17:30:25 crc restorecon[4694]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/audit-policies/..2025_02_23_05_23_11.1301053334/policy.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Jan 11 17:30:25 crc restorecon[4694]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/audit-policies/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Jan 11 17:30:25 crc restorecon[4694]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/audit-policies/policy.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Jan 11 17:30:25 crc restorecon[4694]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Jan 11 17:30:25 crc restorecon[4694]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/containers/fix-audit-permissions/bf5f3b9c not reset as customized by admin to system_u:object_r:container_file_t:s0:c49,c263 Jan 11 17:30:25 crc restorecon[4694]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/containers/fix-audit-permissions/af276eb7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c701 Jan 11 17:30:25 crc restorecon[4694]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/containers/fix-audit-permissions/ea28e322 not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Jan 11 17:30:25 crc restorecon[4694]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/containers/oauth-apiserver/692e6683 not reset as customized by admin to system_u:object_r:container_file_t:s0:c49,c263 Jan 11 17:30:25 crc restorecon[4694]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/containers/oauth-apiserver/871746a7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c701 Jan 11 17:30:25 crc restorecon[4694]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/containers/oauth-apiserver/4eb2e958 not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Jan 11 17:30:25 crc restorecon[4694]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/console-config not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Jan 11 17:30:25 crc restorecon[4694]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/console-config/..2025_02_24_06_09_06.2875086261 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Jan 11 17:30:25 crc restorecon[4694]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/console-config/..2025_02_24_06_09_06.2875086261/console-config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Jan 11 17:30:25 crc restorecon[4694]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/console-config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Jan 11 17:30:25 crc restorecon[4694]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/console-config/console-config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Jan 11 17:30:25 crc restorecon[4694]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/trusted-ca-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Jan 11 17:30:25 crc restorecon[4694]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/trusted-ca-bundle/..2025_02_24_06_09_06.286118152 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Jan 11 17:30:25 crc restorecon[4694]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/trusted-ca-bundle/..2025_02_24_06_09_06.286118152/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Jan 11 17:30:25 crc restorecon[4694]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/trusted-ca-bundle/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Jan 11 17:30:25 crc restorecon[4694]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/trusted-ca-bundle/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Jan 11 17:30:25 crc restorecon[4694]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/oauth-serving-cert not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Jan 11 17:30:25 crc restorecon[4694]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/oauth-serving-cert/..2025_02_24_06_09_06.3865795478 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Jan 11 17:30:25 crc restorecon[4694]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/oauth-serving-cert/..2025_02_24_06_09_06.3865795478/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Jan 11 17:30:25 crc restorecon[4694]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/oauth-serving-cert/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Jan 11 17:30:25 crc restorecon[4694]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/oauth-serving-cert/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Jan 11 17:30:25 crc restorecon[4694]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/service-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Jan 11 17:30:25 crc restorecon[4694]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/service-ca/..2025_02_24_06_09_06.584414814 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Jan 11 17:30:25 crc restorecon[4694]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/service-ca/..2025_02_24_06_09_06.584414814/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Jan 11 17:30:25 crc restorecon[4694]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/service-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Jan 11 17:30:25 crc restorecon[4694]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/service-ca/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Jan 11 17:30:25 crc restorecon[4694]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Jan 11 17:30:25 crc restorecon[4694]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/containers/console/ca9b62da not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Jan 11 17:30:25 crc restorecon[4694]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/containers/console/0edd6fce not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Jan 11 17:30:25 crc restorecon[4694]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Jan 11 17:30:25 crc restorecon[4694]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.2406383837 not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Jan 11 17:30:25 crc restorecon[4694]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.2406383837/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Jan 11 17:30:25 crc restorecon[4694]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.2406383837/openshift-controller-manager.client-ca.configmap not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Jan 11 17:30:25 crc restorecon[4694]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.2406383837/openshift-controller-manager.openshift-global-ca.configmap not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Jan 11 17:30:25 crc restorecon[4694]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.2406383837/openshift-controller-manager.serving-cert.secret not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Jan 11 17:30:25 crc restorecon[4694]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Jan 11 17:30:25 crc restorecon[4694]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Jan 11 17:30:25 crc restorecon[4694]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/openshift-controller-manager.client-ca.configmap not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Jan 11 17:30:25 crc restorecon[4694]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/openshift-controller-manager.openshift-global-ca.configmap not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Jan 11 17:30:25 crc restorecon[4694]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/openshift-controller-manager.serving-cert.secret not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Jan 11 17:30:25 crc restorecon[4694]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/client-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Jan 11 17:30:25 crc restorecon[4694]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/client-ca/..2025_02_24_06_20_07.1071801880 not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Jan 11 17:30:25 crc restorecon[4694]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/client-ca/..2025_02_24_06_20_07.1071801880/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Jan 11 17:30:25 crc restorecon[4694]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/client-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Jan 11 17:30:25 crc restorecon[4694]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/client-ca/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Jan 11 17:30:25 crc restorecon[4694]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/proxy-ca-bundles not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Jan 11 17:30:25 crc restorecon[4694]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/proxy-ca-bundles/..2025_02_24_06_20_07.2494444877 not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Jan 11 17:30:25 crc restorecon[4694]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/proxy-ca-bundles/..2025_02_24_06_20_07.2494444877/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Jan 11 17:30:25 crc restorecon[4694]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/proxy-ca-bundles/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Jan 11 17:30:25 crc restorecon[4694]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/proxy-ca-bundles/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Jan 11 17:30:25 crc restorecon[4694]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Jan 11 17:30:25 crc restorecon[4694]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/containers/controller-manager/89b4555f not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Jan 11 17:30:25 crc restorecon[4694]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/volumes/kubernetes.io~configmap/config-volume not reset as customized by admin to system_u:object_r:container_file_t:s0:c466,c972 Jan 11 17:30:25 crc restorecon[4694]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/volumes/kubernetes.io~configmap/config-volume/..2025_02_23_05_23_22.4071100442 not reset as customized by admin to system_u:object_r:container_file_t:s0:c466,c972 Jan 11 17:30:25 crc restorecon[4694]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/volumes/kubernetes.io~configmap/config-volume/..2025_02_23_05_23_22.4071100442/Corefile not reset as customized by admin to system_u:object_r:container_file_t:s0:c466,c972 Jan 11 17:30:25 crc restorecon[4694]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/volumes/kubernetes.io~configmap/config-volume/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c466,c972 Jan 11 17:30:25 crc restorecon[4694]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/volumes/kubernetes.io~configmap/config-volume/Corefile not reset as customized by admin to system_u:object_r:container_file_t:s0:c466,c972 Jan 11 17:30:25 crc restorecon[4694]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c466,c972 Jan 11 17:30:25 crc restorecon[4694]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/containers/dns/655fcd71 not reset as customized by admin to system_u:object_r:container_file_t:s0:c457,c841 Jan 11 17:30:25 crc restorecon[4694]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/containers/dns/0d43c002 not reset as customized by admin to system_u:object_r:container_file_t:s0:c55,c1022 Jan 11 17:30:25 crc restorecon[4694]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/containers/dns/e68efd17 not reset as customized by admin to system_u:object_r:container_file_t:s0:c466,c972 Jan 11 17:30:25 crc restorecon[4694]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/containers/kube-rbac-proxy/9acf9b65 not reset as customized by admin to system_u:object_r:container_file_t:s0:c457,c841 Jan 11 17:30:25 crc restorecon[4694]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/containers/kube-rbac-proxy/5ae3ff11 not reset as customized by admin to system_u:object_r:container_file_t:s0:c55,c1022 Jan 11 17:30:25 crc restorecon[4694]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/containers/kube-rbac-proxy/1e59206a not reset as customized by admin to system_u:object_r:container_file_t:s0:c466,c972 Jan 11 17:30:25 crc restorecon[4694]: /var/lib/kubelet/pods/44663579-783b-4372-86d6-acf235a62d72/containers/dns-node-resolver/27af16d1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c304,c1017 Jan 11 17:30:25 crc restorecon[4694]: /var/lib/kubelet/pods/44663579-783b-4372-86d6-acf235a62d72/containers/dns-node-resolver/7918e729 not reset as customized by admin to system_u:object_r:container_file_t:s0:c853,c893 Jan 11 17:30:25 crc restorecon[4694]: /var/lib/kubelet/pods/44663579-783b-4372-86d6-acf235a62d72/containers/dns-node-resolver/5d976d0e not reset as customized by admin to system_u:object_r:container_file_t:s0:c585,c981 Jan 11 17:30:25 crc restorecon[4694]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Jan 11 17:30:25 crc restorecon[4694]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/config/..2025_02_23_05_38_56.1112187283 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Jan 11 17:30:25 crc restorecon[4694]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/config/..2025_02_23_05_38_56.1112187283/controller-config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Jan 11 17:30:25 crc restorecon[4694]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Jan 11 17:30:25 crc restorecon[4694]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/config/controller-config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Jan 11 17:30:25 crc restorecon[4694]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/trusted-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Jan 11 17:30:25 crc restorecon[4694]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_23_05_38_56.2839772658 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Jan 11 17:30:25 crc restorecon[4694]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_23_05_38_56.2839772658/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Jan 11 17:30:25 crc restorecon[4694]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/trusted-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Jan 11 17:30:25 crc restorecon[4694]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/trusted-ca/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Jan 11 17:30:25 crc restorecon[4694]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Jan 11 17:30:25 crc restorecon[4694]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/containers/console-operator/d7f55cbb not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Jan 11 17:30:25 crc restorecon[4694]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/containers/console-operator/f0812073 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Jan 11 17:30:25 crc restorecon[4694]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/containers/console-operator/1a56cbeb not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Jan 11 17:30:25 crc restorecon[4694]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/containers/console-operator/7fdd437e not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Jan 11 17:30:25 crc restorecon[4694]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/containers/console-operator/cdfb5652 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Jan 11 17:30:25 crc restorecon[4694]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/etcd-serving-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Jan 11 17:30:25 crc restorecon[4694]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/etcd-serving-ca/..2025_02_24_06_17_29.3844392896 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Jan 11 17:30:25 crc restorecon[4694]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/etcd-serving-ca/..2025_02_24_06_17_29.3844392896/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Jan 11 17:30:25 crc restorecon[4694]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/etcd-serving-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Jan 11 17:30:25 crc restorecon[4694]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/etcd-serving-ca/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Jan 11 17:30:25 crc restorecon[4694]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Jan 11 17:30:25 crc restorecon[4694]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/config/..2025_02_24_06_17_29.848549803 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Jan 11 17:30:25 crc restorecon[4694]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/config/..2025_02_24_06_17_29.848549803/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Jan 11 17:30:25 crc restorecon[4694]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Jan 11 17:30:25 crc restorecon[4694]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Jan 11 17:30:25 crc restorecon[4694]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/audit not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Jan 11 17:30:25 crc restorecon[4694]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/audit/..2025_02_24_06_17_29.780046231 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Jan 11 17:30:25 crc restorecon[4694]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/audit/..2025_02_24_06_17_29.780046231/policy.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Jan 11 17:30:25 crc restorecon[4694]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/audit/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Jan 11 17:30:25 crc restorecon[4694]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/audit/policy.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Jan 11 17:30:25 crc restorecon[4694]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Jan 11 17:30:25 crc restorecon[4694]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca/..2025_02_24_06_17_29.2926008347 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Jan 11 17:30:25 crc restorecon[4694]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca/..2025_02_24_06_17_29.2926008347/image-registry.openshift-image-registry.svc..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Jan 11 17:30:25 crc restorecon[4694]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca/..2025_02_24_06_17_29.2926008347/image-registry.openshift-image-registry.svc.cluster.local..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Jan 11 17:30:25 crc restorecon[4694]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca/..2025_02_24_06_17_29.2926008347/default-route-openshift-image-registry.apps-crc.testing not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Jan 11 17:30:25 crc restorecon[4694]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Jan 11 17:30:25 crc restorecon[4694]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca/image-registry.openshift-image-registry.svc..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Jan 11 17:30:25 crc restorecon[4694]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca/image-registry.openshift-image-registry.svc.cluster.local..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Jan 11 17:30:25 crc restorecon[4694]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca/default-route-openshift-image-registry.apps-crc.testing not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Jan 11 17:30:25 crc restorecon[4694]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/trusted-ca-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Jan 11 17:30:25 crc restorecon[4694]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/trusted-ca-bundle/..2025_02_24_06_17_29.2729721485 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Jan 11 17:30:25 crc restorecon[4694]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/trusted-ca-bundle/..2025_02_24_06_17_29.2729721485/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Jan 11 17:30:25 crc restorecon[4694]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/trusted-ca-bundle/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Jan 11 17:30:25 crc restorecon[4694]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/trusted-ca-bundle/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Jan 11 17:30:25 crc restorecon[4694]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Jan 11 17:30:25 crc restorecon[4694]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/containers/fix-audit-permissions/fb93119e not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Jan 11 17:30:25 crc restorecon[4694]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/containers/openshift-apiserver/f1e8fc0e not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Jan 11 17:30:25 crc restorecon[4694]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/containers/openshift-apiserver-check-endpoints/218511f3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Jan 11 17:30:25 crc restorecon[4694]: /var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/volumes/kubernetes.io~empty-dir/tmpfs not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Jan 11 17:30:25 crc restorecon[4694]: /var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/volumes/kubernetes.io~empty-dir/tmpfs/k8s-webhook-server not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Jan 11 17:30:25 crc restorecon[4694]: /var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/volumes/kubernetes.io~empty-dir/tmpfs/k8s-webhook-server/serving-certs not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Jan 11 17:30:25 crc restorecon[4694]: /var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Jan 11 17:30:25 crc restorecon[4694]: /var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/containers/packageserver/ca8af7b3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Jan 11 17:30:25 crc restorecon[4694]: /var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/containers/packageserver/72cc8a75 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Jan 11 17:30:25 crc restorecon[4694]: /var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/containers/packageserver/6e8a3760 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Jan 11 17:30:25 crc restorecon[4694]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/volumes/kubernetes.io~configmap/service-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Jan 11 17:30:25 crc restorecon[4694]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/volumes/kubernetes.io~configmap/service-ca/..2025_02_23_05_27_30.557428972 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Jan 11 17:30:25 crc restorecon[4694]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/volumes/kubernetes.io~configmap/service-ca/..2025_02_23_05_27_30.557428972/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Jan 11 17:30:25 crc restorecon[4694]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/volumes/kubernetes.io~configmap/service-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Jan 11 17:30:25 crc restorecon[4694]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/volumes/kubernetes.io~configmap/service-ca/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Jan 11 17:30:25 crc restorecon[4694]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Jan 11 17:30:25 crc restorecon[4694]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/containers/cluster-version-operator/4c3455c0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Jan 11 17:30:25 crc restorecon[4694]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/containers/cluster-version-operator/2278acb0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Jan 11 17:30:25 crc restorecon[4694]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/containers/cluster-version-operator/4b453e4f not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Jan 11 17:30:25 crc restorecon[4694]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/containers/cluster-version-operator/3ec09bda not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Jan 11 17:30:25 crc restorecon[4694]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/trusted-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 11 17:30:25 crc restorecon[4694]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_24_06_25_03.422633132 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 11 17:30:25 crc restorecon[4694]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_24_06_25_03.422633132/anchors not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 11 17:30:25 crc restorecon[4694]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_24_06_25_03.422633132/anchors/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 11 17:30:25 crc restorecon[4694]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/trusted-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 11 17:30:25 crc restorecon[4694]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/trusted-ca/anchors not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 11 17:30:25 crc restorecon[4694]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 11 17:30:25 crc restorecon[4694]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates/..2025_02_24_06_25_03.3594477318 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 11 17:30:25 crc restorecon[4694]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates/..2025_02_24_06_25_03.3594477318/image-registry.openshift-image-registry.svc..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 11 17:30:25 crc restorecon[4694]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates/..2025_02_24_06_25_03.3594477318/image-registry.openshift-image-registry.svc.cluster.local..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 11 17:30:25 crc restorecon[4694]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates/..2025_02_24_06_25_03.3594477318/default-route-openshift-image-registry.apps-crc.testing not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 11 17:30:25 crc restorecon[4694]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 11 17:30:25 crc restorecon[4694]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates/image-registry.openshift-image-registry.svc..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 11 17:30:25 crc restorecon[4694]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates/image-registry.openshift-image-registry.svc.cluster.local..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 11 17:30:25 crc restorecon[4694]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates/default-route-openshift-image-registry.apps-crc.testing not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 11 17:30:25 crc restorecon[4694]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 11 17:30:25 crc restorecon[4694]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/edk2 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 11 17:30:25 crc restorecon[4694]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/edk2/cacerts.bin not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 11 17:30:25 crc restorecon[4694]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/java not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 11 17:30:25 crc restorecon[4694]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/java/cacerts not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 11 17:30:25 crc restorecon[4694]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/openssl not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 11 17:30:25 crc restorecon[4694]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/openssl/ca-bundle.trust.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 11 17:30:25 crc restorecon[4694]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 11 17:30:25 crc restorecon[4694]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 11 17:30:25 crc restorecon[4694]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/email-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 11 17:30:25 crc restorecon[4694]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/objsign-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 11 17:30:25 crc restorecon[4694]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 11 17:30:25 crc restorecon[4694]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/2ae6433e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 11 17:30:25 crc restorecon[4694]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/fde84897.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 11 17:30:25 crc restorecon[4694]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/75680d2e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 11 17:30:25 crc restorecon[4694]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/openshift-service-serving-signer_1740288168.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 11 17:30:25 crc restorecon[4694]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/facfc4fa.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 11 17:30:25 crc restorecon[4694]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8f5a969c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 11 17:30:25 crc restorecon[4694]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/CFCA_EV_ROOT.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 11 17:30:25 crc restorecon[4694]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9ef4a08a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 11 17:30:25 crc restorecon[4694]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ingress-operator_1740288202.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 11 17:30:25 crc restorecon[4694]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/2f332aed.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 11 17:30:25 crc restorecon[4694]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/248c8271.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 11 17:30:25 crc restorecon[4694]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8d10a21f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 11 17:30:25 crc restorecon[4694]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ACCVRAIZ1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 11 17:30:25 crc restorecon[4694]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/a94d09e5.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 11 17:30:25 crc restorecon[4694]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3c9a4d3b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 11 17:30:25 crc restorecon[4694]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/40193066.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 11 17:30:25 crc restorecon[4694]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/AC_RAIZ_FNMT-RCM.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 11 17:30:25 crc restorecon[4694]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/cd8c0d63.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 11 17:30:25 crc restorecon[4694]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b936d1c6.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 11 17:30:25 crc restorecon[4694]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/CA_Disig_Root_R2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 11 17:30:25 crc restorecon[4694]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/4fd49c6c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 11 17:30:25 crc restorecon[4694]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/AC_RAIZ_FNMT-RCM_SERVIDORES_SEGUROS.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 11 17:30:25 crc restorecon[4694]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b81b93f0.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 11 17:30:25 crc restorecon[4694]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5f9a69fa.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 11 17:30:25 crc restorecon[4694]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Certigna.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 11 17:30:25 crc restorecon[4694]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b30d5fda.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 11 17:30:25 crc restorecon[4694]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ANF_Secure_Server_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 11 17:30:25 crc restorecon[4694]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b433981b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 11 17:30:25 crc restorecon[4694]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/93851c9e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 11 17:30:25 crc restorecon[4694]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9282e51c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 11 17:30:25 crc restorecon[4694]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e7dd1bc4.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 11 17:30:25 crc restorecon[4694]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Actalis_Authentication_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 11 17:30:25 crc restorecon[4694]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/930ac5d2.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 11 17:30:25 crc restorecon[4694]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5f47b495.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 11 17:30:25 crc restorecon[4694]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e113c810.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 11 17:30:25 crc restorecon[4694]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5931b5bc.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 11 17:30:25 crc restorecon[4694]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/AffirmTrust_Commercial.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 11 17:30:25 crc restorecon[4694]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/2b349938.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 11 17:30:25 crc restorecon[4694]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e48193cf.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 11 17:30:25 crc restorecon[4694]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/302904dd.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 11 17:30:25 crc restorecon[4694]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/a716d4ed.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 11 17:30:25 crc restorecon[4694]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/AffirmTrust_Networking.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 11 17:30:25 crc restorecon[4694]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/93bc0acc.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 11 17:30:25 crc restorecon[4694]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/86212b19.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 11 17:30:25 crc restorecon[4694]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Certigna_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 11 17:30:25 crc restorecon[4694]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/AffirmTrust_Premium.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 11 17:30:25 crc restorecon[4694]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b727005e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 11 17:30:25 crc restorecon[4694]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/dbc54cab.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 11 17:30:25 crc restorecon[4694]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f51bb24c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 11 17:30:25 crc restorecon[4694]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/c28a8a30.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 11 17:30:25 crc restorecon[4694]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/AffirmTrust_Premium_ECC.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 11 17:30:25 crc restorecon[4694]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9c8dfbd4.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 11 17:30:25 crc restorecon[4694]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ccc52f49.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 11 17:30:25 crc restorecon[4694]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/cb1c3204.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 11 17:30:25 crc restorecon[4694]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Amazon_Root_CA_1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 11 17:30:25 crc restorecon[4694]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ce5e74ef.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 11 17:30:25 crc restorecon[4694]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/fd08c599.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 11 17:30:25 crc restorecon[4694]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Certum_Trusted_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 11 17:30:25 crc restorecon[4694]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Amazon_Root_CA_2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 11 17:30:25 crc restorecon[4694]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/6d41d539.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 11 17:30:25 crc restorecon[4694]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/fb5fa911.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 11 17:30:25 crc restorecon[4694]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e35234b1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 11 17:30:25 crc restorecon[4694]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Amazon_Root_CA_3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 11 17:30:25 crc restorecon[4694]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8cb5ee0f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 11 17:30:25 crc restorecon[4694]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7a7c655d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 11 17:30:25 crc restorecon[4694]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f8fc53da.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 11 17:30:25 crc restorecon[4694]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Amazon_Root_CA_4.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 11 17:30:25 crc restorecon[4694]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/de6d66f3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 11 17:30:25 crc restorecon[4694]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d41b5e2a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 11 17:30:25 crc restorecon[4694]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/41a3f684.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 11 17:30:25 crc restorecon[4694]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1df5a75f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 11 17:30:25 crc restorecon[4694]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Atos_TrustedRoot_2011.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 11 17:30:25 crc restorecon[4694]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e36a6752.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 11 17:30:25 crc restorecon[4694]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b872f2b4.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 11 17:30:25 crc restorecon[4694]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9576d26b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 11 17:30:25 crc restorecon[4694]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/228f89db.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 11 17:30:25 crc restorecon[4694]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Atos_TrustedRoot_Root_CA_ECC_TLS_2021.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 11 17:30:25 crc restorecon[4694]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/fb717492.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 11 17:30:25 crc restorecon[4694]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/2d21b73c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 11 17:30:25 crc restorecon[4694]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/0b1b94ef.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 11 17:30:25 crc restorecon[4694]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/595e996b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 11 17:30:25 crc restorecon[4694]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Atos_TrustedRoot_Root_CA_RSA_TLS_2021.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 11 17:30:25 crc restorecon[4694]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9b46e03d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 11 17:30:25 crc restorecon[4694]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/128f4b91.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 11 17:30:25 crc restorecon[4694]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Buypass_Class_3_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 11 17:30:25 crc restorecon[4694]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/81f2d2b1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 11 17:30:25 crc restorecon[4694]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Autoridad_de_Certificacion_Firmaprofesional_CIF_A62634068.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 11 17:30:25 crc restorecon[4694]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3bde41ac.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 11 17:30:25 crc restorecon[4694]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d16a5865.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 11 17:30:25 crc restorecon[4694]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Certum_EC-384_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 11 17:30:25 crc restorecon[4694]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/BJCA_Global_Root_CA1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 11 17:30:25 crc restorecon[4694]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/0179095f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 11 17:30:25 crc restorecon[4694]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ffa7f1eb.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 11 17:30:25 crc restorecon[4694]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9482e63a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 11 17:30:25 crc restorecon[4694]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d4dae3dd.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 11 17:30:25 crc restorecon[4694]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/BJCA_Global_Root_CA2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 11 17:30:25 crc restorecon[4694]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3e359ba6.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 11 17:30:25 crc restorecon[4694]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7e067d03.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 11 17:30:25 crc restorecon[4694]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/95aff9e3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 11 17:30:25 crc restorecon[4694]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d7746a63.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 11 17:30:25 crc restorecon[4694]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Baltimore_CyberTrust_Root.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 11 17:30:25 crc restorecon[4694]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/653b494a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 11 17:30:25 crc restorecon[4694]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3ad48a91.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 11 17:30:25 crc restorecon[4694]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Certum_Trusted_Network_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 11 17:30:25 crc restorecon[4694]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Buypass_Class_2_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 11 17:30:25 crc restorecon[4694]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/54657681.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 11 17:30:25 crc restorecon[4694]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/82223c44.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 11 17:30:25 crc restorecon[4694]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e8de2f56.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 11 17:30:25 crc restorecon[4694]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/2d9dafe4.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 11 17:30:25 crc restorecon[4694]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d96b65e2.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 11 17:30:25 crc restorecon[4694]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ee64a828.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 11 17:30:25 crc restorecon[4694]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/COMODO_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 11 17:30:25 crc restorecon[4694]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/40547a79.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 11 17:30:25 crc restorecon[4694]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5a3f0ff8.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 11 17:30:25 crc restorecon[4694]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7a780d93.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 11 17:30:25 crc restorecon[4694]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/34d996fb.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 11 17:30:25 crc restorecon[4694]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/COMODO_ECC_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 11 17:30:25 crc restorecon[4694]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/eed8c118.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 11 17:30:25 crc restorecon[4694]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/89c02a45.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 11 17:30:25 crc restorecon[4694]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Certainly_Root_R1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 11 17:30:25 crc restorecon[4694]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b1159c4c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 11 17:30:25 crc restorecon[4694]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/COMODO_RSA_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 11 17:30:25 crc restorecon[4694]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d6325660.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 11 17:30:25 crc restorecon[4694]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d4c339cb.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 11 17:30:25 crc restorecon[4694]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8312c4c1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 11 17:30:25 crc restorecon[4694]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Certainly_Root_E1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 11 17:30:25 crc restorecon[4694]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8508e720.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 11 17:30:25 crc restorecon[4694]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5fdd185d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 11 17:30:25 crc restorecon[4694]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/48bec511.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 11 17:30:25 crc restorecon[4694]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/69105f4f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 11 17:30:25 crc restorecon[4694]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GlobalSign.1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 11 17:30:25 crc restorecon[4694]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/0b9bc432.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 11 17:30:25 crc restorecon[4694]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Certum_Trusted_Network_CA_2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 11 17:30:25 crc restorecon[4694]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GTS_Root_R3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 11 17:30:25 crc restorecon[4694]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/32888f65.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 11 17:30:25 crc restorecon[4694]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/CommScope_Public_Trust_ECC_Root-01.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 11 17:30:25 crc restorecon[4694]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/6b03dec0.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 11 17:30:25 crc restorecon[4694]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/219d9499.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 11 17:30:25 crc restorecon[4694]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/CommScope_Public_Trust_ECC_Root-02.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 11 17:30:25 crc restorecon[4694]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5acf816d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 11 17:30:25 crc restorecon[4694]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/cbf06781.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 11 17:30:25 crc restorecon[4694]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/CommScope_Public_Trust_RSA_Root-01.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 11 17:30:25 crc restorecon[4694]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GTS_Root_R4.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 11 17:30:25 crc restorecon[4694]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/dc99f41e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 11 17:30:25 crc restorecon[4694]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/CommScope_Public_Trust_RSA_Root-02.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 11 17:30:25 crc restorecon[4694]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GlobalSign.3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 11 17:30:25 crc restorecon[4694]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/AAA_Certificate_Services.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 11 17:30:25 crc restorecon[4694]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/985c1f52.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 11 17:30:25 crc restorecon[4694]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8794b4e3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 11 17:30:25 crc restorecon[4694]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/D-TRUST_BR_Root_CA_1_2020.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 11 17:30:25 crc restorecon[4694]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e7c037b4.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 11 17:30:25 crc restorecon[4694]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ef954a4e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 11 17:30:25 crc restorecon[4694]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/D-TRUST_EV_Root_CA_1_2020.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 11 17:30:25 crc restorecon[4694]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/2add47b6.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 11 17:30:25 crc restorecon[4694]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/90c5a3c8.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 11 17:30:25 crc restorecon[4694]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/D-TRUST_Root_Class_3_CA_2_2009.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 11 17:30:25 crc restorecon[4694]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b0f3e76e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 11 17:30:25 crc restorecon[4694]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/53a1b57a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 11 17:30:25 crc restorecon[4694]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/D-TRUST_Root_Class_3_CA_2_EV_2009.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 11 17:30:25 crc restorecon[4694]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GlobalSign_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 11 17:30:25 crc restorecon[4694]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_Assured_ID_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 11 17:30:25 crc restorecon[4694]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5ad8a5d6.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 11 17:30:25 crc restorecon[4694]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/68dd7389.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 11 17:30:25 crc restorecon[4694]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_Assured_ID_Root_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 11 17:30:25 crc restorecon[4694]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9d04f354.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 11 17:30:25 crc restorecon[4694]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8d6437c3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 11 17:30:25 crc restorecon[4694]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/062cdee6.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 11 17:30:25 crc restorecon[4694]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/bd43e1dd.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 11 17:30:25 crc restorecon[4694]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_Assured_ID_Root_G3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 11 17:30:25 crc restorecon[4694]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7f3d5d1d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 11 17:30:25 crc restorecon[4694]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/c491639e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 11 17:30:25 crc restorecon[4694]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GlobalSign_Root_E46.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 11 17:30:25 crc restorecon[4694]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_Global_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 11 17:30:25 crc restorecon[4694]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3513523f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 11 17:30:25 crc restorecon[4694]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/399e7759.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 11 17:30:25 crc restorecon[4694]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/feffd413.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 11 17:30:25 crc restorecon[4694]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d18e9066.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 11 17:30:25 crc restorecon[4694]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_Global_Root_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 11 17:30:25 crc restorecon[4694]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/607986c7.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 11 17:30:25 crc restorecon[4694]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/c90bc37d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 11 17:30:25 crc restorecon[4694]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1b0f7e5c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 11 17:30:25 crc restorecon[4694]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1e08bfd1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 11 17:30:25 crc restorecon[4694]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_Global_Root_G3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 11 17:30:25 crc restorecon[4694]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/dd8e9d41.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 11 17:30:25 crc restorecon[4694]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ed39abd0.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 11 17:30:25 crc restorecon[4694]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/a3418fda.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 11 17:30:25 crc restorecon[4694]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/bc3f2570.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 11 17:30:25 crc restorecon[4694]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_High_Assurance_EV_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 11 17:30:25 crc restorecon[4694]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/244b5494.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 11 17:30:25 crc restorecon[4694]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/81b9768f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 11 17:30:25 crc restorecon[4694]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GlobalSign.2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 11 17:30:25 crc restorecon[4694]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/4be590e0.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 11 17:30:25 crc restorecon[4694]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_TLS_ECC_P384_Root_G5.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 11 17:30:25 crc restorecon[4694]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9846683b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 11 17:30:25 crc restorecon[4694]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/252252d2.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 11 17:30:25 crc restorecon[4694]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1e8e7201.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 11 17:30:25 crc restorecon[4694]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ISRG_Root_X1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 11 17:30:25 crc restorecon[4694]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_TLS_RSA4096_Root_G5.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 11 17:30:25 crc restorecon[4694]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d52c538d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 11 17:30:25 crc restorecon[4694]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/c44cc0c0.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 11 17:30:25 crc restorecon[4694]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GlobalSign_Root_R46.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 11 17:30:25 crc restorecon[4694]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_Trusted_Root_G4.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 11 17:30:25 crc restorecon[4694]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/75d1b2ed.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 11 17:30:25 crc restorecon[4694]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/a2c66da8.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 11 17:30:25 crc restorecon[4694]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GTS_Root_R2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 11 17:30:25 crc restorecon[4694]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ecccd8db.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 11 17:30:25 crc restorecon[4694]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Entrust.net_Certification_Authority__2048_.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 11 17:30:25 crc restorecon[4694]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/aee5f10d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 11 17:30:25 crc restorecon[4694]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3e7271e8.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 11 17:30:25 crc restorecon[4694]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b0e59380.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 11 17:30:25 crc restorecon[4694]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/4c3982f2.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 11 17:30:25 crc restorecon[4694]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Entrust_Root_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 11 17:30:25 crc restorecon[4694]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/6b99d060.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 11 17:30:25 crc restorecon[4694]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/bf64f35b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 11 17:30:25 crc restorecon[4694]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/0a775a30.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 11 17:30:25 crc restorecon[4694]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/002c0b4f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 11 17:30:25 crc restorecon[4694]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/cc450945.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 11 17:30:25 crc restorecon[4694]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Entrust_Root_Certification_Authority_-_EC1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 11 17:30:25 crc restorecon[4694]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/106f3e4d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 11 17:30:25 crc restorecon[4694]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b3fb433b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 11 17:30:25 crc restorecon[4694]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GlobalSign.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 11 17:30:25 crc restorecon[4694]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/4042bcee.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 11 17:30:25 crc restorecon[4694]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Entrust_Root_Certification_Authority_-_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 11 17:30:25 crc restorecon[4694]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/02265526.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 11 17:30:25 crc restorecon[4694]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/455f1b52.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 11 17:30:25 crc restorecon[4694]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/0d69c7e1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 11 17:30:25 crc restorecon[4694]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9f727ac7.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 11 17:30:25 crc restorecon[4694]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Entrust_Root_Certification_Authority_-_G4.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 11 17:30:25 crc restorecon[4694]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5e98733a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 11 17:30:25 crc restorecon[4694]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f0cd152c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 11 17:30:25 crc restorecon[4694]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/dc4d6a89.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 11 17:30:25 crc restorecon[4694]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/6187b673.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 11 17:30:25 crc restorecon[4694]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/FIRMAPROFESIONAL_CA_ROOT-A_WEB.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 11 17:30:25 crc restorecon[4694]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ba8887ce.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 11 17:30:25 crc restorecon[4694]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/068570d1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 11 17:30:25 crc restorecon[4694]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f081611a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 11 17:30:25 crc restorecon[4694]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/48a195d8.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 11 17:30:25 crc restorecon[4694]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GDCA_TrustAUTH_R5_ROOT.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 11 17:30:25 crc restorecon[4694]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/0f6fa695.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 11 17:30:25 crc restorecon[4694]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ab59055e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 11 17:30:25 crc restorecon[4694]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b92fd57f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 11 17:30:25 crc restorecon[4694]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GLOBALTRUST_2020.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 11 17:30:25 crc restorecon[4694]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/fa5da96b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 11 17:30:25 crc restorecon[4694]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1ec40989.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 11 17:30:25 crc restorecon[4694]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7719f463.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 11 17:30:25 crc restorecon[4694]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GTS_Root_R1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 11 17:30:25 crc restorecon[4694]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1001acf7.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 11 17:30:25 crc restorecon[4694]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f013ecaf.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 11 17:30:25 crc restorecon[4694]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/626dceaf.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 11 17:30:25 crc restorecon[4694]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/c559d742.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 11 17:30:25 crc restorecon[4694]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1d3472b9.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 11 17:30:25 crc restorecon[4694]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9479c8c3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 11 17:30:25 crc restorecon[4694]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/a81e292b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 11 17:30:25 crc restorecon[4694]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/4bfab552.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 11 17:30:25 crc restorecon[4694]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Go_Daddy_Class_2_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 11 17:30:25 crc restorecon[4694]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Sectigo_Public_Server_Authentication_Root_E46.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 11 17:30:25 crc restorecon[4694]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Go_Daddy_Root_Certificate_Authority_-_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 11 17:30:25 crc restorecon[4694]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e071171e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 11 17:30:25 crc restorecon[4694]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/57bcb2da.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 11 17:30:25 crc restorecon[4694]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/HARICA_TLS_ECC_Root_CA_2021.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 11 17:30:25 crc restorecon[4694]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ab5346f4.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 11 17:30:25 crc restorecon[4694]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5046c355.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 11 17:30:25 crc restorecon[4694]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/HARICA_TLS_RSA_Root_CA_2021.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 11 17:30:25 crc restorecon[4694]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/865fbdf9.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 11 17:30:25 crc restorecon[4694]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/da0cfd1d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 11 17:30:25 crc restorecon[4694]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/85cde254.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 11 17:30:25 crc restorecon[4694]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Hellenic_Academic_and_Research_Institutions_ECC_RootCA_2015.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 11 17:30:25 crc restorecon[4694]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/cbb3f32b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 11 17:30:25 crc restorecon[4694]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SecureSign_RootCA11.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 11 17:30:25 crc restorecon[4694]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Hellenic_Academic_and_Research_Institutions_RootCA_2015.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 11 17:30:25 crc restorecon[4694]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5860aaa6.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 11 17:30:25 crc restorecon[4694]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/31188b5e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 11 17:30:25 crc restorecon[4694]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/HiPKI_Root_CA_-_G1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 11 17:30:25 crc restorecon[4694]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/c7f1359b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 11 17:30:25 crc restorecon[4694]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5f15c80c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 11 17:30:25 crc restorecon[4694]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Hongkong_Post_Root_CA_3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 11 17:30:25 crc restorecon[4694]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/09789157.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 11 17:30:25 crc restorecon[4694]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ISRG_Root_X2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 11 17:30:25 crc restorecon[4694]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/18856ac4.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 11 17:30:25 crc restorecon[4694]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1e09d511.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 11 17:30:25 crc restorecon[4694]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/IdenTrust_Commercial_Root_CA_1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 11 17:30:25 crc restorecon[4694]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/cf701eeb.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 11 17:30:25 crc restorecon[4694]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d06393bb.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 11 17:30:25 crc restorecon[4694]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/IdenTrust_Public_Sector_Root_CA_1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 11 17:30:25 crc restorecon[4694]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/10531352.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 11 17:30:25 crc restorecon[4694]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Izenpe.com.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 11 17:30:25 crc restorecon[4694]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SecureTrust_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 11 17:30:25 crc restorecon[4694]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b0ed035a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 11 17:30:25 crc restorecon[4694]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Microsec_e-Szigno_Root_CA_2009.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 11 17:30:25 crc restorecon[4694]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8160b96c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 11 17:30:25 crc restorecon[4694]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e8651083.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 11 17:30:25 crc restorecon[4694]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/2c63f966.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 11 17:30:25 crc restorecon[4694]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Security_Communication_RootCA2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 11 17:30:25 crc restorecon[4694]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Microsoft_ECC_Root_Certificate_Authority_2017.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 11 17:30:25 crc restorecon[4694]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8d89cda1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 11 17:30:25 crc restorecon[4694]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/01419da9.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 11 17:30:25 crc restorecon[4694]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SSL.com_TLS_RSA_Root_CA_2022.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 11 17:30:25 crc restorecon[4694]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b7a5b843.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 11 17:30:25 crc restorecon[4694]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Microsoft_RSA_Root_Certificate_Authority_2017.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 11 17:30:25 crc restorecon[4694]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/bf53fb88.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 11 17:30:25 crc restorecon[4694]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9591a472.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 11 17:30:25 crc restorecon[4694]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3afde786.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 11 17:30:25 crc restorecon[4694]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SwissSign_Gold_CA_-_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 11 17:30:25 crc restorecon[4694]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/NAVER_Global_Root_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 11 17:30:25 crc restorecon[4694]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3fb36b73.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 11 17:30:25 crc restorecon[4694]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d39b0a2c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 11 17:30:25 crc restorecon[4694]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/a89d74c2.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 11 17:30:25 crc restorecon[4694]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/cd58d51e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 11 17:30:25 crc restorecon[4694]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b7db1890.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 11 17:30:25 crc restorecon[4694]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/NetLock_Arany__Class_Gold__F__tan__s__tv__ny.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 11 17:30:25 crc restorecon[4694]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/988a38cb.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 11 17:30:25 crc restorecon[4694]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/60afe812.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 11 17:30:25 crc restorecon[4694]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f39fc864.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 11 17:30:25 crc restorecon[4694]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5443e9e3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 11 17:30:25 crc restorecon[4694]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/OISTE_WISeKey_Global_Root_GB_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 11 17:30:25 crc restorecon[4694]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e73d606e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 11 17:30:25 crc restorecon[4694]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/dfc0fe80.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 11 17:30:25 crc restorecon[4694]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b66938e9.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 11 17:30:25 crc restorecon[4694]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1e1eab7c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 11 17:30:25 crc restorecon[4694]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/OISTE_WISeKey_Global_Root_GC_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 11 17:30:25 crc restorecon[4694]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/773e07ad.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 11 17:30:25 crc restorecon[4694]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3c899c73.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 11 17:30:25 crc restorecon[4694]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d59297b8.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 11 17:30:25 crc restorecon[4694]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ddcda989.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 11 17:30:25 crc restorecon[4694]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/QuoVadis_Root_CA_1_G3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 11 17:30:25 crc restorecon[4694]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/749e9e03.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 11 17:30:25 crc restorecon[4694]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/52b525c7.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 11 17:30:25 crc restorecon[4694]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Security_Communication_RootCA3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 11 17:30:25 crc restorecon[4694]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/QuoVadis_Root_CA_2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 11 17:30:25 crc restorecon[4694]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d7e8dc79.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 11 17:30:25 crc restorecon[4694]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7a819ef2.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 11 17:30:25 crc restorecon[4694]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/08063a00.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 11 17:30:25 crc restorecon[4694]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/6b483515.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 11 17:30:25 crc restorecon[4694]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/QuoVadis_Root_CA_2_G3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 11 17:30:25 crc restorecon[4694]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/064e0aa9.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 11 17:30:25 crc restorecon[4694]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1f58a078.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 11 17:30:25 crc restorecon[4694]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/6f7454b3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 11 17:30:25 crc restorecon[4694]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7fa05551.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 11 17:30:25 crc restorecon[4694]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/QuoVadis_Root_CA_3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 11 17:30:25 crc restorecon[4694]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/76faf6c0.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 11 17:30:25 crc restorecon[4694]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9339512a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 11 17:30:25 crc restorecon[4694]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f387163d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 11 17:30:25 crc restorecon[4694]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ee37c333.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 11 17:30:25 crc restorecon[4694]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/QuoVadis_Root_CA_3_G3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 11 17:30:25 crc restorecon[4694]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e18bfb83.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 11 17:30:25 crc restorecon[4694]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e442e424.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 11 17:30:25 crc restorecon[4694]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/fe8a2cd8.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 11 17:30:25 crc restorecon[4694]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/23f4c490.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 11 17:30:25 crc restorecon[4694]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5cd81ad7.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 11 17:30:25 crc restorecon[4694]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SSL.com_EV_Root_Certification_Authority_ECC.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 11 17:30:25 crc restorecon[4694]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f0c70a8d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 11 17:30:25 crc restorecon[4694]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7892ad52.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 11 17:30:25 crc restorecon[4694]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SZAFIR_ROOT_CA2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 11 17:30:25 crc restorecon[4694]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/4f316efb.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 11 17:30:25 crc restorecon[4694]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SSL.com_EV_Root_Certification_Authority_RSA_R2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 11 17:30:25 crc restorecon[4694]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/06dc52d5.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 11 17:30:25 crc restorecon[4694]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/583d0756.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 11 17:30:25 crc restorecon[4694]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Sectigo_Public_Server_Authentication_Root_R46.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 11 17:30:25 crc restorecon[4694]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SSL.com_Root_Certification_Authority_ECC.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 11 17:30:25 crc restorecon[4694]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/0bf05006.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 11 17:30:25 crc restorecon[4694]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/88950faa.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 11 17:30:25 crc restorecon[4694]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9046744a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 11 17:30:25 crc restorecon[4694]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3c860d51.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 11 17:30:25 crc restorecon[4694]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SSL.com_Root_Certification_Authority_RSA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 11 17:30:25 crc restorecon[4694]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/6fa5da56.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 11 17:30:25 crc restorecon[4694]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/33ee480d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 11 17:30:25 crc restorecon[4694]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Secure_Global_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 11 17:30:25 crc restorecon[4694]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/63a2c897.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 11 17:30:25 crc restorecon[4694]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SSL.com_TLS_ECC_Root_CA_2022.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 11 17:30:25 crc restorecon[4694]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/bdacca6f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 11 17:30:25 crc restorecon[4694]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ff34af3f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 11 17:30:25 crc restorecon[4694]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/dbff3a01.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 11 17:30:25 crc restorecon[4694]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Security_Communication_ECC_RootCA1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 11 17:30:25 crc restorecon[4694]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/emSign_Root_CA_-_C1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 11 17:30:25 crc restorecon[4694]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Starfield_Class_2_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 11 17:30:25 crc restorecon[4694]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/406c9bb1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 11 17:30:25 crc restorecon[4694]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Starfield_Root_Certificate_Authority_-_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 11 17:30:25 crc restorecon[4694]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/emSign_ECC_Root_CA_-_C3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 11 17:30:25 crc restorecon[4694]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Starfield_Services_Root_Certificate_Authority_-_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 11 17:30:25 crc restorecon[4694]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SwissSign_Silver_CA_-_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 11 17:30:25 crc restorecon[4694]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/99e1b953.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 11 17:30:25 crc restorecon[4694]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/T-TeleSec_GlobalRoot_Class_2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 11 17:30:25 crc restorecon[4694]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/vTrus_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 11 17:30:25 crc restorecon[4694]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/T-TeleSec_GlobalRoot_Class_3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 11 17:30:25 crc restorecon[4694]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/14bc7599.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 11 17:30:25 crc restorecon[4694]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/TUBITAK_Kamu_SM_SSL_Kok_Sertifikasi_-_Surum_1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 11 17:30:25 crc restorecon[4694]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/TWCA_Global_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 11 17:30:25 crc restorecon[4694]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7a3adc42.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 11 17:30:25 crc restorecon[4694]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/TWCA_Root_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 11 17:30:25 crc restorecon[4694]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f459871d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 11 17:30:25 crc restorecon[4694]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Telekom_Security_TLS_ECC_Root_2020.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 11 17:30:25 crc restorecon[4694]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/emSign_Root_CA_-_G1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 11 17:30:25 crc restorecon[4694]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Telekom_Security_TLS_RSA_Root_2023.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 11 17:30:25 crc restorecon[4694]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/TeliaSonera_Root_CA_v1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 11 17:30:25 crc restorecon[4694]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Telia_Root_CA_v2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 11 17:30:25 crc restorecon[4694]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8f103249.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 11 17:30:25 crc restorecon[4694]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f058632f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 11 17:30:25 crc restorecon[4694]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ca-certificates.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 11 17:30:25 crc restorecon[4694]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/TrustAsia_Global_Root_CA_G3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 11 17:30:25 crc restorecon[4694]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9bf03295.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 11 17:30:25 crc restorecon[4694]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/98aaf404.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 11 17:30:25 crc restorecon[4694]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 11 17:30:25 crc restorecon[4694]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/TrustAsia_Global_Root_CA_G4.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 11 17:30:25 crc restorecon[4694]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1cef98f5.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 11 17:30:25 crc restorecon[4694]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/073bfcc5.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 11 17:30:25 crc restorecon[4694]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/2923b3f9.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 11 17:30:25 crc restorecon[4694]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Trustwave_Global_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 11 17:30:25 crc restorecon[4694]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f249de83.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 11 17:30:25 crc restorecon[4694]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/edcbddb5.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 11 17:30:25 crc restorecon[4694]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/emSign_ECC_Root_CA_-_G3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 11 17:30:25 crc restorecon[4694]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Trustwave_Global_ECC_P256_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 11 17:30:25 crc restorecon[4694]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9b5697b0.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 11 17:30:25 crc restorecon[4694]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1ae85e5e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 11 17:30:25 crc restorecon[4694]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b74d2bd5.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 11 17:30:25 crc restorecon[4694]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Trustwave_Global_ECC_P384_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 11 17:30:25 crc restorecon[4694]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d887a5bb.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 11 17:30:25 crc restorecon[4694]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9aef356c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 11 17:30:25 crc restorecon[4694]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/TunTrust_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 11 17:30:25 crc restorecon[4694]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/fd64f3fc.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 11 17:30:25 crc restorecon[4694]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e13665f9.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 11 17:30:25 crc restorecon[4694]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/UCA_Extended_Validation_Root.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 11 17:30:25 crc restorecon[4694]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/0f5dc4f3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 11 17:30:25 crc restorecon[4694]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/da7377f6.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 11 17:30:25 crc restorecon[4694]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/UCA_Global_G2_Root.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 11 17:30:25 crc restorecon[4694]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/c01eb047.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 11 17:30:25 crc restorecon[4694]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/304d27c3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 11 17:30:25 crc restorecon[4694]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ed858448.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 11 17:30:25 crc restorecon[4694]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/USERTrust_ECC_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 11 17:30:25 crc restorecon[4694]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f30dd6ad.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 11 17:30:25 crc restorecon[4694]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/04f60c28.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 11 17:30:25 crc restorecon[4694]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/vTrus_ECC_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 11 17:30:25 crc restorecon[4694]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/USERTrust_RSA_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 11 17:30:25 crc restorecon[4694]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/fc5a8f99.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 11 17:30:25 crc restorecon[4694]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/35105088.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 11 17:30:25 crc restorecon[4694]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ee532fd5.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 11 17:30:25 crc restorecon[4694]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/XRamp_Global_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 11 17:30:25 crc restorecon[4694]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/706f604c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 11 17:30:25 crc restorecon[4694]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/76579174.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 11 17:30:25 crc restorecon[4694]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/certSIGN_ROOT_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 11 17:30:25 crc restorecon[4694]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8d86cdd1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 11 17:30:25 crc restorecon[4694]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/882de061.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 11 17:30:25 crc restorecon[4694]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/certSIGN_ROOT_CA_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 11 17:30:25 crc restorecon[4694]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5f618aec.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 11 17:30:25 crc restorecon[4694]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/a9d40e02.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 11 17:30:25 crc restorecon[4694]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e-Szigno_Root_CA_2017.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 11 17:30:25 crc restorecon[4694]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e868b802.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 11 17:30:25 crc restorecon[4694]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/83e9984f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 11 17:30:25 crc restorecon[4694]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ePKI_Root_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 11 17:30:25 crc restorecon[4694]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ca6e4ad9.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 11 17:30:25 crc restorecon[4694]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9d6523ce.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 11 17:30:25 crc restorecon[4694]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/4b718d9b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 11 17:30:25 crc restorecon[4694]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/869fbf79.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 11 17:30:25 crc restorecon[4694]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 11 17:30:25 crc restorecon[4694]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/containers/registry/f8d22bdb not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 11 17:30:25 crc restorecon[4694]: /var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c17 Jan 11 17:30:25 crc restorecon[4694]: /var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/containers/cluster-samples-operator/6e8bbfac not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c17 Jan 11 17:30:25 crc restorecon[4694]: /var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/containers/cluster-samples-operator/54dd7996 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c17 Jan 11 17:30:25 crc restorecon[4694]: /var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/containers/cluster-samples-operator/a4f1bb05 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c17 Jan 11 17:30:25 crc restorecon[4694]: /var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/containers/cluster-samples-operator-watch/207129da not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c17 Jan 11 17:30:25 crc restorecon[4694]: /var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/containers/cluster-samples-operator-watch/c1df39e1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c17 Jan 11 17:30:25 crc restorecon[4694]: /var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/containers/cluster-samples-operator-watch/15b8f1cd not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c17 Jan 11 17:30:25 crc restorecon[4694]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Jan 11 17:30:25 crc restorecon[4694]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/config/..2025_02_23_05_27_49.3523263858 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Jan 11 17:30:25 crc restorecon[4694]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/config/..2025_02_23_05_27_49.3523263858/config-file.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Jan 11 17:30:25 crc restorecon[4694]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Jan 11 17:30:25 crc restorecon[4694]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/config/config-file.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Jan 11 17:30:25 crc restorecon[4694]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/images not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Jan 11 17:30:25 crc restorecon[4694]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/images/..2025_02_23_05_27_49.3256605594 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Jan 11 17:30:25 crc restorecon[4694]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/images/..2025_02_23_05_27_49.3256605594/images.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Jan 11 17:30:25 crc restorecon[4694]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/images/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Jan 11 17:30:25 crc restorecon[4694]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/images/images.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Jan 11 17:30:25 crc restorecon[4694]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Jan 11 17:30:25 crc restorecon[4694]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/containers/kube-rbac-proxy/77bd6913 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Jan 11 17:30:25 crc restorecon[4694]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/containers/kube-rbac-proxy/2382c1b1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Jan 11 17:30:25 crc restorecon[4694]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/containers/kube-rbac-proxy/704ce128 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Jan 11 17:30:25 crc restorecon[4694]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/containers/machine-api-operator/70d16fe0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Jan 11 17:30:25 crc restorecon[4694]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/containers/machine-api-operator/bfb95535 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Jan 11 17:30:25 crc restorecon[4694]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/containers/machine-api-operator/57a8e8e2 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Jan 11 17:30:25 crc restorecon[4694]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c219,c404 Jan 11 17:30:25 crc restorecon[4694]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/volumes/kubernetes.io~configmap/config/..2025_02_23_05_27_49.3413793711 not reset as customized by admin to system_u:object_r:container_file_t:s0:c219,c404 Jan 11 17:30:25 crc restorecon[4694]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/volumes/kubernetes.io~configmap/config/..2025_02_23_05_27_49.3413793711/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c219,c404 Jan 11 17:30:25 crc restorecon[4694]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c219,c404 Jan 11 17:30:25 crc restorecon[4694]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c219,c404 Jan 11 17:30:25 crc restorecon[4694]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c219,c404 Jan 11 17:30:25 crc restorecon[4694]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/containers/kube-apiserver-operator/1b9d3e5e not reset as customized by admin to system_u:object_r:container_file_t:s0:c107,c917 Jan 11 17:30:25 crc restorecon[4694]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/containers/kube-apiserver-operator/fddb173c not reset as customized by admin to system_u:object_r:container_file_t:s0:c202,c983 Jan 11 17:30:25 crc restorecon[4694]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/containers/kube-apiserver-operator/95d3c6c4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c219,c404 Jan 11 17:30:25 crc restorecon[4694]: /var/lib/kubelet/pods/9d751cbb-f2e2-430d-9754-c882a5e924a5/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c20,c21 Jan 11 17:30:25 crc restorecon[4694]: /var/lib/kubelet/pods/9d751cbb-f2e2-430d-9754-c882a5e924a5/containers/check-endpoints/bfb5fff5 not reset as customized by admin to system_u:object_r:container_file_t:s0:c20,c21 Jan 11 17:30:25 crc restorecon[4694]: /var/lib/kubelet/pods/9d751cbb-f2e2-430d-9754-c882a5e924a5/containers/check-endpoints/2aef40aa not reset as customized by admin to system_u:object_r:container_file_t:s0:c20,c21 Jan 11 17:30:25 crc restorecon[4694]: /var/lib/kubelet/pods/9d751cbb-f2e2-430d-9754-c882a5e924a5/containers/check-endpoints/c0391cad not reset as customized by admin to system_u:object_r:container_file_t:s0:c20,c21 Jan 11 17:30:25 crc restorecon[4694]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c214,c928 Jan 11 17:30:25 crc restorecon[4694]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/kube-controller-manager/1119e69d not reset as customized by admin to system_u:object_r:container_file_t:s0:c776,c1007 Jan 11 17:30:25 crc restorecon[4694]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/kube-controller-manager/660608b4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c214,c928 Jan 11 17:30:25 crc restorecon[4694]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/kube-controller-manager/8220bd53 not reset as customized by admin to system_u:object_r:container_file_t:s0:c214,c928 Jan 11 17:30:25 crc restorecon[4694]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/cluster-policy-controller/85f99d5c not reset as customized by admin to system_u:object_r:container_file_t:s0:c776,c1007 Jan 11 17:30:25 crc restorecon[4694]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/cluster-policy-controller/4b0225f6 not reset as customized by admin to system_u:object_r:container_file_t:s0:c214,c928 Jan 11 17:30:25 crc restorecon[4694]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/kube-controller-manager-cert-syncer/9c2a3394 not reset as customized by admin to system_u:object_r:container_file_t:s0:c776,c1007 Jan 11 17:30:25 crc restorecon[4694]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/kube-controller-manager-cert-syncer/e820b243 not reset as customized by admin to system_u:object_r:container_file_t:s0:c214,c928 Jan 11 17:30:25 crc restorecon[4694]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/kube-controller-manager-recovery-controller/1ca52ea0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c776,c1007 Jan 11 17:30:25 crc restorecon[4694]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/kube-controller-manager-recovery-controller/e6988e45 not reset as customized by admin to system_u:object_r:container_file_t:s0:c214,c928 Jan 11 17:30:25 crc restorecon[4694]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/volumes/kubernetes.io~configmap/mcc-auth-proxy-config not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Jan 11 17:30:25 crc restorecon[4694]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/volumes/kubernetes.io~configmap/mcc-auth-proxy-config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Jan 11 17:30:25 crc restorecon[4694]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/volumes/kubernetes.io~configmap/mcc-auth-proxy-config/config-file.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Jan 11 17:30:25 crc restorecon[4694]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/volumes/kubernetes.io~configmap/mcc-auth-proxy-config/..2025_02_24_06_09_21.2517297950 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Jan 11 17:30:25 crc restorecon[4694]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/volumes/kubernetes.io~configmap/mcc-auth-proxy-config/..2025_02_24_06_09_21.2517297950/config-file.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Jan 11 17:30:25 crc restorecon[4694]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Jan 11 17:30:25 crc restorecon[4694]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/containers/machine-config-controller/6655f00b not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Jan 11 17:30:25 crc restorecon[4694]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/containers/machine-config-controller/98bc3986 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Jan 11 17:30:25 crc restorecon[4694]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/containers/machine-config-controller/08e3458a not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Jan 11 17:30:25 crc restorecon[4694]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/containers/kube-rbac-proxy/2a191cb0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Jan 11 17:30:25 crc restorecon[4694]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/containers/kube-rbac-proxy/6c4eeefb not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Jan 11 17:30:25 crc restorecon[4694]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/containers/kube-rbac-proxy/f61a549c not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Jan 11 17:30:25 crc restorecon[4694]: /var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c318,c553 Jan 11 17:30:25 crc restorecon[4694]: /var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/containers/hostpath-provisioner/24891863 not reset as customized by admin to system_u:object_r:container_file_t:s0:c37,c572 Jan 11 17:30:25 crc restorecon[4694]: /var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/containers/hostpath-provisioner/fbdfd89c not reset as customized by admin to system_u:object_r:container_file_t:s0:c318,c553 Jan 11 17:30:25 crc restorecon[4694]: /var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/containers/liveness-probe/9b63b3bc not reset as customized by admin to system_u:object_r:container_file_t:s0:c37,c572 Jan 11 17:30:25 crc restorecon[4694]: /var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/containers/liveness-probe/8acde6d6 not reset as customized by admin to system_u:object_r:container_file_t:s0:c318,c553 Jan 11 17:30:25 crc restorecon[4694]: /var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/containers/node-driver-registrar/59ecbba3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c318,c553 Jan 11 17:30:25 crc restorecon[4694]: /var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/containers/csi-provisioner/685d4be3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c318,c553 Jan 11 17:30:25 crc restorecon[4694]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Jan 11 17:30:25 crc restorecon[4694]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.341639300 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Jan 11 17:30:25 crc restorecon[4694]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.341639300/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Jan 11 17:30:25 crc restorecon[4694]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.341639300/openshift-route-controller-manager.client-ca.configmap not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Jan 11 17:30:25 crc restorecon[4694]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.341639300/openshift-route-controller-manager.serving-cert.secret not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Jan 11 17:30:25 crc restorecon[4694]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Jan 11 17:30:25 crc restorecon[4694]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Jan 11 17:30:25 crc restorecon[4694]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config/openshift-route-controller-manager.client-ca.configmap not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Jan 11 17:30:25 crc restorecon[4694]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config/openshift-route-controller-manager.serving-cert.secret not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Jan 11 17:30:25 crc restorecon[4694]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/client-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Jan 11 17:30:25 crc restorecon[4694]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/client-ca/..2025_02_24_06_20_07.2950937851 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Jan 11 17:30:25 crc restorecon[4694]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/client-ca/..2025_02_24_06_20_07.2950937851/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Jan 11 17:30:25 crc restorecon[4694]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/client-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Jan 11 17:30:25 crc restorecon[4694]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/client-ca/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Jan 11 17:30:25 crc restorecon[4694]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Jan 11 17:30:25 crc restorecon[4694]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/containers/route-controller-manager/feaea55e not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Jan 11 17:30:25 crc restorecon[4694]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 11 17:30:25 crc restorecon[4694]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 11 17:30:25 crc restorecon[4694]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/abinitio-runtime-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 11 17:30:25 crc restorecon[4694]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/abinitio-runtime-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 11 17:30:25 crc restorecon[4694]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/accuknox-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 11 17:30:25 crc restorecon[4694]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/accuknox-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 11 17:30:25 crc restorecon[4694]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aci-containers-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 11 17:30:25 crc restorecon[4694]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aci-containers-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 11 17:30:25 crc restorecon[4694]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aikit-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 11 17:30:25 crc restorecon[4694]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aikit-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 11 17:30:25 crc restorecon[4694]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/airlock-microgateway not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 11 17:30:25 crc restorecon[4694]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/airlock-microgateway/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 11 17:30:25 crc restorecon[4694]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ako-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 11 17:30:25 crc restorecon[4694]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ako-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 11 17:30:25 crc restorecon[4694]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/alloy not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 11 17:30:25 crc restorecon[4694]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/alloy/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 11 17:30:25 crc restorecon[4694]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anchore-engine not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 11 17:30:25 crc restorecon[4694]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anchore-engine/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 11 17:30:25 crc restorecon[4694]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzo-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 11 17:30:25 crc restorecon[4694]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzo-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 11 17:30:25 crc restorecon[4694]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzograph-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 11 17:30:25 crc restorecon[4694]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzograph-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 11 17:30:25 crc restorecon[4694]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzounstructured-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 11 17:30:25 crc restorecon[4694]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzounstructured-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 11 17:30:25 crc restorecon[4694]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/appdynamics-cloud-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 11 17:30:25 crc restorecon[4694]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/appdynamics-cloud-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 11 17:30:25 crc restorecon[4694]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/appdynamics-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 11 17:30:25 crc restorecon[4694]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/appdynamics-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 11 17:30:25 crc restorecon[4694]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aqua-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 11 17:30:25 crc restorecon[4694]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aqua-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 11 17:30:25 crc restorecon[4694]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cass-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 11 17:30:25 crc restorecon[4694]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cass-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 11 17:30:25 crc restorecon[4694]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ccm-node-agent-dcap-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 11 17:30:25 crc restorecon[4694]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ccm-node-agent-dcap-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 11 17:30:25 crc restorecon[4694]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ccm-node-agent-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 11 17:30:25 crc restorecon[4694]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ccm-node-agent-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 11 17:30:25 crc restorecon[4694]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cfm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 11 17:30:25 crc restorecon[4694]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cfm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 11 17:30:25 crc restorecon[4694]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cilium not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 11 17:30:25 crc restorecon[4694]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cilium/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 11 17:30:25 crc restorecon[4694]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cilium-enterprise not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 11 17:30:25 crc restorecon[4694]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cilium-enterprise/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 11 17:30:25 crc restorecon[4694]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloud-native-postgresql not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 11 17:30:25 crc restorecon[4694]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloud-native-postgresql/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 11 17:30:25 crc restorecon[4694]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloudbees-ci not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 11 17:30:25 crc restorecon[4694]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloudbees-ci/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 11 17:30:25 crc restorecon[4694]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloudera-streams-messaging-kubernetes-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 11 17:30:25 crc restorecon[4694]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloudera-streams-messaging-kubernetes-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 11 17:30:25 crc restorecon[4694]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloudnative-pg not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 11 17:30:25 crc restorecon[4694]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloudnative-pg/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 11 17:30:25 crc restorecon[4694]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cnfv-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 11 17:30:25 crc restorecon[4694]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cnfv-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 11 17:30:25 crc restorecon[4694]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cockroachdb-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 11 17:30:25 crc restorecon[4694]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cockroachdb-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 11 17:30:25 crc restorecon[4694]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/conjur-follower-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 11 17:30:25 crc restorecon[4694]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/conjur-follower-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 11 17:30:25 crc restorecon[4694]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/coroot-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 11 17:30:25 crc restorecon[4694]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/coroot-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 11 17:30:25 crc restorecon[4694]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/crunchy-postgres-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 11 17:30:25 crc restorecon[4694]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/crunchy-postgres-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 11 17:30:25 crc restorecon[4694]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cte-k8s-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 11 17:30:25 crc restorecon[4694]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cte-k8s-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 11 17:30:25 crc restorecon[4694]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datadog-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 11 17:30:25 crc restorecon[4694]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datadog-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 11 17:30:25 crc restorecon[4694]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dell-csm-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 11 17:30:25 crc restorecon[4694]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dell-csm-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 11 17:30:25 crc restorecon[4694]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/digitalai-deploy-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 11 17:30:25 crc restorecon[4694]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/digitalai-deploy-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 11 17:30:25 crc restorecon[4694]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/digitalai-release-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 11 17:30:25 crc restorecon[4694]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/digitalai-release-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 11 17:30:25 crc restorecon[4694]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dynatrace-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 11 17:30:25 crc restorecon[4694]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dynatrace-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 11 17:30:25 crc restorecon[4694]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/edb-hcp-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 11 17:30:25 crc restorecon[4694]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/edb-hcp-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 11 17:30:25 crc restorecon[4694]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eginnovations-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 11 17:30:25 crc restorecon[4694]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eginnovations-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 11 17:30:25 crc restorecon[4694]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/elasticsearch-eck-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 11 17:30:25 crc restorecon[4694]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/elasticsearch-eck-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 11 17:30:25 crc restorecon[4694]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/falcon-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 11 17:30:25 crc restorecon[4694]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/falcon-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 11 17:30:25 crc restorecon[4694]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/federatorai-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 11 17:30:25 crc restorecon[4694]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/federatorai-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 11 17:30:25 crc restorecon[4694]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fujitsu-enterprise-postgres-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 11 17:30:25 crc restorecon[4694]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fujitsu-enterprise-postgres-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 11 17:30:25 crc restorecon[4694]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/function-mesh not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 11 17:30:25 crc restorecon[4694]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/function-mesh/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 11 17:30:25 crc restorecon[4694]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/harness-gitops-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 11 17:30:25 crc restorecon[4694]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/harness-gitops-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 11 17:30:25 crc restorecon[4694]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hazelcast-platform-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 11 17:30:25 crc restorecon[4694]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hazelcast-platform-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 11 17:30:25 crc restorecon[4694]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hcp-terraform-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 11 17:30:25 crc restorecon[4694]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hcp-terraform-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 11 17:30:25 crc restorecon[4694]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hpe-ezmeral-csi-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 11 17:30:25 crc restorecon[4694]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hpe-ezmeral-csi-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 11 17:30:25 crc restorecon[4694]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-application-gateway-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 11 17:30:25 crc restorecon[4694]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-application-gateway-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 11 17:30:25 crc restorecon[4694]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-block-csi-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 11 17:30:25 crc restorecon[4694]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-block-csi-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 11 17:30:25 crc restorecon[4694]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-security-verify-access-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 11 17:30:25 crc restorecon[4694]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-security-verify-access-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 11 17:30:25 crc restorecon[4694]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-security-verify-directory-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 11 17:30:25 crc restorecon[4694]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-security-verify-directory-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 11 17:30:25 crc restorecon[4694]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-security-verify-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 11 17:30:25 crc restorecon[4694]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-security-verify-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 11 17:30:25 crc restorecon[4694]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infoscale-dr-manager not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 11 17:30:25 crc restorecon[4694]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infoscale-dr-manager/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 11 17:30:25 crc restorecon[4694]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infoscale-licensing-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 11 17:30:25 crc restorecon[4694]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infoscale-licensing-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 11 17:30:25 crc restorecon[4694]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infoscale-sds-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 11 17:30:25 crc restorecon[4694]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infoscale-sds-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 11 17:30:25 crc restorecon[4694]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infrastructure-asset-orchestrator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 11 17:30:25 crc restorecon[4694]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infrastructure-asset-orchestrator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 11 17:30:25 crc restorecon[4694]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/instana-agent-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 11 17:30:25 crc restorecon[4694]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/instana-agent-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 11 17:30:25 crc restorecon[4694]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/intel-device-plugins-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 11 17:30:25 crc restorecon[4694]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/intel-device-plugins-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 11 17:30:25 crc restorecon[4694]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/intel-kubernetes-power-manager not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 11 17:30:25 crc restorecon[4694]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/intel-kubernetes-power-manager/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 11 17:30:25 crc restorecon[4694]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/iomesh-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 11 17:30:25 crc restorecon[4694]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/iomesh-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 11 17:30:25 crc restorecon[4694]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 11 17:30:25 crc restorecon[4694]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 11 17:30:25 crc restorecon[4694]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx8-openshift-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 11 17:30:25 crc restorecon[4694]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx8-openshift-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 11 17:30:25 crc restorecon[4694]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx8-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 11 17:30:25 crc restorecon[4694]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx8-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 11 17:30:25 crc restorecon[4694]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k8s-triliovault not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 11 17:30:25 crc restorecon[4694]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k8s-triliovault/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 11 17:30:25 crc restorecon[4694]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-ati-updates not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 11 17:30:25 crc restorecon[4694]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-ati-updates/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 11 17:30:25 crc restorecon[4694]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-kcos-framework not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 11 17:30:25 crc restorecon[4694]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-kcos-framework/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 11 17:30:25 crc restorecon[4694]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-kcos-ingress not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 11 17:30:25 crc restorecon[4694]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-kcos-ingress/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 11 17:30:25 crc restorecon[4694]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-kcos-licensing not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 11 17:30:25 crc restorecon[4694]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-kcos-licensing/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 11 17:30:25 crc restorecon[4694]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-kcos-sso not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 11 17:30:25 crc restorecon[4694]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-kcos-sso/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 11 17:30:25 crc restorecon[4694]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-keycloak-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 11 17:30:25 crc restorecon[4694]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-keycloak-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 11 17:30:25 crc restorecon[4694]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-load-core not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 11 17:30:25 crc restorecon[4694]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-load-core/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 11 17:30:25 crc restorecon[4694]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-loadcore-agents not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 11 17:30:25 crc restorecon[4694]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-loadcore-agents/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 11 17:30:25 crc restorecon[4694]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-nats-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 11 17:30:25 crc restorecon[4694]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-nats-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 11 17:30:25 crc restorecon[4694]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-nimbusmosaic-dusim not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 11 17:30:25 crc restorecon[4694]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-nimbusmosaic-dusim/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 11 17:30:25 crc restorecon[4694]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-rest-api-browser-v1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 11 17:30:25 crc restorecon[4694]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-rest-api-browser-v1/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 11 17:30:25 crc restorecon[4694]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-appsec not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 11 17:30:25 crc restorecon[4694]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-appsec/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 11 17:30:25 crc restorecon[4694]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-core not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 11 17:30:25 crc restorecon[4694]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-core/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 11 17:30:25 crc restorecon[4694]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-db not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 11 17:30:25 crc restorecon[4694]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-db/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 11 17:30:25 crc restorecon[4694]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-diagnostics not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 11 17:30:25 crc restorecon[4694]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-diagnostics/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 11 17:30:25 crc restorecon[4694]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-logging not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 11 17:30:25 crc restorecon[4694]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-logging/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 11 17:30:25 crc restorecon[4694]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-migration not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 11 17:30:25 crc restorecon[4694]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-migration/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 11 17:30:25 crc restorecon[4694]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-msg-broker not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 11 17:30:25 crc restorecon[4694]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-msg-broker/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 11 17:30:25 crc restorecon[4694]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-notifications not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 11 17:30:25 crc restorecon[4694]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-notifications/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 11 17:30:25 crc restorecon[4694]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-stats-dashboards not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 11 17:30:25 crc restorecon[4694]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-stats-dashboards/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 11 17:30:25 crc restorecon[4694]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-storage not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 11 17:30:25 crc restorecon[4694]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-storage/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 11 17:30:25 crc restorecon[4694]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-test-core not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 11 17:30:25 crc restorecon[4694]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-test-core/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 11 17:30:25 crc restorecon[4694]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-ui not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 11 17:30:25 crc restorecon[4694]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-ui/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 11 17:30:25 crc restorecon[4694]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-websocket-service not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 11 17:30:25 crc restorecon[4694]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-websocket-service/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 11 17:30:25 crc restorecon[4694]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kong-gateway-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 11 17:30:25 crc restorecon[4694]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kong-gateway-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 11 17:30:25 crc restorecon[4694]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubearmor-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 11 17:30:25 crc restorecon[4694]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubearmor-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 11 17:30:25 crc restorecon[4694]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubecost-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 11 17:30:25 crc restorecon[4694]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubecost-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 11 17:30:25 crc restorecon[4694]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubemq-operator-marketplace not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 11 17:30:25 crc restorecon[4694]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubemq-operator-marketplace/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 11 17:30:25 crc restorecon[4694]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubeturbo-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 11 17:30:25 crc restorecon[4694]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubeturbo-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 11 17:30:25 crc restorecon[4694]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/lenovo-locd-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 11 17:30:25 crc restorecon[4694]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/lenovo-locd-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 11 17:30:25 crc restorecon[4694]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/marketplace-games-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 11 17:30:25 crc restorecon[4694]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/marketplace-games-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 11 17:30:25 crc restorecon[4694]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/memcached-operator-ogaye not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 11 17:30:25 crc restorecon[4694]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/memcached-operator-ogaye/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 11 17:30:25 crc restorecon[4694]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/memory-machine-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 11 17:30:25 crc restorecon[4694]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/memory-machine-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 11 17:30:25 crc restorecon[4694]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/model-builder-for-vision-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 11 17:30:25 crc restorecon[4694]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/model-builder-for-vision-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 11 17:30:25 crc restorecon[4694]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mongodb-atlas-kubernetes not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 11 17:30:25 crc restorecon[4694]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mongodb-atlas-kubernetes/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 11 17:30:25 crc restorecon[4694]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mongodb-enterprise not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 11 17:30:25 crc restorecon[4694]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mongodb-enterprise/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 11 17:30:25 crc restorecon[4694]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netapp-spark-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 11 17:30:25 crc restorecon[4694]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netapp-spark-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 11 17:30:25 crc restorecon[4694]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netscaler-adm-agent-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 11 17:30:25 crc restorecon[4694]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netscaler-adm-agent-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 11 17:30:25 crc restorecon[4694]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netscaler-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 11 17:30:25 crc restorecon[4694]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netscaler-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 11 17:30:25 crc restorecon[4694]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/neuvector-certified-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 11 17:30:25 crc restorecon[4694]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/neuvector-certified-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 11 17:30:25 crc restorecon[4694]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nexus-repository-ha-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 11 17:30:25 crc restorecon[4694]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nexus-repository-ha-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 11 17:30:25 crc restorecon[4694]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nginx-ingress-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 11 17:30:25 crc restorecon[4694]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nginx-ingress-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 11 17:30:25 crc restorecon[4694]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pcc-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 11 17:30:25 crc restorecon[4694]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pcc-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 11 17:30:25 crc restorecon[4694]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nim-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 11 17:30:25 crc restorecon[4694]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nim-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 11 17:30:25 crc restorecon[4694]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nxiq-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 11 17:30:25 crc restorecon[4694]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nxiq-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 11 17:30:25 crc restorecon[4694]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nxrm-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 11 17:30:25 crc restorecon[4694]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nxrm-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 11 17:30:25 crc restorecon[4694]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odigos-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 11 17:30:25 crc restorecon[4694]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odigos-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 11 17:30:25 crc restorecon[4694]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/open-liberty-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 11 17:30:25 crc restorecon[4694]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/open-liberty-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 11 17:30:25 crc restorecon[4694]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshiftartifactoryha-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 11 17:30:25 crc restorecon[4694]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshiftartifactoryha-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 11 17:30:25 crc restorecon[4694]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshiftxray-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 11 17:30:25 crc restorecon[4694]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshiftxray-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 11 17:30:25 crc restorecon[4694]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/operator-certification-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 11 17:30:25 crc restorecon[4694]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/operator-certification-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 11 17:30:25 crc restorecon[4694]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ovms-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 11 17:30:25 crc restorecon[4694]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ovms-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 11 17:30:25 crc restorecon[4694]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pachyderm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 11 17:30:25 crc restorecon[4694]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pachyderm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 11 17:30:25 crc restorecon[4694]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pmem-csi-operator-os not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 11 17:30:25 crc restorecon[4694]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pmem-csi-operator-os/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 11 17:30:25 crc restorecon[4694]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/portworx-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 11 17:30:25 crc restorecon[4694]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/portworx-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 11 17:30:25 crc restorecon[4694]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/prometurbo-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 11 17:30:25 crc restorecon[4694]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/prometurbo-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 11 17:30:25 crc restorecon[4694]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pubsubplus-eventbroker-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 11 17:30:25 crc restorecon[4694]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pubsubplus-eventbroker-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 11 17:30:25 crc restorecon[4694]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/redis-enterprise-operator-cert not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 11 17:30:25 crc restorecon[4694]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/redis-enterprise-operator-cert/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 11 17:30:25 crc restorecon[4694]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/runtime-component-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 11 17:30:25 crc restorecon[4694]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/runtime-component-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 11 17:30:25 crc restorecon[4694]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/runtime-fabric-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 11 17:30:25 crc restorecon[4694]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/runtime-fabric-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 11 17:30:25 crc restorecon[4694]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sanstoragecsi-operator-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 11 17:30:25 crc restorecon[4694]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sanstoragecsi-operator-bundle/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 11 17:30:25 crc restorecon[4694]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/silicom-sts-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 11 17:30:25 crc restorecon[4694]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/silicom-sts-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 11 17:30:25 crc restorecon[4694]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/smilecdr-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 11 17:30:25 crc restorecon[4694]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/smilecdr-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 11 17:30:25 crc restorecon[4694]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sriov-fec not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 11 17:30:25 crc restorecon[4694]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sriov-fec/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 11 17:30:25 crc restorecon[4694]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/stackable-commons-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 11 17:30:25 crc restorecon[4694]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/stackable-commons-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 11 17:30:25 crc restorecon[4694]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/stackable-zookeeper-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 11 17:30:25 crc restorecon[4694]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/stackable-zookeeper-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 11 17:30:25 crc restorecon[4694]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/t8c-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 11 17:30:25 crc restorecon[4694]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/t8c-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 11 17:30:25 crc restorecon[4694]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/t8c-tsc-client-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 11 17:30:25 crc restorecon[4694]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/t8c-tsc-client-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 11 17:30:25 crc restorecon[4694]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tawon-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 11 17:30:25 crc restorecon[4694]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tawon-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 11 17:30:25 crc restorecon[4694]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tigera-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 11 17:30:25 crc restorecon[4694]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tigera-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 11 17:30:25 crc restorecon[4694]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/timemachine-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 11 17:30:25 crc restorecon[4694]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/timemachine-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 11 17:30:25 crc restorecon[4694]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/vault-secrets-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 11 17:30:25 crc restorecon[4694]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/vault-secrets-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 11 17:30:25 crc restorecon[4694]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/vcp-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 11 17:30:25 crc restorecon[4694]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/vcp-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 11 17:30:25 crc restorecon[4694]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/webotx-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 11 17:30:25 crc restorecon[4694]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/webotx-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 11 17:30:25 crc restorecon[4694]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/xcrypt-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 11 17:30:25 crc restorecon[4694]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/xcrypt-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 11 17:30:25 crc restorecon[4694]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/zabbix-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 11 17:30:25 crc restorecon[4694]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/zabbix-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 11 17:30:25 crc restorecon[4694]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 11 17:30:25 crc restorecon[4694]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 11 17:30:25 crc restorecon[4694]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 11 17:30:25 crc restorecon[4694]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/00000-1.psg not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 11 17:30:25 crc restorecon[4694]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/00000-1.psg.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 11 17:30:25 crc restorecon[4694]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/db.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 11 17:30:25 crc restorecon[4694]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/index.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 11 17:30:25 crc restorecon[4694]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/main.pix not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 11 17:30:25 crc restorecon[4694]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/overflow.pix not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 11 17:30:25 crc restorecon[4694]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/digest not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 11 17:30:25 crc restorecon[4694]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/utilities not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 11 17:30:25 crc restorecon[4694]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/utilities/copy-content not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 11 17:30:25 crc restorecon[4694]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 11 17:30:25 crc restorecon[4694]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/extract-utilities/63709497 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 11 17:30:25 crc restorecon[4694]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/extract-utilities/d966b7fd not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 11 17:30:25 crc restorecon[4694]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/extract-utilities/f5773757 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 11 17:30:25 crc restorecon[4694]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/extract-content/81c9edb9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 11 17:30:25 crc restorecon[4694]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/extract-content/57bf57ee not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 11 17:30:25 crc restorecon[4694]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/extract-content/86f5e6aa not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 11 17:30:25 crc restorecon[4694]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/registry-server/0aabe31d not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 11 17:30:25 crc restorecon[4694]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/registry-server/d2af85c2 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 11 17:30:25 crc restorecon[4694]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/registry-server/09d157d9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 11 17:30:25 crc restorecon[4694]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 11 17:30:25 crc restorecon[4694]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 11 17:30:25 crc restorecon[4694]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 11 17:30:25 crc restorecon[4694]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 11 17:30:25 crc restorecon[4694]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/00000-1.psg not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 11 17:30:25 crc restorecon[4694]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/00000-1.psg.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 11 17:30:25 crc restorecon[4694]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/db.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 11 17:30:25 crc restorecon[4694]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/index.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 11 17:30:25 crc restorecon[4694]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/main.pix not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 11 17:30:25 crc restorecon[4694]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/overflow.pix not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 11 17:30:25 crc restorecon[4694]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/digest not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 11 17:30:25 crc restorecon[4694]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 11 17:30:25 crc restorecon[4694]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/3scale-community-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 11 17:30:25 crc restorecon[4694]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/3scale-community-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 11 17:30:25 crc restorecon[4694]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-acm-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 11 17:30:25 crc restorecon[4694]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-acm-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 11 17:30:25 crc restorecon[4694]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-acmpca-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 11 17:30:25 crc restorecon[4694]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-acmpca-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 11 17:30:25 crc restorecon[4694]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-apigateway-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 11 17:30:25 crc restorecon[4694]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-apigateway-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 11 17:30:25 crc restorecon[4694]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-apigatewayv2-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 11 17:30:25 crc restorecon[4694]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-apigatewayv2-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 11 17:30:25 crc restorecon[4694]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-applicationautoscaling-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 11 17:30:25 crc restorecon[4694]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-applicationautoscaling-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 11 17:30:25 crc restorecon[4694]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-athena-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 11 17:30:25 crc restorecon[4694]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-athena-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 11 17:30:25 crc restorecon[4694]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-cloudfront-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 11 17:30:25 crc restorecon[4694]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-cloudfront-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 11 17:30:25 crc restorecon[4694]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-cloudtrail-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 11 17:30:25 crc restorecon[4694]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-cloudtrail-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 11 17:30:25 crc restorecon[4694]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-cloudwatch-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 11 17:30:25 crc restorecon[4694]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-cloudwatch-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 11 17:30:25 crc restorecon[4694]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-cloudwatchlogs-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 11 17:30:25 crc restorecon[4694]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-cloudwatchlogs-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 11 17:30:25 crc restorecon[4694]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-documentdb-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 11 17:30:25 crc restorecon[4694]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-documentdb-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 11 17:30:25 crc restorecon[4694]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-dynamodb-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 11 17:30:25 crc restorecon[4694]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-dynamodb-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 11 17:30:25 crc restorecon[4694]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ec2-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 11 17:30:25 crc restorecon[4694]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ec2-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 11 17:30:25 crc restorecon[4694]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ecr-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 11 17:30:25 crc restorecon[4694]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ecr-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 11 17:30:25 crc restorecon[4694]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ecs-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 11 17:30:25 crc restorecon[4694]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ecs-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 11 17:30:25 crc restorecon[4694]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-efs-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 11 17:30:25 crc restorecon[4694]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-efs-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 11 17:30:25 crc restorecon[4694]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-eks-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 11 17:30:25 crc restorecon[4694]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-eks-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 11 17:30:25 crc restorecon[4694]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-elasticache-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 11 17:30:25 crc restorecon[4694]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-elasticache-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 11 17:30:25 crc restorecon[4694]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-elbv2-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 11 17:30:25 crc restorecon[4694]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-elbv2-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 11 17:30:25 crc restorecon[4694]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-emrcontainers-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 11 17:30:25 crc restorecon[4694]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-emrcontainers-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 11 17:30:25 crc restorecon[4694]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-eventbridge-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 11 17:30:25 crc restorecon[4694]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-eventbridge-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 11 17:30:25 crc restorecon[4694]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-iam-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 11 17:30:25 crc restorecon[4694]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-iam-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 11 17:30:25 crc restorecon[4694]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-kafka-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 11 17:30:25 crc restorecon[4694]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-kafka-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 11 17:30:25 crc restorecon[4694]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-keyspaces-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 11 17:30:25 crc restorecon[4694]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-keyspaces-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 11 17:30:25 crc restorecon[4694]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-kinesis-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 11 17:30:25 crc restorecon[4694]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-kinesis-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 11 17:30:25 crc restorecon[4694]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-kms-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 11 17:30:25 crc restorecon[4694]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-kms-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 11 17:30:25 crc restorecon[4694]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-lambda-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 11 17:30:25 crc restorecon[4694]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-lambda-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 11 17:30:25 crc restorecon[4694]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-memorydb-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 11 17:30:25 crc restorecon[4694]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-memorydb-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 11 17:30:25 crc restorecon[4694]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-mq-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 11 17:30:25 crc restorecon[4694]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-mq-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 11 17:30:25 crc restorecon[4694]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-networkfirewall-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 11 17:30:25 crc restorecon[4694]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-networkfirewall-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 11 17:30:25 crc restorecon[4694]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-opensearchservice-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 11 17:30:25 crc restorecon[4694]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-opensearchservice-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 11 17:30:25 crc restorecon[4694]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-organizations-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 11 17:30:25 crc restorecon[4694]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-organizations-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 11 17:30:25 crc restorecon[4694]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-pipes-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 11 17:30:25 crc restorecon[4694]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-pipes-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 11 17:30:25 crc restorecon[4694]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-prometheusservice-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 11 17:30:25 crc restorecon[4694]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-prometheusservice-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 11 17:30:25 crc restorecon[4694]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-rds-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 11 17:30:25 crc restorecon[4694]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-rds-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 11 17:30:25 crc restorecon[4694]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-recyclebin-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 11 17:30:25 crc restorecon[4694]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-recyclebin-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 11 17:30:25 crc restorecon[4694]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-route53-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 11 17:30:25 crc restorecon[4694]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-route53-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 11 17:30:25 crc restorecon[4694]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-route53resolver-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 11 17:30:25 crc restorecon[4694]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-route53resolver-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 11 17:30:25 crc restorecon[4694]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-s3-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 11 17:30:25 crc restorecon[4694]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-s3-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 11 17:30:25 crc restorecon[4694]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-sagemaker-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 11 17:30:25 crc restorecon[4694]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-sagemaker-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 11 17:30:25 crc restorecon[4694]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-secretsmanager-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 11 17:30:25 crc restorecon[4694]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-secretsmanager-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 11 17:30:25 crc restorecon[4694]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ses-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 11 17:30:25 crc restorecon[4694]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ses-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 11 17:30:25 crc restorecon[4694]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-sfn-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 11 17:30:25 crc restorecon[4694]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-sfn-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 11 17:30:25 crc restorecon[4694]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-sns-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 11 17:30:25 crc restorecon[4694]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-sns-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 11 17:30:25 crc restorecon[4694]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-sqs-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 11 17:30:25 crc restorecon[4694]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-sqs-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 11 17:30:25 crc restorecon[4694]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ssm-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 11 17:30:25 crc restorecon[4694]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ssm-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 11 17:30:25 crc restorecon[4694]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-wafv2-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 11 17:30:25 crc restorecon[4694]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-wafv2-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 11 17:30:25 crc restorecon[4694]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aerospike-kubernetes-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 11 17:30:25 crc restorecon[4694]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aerospike-kubernetes-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 11 17:30:25 crc restorecon[4694]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/airflow-helm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 11 17:30:25 crc restorecon[4694]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/airflow-helm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 11 17:30:25 crc restorecon[4694]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/alloydb-omni-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 11 17:30:25 crc restorecon[4694]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/alloydb-omni-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 11 17:30:25 crc restorecon[4694]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/alvearie-imaging-ingestion not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 11 17:30:25 crc restorecon[4694]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/alvearie-imaging-ingestion/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 11 17:30:25 crc restorecon[4694]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amd-gpu-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 11 17:30:25 crc restorecon[4694]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amd-gpu-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 11 17:30:25 crc restorecon[4694]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/analytics-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 11 17:30:25 crc restorecon[4694]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/analytics-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 11 17:30:25 crc restorecon[4694]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/annotationlab not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 11 17:30:25 crc restorecon[4694]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/annotationlab/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 11 17:30:25 crc restorecon[4694]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicast-community-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 11 17:30:25 crc restorecon[4694]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicast-community-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 11 17:30:25 crc restorecon[4694]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicurio-api-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 11 17:30:25 crc restorecon[4694]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicurio-api-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 11 17:30:25 crc restorecon[4694]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicurio-registry not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 11 17:30:25 crc restorecon[4694]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicurio-registry/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 11 17:30:25 crc restorecon[4694]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicurito not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 11 17:30:25 crc restorecon[4694]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicurito/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 11 17:30:25 crc restorecon[4694]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apimatic-kubernetes-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 11 17:30:25 crc restorecon[4694]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apimatic-kubernetes-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 11 17:30:25 crc restorecon[4694]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/application-services-metering-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 11 17:30:25 crc restorecon[4694]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/application-services-metering-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 11 17:30:25 crc restorecon[4694]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aqua not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 11 17:30:25 crc restorecon[4694]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aqua/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 11 17:30:25 crc restorecon[4694]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/argocd-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 11 17:30:25 crc restorecon[4694]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/argocd-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 11 17:30:25 crc restorecon[4694]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/assisted-service-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 11 17:30:25 crc restorecon[4694]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/assisted-service-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 11 17:30:25 crc restorecon[4694]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/authorino-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 11 17:30:25 crc restorecon[4694]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/authorino-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 11 17:30:25 crc restorecon[4694]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/automotive-infra not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 11 17:30:25 crc restorecon[4694]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/automotive-infra/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 11 17:30:25 crc restorecon[4694]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aws-efs-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 11 17:30:25 crc restorecon[4694]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aws-efs-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 11 17:30:25 crc restorecon[4694]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/awss3-operator-registry not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 11 17:30:25 crc restorecon[4694]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/awss3-operator-registry/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 11 17:30:25 crc restorecon[4694]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/azure-service-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 11 17:30:25 crc restorecon[4694]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/azure-service-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 11 17:30:25 crc restorecon[4694]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/beegfs-csi-driver-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 11 17:30:25 crc restorecon[4694]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/beegfs-csi-driver-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 11 17:30:25 crc restorecon[4694]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/bpfman-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 11 17:30:25 crc restorecon[4694]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/bpfman-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 11 17:30:25 crc restorecon[4694]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/camel-k not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 11 17:30:25 crc restorecon[4694]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/camel-k/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 11 17:30:25 crc restorecon[4694]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/camel-karavan-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 11 17:30:25 crc restorecon[4694]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/camel-karavan-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 11 17:30:25 crc restorecon[4694]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cass-operator-community not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 11 17:30:25 crc restorecon[4694]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cass-operator-community/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 11 17:30:25 crc restorecon[4694]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cert-manager not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 11 17:30:25 crc restorecon[4694]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cert-manager/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 11 17:30:25 crc restorecon[4694]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cert-utils-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 11 17:30:25 crc restorecon[4694]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cert-utils-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 11 17:30:25 crc restorecon[4694]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-aas-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 11 17:30:25 crc restorecon[4694]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-aas-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 11 17:30:25 crc restorecon[4694]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-impairment-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 11 17:30:25 crc restorecon[4694]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-impairment-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 11 17:30:25 crc restorecon[4694]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-manager not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 11 17:30:25 crc restorecon[4694]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-manager/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 11 17:30:25 crc restorecon[4694]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cockroachdb not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 11 17:30:25 crc restorecon[4694]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cockroachdb/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 11 17:30:25 crc restorecon[4694]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/codeflare-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 11 17:30:25 crc restorecon[4694]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/codeflare-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 11 17:30:25 crc restorecon[4694]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/community-kubevirt-hyperconverged not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 11 17:30:25 crc restorecon[4694]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/community-kubevirt-hyperconverged/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 11 17:30:25 crc restorecon[4694]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/community-trivy-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 11 17:30:25 crc restorecon[4694]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/community-trivy-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 11 17:30:25 crc restorecon[4694]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/community-windows-machine-config-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 11 17:30:25 crc restorecon[4694]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/community-windows-machine-config-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 11 17:30:25 crc restorecon[4694]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/customized-user-remediation not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 11 17:30:25 crc restorecon[4694]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/customized-user-remediation/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 11 17:30:25 crc restorecon[4694]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cxl-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 11 17:30:25 crc restorecon[4694]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cxl-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 11 17:30:25 crc restorecon[4694]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dapr-kubernetes-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 11 17:30:25 crc restorecon[4694]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dapr-kubernetes-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 11 17:30:25 crc restorecon[4694]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datadog-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 11 17:30:25 crc restorecon[4694]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datadog-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 11 17:30:25 crc restorecon[4694]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datatrucker-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 11 17:30:25 crc restorecon[4694]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datatrucker-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 11 17:30:25 crc restorecon[4694]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dbaas-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 11 17:30:25 crc restorecon[4694]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dbaas-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 11 17:30:25 crc restorecon[4694]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/debezium-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 11 17:30:25 crc restorecon[4694]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/debezium-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 11 17:30:25 crc restorecon[4694]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dell-csm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 11 17:30:25 crc restorecon[4694]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dell-csm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 11 17:30:25 crc restorecon[4694]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/deployment-validation-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 11 17:30:25 crc restorecon[4694]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/deployment-validation-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 11 17:30:25 crc restorecon[4694]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/devopsinabox not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 11 17:30:25 crc restorecon[4694]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/devopsinabox/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 11 17:30:25 crc restorecon[4694]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dns-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 11 17:30:25 crc restorecon[4694]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dns-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 11 17:30:25 crc restorecon[4694]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dynatrace-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 11 17:30:25 crc restorecon[4694]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dynatrace-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 11 17:30:25 crc restorecon[4694]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eclipse-amlen-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 11 17:30:25 crc restorecon[4694]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eclipse-amlen-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 11 17:30:25 crc restorecon[4694]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eclipse-che not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 11 17:30:25 crc restorecon[4694]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eclipse-che/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 11 17:30:25 crc restorecon[4694]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ecr-secret-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 11 17:30:25 crc restorecon[4694]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ecr-secret-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 11 17:30:25 crc restorecon[4694]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/edp-keycloak-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 11 17:30:25 crc restorecon[4694]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/edp-keycloak-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 11 17:30:25 crc restorecon[4694]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eginnovations-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 11 17:30:25 crc restorecon[4694]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eginnovations-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 11 17:30:25 crc restorecon[4694]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/egressip-ipam-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 11 17:30:25 crc restorecon[4694]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/egressip-ipam-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 11 17:30:25 crc restorecon[4694]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ember-csi-community-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 11 17:30:25 crc restorecon[4694]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ember-csi-community-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 11 17:30:25 crc restorecon[4694]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/etcd not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 11 17:30:25 crc restorecon[4694]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/etcd/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 11 17:30:25 crc restorecon[4694]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eventing-kogito not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 11 17:30:25 crc restorecon[4694]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eventing-kogito/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 11 17:30:25 crc restorecon[4694]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/external-secrets-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 11 17:30:25 crc restorecon[4694]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/external-secrets-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 11 17:30:25 crc restorecon[4694]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/falcon-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 11 17:30:25 crc restorecon[4694]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/falcon-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 11 17:30:25 crc restorecon[4694]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fence-agents-remediation not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 11 17:30:25 crc restorecon[4694]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fence-agents-remediation/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 11 17:30:25 crc restorecon[4694]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/flink-kubernetes-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 11 17:30:25 crc restorecon[4694]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/flink-kubernetes-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 11 17:30:25 crc restorecon[4694]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/flux not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 11 17:30:25 crc restorecon[4694]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/flux/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 11 17:30:25 crc restorecon[4694]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k8gb not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 11 17:30:25 crc restorecon[4694]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k8gb/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 11 17:30:25 crc restorecon[4694]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fossul-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 11 17:30:25 crc restorecon[4694]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fossul-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 11 17:30:25 crc restorecon[4694]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/github-arc-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 11 17:30:25 crc restorecon[4694]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/github-arc-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 11 17:30:25 crc restorecon[4694]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/gitops-primer not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 11 17:30:25 crc restorecon[4694]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/gitops-primer/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 11 17:30:25 crc restorecon[4694]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/gitwebhook-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 11 17:30:25 crc restorecon[4694]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/gitwebhook-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 11 17:30:25 crc restorecon[4694]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/global-load-balancer-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 11 17:30:25 crc restorecon[4694]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/global-load-balancer-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 11 17:30:25 crc restorecon[4694]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/grafana-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 11 17:30:25 crc restorecon[4694]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/grafana-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 11 17:30:25 crc restorecon[4694]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/group-sync-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 11 17:30:25 crc restorecon[4694]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/group-sync-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 11 17:30:25 crc restorecon[4694]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hawtio-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 11 17:30:25 crc restorecon[4694]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hawtio-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 11 17:30:25 crc restorecon[4694]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hazelcast-platform-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 11 17:30:25 crc restorecon[4694]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hazelcast-platform-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 11 17:30:25 crc restorecon[4694]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hedvig-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 11 17:30:25 crc restorecon[4694]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hedvig-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 11 17:30:25 crc restorecon[4694]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hive-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 11 17:30:25 crc restorecon[4694]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hive-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 11 17:30:25 crc restorecon[4694]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/horreum-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 11 17:30:25 crc restorecon[4694]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/horreum-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 11 17:30:25 crc restorecon[4694]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hyperfoil-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 11 17:30:25 crc restorecon[4694]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hyperfoil-bundle/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 11 17:30:25 crc restorecon[4694]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-block-csi-operator-community not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 11 17:30:25 crc restorecon[4694]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-block-csi-operator-community/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 11 17:30:25 crc restorecon[4694]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-security-verify-access-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 11 17:30:25 crc restorecon[4694]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-security-verify-access-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 11 17:30:25 crc restorecon[4694]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-spectrum-scale-csi-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 11 17:30:25 crc restorecon[4694]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-spectrum-scale-csi-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 11 17:30:25 crc restorecon[4694]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibmcloud-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 11 17:30:25 crc restorecon[4694]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibmcloud-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 11 17:30:25 crc restorecon[4694]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infinispan not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 11 17:30:25 crc restorecon[4694]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infinispan/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 11 17:30:25 crc restorecon[4694]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/integrity-shield-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 11 17:30:25 crc restorecon[4694]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/integrity-shield-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 11 17:30:25 crc restorecon[4694]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ipfs-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 11 17:30:25 crc restorecon[4694]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ipfs-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 11 17:30:25 crc restorecon[4694]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/istio-workspace-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 11 17:30:25 crc restorecon[4694]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/istio-workspace-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 11 17:30:25 crc restorecon[4694]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/jaeger not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 11 17:30:25 crc restorecon[4694]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/jaeger/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 11 17:30:25 crc restorecon[4694]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kaoto-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 11 17:30:25 crc restorecon[4694]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kaoto-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 11 17:30:25 crc restorecon[4694]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keda not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 11 17:30:25 crc restorecon[4694]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keda/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 11 17:30:25 crc restorecon[4694]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keepalived-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 11 17:30:25 crc restorecon[4694]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keepalived-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 11 17:30:25 crc restorecon[4694]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keycloak-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 11 17:30:25 crc restorecon[4694]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keycloak-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 11 17:30:25 crc restorecon[4694]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keycloak-permissions-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 11 17:30:25 crc restorecon[4694]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keycloak-permissions-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 11 17:30:25 crc restorecon[4694]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/klusterlet not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 11 17:30:25 crc restorecon[4694]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/klusterlet/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 11 17:30:25 crc restorecon[4694]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kogito-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 11 17:30:25 crc restorecon[4694]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kogito-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 11 17:30:25 crc restorecon[4694]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/koku-metrics-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 11 17:30:25 crc restorecon[4694]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/koku-metrics-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 11 17:30:25 crc restorecon[4694]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/konveyor-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 11 17:30:25 crc restorecon[4694]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/konveyor-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 11 17:30:25 crc restorecon[4694]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/korrel8r not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 11 17:30:25 crc restorecon[4694]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/korrel8r/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 11 17:30:25 crc restorecon[4694]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kuadrant-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 11 17:30:25 crc restorecon[4694]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kuadrant-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 11 17:30:25 crc restorecon[4694]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kube-green not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 11 17:30:25 crc restorecon[4694]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kube-green/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 11 17:30:25 crc restorecon[4694]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubecost not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 11 17:30:25 crc restorecon[4694]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubecost/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 11 17:30:25 crc restorecon[4694]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubernetes-imagepuller-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 11 17:30:25 crc restorecon[4694]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubernetes-imagepuller-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 11 17:30:25 crc restorecon[4694]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubeturbo not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 11 17:30:25 crc restorecon[4694]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubeturbo/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 11 17:30:25 crc restorecon[4694]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/l5-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 11 17:30:25 crc restorecon[4694]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/l5-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 11 17:30:25 crc restorecon[4694]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/layer7-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 11 17:30:25 crc restorecon[4694]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/layer7-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 11 17:30:25 crc restorecon[4694]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/lbconfig-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 11 17:30:25 crc restorecon[4694]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/lbconfig-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 11 17:30:25 crc restorecon[4694]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/lib-bucket-provisioner not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 11 17:30:25 crc restorecon[4694]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/lib-bucket-provisioner/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 11 17:30:25 crc restorecon[4694]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/limitador-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 11 17:30:25 crc restorecon[4694]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/limitador-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 11 17:30:25 crc restorecon[4694]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/logging-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 11 17:30:25 crc restorecon[4694]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/logging-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 11 17:30:25 crc restorecon[4694]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/loki-helm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 11 17:30:25 crc restorecon[4694]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/loki-helm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 11 17:30:25 crc restorecon[4694]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/loki-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 11 17:30:25 crc restorecon[4694]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/loki-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 11 17:30:25 crc restorecon[4694]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/machine-deletion-remediation not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 11 17:30:25 crc restorecon[4694]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/machine-deletion-remediation/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 11 17:30:25 crc restorecon[4694]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mariadb-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 11 17:30:25 crc restorecon[4694]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mariadb-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 11 17:30:25 crc restorecon[4694]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/marin3r not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 11 17:30:25 crc restorecon[4694]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/marin3r/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 11 17:30:25 crc restorecon[4694]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mercury-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 11 17:30:25 crc restorecon[4694]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mercury-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 11 17:30:25 crc restorecon[4694]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/microcks not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 11 17:30:25 crc restorecon[4694]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/microcks/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 11 17:30:25 crc restorecon[4694]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mongodb-atlas-kubernetes not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 11 17:30:25 crc restorecon[4694]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mongodb-atlas-kubernetes/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 11 17:30:25 crc restorecon[4694]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mongodb-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 11 17:30:25 crc restorecon[4694]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mongodb-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 11 17:30:25 crc restorecon[4694]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/move2kube-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 11 17:30:25 crc restorecon[4694]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/move2kube-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 11 17:30:25 crc restorecon[4694]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/multi-nic-cni-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 11 17:30:25 crc restorecon[4694]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/multi-nic-cni-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 11 17:30:25 crc restorecon[4694]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/multicluster-global-hub-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 11 17:30:25 crc restorecon[4694]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/multicluster-global-hub-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 11 17:30:25 crc restorecon[4694]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/multicluster-operators-subscription not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 11 17:30:25 crc restorecon[4694]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/multicluster-operators-subscription/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 11 17:30:25 crc restorecon[4694]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/must-gather-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 11 17:30:25 crc restorecon[4694]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/must-gather-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 11 17:30:25 crc restorecon[4694]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/namespace-configuration-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 11 17:30:25 crc restorecon[4694]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/namespace-configuration-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 11 17:30:25 crc restorecon[4694]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ncn-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 11 17:30:25 crc restorecon[4694]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ncn-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 11 17:30:25 crc restorecon[4694]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ndmspc-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 11 17:30:25 crc restorecon[4694]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ndmspc-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 11 17:30:25 crc restorecon[4694]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netobserv-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 11 17:30:25 crc restorecon[4694]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netobserv-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 11 17:30:25 crc restorecon[4694]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/neuvector-community-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 11 17:30:25 crc restorecon[4694]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/neuvector-community-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 11 17:30:25 crc restorecon[4694]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nexus-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 11 17:30:25 crc restorecon[4694]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nexus-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 11 17:30:25 crc restorecon[4694]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nexus-operator-m88i not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 11 17:30:25 crc restorecon[4694]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nexus-operator-m88i/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 11 17:30:25 crc restorecon[4694]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nfs-provisioner-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 11 17:30:25 crc restorecon[4694]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nfs-provisioner-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 11 17:30:25 crc restorecon[4694]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nlp-server not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 11 17:30:25 crc restorecon[4694]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nlp-server/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 11 17:30:25 crc restorecon[4694]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-discovery-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 11 17:30:25 crc restorecon[4694]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-discovery-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 11 17:30:25 crc restorecon[4694]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-healthcheck-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 11 17:30:25 crc restorecon[4694]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-healthcheck-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 11 17:30:25 crc restorecon[4694]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-maintenance-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 11 17:30:25 crc restorecon[4694]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-maintenance-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 11 17:30:25 crc restorecon[4694]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nsm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 11 17:30:25 crc restorecon[4694]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nsm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 11 17:30:25 crc restorecon[4694]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/oadp-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 11 17:30:25 crc restorecon[4694]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/oadp-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 11 17:30:25 crc restorecon[4694]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/observability-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 11 17:30:25 crc restorecon[4694]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/observability-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 11 17:30:25 crc restorecon[4694]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/oci-ccm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 11 17:30:25 crc restorecon[4694]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/oci-ccm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 11 17:30:25 crc restorecon[4694]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ocm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 11 17:30:25 crc restorecon[4694]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ocm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 11 17:30:25 crc restorecon[4694]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odoo-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 11 17:30:25 crc restorecon[4694]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odoo-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 11 17:30:25 crc restorecon[4694]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/opendatahub-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 11 17:30:25 crc restorecon[4694]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/opendatahub-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 11 17:30:25 crc restorecon[4694]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openebs not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 11 17:30:25 crc restorecon[4694]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openebs/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 11 17:30:25 crc restorecon[4694]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-nfd-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 11 17:30:25 crc restorecon[4694]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-nfd-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 11 17:30:25 crc restorecon[4694]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-node-upgrade-mutex-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 11 17:30:25 crc restorecon[4694]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-node-upgrade-mutex-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 11 17:30:25 crc restorecon[4694]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-qiskit-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 11 17:30:25 crc restorecon[4694]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-qiskit-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 11 17:30:25 crc restorecon[4694]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/opentelemetry-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 11 17:30:25 crc restorecon[4694]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/opentelemetry-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 11 17:30:25 crc restorecon[4694]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/patch-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 11 17:30:25 crc restorecon[4694]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/patch-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 11 17:30:25 crc restorecon[4694]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/patterns-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 11 17:30:25 crc restorecon[4694]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/patterns-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 11 17:30:25 crc restorecon[4694]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pcc-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 11 17:30:25 crc restorecon[4694]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pcc-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 11 17:30:25 crc restorecon[4694]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pelorus-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 11 17:30:25 crc restorecon[4694]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pelorus-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 11 17:30:25 crc restorecon[4694]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/percona-xtradb-cluster-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 11 17:30:25 crc restorecon[4694]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/percona-xtradb-cluster-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 11 17:30:25 crc restorecon[4694]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/portworx-essentials not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 11 17:30:25 crc restorecon[4694]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/portworx-essentials/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 11 17:30:25 crc restorecon[4694]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/postgresql not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 11 17:30:25 crc restorecon[4694]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/postgresql/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 11 17:30:25 crc restorecon[4694]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/proactive-node-scaling-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 11 17:30:25 crc restorecon[4694]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/proactive-node-scaling-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 11 17:30:25 crc restorecon[4694]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/project-quay not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 11 17:30:25 crc restorecon[4694]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/project-quay/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 11 17:30:25 crc restorecon[4694]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/prometheus not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 11 17:30:25 crc restorecon[4694]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/prometheus/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 11 17:30:25 crc restorecon[4694]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/prometheus-exporter-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 11 17:30:25 crc restorecon[4694]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/prometheus-exporter-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 11 17:30:25 crc restorecon[4694]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/prometurbo not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 11 17:30:25 crc restorecon[4694]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/prometurbo/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 11 17:30:25 crc restorecon[4694]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pubsubplus-eventbroker-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 11 17:30:25 crc restorecon[4694]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pubsubplus-eventbroker-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 11 17:30:25 crc restorecon[4694]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pulp-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 11 17:30:25 crc restorecon[4694]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pulp-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 11 17:30:25 crc restorecon[4694]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rabbitmq-cluster-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 11 17:30:25 crc restorecon[4694]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rabbitmq-cluster-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 11 17:30:25 crc restorecon[4694]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rabbitmq-messaging-topology-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 11 17:30:25 crc restorecon[4694]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rabbitmq-messaging-topology-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 11 17:30:25 crc restorecon[4694]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/redis-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 11 17:30:25 crc restorecon[4694]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/redis-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 11 17:30:25 crc restorecon[4694]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/reportportal-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 11 17:30:25 crc restorecon[4694]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/reportportal-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 11 17:30:25 crc restorecon[4694]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/resource-locker-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 11 17:30:25 crc restorecon[4694]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/resource-locker-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 11 17:30:25 crc restorecon[4694]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhoas-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 11 17:30:25 crc restorecon[4694]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhoas-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 11 17:30:25 crc restorecon[4694]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ripsaw not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 11 17:30:25 crc restorecon[4694]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ripsaw/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 11 17:30:25 crc restorecon[4694]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sailoperator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 11 17:30:25 crc restorecon[4694]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sailoperator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 11 17:30:25 crc restorecon[4694]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sap-commerce-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 11 17:30:25 crc restorecon[4694]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sap-commerce-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 11 17:30:25 crc restorecon[4694]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sap-data-intelligence-observer-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 11 17:30:25 crc restorecon[4694]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sap-data-intelligence-observer-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 11 17:30:25 crc restorecon[4694]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sap-hana-express-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 11 17:30:25 crc restorecon[4694]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sap-hana-express-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 11 17:30:25 crc restorecon[4694]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/seldon-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 11 17:30:25 crc restorecon[4694]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/seldon-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 11 17:30:25 crc restorecon[4694]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/self-node-remediation not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 11 17:30:25 crc restorecon[4694]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/self-node-remediation/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 11 17:30:25 crc restorecon[4694]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/service-binding-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 11 17:30:25 crc restorecon[4694]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/service-binding-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 11 17:30:25 crc restorecon[4694]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/shipwright-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 11 17:30:25 crc restorecon[4694]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/shipwright-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 11 17:30:25 crc restorecon[4694]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sigstore-helm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 11 17:30:25 crc restorecon[4694]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sigstore-helm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 11 17:30:25 crc restorecon[4694]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/silicom-sts-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 11 17:30:25 crc restorecon[4694]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/silicom-sts-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 11 17:30:25 crc restorecon[4694]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/skupper-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 11 17:30:25 crc restorecon[4694]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/skupper-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 11 17:30:25 crc restorecon[4694]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/snapscheduler not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 11 17:30:25 crc restorecon[4694]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/snapscheduler/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 11 17:30:25 crc restorecon[4694]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/snyk-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 11 17:30:25 crc restorecon[4694]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/snyk-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 11 17:30:25 crc restorecon[4694]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/socmmd not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 11 17:30:25 crc restorecon[4694]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/socmmd/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 11 17:30:25 crc restorecon[4694]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sonar-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 11 17:30:25 crc restorecon[4694]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sonar-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 11 17:30:25 crc restorecon[4694]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sosivio not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 11 17:30:25 crc restorecon[4694]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sosivio/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 11 17:30:25 crc restorecon[4694]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sonataflow-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 11 17:30:25 crc restorecon[4694]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sonataflow-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 11 17:30:25 crc restorecon[4694]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sosreport-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 11 17:30:25 crc restorecon[4694]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sosreport-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 11 17:30:25 crc restorecon[4694]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/spark-helm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 11 17:30:25 crc restorecon[4694]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/spark-helm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 11 17:30:25 crc restorecon[4694]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/special-resource-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 11 17:30:25 crc restorecon[4694]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/special-resource-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 11 17:30:25 crc restorecon[4694]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/stolostron not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 11 17:30:25 crc restorecon[4694]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/stolostron/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 11 17:30:25 crc restorecon[4694]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/stolostron-engine not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 11 17:30:25 crc restorecon[4694]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/stolostron-engine/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 11 17:30:25 crc restorecon[4694]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/strimzi-kafka-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 11 17:30:25 crc restorecon[4694]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/strimzi-kafka-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 11 17:30:25 crc restorecon[4694]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/syndesis not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 11 17:30:25 crc restorecon[4694]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/syndesis/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 11 17:30:25 crc restorecon[4694]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/t8c not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 11 17:30:25 crc restorecon[4694]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/t8c/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 11 17:30:25 crc restorecon[4694]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tagger not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 11 17:30:25 crc restorecon[4694]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tagger/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 11 17:30:25 crc restorecon[4694]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tempo-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 11 17:30:25 crc restorecon[4694]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tempo-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 11 17:30:25 crc restorecon[4694]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tf-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 11 17:30:25 crc restorecon[4694]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tf-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 11 17:30:25 crc restorecon[4694]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tidb-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 11 17:30:25 crc restorecon[4694]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tidb-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 11 17:30:25 crc restorecon[4694]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/trident-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 11 17:30:25 crc restorecon[4694]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/trident-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 11 17:30:25 crc restorecon[4694]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/trustify-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 11 17:30:25 crc restorecon[4694]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/trustify-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 11 17:30:25 crc restorecon[4694]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ucs-ci-solutions-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 11 17:30:25 crc restorecon[4694]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ucs-ci-solutions-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 11 17:30:25 crc restorecon[4694]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/universal-crossplane not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 11 17:30:25 crc restorecon[4694]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/universal-crossplane/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 11 17:30:25 crc restorecon[4694]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/varnish-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 11 17:30:25 crc restorecon[4694]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/varnish-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 11 17:30:25 crc restorecon[4694]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/vault-config-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 11 17:30:25 crc restorecon[4694]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/vault-config-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 11 17:30:25 crc restorecon[4694]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/verticadb-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 11 17:30:25 crc restorecon[4694]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/verticadb-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 11 17:30:25 crc restorecon[4694]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/volume-expander-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 11 17:30:25 crc restorecon[4694]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/volume-expander-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 11 17:30:25 crc restorecon[4694]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/wandb-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 11 17:30:25 crc restorecon[4694]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/wandb-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 11 17:30:25 crc restorecon[4694]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/windup-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 11 17:30:25 crc restorecon[4694]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/windup-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 11 17:30:25 crc restorecon[4694]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/yaks not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 11 17:30:25 crc restorecon[4694]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/yaks/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 11 17:30:25 crc restorecon[4694]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/utilities not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 11 17:30:25 crc restorecon[4694]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/utilities/copy-content not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 11 17:30:25 crc restorecon[4694]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 11 17:30:25 crc restorecon[4694]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/extract-utilities/c0fe7256 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 11 17:30:25 crc restorecon[4694]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/extract-utilities/c30319e4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 11 17:30:25 crc restorecon[4694]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/extract-utilities/e6b1dd45 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 11 17:30:25 crc restorecon[4694]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/extract-content/2bb643f0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 11 17:30:25 crc restorecon[4694]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/extract-content/920de426 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 11 17:30:25 crc restorecon[4694]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/extract-content/70fa1e87 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 11 17:30:25 crc restorecon[4694]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/registry-server/a1c12a2f not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 11 17:30:25 crc restorecon[4694]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/registry-server/9442e6c7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 11 17:30:25 crc restorecon[4694]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/registry-server/5b45ec72 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 11 17:30:25 crc restorecon[4694]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 11 17:30:25 crc restorecon[4694]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 11 17:30:25 crc restorecon[4694]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/abot-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 11 17:30:25 crc restorecon[4694]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/abot-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 11 17:30:25 crc restorecon[4694]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aerospike-kubernetes-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 11 17:30:25 crc restorecon[4694]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aerospike-kubernetes-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 11 17:30:25 crc restorecon[4694]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aikit-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 11 17:30:25 crc restorecon[4694]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aikit-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 11 17:30:25 crc restorecon[4694]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzo-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 11 17:30:25 crc restorecon[4694]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzo-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 11 17:30:25 crc restorecon[4694]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzograph-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 11 17:30:25 crc restorecon[4694]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzograph-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 11 17:30:25 crc restorecon[4694]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzounstructured-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 11 17:30:25 crc restorecon[4694]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzounstructured-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 11 17:30:25 crc restorecon[4694]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloudbees-ci-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 11 17:30:25 crc restorecon[4694]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloudbees-ci-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 11 17:30:25 crc restorecon[4694]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cockroachdb-certified-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 11 17:30:25 crc restorecon[4694]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cockroachdb-certified-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 11 17:30:25 crc restorecon[4694]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/crunchy-postgres-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 11 17:30:25 crc restorecon[4694]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/crunchy-postgres-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 11 17:30:25 crc restorecon[4694]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datadog-operator-certified-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 11 17:30:25 crc restorecon[4694]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datadog-operator-certified-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 11 17:30:25 crc restorecon[4694]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dynatrace-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 11 17:30:25 crc restorecon[4694]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dynatrace-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 11 17:30:25 crc restorecon[4694]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/entando-k8s-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 11 17:30:25 crc restorecon[4694]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/entando-k8s-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 11 17:30:25 crc restorecon[4694]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/flux not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 11 17:30:25 crc restorecon[4694]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/flux/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 11 17:30:25 crc restorecon[4694]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/instana-agent-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 11 17:30:25 crc restorecon[4694]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/instana-agent-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 11 17:30:25 crc restorecon[4694]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/iomesh-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 11 17:30:25 crc restorecon[4694]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/iomesh-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 11 17:30:25 crc restorecon[4694]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 11 17:30:25 crc restorecon[4694]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 11 17:30:25 crc restorecon[4694]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx8-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 11 17:30:25 crc restorecon[4694]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx8-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 11 17:30:25 crc restorecon[4694]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k10-kasten-operator-paygo-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 11 17:30:25 crc restorecon[4694]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k10-kasten-operator-paygo-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 11 17:30:25 crc restorecon[4694]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k10-kasten-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 11 17:30:25 crc restorecon[4694]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k10-kasten-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 11 17:30:25 crc restorecon[4694]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k10-kasten-operator-term-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 11 17:30:25 crc restorecon[4694]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k10-kasten-operator-term-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 11 17:30:25 crc restorecon[4694]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubemq-operator-marketplace-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 11 17:30:25 crc restorecon[4694]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubemq-operator-marketplace-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 11 17:30:25 crc restorecon[4694]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubeturbo-certified-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 11 17:30:25 crc restorecon[4694]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubeturbo-certified-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 11 17:30:25 crc restorecon[4694]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/linstor-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 11 17:30:25 crc restorecon[4694]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/linstor-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 11 17:30:25 crc restorecon[4694]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/marketplace-games-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 11 17:30:25 crc restorecon[4694]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/marketplace-games-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 11 17:30:25 crc restorecon[4694]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/model-builder-for-vision-certified-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 11 17:30:25 crc restorecon[4694]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/model-builder-for-vision-certified-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 11 17:30:25 crc restorecon[4694]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/neuvector-certified-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 11 17:30:25 crc restorecon[4694]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/neuvector-certified-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 11 17:30:25 crc restorecon[4694]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ovms-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 11 17:30:25 crc restorecon[4694]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ovms-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 11 17:30:25 crc restorecon[4694]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pachyderm-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 11 17:30:25 crc restorecon[4694]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pachyderm-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 11 17:30:25 crc restorecon[4694]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/redis-enterprise-operator-cert-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 11 17:30:25 crc restorecon[4694]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/redis-enterprise-operator-cert-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 11 17:30:25 crc restorecon[4694]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/seldon-deploy-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 11 17:30:25 crc restorecon[4694]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/seldon-deploy-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 11 17:30:25 crc restorecon[4694]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/starburst-enterprise-helm-operator-paygo-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 11 17:30:25 crc restorecon[4694]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/starburst-enterprise-helm-operator-paygo-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 11 17:30:25 crc restorecon[4694]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/starburst-enterprise-helm-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 11 17:30:25 crc restorecon[4694]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/starburst-enterprise-helm-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 11 17:30:25 crc restorecon[4694]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/t8c-certified-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 11 17:30:25 crc restorecon[4694]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/t8c-certified-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 11 17:30:25 crc restorecon[4694]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/timemachine-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 11 17:30:25 crc restorecon[4694]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/timemachine-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 11 17:30:25 crc restorecon[4694]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/vfunction-server-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 11 17:30:25 crc restorecon[4694]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/vfunction-server-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 11 17:30:25 crc restorecon[4694]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/xcrypt-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 11 17:30:25 crc restorecon[4694]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/xcrypt-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 11 17:30:25 crc restorecon[4694]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/yugabyte-platform-operator-bundle-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 11 17:30:25 crc restorecon[4694]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/yugabyte-platform-operator-bundle-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 11 17:30:25 crc restorecon[4694]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/zabbix-operator-certified-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 11 17:30:25 crc restorecon[4694]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/zabbix-operator-certified-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 11 17:30:25 crc restorecon[4694]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 11 17:30:25 crc restorecon[4694]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 11 17:30:25 crc restorecon[4694]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 11 17:30:25 crc restorecon[4694]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/00000-1.psg not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 11 17:30:25 crc restorecon[4694]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/00000-1.psg.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 11 17:30:25 crc restorecon[4694]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/db.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 11 17:30:25 crc restorecon[4694]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/index.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 11 17:30:25 crc restorecon[4694]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/main.pix not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 11 17:30:25 crc restorecon[4694]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/overflow.pix not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 11 17:30:25 crc restorecon[4694]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/digest not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 11 17:30:25 crc restorecon[4694]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/utilities not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 11 17:30:25 crc restorecon[4694]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/utilities/copy-content not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 11 17:30:25 crc restorecon[4694]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 11 17:30:25 crc restorecon[4694]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/extract-utilities/3c9f3a59 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 11 17:30:25 crc restorecon[4694]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/extract-utilities/1091c11b not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 11 17:30:25 crc restorecon[4694]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/extract-utilities/9a6821c6 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 11 17:30:25 crc restorecon[4694]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/extract-content/ec0c35e2 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 11 17:30:25 crc restorecon[4694]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/extract-content/517f37e7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 11 17:30:25 crc restorecon[4694]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/extract-content/6214fe78 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 11 17:30:25 crc restorecon[4694]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/registry-server/ba189c8b not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 11 17:30:25 crc restorecon[4694]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/registry-server/351e4f31 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 11 17:30:25 crc restorecon[4694]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/registry-server/c0f219ff not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 11 17:30:25 crc restorecon[4694]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c247,c522 Jan 11 17:30:25 crc restorecon[4694]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/wait-for-host-port/8069f607 not reset as customized by admin to system_u:object_r:container_file_t:s0:c378,c723 Jan 11 17:30:25 crc restorecon[4694]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/wait-for-host-port/559c3d82 not reset as customized by admin to system_u:object_r:container_file_t:s0:c133,c223 Jan 11 17:30:25 crc restorecon[4694]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/wait-for-host-port/605ad488 not reset as customized by admin to system_u:object_r:container_file_t:s0:c247,c522 Jan 11 17:30:25 crc restorecon[4694]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler/148df488 not reset as customized by admin to system_u:object_r:container_file_t:s0:c378,c723 Jan 11 17:30:25 crc restorecon[4694]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler/3bf6dcb4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c133,c223 Jan 11 17:30:25 crc restorecon[4694]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler/022a2feb not reset as customized by admin to system_u:object_r:container_file_t:s0:c247,c522 Jan 11 17:30:25 crc restorecon[4694]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler-cert-syncer/938c3924 not reset as customized by admin to system_u:object_r:container_file_t:s0:c378,c723 Jan 11 17:30:25 crc restorecon[4694]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler-cert-syncer/729fe23e not reset as customized by admin to system_u:object_r:container_file_t:s0:c133,c223 Jan 11 17:30:25 crc restorecon[4694]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler-cert-syncer/1fd5cbd4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c247,c522 Jan 11 17:30:25 crc restorecon[4694]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler-recovery-controller/a96697e1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c378,c723 Jan 11 17:30:25 crc restorecon[4694]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler-recovery-controller/e155ddca not reset as customized by admin to system_u:object_r:container_file_t:s0:c133,c223 Jan 11 17:30:25 crc restorecon[4694]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler-recovery-controller/10dd0e0f not reset as customized by admin to system_u:object_r:container_file_t:s0:c247,c522 Jan 11 17:30:25 crc restorecon[4694]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-trusted-ca-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Jan 11 17:30:25 crc restorecon[4694]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-trusted-ca-bundle/..2025_02_24_06_09_35.3018472960 not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Jan 11 17:30:25 crc restorecon[4694]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-trusted-ca-bundle/..2025_02_24_06_09_35.3018472960/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Jan 11 17:30:25 crc restorecon[4694]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-trusted-ca-bundle/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Jan 11 17:30:25 crc restorecon[4694]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-trusted-ca-bundle/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Jan 11 17:30:25 crc restorecon[4694]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/audit-policies not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Jan 11 17:30:25 crc restorecon[4694]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/audit-policies/..2025_02_24_06_09_35.4262376737 not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Jan 11 17:30:25 crc restorecon[4694]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/audit-policies/..2025_02_24_06_09_35.4262376737/audit.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Jan 11 17:30:25 crc restorecon[4694]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/audit-policies/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Jan 11 17:30:25 crc restorecon[4694]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/audit-policies/audit.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Jan 11 17:30:25 crc restorecon[4694]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-cliconfig not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Jan 11 17:30:25 crc restorecon[4694]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-cliconfig/..2025_02_24_06_09_35.2630275752 not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Jan 11 17:30:25 crc restorecon[4694]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-cliconfig/..2025_02_24_06_09_35.2630275752/v4-0-config-system-cliconfig not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Jan 11 17:30:25 crc restorecon[4694]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-cliconfig/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Jan 11 17:30:25 crc restorecon[4694]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-cliconfig/v4-0-config-system-cliconfig not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Jan 11 17:30:25 crc restorecon[4694]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-service-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Jan 11 17:30:25 crc restorecon[4694]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-service-ca/..2025_02_24_06_09_35.2376963788 not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Jan 11 17:30:25 crc restorecon[4694]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-service-ca/..2025_02_24_06_09_35.2376963788/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Jan 11 17:30:25 crc restorecon[4694]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-service-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Jan 11 17:30:25 crc restorecon[4694]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-service-ca/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Jan 11 17:30:25 crc restorecon[4694]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Jan 11 17:30:25 crc restorecon[4694]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/containers/oauth-openshift/6f2c8392 not reset as customized by admin to system_u:object_r:container_file_t:s0:c267,c588 Jan 11 17:30:25 crc restorecon[4694]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/containers/oauth-openshift/bd241ad9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Jan 11 17:30:25 crc restorecon[4694]: /var/lib/kubelet/plugins not reset as customized by admin to system_u:object_r:container_file_t:s0 Jan 11 17:30:25 crc restorecon[4694]: /var/lib/kubelet/plugins/csi-hostpath not reset as customized by admin to system_u:object_r:container_file_t:s0 Jan 11 17:30:25 crc restorecon[4694]: /var/lib/kubelet/plugins/csi-hostpath/csi.sock not reset as customized by admin to system_u:object_r:container_file_t:s0 Jan 11 17:30:25 crc restorecon[4694]: /var/lib/kubelet/plugins/kubernetes.io not reset as customized by admin to system_u:object_r:container_file_t:s0 Jan 11 17:30:25 crc restorecon[4694]: /var/lib/kubelet/plugins/kubernetes.io/csi not reset as customized by admin to system_u:object_r:container_file_t:s0 Jan 11 17:30:25 crc restorecon[4694]: /var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner not reset as customized by admin to system_u:object_r:container_file_t:s0 Jan 11 17:30:25 crc restorecon[4694]: /var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/1f4776af88835e41c12b831b4c9fed40233456d14189815a54dbe7f892fc1983 not reset as customized by admin to system_u:object_r:container_file_t:s0 Jan 11 17:30:25 crc restorecon[4694]: /var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/1f4776af88835e41c12b831b4c9fed40233456d14189815a54dbe7f892fc1983/globalmount not reset as customized by admin to system_u:object_r:container_file_t:s0 Jan 11 17:30:25 crc restorecon[4694]: /var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/1f4776af88835e41c12b831b4c9fed40233456d14189815a54dbe7f892fc1983/vol_data.json not reset as customized by admin to system_u:object_r:container_file_t:s0 Jan 11 17:30:25 crc restorecon[4694]: /var/lib/kubelet/plugins_registry not reset as customized by admin to system_u:object_r:container_file_t:s0 Jan 11 17:30:25 crc restorecon[4694]: Relabeled /var/usrlocal/bin/kubenswrapper from system_u:object_r:bin_t:s0 to system_u:object_r:kubelet_exec_t:s0 Jan 11 17:30:25 crc kubenswrapper[4837]: Flag --container-runtime-endpoint has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Jan 11 17:30:25 crc kubenswrapper[4837]: Flag --minimum-container-ttl-duration has been deprecated, Use --eviction-hard or --eviction-soft instead. Will be removed in a future version. Jan 11 17:30:25 crc kubenswrapper[4837]: Flag --volume-plugin-dir has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Jan 11 17:30:25 crc kubenswrapper[4837]: Flag --register-with-taints has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Jan 11 17:30:25 crc kubenswrapper[4837]: Flag --pod-infra-container-image has been deprecated, will be removed in a future release. Image garbage collector will get sandbox image information from CRI. Jan 11 17:30:25 crc kubenswrapper[4837]: Flag --system-reserved has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Jan 11 17:30:25 crc kubenswrapper[4837]: I0111 17:30:25.956854 4837 server.go:211] "--pod-infra-container-image will not be pruned by the image garbage collector in kubelet and should also be set in the remote runtime" Jan 11 17:30:25 crc kubenswrapper[4837]: W0111 17:30:25.962828 4837 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAWS Jan 11 17:30:25 crc kubenswrapper[4837]: W0111 17:30:25.962862 4837 feature_gate.go:330] unrecognized feature gate: MachineAPIProviderOpenStack Jan 11 17:30:25 crc kubenswrapper[4837]: W0111 17:30:25.962875 4837 feature_gate.go:353] Setting GA feature gate ValidatingAdmissionPolicy=true. It will be removed in a future release. Jan 11 17:30:25 crc kubenswrapper[4837]: W0111 17:30:25.962887 4837 feature_gate.go:330] unrecognized feature gate: VSphereStaticIPs Jan 11 17:30:25 crc kubenswrapper[4837]: W0111 17:30:25.962898 4837 feature_gate.go:330] unrecognized feature gate: InsightsRuntimeExtractor Jan 11 17:30:25 crc kubenswrapper[4837]: W0111 17:30:25.962907 4837 feature_gate.go:330] unrecognized feature gate: InsightsConfigAPI Jan 11 17:30:25 crc kubenswrapper[4837]: W0111 17:30:25.962918 4837 feature_gate.go:330] unrecognized feature gate: AdminNetworkPolicy Jan 11 17:30:25 crc kubenswrapper[4837]: W0111 17:30:25.962927 4837 feature_gate.go:330] unrecognized feature gate: GCPClusterHostedDNS Jan 11 17:30:25 crc kubenswrapper[4837]: W0111 17:30:25.962937 4837 feature_gate.go:330] unrecognized feature gate: Example Jan 11 17:30:25 crc kubenswrapper[4837]: W0111 17:30:25.962945 4837 feature_gate.go:330] unrecognized feature gate: VSphereControlPlaneMachineSet Jan 11 17:30:25 crc kubenswrapper[4837]: W0111 17:30:25.962955 4837 feature_gate.go:330] unrecognized feature gate: PersistentIPsForVirtualization Jan 11 17:30:25 crc kubenswrapper[4837]: W0111 17:30:25.962963 4837 feature_gate.go:330] unrecognized feature gate: NutanixMultiSubnets Jan 11 17:30:25 crc kubenswrapper[4837]: W0111 17:30:25.962972 4837 feature_gate.go:330] unrecognized feature gate: AWSClusterHostedDNS Jan 11 17:30:25 crc kubenswrapper[4837]: W0111 17:30:25.962981 4837 feature_gate.go:330] unrecognized feature gate: RouteAdvertisements Jan 11 17:30:25 crc kubenswrapper[4837]: W0111 17:30:25.962989 4837 feature_gate.go:330] unrecognized feature gate: NetworkSegmentation Jan 11 17:30:25 crc kubenswrapper[4837]: W0111 17:30:25.962997 4837 feature_gate.go:330] unrecognized feature gate: GatewayAPI Jan 11 17:30:25 crc kubenswrapper[4837]: W0111 17:30:25.963005 4837 feature_gate.go:330] unrecognized feature gate: UpgradeStatus Jan 11 17:30:25 crc kubenswrapper[4837]: W0111 17:30:25.963014 4837 feature_gate.go:330] unrecognized feature gate: BuildCSIVolumes Jan 11 17:30:25 crc kubenswrapper[4837]: W0111 17:30:25.963022 4837 feature_gate.go:330] unrecognized feature gate: ChunkSizeMiB Jan 11 17:30:25 crc kubenswrapper[4837]: W0111 17:30:25.963031 4837 feature_gate.go:330] unrecognized feature gate: EtcdBackendQuota Jan 11 17:30:25 crc kubenswrapper[4837]: W0111 17:30:25.963039 4837 feature_gate.go:330] unrecognized feature gate: CSIDriverSharedResource Jan 11 17:30:25 crc kubenswrapper[4837]: W0111 17:30:25.963048 4837 feature_gate.go:330] unrecognized feature gate: PinnedImages Jan 11 17:30:25 crc kubenswrapper[4837]: W0111 17:30:25.963057 4837 feature_gate.go:330] unrecognized feature gate: AlibabaPlatform Jan 11 17:30:25 crc kubenswrapper[4837]: W0111 17:30:25.963065 4837 feature_gate.go:330] unrecognized feature gate: SigstoreImageVerification Jan 11 17:30:25 crc kubenswrapper[4837]: W0111 17:30:25.963073 4837 feature_gate.go:330] unrecognized feature gate: InsightsConfig Jan 11 17:30:25 crc kubenswrapper[4837]: W0111 17:30:25.963081 4837 feature_gate.go:330] unrecognized feature gate: VSphereDriverConfiguration Jan 11 17:30:25 crc kubenswrapper[4837]: W0111 17:30:25.963089 4837 feature_gate.go:330] unrecognized feature gate: HardwareSpeed Jan 11 17:30:25 crc kubenswrapper[4837]: W0111 17:30:25.963097 4837 feature_gate.go:330] unrecognized feature gate: OpenShiftPodSecurityAdmission Jan 11 17:30:25 crc kubenswrapper[4837]: W0111 17:30:25.963106 4837 feature_gate.go:330] unrecognized feature gate: ConsolePluginContentSecurityPolicy Jan 11 17:30:25 crc kubenswrapper[4837]: W0111 17:30:25.963116 4837 feature_gate.go:330] unrecognized feature gate: MachineAPIMigration Jan 11 17:30:25 crc kubenswrapper[4837]: W0111 17:30:25.963124 4837 feature_gate.go:330] unrecognized feature gate: PlatformOperators Jan 11 17:30:25 crc kubenswrapper[4837]: W0111 17:30:25.963132 4837 feature_gate.go:330] unrecognized feature gate: MinimumKubeletVersion Jan 11 17:30:25 crc kubenswrapper[4837]: W0111 17:30:25.963141 4837 feature_gate.go:330] unrecognized feature gate: ExternalOIDC Jan 11 17:30:25 crc kubenswrapper[4837]: W0111 17:30:25.963150 4837 feature_gate.go:330] unrecognized feature gate: MachineConfigNodes Jan 11 17:30:25 crc kubenswrapper[4837]: W0111 17:30:25.963159 4837 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstall Jan 11 17:30:25 crc kubenswrapper[4837]: W0111 17:30:25.963168 4837 feature_gate.go:330] unrecognized feature gate: GCPLabelsTags Jan 11 17:30:25 crc kubenswrapper[4837]: W0111 17:30:25.963177 4837 feature_gate.go:330] unrecognized feature gate: OVNObservability Jan 11 17:30:25 crc kubenswrapper[4837]: W0111 17:30:25.963186 4837 feature_gate.go:330] unrecognized feature gate: ClusterMonitoringConfig Jan 11 17:30:25 crc kubenswrapper[4837]: W0111 17:30:25.963194 4837 feature_gate.go:330] unrecognized feature gate: VolumeGroupSnapshot Jan 11 17:30:25 crc kubenswrapper[4837]: W0111 17:30:25.963203 4837 feature_gate.go:330] unrecognized feature gate: NewOLM Jan 11 17:30:25 crc kubenswrapper[4837]: W0111 17:30:25.963214 4837 feature_gate.go:351] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Jan 11 17:30:25 crc kubenswrapper[4837]: W0111 17:30:25.963224 4837 feature_gate.go:330] unrecognized feature gate: MetricsCollectionProfiles Jan 11 17:30:25 crc kubenswrapper[4837]: W0111 17:30:25.963232 4837 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstallIBMCloud Jan 11 17:30:25 crc kubenswrapper[4837]: W0111 17:30:25.963240 4837 feature_gate.go:330] unrecognized feature gate: MixedCPUsAllocation Jan 11 17:30:25 crc kubenswrapper[4837]: W0111 17:30:25.963248 4837 feature_gate.go:330] unrecognized feature gate: AWSEFSDriverVolumeMetrics Jan 11 17:30:25 crc kubenswrapper[4837]: W0111 17:30:25.963257 4837 feature_gate.go:330] unrecognized feature gate: PrivateHostedZoneAWS Jan 11 17:30:25 crc kubenswrapper[4837]: W0111 17:30:25.963269 4837 feature_gate.go:353] Setting GA feature gate CloudDualStackNodeIPs=true. It will be removed in a future release. Jan 11 17:30:25 crc kubenswrapper[4837]: W0111 17:30:25.963280 4837 feature_gate.go:330] unrecognized feature gate: VSphereMultiVCenters Jan 11 17:30:25 crc kubenswrapper[4837]: W0111 17:30:25.963290 4837 feature_gate.go:330] unrecognized feature gate: AdditionalRoutingCapabilities Jan 11 17:30:25 crc kubenswrapper[4837]: W0111 17:30:25.963301 4837 feature_gate.go:330] unrecognized feature gate: BareMetalLoadBalancer Jan 11 17:30:25 crc kubenswrapper[4837]: W0111 17:30:25.963311 4837 feature_gate.go:330] unrecognized feature gate: IngressControllerDynamicConfigurationManager Jan 11 17:30:25 crc kubenswrapper[4837]: W0111 17:30:25.963320 4837 feature_gate.go:330] unrecognized feature gate: ImageStreamImportMode Jan 11 17:30:25 crc kubenswrapper[4837]: W0111 17:30:25.963331 4837 feature_gate.go:353] Setting GA feature gate DisableKubeletCloudCredentialProviders=true. It will be removed in a future release. Jan 11 17:30:25 crc kubenswrapper[4837]: W0111 17:30:25.963342 4837 feature_gate.go:330] unrecognized feature gate: IngressControllerLBSubnetsAWS Jan 11 17:30:25 crc kubenswrapper[4837]: W0111 17:30:25.963352 4837 feature_gate.go:330] unrecognized feature gate: ManagedBootImages Jan 11 17:30:25 crc kubenswrapper[4837]: W0111 17:30:25.963361 4837 feature_gate.go:330] unrecognized feature gate: SetEIPForNLBIngressController Jan 11 17:30:25 crc kubenswrapper[4837]: W0111 17:30:25.963373 4837 feature_gate.go:330] unrecognized feature gate: NodeDisruptionPolicy Jan 11 17:30:25 crc kubenswrapper[4837]: W0111 17:30:25.963382 4837 feature_gate.go:330] unrecognized feature gate: DNSNameResolver Jan 11 17:30:25 crc kubenswrapper[4837]: W0111 17:30:25.963393 4837 feature_gate.go:330] unrecognized feature gate: AutomatedEtcdBackup Jan 11 17:30:25 crc kubenswrapper[4837]: W0111 17:30:25.963402 4837 feature_gate.go:330] unrecognized feature gate: NetworkLiveMigration Jan 11 17:30:25 crc kubenswrapper[4837]: W0111 17:30:25.963410 4837 feature_gate.go:330] unrecognized feature gate: AzureWorkloadIdentity Jan 11 17:30:25 crc kubenswrapper[4837]: W0111 17:30:25.963418 4837 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAzure Jan 11 17:30:25 crc kubenswrapper[4837]: W0111 17:30:25.963427 4837 feature_gate.go:330] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Jan 11 17:30:25 crc kubenswrapper[4837]: W0111 17:30:25.963435 4837 feature_gate.go:330] unrecognized feature gate: OnClusterBuild Jan 11 17:30:25 crc kubenswrapper[4837]: W0111 17:30:25.963444 4837 feature_gate.go:330] unrecognized feature gate: MultiArchInstallGCP Jan 11 17:30:25 crc kubenswrapper[4837]: W0111 17:30:25.963452 4837 feature_gate.go:330] unrecognized feature gate: InsightsOnDemandDataGather Jan 11 17:30:25 crc kubenswrapper[4837]: W0111 17:30:25.963460 4837 feature_gate.go:330] unrecognized feature gate: BootcNodeManagement Jan 11 17:30:25 crc kubenswrapper[4837]: W0111 17:30:25.963469 4837 feature_gate.go:330] unrecognized feature gate: VSphereMultiNetworks Jan 11 17:30:25 crc kubenswrapper[4837]: W0111 17:30:25.963477 4837 feature_gate.go:330] unrecognized feature gate: NetworkDiagnosticsConfig Jan 11 17:30:25 crc kubenswrapper[4837]: W0111 17:30:25.963485 4837 feature_gate.go:330] unrecognized feature gate: ManagedBootImagesAWS Jan 11 17:30:25 crc kubenswrapper[4837]: W0111 17:30:25.963493 4837 feature_gate.go:330] unrecognized feature gate: SignatureStores Jan 11 17:30:25 crc kubenswrapper[4837]: I0111 17:30:25.963969 4837 flags.go:64] FLAG: --address="0.0.0.0" Jan 11 17:30:25 crc kubenswrapper[4837]: I0111 17:30:25.963995 4837 flags.go:64] FLAG: --allowed-unsafe-sysctls="[]" Jan 11 17:30:25 crc kubenswrapper[4837]: I0111 17:30:25.964013 4837 flags.go:64] FLAG: --anonymous-auth="true" Jan 11 17:30:25 crc kubenswrapper[4837]: I0111 17:30:25.964026 4837 flags.go:64] FLAG: --application-metrics-count-limit="100" Jan 11 17:30:25 crc kubenswrapper[4837]: I0111 17:30:25.964039 4837 flags.go:64] FLAG: --authentication-token-webhook="false" Jan 11 17:30:25 crc kubenswrapper[4837]: I0111 17:30:25.964049 4837 flags.go:64] FLAG: --authentication-token-webhook-cache-ttl="2m0s" Jan 11 17:30:25 crc kubenswrapper[4837]: I0111 17:30:25.964063 4837 flags.go:64] FLAG: --authorization-mode="AlwaysAllow" Jan 11 17:30:25 crc kubenswrapper[4837]: I0111 17:30:25.964086 4837 flags.go:64] FLAG: --authorization-webhook-cache-authorized-ttl="5m0s" Jan 11 17:30:25 crc kubenswrapper[4837]: I0111 17:30:25.964097 4837 flags.go:64] FLAG: --authorization-webhook-cache-unauthorized-ttl="30s" Jan 11 17:30:25 crc kubenswrapper[4837]: I0111 17:30:25.964107 4837 flags.go:64] FLAG: --boot-id-file="/proc/sys/kernel/random/boot_id" Jan 11 17:30:25 crc kubenswrapper[4837]: I0111 17:30:25.964118 4837 flags.go:64] FLAG: --bootstrap-kubeconfig="/etc/kubernetes/kubeconfig" Jan 11 17:30:25 crc kubenswrapper[4837]: I0111 17:30:25.964128 4837 flags.go:64] FLAG: --cert-dir="/var/lib/kubelet/pki" Jan 11 17:30:25 crc kubenswrapper[4837]: I0111 17:30:25.964138 4837 flags.go:64] FLAG: --cgroup-driver="cgroupfs" Jan 11 17:30:25 crc kubenswrapper[4837]: I0111 17:30:25.964147 4837 flags.go:64] FLAG: --cgroup-root="" Jan 11 17:30:25 crc kubenswrapper[4837]: I0111 17:30:25.964157 4837 flags.go:64] FLAG: --cgroups-per-qos="true" Jan 11 17:30:25 crc kubenswrapper[4837]: I0111 17:30:25.964167 4837 flags.go:64] FLAG: --client-ca-file="" Jan 11 17:30:25 crc kubenswrapper[4837]: I0111 17:30:25.964176 4837 flags.go:64] FLAG: --cloud-config="" Jan 11 17:30:25 crc kubenswrapper[4837]: I0111 17:30:25.964186 4837 flags.go:64] FLAG: --cloud-provider="" Jan 11 17:30:25 crc kubenswrapper[4837]: I0111 17:30:25.964195 4837 flags.go:64] FLAG: --cluster-dns="[]" Jan 11 17:30:25 crc kubenswrapper[4837]: I0111 17:30:25.964208 4837 flags.go:64] FLAG: --cluster-domain="" Jan 11 17:30:25 crc kubenswrapper[4837]: I0111 17:30:25.964217 4837 flags.go:64] FLAG: --config="/etc/kubernetes/kubelet.conf" Jan 11 17:30:25 crc kubenswrapper[4837]: I0111 17:30:25.964227 4837 flags.go:64] FLAG: --config-dir="" Jan 11 17:30:25 crc kubenswrapper[4837]: I0111 17:30:25.964237 4837 flags.go:64] FLAG: --container-hints="/etc/cadvisor/container_hints.json" Jan 11 17:30:25 crc kubenswrapper[4837]: I0111 17:30:25.964247 4837 flags.go:64] FLAG: --container-log-max-files="5" Jan 11 17:30:25 crc kubenswrapper[4837]: I0111 17:30:25.964260 4837 flags.go:64] FLAG: --container-log-max-size="10Mi" Jan 11 17:30:25 crc kubenswrapper[4837]: I0111 17:30:25.964270 4837 flags.go:64] FLAG: --container-runtime-endpoint="/var/run/crio/crio.sock" Jan 11 17:30:25 crc kubenswrapper[4837]: I0111 17:30:25.964280 4837 flags.go:64] FLAG: --containerd="/run/containerd/containerd.sock" Jan 11 17:30:25 crc kubenswrapper[4837]: I0111 17:30:25.964291 4837 flags.go:64] FLAG: --containerd-namespace="k8s.io" Jan 11 17:30:25 crc kubenswrapper[4837]: I0111 17:30:25.964301 4837 flags.go:64] FLAG: --contention-profiling="false" Jan 11 17:30:25 crc kubenswrapper[4837]: I0111 17:30:25.964311 4837 flags.go:64] FLAG: --cpu-cfs-quota="true" Jan 11 17:30:25 crc kubenswrapper[4837]: I0111 17:30:25.964321 4837 flags.go:64] FLAG: --cpu-cfs-quota-period="100ms" Jan 11 17:30:25 crc kubenswrapper[4837]: I0111 17:30:25.964332 4837 flags.go:64] FLAG: --cpu-manager-policy="none" Jan 11 17:30:25 crc kubenswrapper[4837]: I0111 17:30:25.964341 4837 flags.go:64] FLAG: --cpu-manager-policy-options="" Jan 11 17:30:25 crc kubenswrapper[4837]: I0111 17:30:25.964353 4837 flags.go:64] FLAG: --cpu-manager-reconcile-period="10s" Jan 11 17:30:25 crc kubenswrapper[4837]: I0111 17:30:25.964362 4837 flags.go:64] FLAG: --enable-controller-attach-detach="true" Jan 11 17:30:25 crc kubenswrapper[4837]: I0111 17:30:25.964373 4837 flags.go:64] FLAG: --enable-debugging-handlers="true" Jan 11 17:30:25 crc kubenswrapper[4837]: I0111 17:30:25.964382 4837 flags.go:64] FLAG: --enable-load-reader="false" Jan 11 17:30:25 crc kubenswrapper[4837]: I0111 17:30:25.964392 4837 flags.go:64] FLAG: --enable-server="true" Jan 11 17:30:25 crc kubenswrapper[4837]: I0111 17:30:25.964402 4837 flags.go:64] FLAG: --enforce-node-allocatable="[pods]" Jan 11 17:30:25 crc kubenswrapper[4837]: I0111 17:30:25.964414 4837 flags.go:64] FLAG: --event-burst="100" Jan 11 17:30:25 crc kubenswrapper[4837]: I0111 17:30:25.964425 4837 flags.go:64] FLAG: --event-qps="50" Jan 11 17:30:25 crc kubenswrapper[4837]: I0111 17:30:25.964434 4837 flags.go:64] FLAG: --event-storage-age-limit="default=0" Jan 11 17:30:25 crc kubenswrapper[4837]: I0111 17:30:25.964444 4837 flags.go:64] FLAG: --event-storage-event-limit="default=0" Jan 11 17:30:25 crc kubenswrapper[4837]: I0111 17:30:25.964455 4837 flags.go:64] FLAG: --eviction-hard="" Jan 11 17:30:25 crc kubenswrapper[4837]: I0111 17:30:25.964466 4837 flags.go:64] FLAG: --eviction-max-pod-grace-period="0" Jan 11 17:30:25 crc kubenswrapper[4837]: I0111 17:30:25.964476 4837 flags.go:64] FLAG: --eviction-minimum-reclaim="" Jan 11 17:30:25 crc kubenswrapper[4837]: I0111 17:30:25.964486 4837 flags.go:64] FLAG: --eviction-pressure-transition-period="5m0s" Jan 11 17:30:25 crc kubenswrapper[4837]: I0111 17:30:25.964496 4837 flags.go:64] FLAG: --eviction-soft="" Jan 11 17:30:25 crc kubenswrapper[4837]: I0111 17:30:25.964506 4837 flags.go:64] FLAG: --eviction-soft-grace-period="" Jan 11 17:30:25 crc kubenswrapper[4837]: I0111 17:30:25.964515 4837 flags.go:64] FLAG: --exit-on-lock-contention="false" Jan 11 17:30:25 crc kubenswrapper[4837]: I0111 17:30:25.964525 4837 flags.go:64] FLAG: --experimental-allocatable-ignore-eviction="false" Jan 11 17:30:25 crc kubenswrapper[4837]: I0111 17:30:25.964535 4837 flags.go:64] FLAG: --experimental-mounter-path="" Jan 11 17:30:25 crc kubenswrapper[4837]: I0111 17:30:25.964545 4837 flags.go:64] FLAG: --fail-cgroupv1="false" Jan 11 17:30:25 crc kubenswrapper[4837]: I0111 17:30:25.964554 4837 flags.go:64] FLAG: --fail-swap-on="true" Jan 11 17:30:25 crc kubenswrapper[4837]: I0111 17:30:25.964564 4837 flags.go:64] FLAG: --feature-gates="" Jan 11 17:30:25 crc kubenswrapper[4837]: I0111 17:30:25.964575 4837 flags.go:64] FLAG: --file-check-frequency="20s" Jan 11 17:30:25 crc kubenswrapper[4837]: I0111 17:30:25.964586 4837 flags.go:64] FLAG: --global-housekeeping-interval="1m0s" Jan 11 17:30:25 crc kubenswrapper[4837]: I0111 17:30:25.964596 4837 flags.go:64] FLAG: --hairpin-mode="promiscuous-bridge" Jan 11 17:30:25 crc kubenswrapper[4837]: I0111 17:30:25.964606 4837 flags.go:64] FLAG: --healthz-bind-address="127.0.0.1" Jan 11 17:30:25 crc kubenswrapper[4837]: I0111 17:30:25.964616 4837 flags.go:64] FLAG: --healthz-port="10248" Jan 11 17:30:25 crc kubenswrapper[4837]: I0111 17:30:25.964626 4837 flags.go:64] FLAG: --help="false" Jan 11 17:30:25 crc kubenswrapper[4837]: I0111 17:30:25.964636 4837 flags.go:64] FLAG: --hostname-override="" Jan 11 17:30:25 crc kubenswrapper[4837]: I0111 17:30:25.964646 4837 flags.go:64] FLAG: --housekeeping-interval="10s" Jan 11 17:30:25 crc kubenswrapper[4837]: I0111 17:30:25.964656 4837 flags.go:64] FLAG: --http-check-frequency="20s" Jan 11 17:30:25 crc kubenswrapper[4837]: I0111 17:30:25.964665 4837 flags.go:64] FLAG: --image-credential-provider-bin-dir="" Jan 11 17:30:25 crc kubenswrapper[4837]: I0111 17:30:25.964706 4837 flags.go:64] FLAG: --image-credential-provider-config="" Jan 11 17:30:25 crc kubenswrapper[4837]: I0111 17:30:25.964717 4837 flags.go:64] FLAG: --image-gc-high-threshold="85" Jan 11 17:30:25 crc kubenswrapper[4837]: I0111 17:30:25.964727 4837 flags.go:64] FLAG: --image-gc-low-threshold="80" Jan 11 17:30:25 crc kubenswrapper[4837]: I0111 17:30:25.964737 4837 flags.go:64] FLAG: --image-service-endpoint="" Jan 11 17:30:25 crc kubenswrapper[4837]: I0111 17:30:25.964746 4837 flags.go:64] FLAG: --kernel-memcg-notification="false" Jan 11 17:30:25 crc kubenswrapper[4837]: I0111 17:30:25.964755 4837 flags.go:64] FLAG: --kube-api-burst="100" Jan 11 17:30:25 crc kubenswrapper[4837]: I0111 17:30:25.964765 4837 flags.go:64] FLAG: --kube-api-content-type="application/vnd.kubernetes.protobuf" Jan 11 17:30:25 crc kubenswrapper[4837]: I0111 17:30:25.964776 4837 flags.go:64] FLAG: --kube-api-qps="50" Jan 11 17:30:25 crc kubenswrapper[4837]: I0111 17:30:25.964786 4837 flags.go:64] FLAG: --kube-reserved="" Jan 11 17:30:25 crc kubenswrapper[4837]: I0111 17:30:25.964795 4837 flags.go:64] FLAG: --kube-reserved-cgroup="" Jan 11 17:30:25 crc kubenswrapper[4837]: I0111 17:30:25.964804 4837 flags.go:64] FLAG: --kubeconfig="/var/lib/kubelet/kubeconfig" Jan 11 17:30:25 crc kubenswrapper[4837]: I0111 17:30:25.964814 4837 flags.go:64] FLAG: --kubelet-cgroups="" Jan 11 17:30:25 crc kubenswrapper[4837]: I0111 17:30:25.964824 4837 flags.go:64] FLAG: --local-storage-capacity-isolation="true" Jan 11 17:30:25 crc kubenswrapper[4837]: I0111 17:30:25.964833 4837 flags.go:64] FLAG: --lock-file="" Jan 11 17:30:25 crc kubenswrapper[4837]: I0111 17:30:25.964853 4837 flags.go:64] FLAG: --log-cadvisor-usage="false" Jan 11 17:30:25 crc kubenswrapper[4837]: I0111 17:30:25.964863 4837 flags.go:64] FLAG: --log-flush-frequency="5s" Jan 11 17:30:25 crc kubenswrapper[4837]: I0111 17:30:25.964873 4837 flags.go:64] FLAG: --log-json-info-buffer-size="0" Jan 11 17:30:25 crc kubenswrapper[4837]: I0111 17:30:25.964897 4837 flags.go:64] FLAG: --log-json-split-stream="false" Jan 11 17:30:25 crc kubenswrapper[4837]: I0111 17:30:25.964907 4837 flags.go:64] FLAG: --log-text-info-buffer-size="0" Jan 11 17:30:25 crc kubenswrapper[4837]: I0111 17:30:25.964917 4837 flags.go:64] FLAG: --log-text-split-stream="false" Jan 11 17:30:25 crc kubenswrapper[4837]: I0111 17:30:25.964926 4837 flags.go:64] FLAG: --logging-format="text" Jan 11 17:30:25 crc kubenswrapper[4837]: I0111 17:30:25.964936 4837 flags.go:64] FLAG: --machine-id-file="/etc/machine-id,/var/lib/dbus/machine-id" Jan 11 17:30:25 crc kubenswrapper[4837]: I0111 17:30:25.964946 4837 flags.go:64] FLAG: --make-iptables-util-chains="true" Jan 11 17:30:25 crc kubenswrapper[4837]: I0111 17:30:25.964956 4837 flags.go:64] FLAG: --manifest-url="" Jan 11 17:30:25 crc kubenswrapper[4837]: I0111 17:30:25.964966 4837 flags.go:64] FLAG: --manifest-url-header="" Jan 11 17:30:25 crc kubenswrapper[4837]: I0111 17:30:25.964978 4837 flags.go:64] FLAG: --max-housekeeping-interval="15s" Jan 11 17:30:25 crc kubenswrapper[4837]: I0111 17:30:25.964987 4837 flags.go:64] FLAG: --max-open-files="1000000" Jan 11 17:30:25 crc kubenswrapper[4837]: I0111 17:30:25.964999 4837 flags.go:64] FLAG: --max-pods="110" Jan 11 17:30:25 crc kubenswrapper[4837]: I0111 17:30:25.965009 4837 flags.go:64] FLAG: --maximum-dead-containers="-1" Jan 11 17:30:25 crc kubenswrapper[4837]: I0111 17:30:25.965020 4837 flags.go:64] FLAG: --maximum-dead-containers-per-container="1" Jan 11 17:30:25 crc kubenswrapper[4837]: I0111 17:30:25.965030 4837 flags.go:64] FLAG: --memory-manager-policy="None" Jan 11 17:30:25 crc kubenswrapper[4837]: I0111 17:30:25.965039 4837 flags.go:64] FLAG: --minimum-container-ttl-duration="6m0s" Jan 11 17:30:25 crc kubenswrapper[4837]: I0111 17:30:25.965049 4837 flags.go:64] FLAG: --minimum-image-ttl-duration="2m0s" Jan 11 17:30:25 crc kubenswrapper[4837]: I0111 17:30:25.965059 4837 flags.go:64] FLAG: --node-ip="192.168.126.11" Jan 11 17:30:25 crc kubenswrapper[4837]: I0111 17:30:25.965069 4837 flags.go:64] FLAG: --node-labels="node-role.kubernetes.io/control-plane=,node-role.kubernetes.io/master=,node.openshift.io/os_id=rhcos" Jan 11 17:30:25 crc kubenswrapper[4837]: I0111 17:30:25.965091 4837 flags.go:64] FLAG: --node-status-max-images="50" Jan 11 17:30:25 crc kubenswrapper[4837]: I0111 17:30:25.965101 4837 flags.go:64] FLAG: --node-status-update-frequency="10s" Jan 11 17:30:25 crc kubenswrapper[4837]: I0111 17:30:25.965110 4837 flags.go:64] FLAG: --oom-score-adj="-999" Jan 11 17:30:25 crc kubenswrapper[4837]: I0111 17:30:25.965121 4837 flags.go:64] FLAG: --pod-cidr="" Jan 11 17:30:25 crc kubenswrapper[4837]: I0111 17:30:25.965130 4837 flags.go:64] FLAG: --pod-infra-container-image="quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:33549946e22a9ffa738fd94b1345f90921bc8f92fa6137784cb33c77ad806f9d" Jan 11 17:30:25 crc kubenswrapper[4837]: I0111 17:30:25.965145 4837 flags.go:64] FLAG: --pod-manifest-path="" Jan 11 17:30:25 crc kubenswrapper[4837]: I0111 17:30:25.965154 4837 flags.go:64] FLAG: --pod-max-pids="-1" Jan 11 17:30:25 crc kubenswrapper[4837]: I0111 17:30:25.965164 4837 flags.go:64] FLAG: --pods-per-core="0" Jan 11 17:30:25 crc kubenswrapper[4837]: I0111 17:30:25.965174 4837 flags.go:64] FLAG: --port="10250" Jan 11 17:30:25 crc kubenswrapper[4837]: I0111 17:30:25.965184 4837 flags.go:64] FLAG: --protect-kernel-defaults="false" Jan 11 17:30:25 crc kubenswrapper[4837]: I0111 17:30:25.965193 4837 flags.go:64] FLAG: --provider-id="" Jan 11 17:30:25 crc kubenswrapper[4837]: I0111 17:30:25.965203 4837 flags.go:64] FLAG: --qos-reserved="" Jan 11 17:30:25 crc kubenswrapper[4837]: I0111 17:30:25.965212 4837 flags.go:64] FLAG: --read-only-port="10255" Jan 11 17:30:25 crc kubenswrapper[4837]: I0111 17:30:25.965224 4837 flags.go:64] FLAG: --register-node="true" Jan 11 17:30:25 crc kubenswrapper[4837]: I0111 17:30:25.965234 4837 flags.go:64] FLAG: --register-schedulable="true" Jan 11 17:30:25 crc kubenswrapper[4837]: I0111 17:30:25.965244 4837 flags.go:64] FLAG: --register-with-taints="node-role.kubernetes.io/master=:NoSchedule" Jan 11 17:30:25 crc kubenswrapper[4837]: I0111 17:30:25.965260 4837 flags.go:64] FLAG: --registry-burst="10" Jan 11 17:30:25 crc kubenswrapper[4837]: I0111 17:30:25.965270 4837 flags.go:64] FLAG: --registry-qps="5" Jan 11 17:30:25 crc kubenswrapper[4837]: I0111 17:30:25.965280 4837 flags.go:64] FLAG: --reserved-cpus="" Jan 11 17:30:25 crc kubenswrapper[4837]: I0111 17:30:25.965290 4837 flags.go:64] FLAG: --reserved-memory="" Jan 11 17:30:25 crc kubenswrapper[4837]: I0111 17:30:25.965302 4837 flags.go:64] FLAG: --resolv-conf="/etc/resolv.conf" Jan 11 17:30:25 crc kubenswrapper[4837]: I0111 17:30:25.965312 4837 flags.go:64] FLAG: --root-dir="/var/lib/kubelet" Jan 11 17:30:25 crc kubenswrapper[4837]: I0111 17:30:25.965322 4837 flags.go:64] FLAG: --rotate-certificates="false" Jan 11 17:30:25 crc kubenswrapper[4837]: I0111 17:30:25.965332 4837 flags.go:64] FLAG: --rotate-server-certificates="false" Jan 11 17:30:25 crc kubenswrapper[4837]: I0111 17:30:25.965342 4837 flags.go:64] FLAG: --runonce="false" Jan 11 17:30:25 crc kubenswrapper[4837]: I0111 17:30:25.965351 4837 flags.go:64] FLAG: --runtime-cgroups="/system.slice/crio.service" Jan 11 17:30:25 crc kubenswrapper[4837]: I0111 17:30:25.965361 4837 flags.go:64] FLAG: --runtime-request-timeout="2m0s" Jan 11 17:30:25 crc kubenswrapper[4837]: I0111 17:30:25.965371 4837 flags.go:64] FLAG: --seccomp-default="false" Jan 11 17:30:25 crc kubenswrapper[4837]: I0111 17:30:25.965381 4837 flags.go:64] FLAG: --serialize-image-pulls="true" Jan 11 17:30:25 crc kubenswrapper[4837]: I0111 17:30:25.965391 4837 flags.go:64] FLAG: --storage-driver-buffer-duration="1m0s" Jan 11 17:30:25 crc kubenswrapper[4837]: I0111 17:30:25.965401 4837 flags.go:64] FLAG: --storage-driver-db="cadvisor" Jan 11 17:30:25 crc kubenswrapper[4837]: I0111 17:30:25.965411 4837 flags.go:64] FLAG: --storage-driver-host="localhost:8086" Jan 11 17:30:25 crc kubenswrapper[4837]: I0111 17:30:25.965421 4837 flags.go:64] FLAG: --storage-driver-password="root" Jan 11 17:30:25 crc kubenswrapper[4837]: I0111 17:30:25.965432 4837 flags.go:64] FLAG: --storage-driver-secure="false" Jan 11 17:30:25 crc kubenswrapper[4837]: I0111 17:30:25.965441 4837 flags.go:64] FLAG: --storage-driver-table="stats" Jan 11 17:30:25 crc kubenswrapper[4837]: I0111 17:30:25.965451 4837 flags.go:64] FLAG: --storage-driver-user="root" Jan 11 17:30:25 crc kubenswrapper[4837]: I0111 17:30:25.965461 4837 flags.go:64] FLAG: --streaming-connection-idle-timeout="4h0m0s" Jan 11 17:30:25 crc kubenswrapper[4837]: I0111 17:30:25.965471 4837 flags.go:64] FLAG: --sync-frequency="1m0s" Jan 11 17:30:25 crc kubenswrapper[4837]: I0111 17:30:25.965480 4837 flags.go:64] FLAG: --system-cgroups="" Jan 11 17:30:25 crc kubenswrapper[4837]: I0111 17:30:25.965490 4837 flags.go:64] FLAG: --system-reserved="cpu=200m,ephemeral-storage=350Mi,memory=350Mi" Jan 11 17:30:25 crc kubenswrapper[4837]: I0111 17:30:25.965505 4837 flags.go:64] FLAG: --system-reserved-cgroup="" Jan 11 17:30:25 crc kubenswrapper[4837]: I0111 17:30:25.965514 4837 flags.go:64] FLAG: --tls-cert-file="" Jan 11 17:30:25 crc kubenswrapper[4837]: I0111 17:30:25.965524 4837 flags.go:64] FLAG: --tls-cipher-suites="[]" Jan 11 17:30:25 crc kubenswrapper[4837]: I0111 17:30:25.965535 4837 flags.go:64] FLAG: --tls-min-version="" Jan 11 17:30:25 crc kubenswrapper[4837]: I0111 17:30:25.965545 4837 flags.go:64] FLAG: --tls-private-key-file="" Jan 11 17:30:25 crc kubenswrapper[4837]: I0111 17:30:25.965554 4837 flags.go:64] FLAG: --topology-manager-policy="none" Jan 11 17:30:25 crc kubenswrapper[4837]: I0111 17:30:25.965564 4837 flags.go:64] FLAG: --topology-manager-policy-options="" Jan 11 17:30:25 crc kubenswrapper[4837]: I0111 17:30:25.965573 4837 flags.go:64] FLAG: --topology-manager-scope="container" Jan 11 17:30:25 crc kubenswrapper[4837]: I0111 17:30:25.965583 4837 flags.go:64] FLAG: --v="2" Jan 11 17:30:25 crc kubenswrapper[4837]: I0111 17:30:25.965598 4837 flags.go:64] FLAG: --version="false" Jan 11 17:30:25 crc kubenswrapper[4837]: I0111 17:30:25.965613 4837 flags.go:64] FLAG: --vmodule="" Jan 11 17:30:25 crc kubenswrapper[4837]: I0111 17:30:25.965628 4837 flags.go:64] FLAG: --volume-plugin-dir="/etc/kubernetes/kubelet-plugins/volume/exec" Jan 11 17:30:25 crc kubenswrapper[4837]: I0111 17:30:25.965642 4837 flags.go:64] FLAG: --volume-stats-agg-period="1m0s" Jan 11 17:30:25 crc kubenswrapper[4837]: W0111 17:30:25.965916 4837 feature_gate.go:330] unrecognized feature gate: OnClusterBuild Jan 11 17:30:25 crc kubenswrapper[4837]: W0111 17:30:25.965930 4837 feature_gate.go:330] unrecognized feature gate: VSphereControlPlaneMachineSet Jan 11 17:30:25 crc kubenswrapper[4837]: W0111 17:30:25.965940 4837 feature_gate.go:330] unrecognized feature gate: BareMetalLoadBalancer Jan 11 17:30:25 crc kubenswrapper[4837]: W0111 17:30:25.965950 4837 feature_gate.go:330] unrecognized feature gate: PrivateHostedZoneAWS Jan 11 17:30:25 crc kubenswrapper[4837]: W0111 17:30:25.965959 4837 feature_gate.go:330] unrecognized feature gate: UpgradeStatus Jan 11 17:30:25 crc kubenswrapper[4837]: W0111 17:30:25.965968 4837 feature_gate.go:330] unrecognized feature gate: MultiArchInstallGCP Jan 11 17:30:25 crc kubenswrapper[4837]: W0111 17:30:25.965976 4837 feature_gate.go:330] unrecognized feature gate: AutomatedEtcdBackup Jan 11 17:30:25 crc kubenswrapper[4837]: W0111 17:30:25.965985 4837 feature_gate.go:330] unrecognized feature gate: InsightsOnDemandDataGather Jan 11 17:30:25 crc kubenswrapper[4837]: W0111 17:30:25.965993 4837 feature_gate.go:330] unrecognized feature gate: GCPClusterHostedDNS Jan 11 17:30:25 crc kubenswrapper[4837]: W0111 17:30:25.966002 4837 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstall Jan 11 17:30:25 crc kubenswrapper[4837]: W0111 17:30:25.966010 4837 feature_gate.go:330] unrecognized feature gate: SetEIPForNLBIngressController Jan 11 17:30:25 crc kubenswrapper[4837]: W0111 17:30:25.966019 4837 feature_gate.go:330] unrecognized feature gate: SigstoreImageVerification Jan 11 17:30:25 crc kubenswrapper[4837]: W0111 17:30:25.966028 4837 feature_gate.go:330] unrecognized feature gate: NewOLM Jan 11 17:30:25 crc kubenswrapper[4837]: W0111 17:30:25.966036 4837 feature_gate.go:330] unrecognized feature gate: MachineConfigNodes Jan 11 17:30:25 crc kubenswrapper[4837]: W0111 17:30:25.966044 4837 feature_gate.go:330] unrecognized feature gate: BuildCSIVolumes Jan 11 17:30:25 crc kubenswrapper[4837]: W0111 17:30:25.966053 4837 feature_gate.go:330] unrecognized feature gate: AlibabaPlatform Jan 11 17:30:25 crc kubenswrapper[4837]: W0111 17:30:25.966064 4837 feature_gate.go:353] Setting GA feature gate ValidatingAdmissionPolicy=true. It will be removed in a future release. Jan 11 17:30:25 crc kubenswrapper[4837]: W0111 17:30:25.966075 4837 feature_gate.go:330] unrecognized feature gate: InsightsRuntimeExtractor Jan 11 17:30:25 crc kubenswrapper[4837]: W0111 17:30:25.966086 4837 feature_gate.go:330] unrecognized feature gate: ImageStreamImportMode Jan 11 17:30:25 crc kubenswrapper[4837]: W0111 17:30:25.966094 4837 feature_gate.go:330] unrecognized feature gate: VSphereStaticIPs Jan 11 17:30:25 crc kubenswrapper[4837]: W0111 17:30:25.966106 4837 feature_gate.go:351] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Jan 11 17:30:25 crc kubenswrapper[4837]: W0111 17:30:25.966117 4837 feature_gate.go:330] unrecognized feature gate: HardwareSpeed Jan 11 17:30:25 crc kubenswrapper[4837]: W0111 17:30:25.966126 4837 feature_gate.go:330] unrecognized feature gate: MinimumKubeletVersion Jan 11 17:30:25 crc kubenswrapper[4837]: W0111 17:30:25.966135 4837 feature_gate.go:330] unrecognized feature gate: NetworkDiagnosticsConfig Jan 11 17:30:25 crc kubenswrapper[4837]: W0111 17:30:25.966144 4837 feature_gate.go:330] unrecognized feature gate: ChunkSizeMiB Jan 11 17:30:25 crc kubenswrapper[4837]: W0111 17:30:25.966153 4837 feature_gate.go:330] unrecognized feature gate: SignatureStores Jan 11 17:30:25 crc kubenswrapper[4837]: W0111 17:30:25.966161 4837 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAzure Jan 11 17:30:25 crc kubenswrapper[4837]: W0111 17:30:25.966170 4837 feature_gate.go:330] unrecognized feature gate: PersistentIPsForVirtualization Jan 11 17:30:25 crc kubenswrapper[4837]: W0111 17:30:25.966178 4837 feature_gate.go:330] unrecognized feature gate: NetworkSegmentation Jan 11 17:30:25 crc kubenswrapper[4837]: W0111 17:30:25.966187 4837 feature_gate.go:330] unrecognized feature gate: ManagedBootImagesAWS Jan 11 17:30:25 crc kubenswrapper[4837]: W0111 17:30:25.966195 4837 feature_gate.go:330] unrecognized feature gate: MachineAPIMigration Jan 11 17:30:25 crc kubenswrapper[4837]: W0111 17:30:25.966203 4837 feature_gate.go:330] unrecognized feature gate: VolumeGroupSnapshot Jan 11 17:30:25 crc kubenswrapper[4837]: W0111 17:30:25.966212 4837 feature_gate.go:330] unrecognized feature gate: CSIDriverSharedResource Jan 11 17:30:25 crc kubenswrapper[4837]: W0111 17:30:25.966220 4837 feature_gate.go:330] unrecognized feature gate: ConsolePluginContentSecurityPolicy Jan 11 17:30:25 crc kubenswrapper[4837]: W0111 17:30:25.966230 4837 feature_gate.go:330] unrecognized feature gate: NodeDisruptionPolicy Jan 11 17:30:25 crc kubenswrapper[4837]: W0111 17:30:25.966239 4837 feature_gate.go:330] unrecognized feature gate: NetworkLiveMigration Jan 11 17:30:25 crc kubenswrapper[4837]: W0111 17:30:25.966248 4837 feature_gate.go:330] unrecognized feature gate: PinnedImages Jan 11 17:30:25 crc kubenswrapper[4837]: W0111 17:30:25.966256 4837 feature_gate.go:330] unrecognized feature gate: GatewayAPI Jan 11 17:30:25 crc kubenswrapper[4837]: W0111 17:30:25.966265 4837 feature_gate.go:330] unrecognized feature gate: RouteAdvertisements Jan 11 17:30:25 crc kubenswrapper[4837]: W0111 17:30:25.966274 4837 feature_gate.go:330] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Jan 11 17:30:25 crc kubenswrapper[4837]: W0111 17:30:25.966282 4837 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAWS Jan 11 17:30:25 crc kubenswrapper[4837]: W0111 17:30:25.966291 4837 feature_gate.go:330] unrecognized feature gate: ManagedBootImages Jan 11 17:30:25 crc kubenswrapper[4837]: W0111 17:30:25.966299 4837 feature_gate.go:330] unrecognized feature gate: VSphereMultiVCenters Jan 11 17:30:25 crc kubenswrapper[4837]: W0111 17:30:25.966308 4837 feature_gate.go:330] unrecognized feature gate: IngressControllerDynamicConfigurationManager Jan 11 17:30:25 crc kubenswrapper[4837]: W0111 17:30:25.966317 4837 feature_gate.go:330] unrecognized feature gate: IngressControllerLBSubnetsAWS Jan 11 17:30:25 crc kubenswrapper[4837]: W0111 17:30:25.966325 4837 feature_gate.go:330] unrecognized feature gate: OpenShiftPodSecurityAdmission Jan 11 17:30:25 crc kubenswrapper[4837]: W0111 17:30:25.966334 4837 feature_gate.go:330] unrecognized feature gate: GCPLabelsTags Jan 11 17:30:25 crc kubenswrapper[4837]: W0111 17:30:25.966342 4837 feature_gate.go:330] unrecognized feature gate: AdditionalRoutingCapabilities Jan 11 17:30:25 crc kubenswrapper[4837]: W0111 17:30:25.966351 4837 feature_gate.go:330] unrecognized feature gate: AWSClusterHostedDNS Jan 11 17:30:25 crc kubenswrapper[4837]: W0111 17:30:25.966359 4837 feature_gate.go:330] unrecognized feature gate: BootcNodeManagement Jan 11 17:30:25 crc kubenswrapper[4837]: W0111 17:30:25.966367 4837 feature_gate.go:330] unrecognized feature gate: VSphereDriverConfiguration Jan 11 17:30:25 crc kubenswrapper[4837]: W0111 17:30:25.966376 4837 feature_gate.go:330] unrecognized feature gate: MachineAPIProviderOpenStack Jan 11 17:30:25 crc kubenswrapper[4837]: W0111 17:30:25.966384 4837 feature_gate.go:330] unrecognized feature gate: PlatformOperators Jan 11 17:30:25 crc kubenswrapper[4837]: W0111 17:30:25.966392 4837 feature_gate.go:330] unrecognized feature gate: NutanixMultiSubnets Jan 11 17:30:25 crc kubenswrapper[4837]: W0111 17:30:25.966401 4837 feature_gate.go:330] unrecognized feature gate: AzureWorkloadIdentity Jan 11 17:30:25 crc kubenswrapper[4837]: W0111 17:30:25.966412 4837 feature_gate.go:353] Setting GA feature gate DisableKubeletCloudCredentialProviders=true. It will be removed in a future release. Jan 11 17:30:25 crc kubenswrapper[4837]: W0111 17:30:25.966423 4837 feature_gate.go:330] unrecognized feature gate: MixedCPUsAllocation Jan 11 17:30:25 crc kubenswrapper[4837]: W0111 17:30:25.966434 4837 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstallIBMCloud Jan 11 17:30:25 crc kubenswrapper[4837]: W0111 17:30:25.966444 4837 feature_gate.go:330] unrecognized feature gate: AWSEFSDriverVolumeMetrics Jan 11 17:30:25 crc kubenswrapper[4837]: W0111 17:30:25.966453 4837 feature_gate.go:330] unrecognized feature gate: ExternalOIDC Jan 11 17:30:25 crc kubenswrapper[4837]: W0111 17:30:25.966462 4837 feature_gate.go:330] unrecognized feature gate: MetricsCollectionProfiles Jan 11 17:30:25 crc kubenswrapper[4837]: W0111 17:30:25.966471 4837 feature_gate.go:330] unrecognized feature gate: AdminNetworkPolicy Jan 11 17:30:25 crc kubenswrapper[4837]: W0111 17:30:25.966480 4837 feature_gate.go:330] unrecognized feature gate: ClusterMonitoringConfig Jan 11 17:30:25 crc kubenswrapper[4837]: W0111 17:30:25.966488 4837 feature_gate.go:330] unrecognized feature gate: Example Jan 11 17:30:25 crc kubenswrapper[4837]: W0111 17:30:25.966497 4837 feature_gate.go:330] unrecognized feature gate: VSphereMultiNetworks Jan 11 17:30:25 crc kubenswrapper[4837]: W0111 17:30:25.966505 4837 feature_gate.go:330] unrecognized feature gate: InsightsConfigAPI Jan 11 17:30:25 crc kubenswrapper[4837]: W0111 17:30:25.966513 4837 feature_gate.go:330] unrecognized feature gate: InsightsConfig Jan 11 17:30:25 crc kubenswrapper[4837]: W0111 17:30:25.966522 4837 feature_gate.go:330] unrecognized feature gate: OVNObservability Jan 11 17:30:25 crc kubenswrapper[4837]: W0111 17:30:25.966533 4837 feature_gate.go:353] Setting GA feature gate CloudDualStackNodeIPs=true. It will be removed in a future release. Jan 11 17:30:25 crc kubenswrapper[4837]: W0111 17:30:25.966543 4837 feature_gate.go:330] unrecognized feature gate: DNSNameResolver Jan 11 17:30:25 crc kubenswrapper[4837]: W0111 17:30:25.966554 4837 feature_gate.go:330] unrecognized feature gate: EtcdBackendQuota Jan 11 17:30:25 crc kubenswrapper[4837]: I0111 17:30:25.966580 4837 feature_gate.go:386] feature gates: {map[CloudDualStackNodeIPs:true DisableKubeletCloudCredentialProviders:true DynamicResourceAllocation:false EventedPLEG:false KMSv1:true MaxUnavailableStatefulSet:false NodeSwap:false ProcMountType:false RouteExternalCertificate:false ServiceAccountTokenNodeBinding:false TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:false UserNamespacesSupport:false ValidatingAdmissionPolicy:true VolumeAttributesClass:false]} Jan 11 17:30:25 crc kubenswrapper[4837]: I0111 17:30:25.975990 4837 server.go:491] "Kubelet version" kubeletVersion="v1.31.5" Jan 11 17:30:25 crc kubenswrapper[4837]: I0111 17:30:25.976030 4837 server.go:493] "Golang settings" GOGC="" GOMAXPROCS="" GOTRACEBACK="" Jan 11 17:30:25 crc kubenswrapper[4837]: W0111 17:30:25.976207 4837 feature_gate.go:330] unrecognized feature gate: VSphereMultiVCenters Jan 11 17:30:25 crc kubenswrapper[4837]: W0111 17:30:25.976226 4837 feature_gate.go:330] unrecognized feature gate: IngressControllerLBSubnetsAWS Jan 11 17:30:25 crc kubenswrapper[4837]: W0111 17:30:25.976239 4837 feature_gate.go:330] unrecognized feature gate: AutomatedEtcdBackup Jan 11 17:30:25 crc kubenswrapper[4837]: W0111 17:30:25.976250 4837 feature_gate.go:330] unrecognized feature gate: PinnedImages Jan 11 17:30:25 crc kubenswrapper[4837]: W0111 17:30:25.976261 4837 feature_gate.go:330] unrecognized feature gate: OpenShiftPodSecurityAdmission Jan 11 17:30:25 crc kubenswrapper[4837]: W0111 17:30:25.976272 4837 feature_gate.go:330] unrecognized feature gate: ConsolePluginContentSecurityPolicy Jan 11 17:30:25 crc kubenswrapper[4837]: W0111 17:30:25.976283 4837 feature_gate.go:330] unrecognized feature gate: VSphereDriverConfiguration Jan 11 17:30:25 crc kubenswrapper[4837]: W0111 17:30:25.976294 4837 feature_gate.go:330] unrecognized feature gate: NewOLM Jan 11 17:30:25 crc kubenswrapper[4837]: W0111 17:30:25.976303 4837 feature_gate.go:330] unrecognized feature gate: MachineAPIProviderOpenStack Jan 11 17:30:25 crc kubenswrapper[4837]: W0111 17:30:25.976312 4837 feature_gate.go:330] unrecognized feature gate: InsightsConfig Jan 11 17:30:25 crc kubenswrapper[4837]: W0111 17:30:25.976319 4837 feature_gate.go:330] unrecognized feature gate: PlatformOperators Jan 11 17:30:25 crc kubenswrapper[4837]: W0111 17:30:25.976328 4837 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAWS Jan 11 17:30:25 crc kubenswrapper[4837]: W0111 17:30:25.976336 4837 feature_gate.go:330] unrecognized feature gate: ManagedBootImages Jan 11 17:30:25 crc kubenswrapper[4837]: W0111 17:30:25.976344 4837 feature_gate.go:330] unrecognized feature gate: GatewayAPI Jan 11 17:30:25 crc kubenswrapper[4837]: W0111 17:30:25.976352 4837 feature_gate.go:330] unrecognized feature gate: OVNObservability Jan 11 17:30:25 crc kubenswrapper[4837]: W0111 17:30:25.976362 4837 feature_gate.go:351] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Jan 11 17:30:25 crc kubenswrapper[4837]: W0111 17:30:25.976372 4837 feature_gate.go:330] unrecognized feature gate: NodeDisruptionPolicy Jan 11 17:30:25 crc kubenswrapper[4837]: W0111 17:30:25.976381 4837 feature_gate.go:330] unrecognized feature gate: GCPClusterHostedDNS Jan 11 17:30:25 crc kubenswrapper[4837]: W0111 17:30:25.976390 4837 feature_gate.go:330] unrecognized feature gate: MultiArchInstallGCP Jan 11 17:30:25 crc kubenswrapper[4837]: W0111 17:30:25.976398 4837 feature_gate.go:330] unrecognized feature gate: NetworkDiagnosticsConfig Jan 11 17:30:25 crc kubenswrapper[4837]: W0111 17:30:25.976406 4837 feature_gate.go:330] unrecognized feature gate: SignatureStores Jan 11 17:30:25 crc kubenswrapper[4837]: W0111 17:30:25.976413 4837 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstall Jan 11 17:30:25 crc kubenswrapper[4837]: W0111 17:30:25.976421 4837 feature_gate.go:330] unrecognized feature gate: MinimumKubeletVersion Jan 11 17:30:25 crc kubenswrapper[4837]: W0111 17:30:25.976429 4837 feature_gate.go:330] unrecognized feature gate: AdditionalRoutingCapabilities Jan 11 17:30:25 crc kubenswrapper[4837]: W0111 17:30:25.976437 4837 feature_gate.go:330] unrecognized feature gate: SetEIPForNLBIngressController Jan 11 17:30:25 crc kubenswrapper[4837]: W0111 17:30:25.976444 4837 feature_gate.go:330] unrecognized feature gate: ImageStreamImportMode Jan 11 17:30:25 crc kubenswrapper[4837]: W0111 17:30:25.976452 4837 feature_gate.go:330] unrecognized feature gate: MetricsCollectionProfiles Jan 11 17:30:25 crc kubenswrapper[4837]: W0111 17:30:25.976460 4837 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstallIBMCloud Jan 11 17:30:25 crc kubenswrapper[4837]: W0111 17:30:25.976469 4837 feature_gate.go:330] unrecognized feature gate: UpgradeStatus Jan 11 17:30:25 crc kubenswrapper[4837]: W0111 17:30:25.976477 4837 feature_gate.go:330] unrecognized feature gate: NetworkSegmentation Jan 11 17:30:25 crc kubenswrapper[4837]: W0111 17:30:25.976485 4837 feature_gate.go:330] unrecognized feature gate: ChunkSizeMiB Jan 11 17:30:25 crc kubenswrapper[4837]: W0111 17:30:25.976493 4837 feature_gate.go:330] unrecognized feature gate: ExternalOIDC Jan 11 17:30:25 crc kubenswrapper[4837]: W0111 17:30:25.976500 4837 feature_gate.go:330] unrecognized feature gate: AWSEFSDriverVolumeMetrics Jan 11 17:30:25 crc kubenswrapper[4837]: W0111 17:30:25.976509 4837 feature_gate.go:330] unrecognized feature gate: PersistentIPsForVirtualization Jan 11 17:30:25 crc kubenswrapper[4837]: W0111 17:30:25.976519 4837 feature_gate.go:330] unrecognized feature gate: AWSClusterHostedDNS Jan 11 17:30:25 crc kubenswrapper[4837]: W0111 17:30:25.976526 4837 feature_gate.go:330] unrecognized feature gate: CSIDriverSharedResource Jan 11 17:30:25 crc kubenswrapper[4837]: W0111 17:30:25.976534 4837 feature_gate.go:330] unrecognized feature gate: MachineConfigNodes Jan 11 17:30:25 crc kubenswrapper[4837]: W0111 17:30:25.976541 4837 feature_gate.go:330] unrecognized feature gate: MixedCPUsAllocation Jan 11 17:30:25 crc kubenswrapper[4837]: W0111 17:30:25.976550 4837 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAzure Jan 11 17:30:25 crc kubenswrapper[4837]: W0111 17:30:25.976558 4837 feature_gate.go:330] unrecognized feature gate: NutanixMultiSubnets Jan 11 17:30:25 crc kubenswrapper[4837]: W0111 17:30:25.976565 4837 feature_gate.go:330] unrecognized feature gate: EtcdBackendQuota Jan 11 17:30:25 crc kubenswrapper[4837]: W0111 17:30:25.976577 4837 feature_gate.go:353] Setting GA feature gate DisableKubeletCloudCredentialProviders=true. It will be removed in a future release. Jan 11 17:30:25 crc kubenswrapper[4837]: W0111 17:30:25.976589 4837 feature_gate.go:330] unrecognized feature gate: AzureWorkloadIdentity Jan 11 17:30:25 crc kubenswrapper[4837]: W0111 17:30:25.976598 4837 feature_gate.go:330] unrecognized feature gate: IngressControllerDynamicConfigurationManager Jan 11 17:30:25 crc kubenswrapper[4837]: W0111 17:30:25.976608 4837 feature_gate.go:330] unrecognized feature gate: VSphereStaticIPs Jan 11 17:30:25 crc kubenswrapper[4837]: W0111 17:30:25.976616 4837 feature_gate.go:330] unrecognized feature gate: InsightsRuntimeExtractor Jan 11 17:30:25 crc kubenswrapper[4837]: W0111 17:30:25.976624 4837 feature_gate.go:330] unrecognized feature gate: HardwareSpeed Jan 11 17:30:25 crc kubenswrapper[4837]: W0111 17:30:25.976632 4837 feature_gate.go:330] unrecognized feature gate: InsightsOnDemandDataGather Jan 11 17:30:25 crc kubenswrapper[4837]: W0111 17:30:25.976639 4837 feature_gate.go:330] unrecognized feature gate: AdminNetworkPolicy Jan 11 17:30:25 crc kubenswrapper[4837]: W0111 17:30:25.976647 4837 feature_gate.go:330] unrecognized feature gate: VSphereControlPlaneMachineSet Jan 11 17:30:25 crc kubenswrapper[4837]: W0111 17:30:25.976655 4837 feature_gate.go:330] unrecognized feature gate: RouteAdvertisements Jan 11 17:30:25 crc kubenswrapper[4837]: W0111 17:30:25.976663 4837 feature_gate.go:330] unrecognized feature gate: MachineAPIMigration Jan 11 17:30:25 crc kubenswrapper[4837]: W0111 17:30:25.976700 4837 feature_gate.go:330] unrecognized feature gate: BareMetalLoadBalancer Jan 11 17:30:25 crc kubenswrapper[4837]: W0111 17:30:25.976708 4837 feature_gate.go:330] unrecognized feature gate: OnClusterBuild Jan 11 17:30:25 crc kubenswrapper[4837]: W0111 17:30:25.976716 4837 feature_gate.go:330] unrecognized feature gate: ManagedBootImagesAWS Jan 11 17:30:25 crc kubenswrapper[4837]: W0111 17:30:25.976724 4837 feature_gate.go:330] unrecognized feature gate: PrivateHostedZoneAWS Jan 11 17:30:25 crc kubenswrapper[4837]: W0111 17:30:25.976732 4837 feature_gate.go:330] unrecognized feature gate: BuildCSIVolumes Jan 11 17:30:25 crc kubenswrapper[4837]: W0111 17:30:25.976740 4837 feature_gate.go:330] unrecognized feature gate: DNSNameResolver Jan 11 17:30:25 crc kubenswrapper[4837]: W0111 17:30:25.976747 4837 feature_gate.go:330] unrecognized feature gate: NetworkLiveMigration Jan 11 17:30:25 crc kubenswrapper[4837]: W0111 17:30:25.976756 4837 feature_gate.go:330] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Jan 11 17:30:25 crc kubenswrapper[4837]: W0111 17:30:25.976767 4837 feature_gate.go:353] Setting GA feature gate ValidatingAdmissionPolicy=true. It will be removed in a future release. Jan 11 17:30:25 crc kubenswrapper[4837]: W0111 17:30:25.976778 4837 feature_gate.go:330] unrecognized feature gate: Example Jan 11 17:30:25 crc kubenswrapper[4837]: W0111 17:30:25.976786 4837 feature_gate.go:330] unrecognized feature gate: AlibabaPlatform Jan 11 17:30:25 crc kubenswrapper[4837]: W0111 17:30:25.976797 4837 feature_gate.go:353] Setting GA feature gate CloudDualStackNodeIPs=true. It will be removed in a future release. Jan 11 17:30:25 crc kubenswrapper[4837]: W0111 17:30:25.976807 4837 feature_gate.go:330] unrecognized feature gate: BootcNodeManagement Jan 11 17:30:25 crc kubenswrapper[4837]: W0111 17:30:25.976816 4837 feature_gate.go:330] unrecognized feature gate: VSphereMultiNetworks Jan 11 17:30:25 crc kubenswrapper[4837]: W0111 17:30:25.976824 4837 feature_gate.go:330] unrecognized feature gate: InsightsConfigAPI Jan 11 17:30:25 crc kubenswrapper[4837]: W0111 17:30:25.976833 4837 feature_gate.go:330] unrecognized feature gate: SigstoreImageVerification Jan 11 17:30:25 crc kubenswrapper[4837]: W0111 17:30:25.976841 4837 feature_gate.go:330] unrecognized feature gate: GCPLabelsTags Jan 11 17:30:25 crc kubenswrapper[4837]: W0111 17:30:25.976848 4837 feature_gate.go:330] unrecognized feature gate: ClusterMonitoringConfig Jan 11 17:30:25 crc kubenswrapper[4837]: W0111 17:30:25.976858 4837 feature_gate.go:330] unrecognized feature gate: VolumeGroupSnapshot Jan 11 17:30:25 crc kubenswrapper[4837]: I0111 17:30:25.976871 4837 feature_gate.go:386] feature gates: {map[CloudDualStackNodeIPs:true DisableKubeletCloudCredentialProviders:true DynamicResourceAllocation:false EventedPLEG:false KMSv1:true MaxUnavailableStatefulSet:false NodeSwap:false ProcMountType:false RouteExternalCertificate:false ServiceAccountTokenNodeBinding:false TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:false UserNamespacesSupport:false ValidatingAdmissionPolicy:true VolumeAttributesClass:false]} Jan 11 17:30:25 crc kubenswrapper[4837]: W0111 17:30:25.977101 4837 feature_gate.go:353] Setting GA feature gate ValidatingAdmissionPolicy=true. It will be removed in a future release. Jan 11 17:30:25 crc kubenswrapper[4837]: W0111 17:30:25.977116 4837 feature_gate.go:330] unrecognized feature gate: MetricsCollectionProfiles Jan 11 17:30:25 crc kubenswrapper[4837]: W0111 17:30:25.977125 4837 feature_gate.go:330] unrecognized feature gate: OVNObservability Jan 11 17:30:25 crc kubenswrapper[4837]: W0111 17:30:25.977135 4837 feature_gate.go:330] unrecognized feature gate: VSphereMultiNetworks Jan 11 17:30:25 crc kubenswrapper[4837]: W0111 17:30:25.977143 4837 feature_gate.go:330] unrecognized feature gate: IngressControllerLBSubnetsAWS Jan 11 17:30:25 crc kubenswrapper[4837]: W0111 17:30:25.977151 4837 feature_gate.go:330] unrecognized feature gate: AWSClusterHostedDNS Jan 11 17:30:25 crc kubenswrapper[4837]: W0111 17:30:25.977159 4837 feature_gate.go:330] unrecognized feature gate: AWSEFSDriverVolumeMetrics Jan 11 17:30:25 crc kubenswrapper[4837]: W0111 17:30:25.977166 4837 feature_gate.go:330] unrecognized feature gate: ImageStreamImportMode Jan 11 17:30:25 crc kubenswrapper[4837]: W0111 17:30:25.977175 4837 feature_gate.go:330] unrecognized feature gate: NetworkLiveMigration Jan 11 17:30:25 crc kubenswrapper[4837]: W0111 17:30:25.977184 4837 feature_gate.go:330] unrecognized feature gate: RouteAdvertisements Jan 11 17:30:25 crc kubenswrapper[4837]: W0111 17:30:25.977192 4837 feature_gate.go:330] unrecognized feature gate: BareMetalLoadBalancer Jan 11 17:30:25 crc kubenswrapper[4837]: W0111 17:30:25.977200 4837 feature_gate.go:330] unrecognized feature gate: UpgradeStatus Jan 11 17:30:25 crc kubenswrapper[4837]: W0111 17:30:25.977207 4837 feature_gate.go:330] unrecognized feature gate: AlibabaPlatform Jan 11 17:30:25 crc kubenswrapper[4837]: W0111 17:30:25.977216 4837 feature_gate.go:330] unrecognized feature gate: MachineConfigNodes Jan 11 17:30:25 crc kubenswrapper[4837]: W0111 17:30:25.977223 4837 feature_gate.go:330] unrecognized feature gate: VSphereMultiVCenters Jan 11 17:30:25 crc kubenswrapper[4837]: W0111 17:30:25.977233 4837 feature_gate.go:353] Setting GA feature gate DisableKubeletCloudCredentialProviders=true. It will be removed in a future release. Jan 11 17:30:25 crc kubenswrapper[4837]: W0111 17:30:25.977243 4837 feature_gate.go:330] unrecognized feature gate: VSphereStaticIPs Jan 11 17:30:25 crc kubenswrapper[4837]: W0111 17:30:25.977253 4837 feature_gate.go:330] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Jan 11 17:30:25 crc kubenswrapper[4837]: W0111 17:30:25.977263 4837 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstall Jan 11 17:30:25 crc kubenswrapper[4837]: W0111 17:30:25.977271 4837 feature_gate.go:330] unrecognized feature gate: CSIDriverSharedResource Jan 11 17:30:25 crc kubenswrapper[4837]: W0111 17:30:25.977279 4837 feature_gate.go:330] unrecognized feature gate: NodeDisruptionPolicy Jan 11 17:30:25 crc kubenswrapper[4837]: W0111 17:30:25.977288 4837 feature_gate.go:330] unrecognized feature gate: InsightsRuntimeExtractor Jan 11 17:30:25 crc kubenswrapper[4837]: W0111 17:30:25.977296 4837 feature_gate.go:330] unrecognized feature gate: InsightsOnDemandDataGather Jan 11 17:30:25 crc kubenswrapper[4837]: W0111 17:30:25.977304 4837 feature_gate.go:330] unrecognized feature gate: MachineAPIMigration Jan 11 17:30:25 crc kubenswrapper[4837]: W0111 17:30:25.977312 4837 feature_gate.go:330] unrecognized feature gate: ManagedBootImages Jan 11 17:30:25 crc kubenswrapper[4837]: W0111 17:30:25.977320 4837 feature_gate.go:330] unrecognized feature gate: NetworkDiagnosticsConfig Jan 11 17:30:25 crc kubenswrapper[4837]: W0111 17:30:25.977328 4837 feature_gate.go:330] unrecognized feature gate: SigstoreImageVerification Jan 11 17:30:25 crc kubenswrapper[4837]: W0111 17:30:25.977336 4837 feature_gate.go:330] unrecognized feature gate: OpenShiftPodSecurityAdmission Jan 11 17:30:25 crc kubenswrapper[4837]: W0111 17:30:25.977346 4837 feature_gate.go:353] Setting GA feature gate CloudDualStackNodeIPs=true. It will be removed in a future release. Jan 11 17:30:25 crc kubenswrapper[4837]: W0111 17:30:25.977356 4837 feature_gate.go:330] unrecognized feature gate: OnClusterBuild Jan 11 17:30:25 crc kubenswrapper[4837]: W0111 17:30:25.977365 4837 feature_gate.go:330] unrecognized feature gate: PinnedImages Jan 11 17:30:25 crc kubenswrapper[4837]: W0111 17:30:25.977373 4837 feature_gate.go:330] unrecognized feature gate: ManagedBootImagesAWS Jan 11 17:30:25 crc kubenswrapper[4837]: W0111 17:30:25.977381 4837 feature_gate.go:330] unrecognized feature gate: ClusterMonitoringConfig Jan 11 17:30:25 crc kubenswrapper[4837]: W0111 17:30:25.977389 4837 feature_gate.go:330] unrecognized feature gate: InsightsConfig Jan 11 17:30:25 crc kubenswrapper[4837]: W0111 17:30:25.977399 4837 feature_gate.go:330] unrecognized feature gate: PrivateHostedZoneAWS Jan 11 17:30:25 crc kubenswrapper[4837]: W0111 17:30:25.977407 4837 feature_gate.go:330] unrecognized feature gate: VolumeGroupSnapshot Jan 11 17:30:25 crc kubenswrapper[4837]: W0111 17:30:25.977415 4837 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAzure Jan 11 17:30:25 crc kubenswrapper[4837]: W0111 17:30:25.977423 4837 feature_gate.go:330] unrecognized feature gate: Example Jan 11 17:30:25 crc kubenswrapper[4837]: W0111 17:30:25.977431 4837 feature_gate.go:330] unrecognized feature gate: NutanixMultiSubnets Jan 11 17:30:25 crc kubenswrapper[4837]: W0111 17:30:25.977439 4837 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstallIBMCloud Jan 11 17:30:25 crc kubenswrapper[4837]: W0111 17:30:25.977447 4837 feature_gate.go:330] unrecognized feature gate: SetEIPForNLBIngressController Jan 11 17:30:25 crc kubenswrapper[4837]: W0111 17:30:25.977454 4837 feature_gate.go:330] unrecognized feature gate: SignatureStores Jan 11 17:30:25 crc kubenswrapper[4837]: W0111 17:30:25.977462 4837 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAWS Jan 11 17:30:25 crc kubenswrapper[4837]: W0111 17:30:25.977470 4837 feature_gate.go:330] unrecognized feature gate: AdditionalRoutingCapabilities Jan 11 17:30:25 crc kubenswrapper[4837]: W0111 17:30:25.977478 4837 feature_gate.go:330] unrecognized feature gate: PersistentIPsForVirtualization Jan 11 17:30:25 crc kubenswrapper[4837]: W0111 17:30:25.977486 4837 feature_gate.go:330] unrecognized feature gate: EtcdBackendQuota Jan 11 17:30:25 crc kubenswrapper[4837]: W0111 17:30:25.977494 4837 feature_gate.go:330] unrecognized feature gate: AdminNetworkPolicy Jan 11 17:30:25 crc kubenswrapper[4837]: W0111 17:30:25.977501 4837 feature_gate.go:330] unrecognized feature gate: ConsolePluginContentSecurityPolicy Jan 11 17:30:25 crc kubenswrapper[4837]: W0111 17:30:25.977509 4837 feature_gate.go:330] unrecognized feature gate: DNSNameResolver Jan 11 17:30:25 crc kubenswrapper[4837]: W0111 17:30:25.977517 4837 feature_gate.go:330] unrecognized feature gate: GCPLabelsTags Jan 11 17:30:25 crc kubenswrapper[4837]: W0111 17:30:25.977525 4837 feature_gate.go:330] unrecognized feature gate: AutomatedEtcdBackup Jan 11 17:30:25 crc kubenswrapper[4837]: W0111 17:30:25.977533 4837 feature_gate.go:330] unrecognized feature gate: BuildCSIVolumes Jan 11 17:30:25 crc kubenswrapper[4837]: W0111 17:30:25.977540 4837 feature_gate.go:330] unrecognized feature gate: VSphereControlPlaneMachineSet Jan 11 17:30:25 crc kubenswrapper[4837]: W0111 17:30:25.977548 4837 feature_gate.go:330] unrecognized feature gate: AzureWorkloadIdentity Jan 11 17:30:25 crc kubenswrapper[4837]: W0111 17:30:25.977556 4837 feature_gate.go:330] unrecognized feature gate: GatewayAPI Jan 11 17:30:25 crc kubenswrapper[4837]: W0111 17:30:25.977564 4837 feature_gate.go:330] unrecognized feature gate: GCPClusterHostedDNS Jan 11 17:30:25 crc kubenswrapper[4837]: W0111 17:30:25.977571 4837 feature_gate.go:330] unrecognized feature gate: IngressControllerDynamicConfigurationManager Jan 11 17:30:25 crc kubenswrapper[4837]: W0111 17:30:25.977579 4837 feature_gate.go:330] unrecognized feature gate: MultiArchInstallGCP Jan 11 17:30:25 crc kubenswrapper[4837]: W0111 17:30:25.977586 4837 feature_gate.go:330] unrecognized feature gate: MachineAPIProviderOpenStack Jan 11 17:30:25 crc kubenswrapper[4837]: W0111 17:30:25.977594 4837 feature_gate.go:330] unrecognized feature gate: PlatformOperators Jan 11 17:30:25 crc kubenswrapper[4837]: W0111 17:30:25.977602 4837 feature_gate.go:330] unrecognized feature gate: ExternalOIDC Jan 11 17:30:25 crc kubenswrapper[4837]: W0111 17:30:25.977610 4837 feature_gate.go:330] unrecognized feature gate: MixedCPUsAllocation Jan 11 17:30:25 crc kubenswrapper[4837]: W0111 17:30:25.977617 4837 feature_gate.go:330] unrecognized feature gate: NewOLM Jan 11 17:30:25 crc kubenswrapper[4837]: W0111 17:30:25.977625 4837 feature_gate.go:330] unrecognized feature gate: HardwareSpeed Jan 11 17:30:25 crc kubenswrapper[4837]: W0111 17:30:25.977632 4837 feature_gate.go:330] unrecognized feature gate: NetworkSegmentation Jan 11 17:30:25 crc kubenswrapper[4837]: W0111 17:30:25.977640 4837 feature_gate.go:330] unrecognized feature gate: InsightsConfigAPI Jan 11 17:30:25 crc kubenswrapper[4837]: W0111 17:30:25.977648 4837 feature_gate.go:330] unrecognized feature gate: ChunkSizeMiB Jan 11 17:30:25 crc kubenswrapper[4837]: W0111 17:30:25.977655 4837 feature_gate.go:330] unrecognized feature gate: BootcNodeManagement Jan 11 17:30:25 crc kubenswrapper[4837]: W0111 17:30:25.977663 4837 feature_gate.go:330] unrecognized feature gate: VSphereDriverConfiguration Jan 11 17:30:25 crc kubenswrapper[4837]: W0111 17:30:25.977696 4837 feature_gate.go:351] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Jan 11 17:30:25 crc kubenswrapper[4837]: W0111 17:30:25.977706 4837 feature_gate.go:330] unrecognized feature gate: MinimumKubeletVersion Jan 11 17:30:25 crc kubenswrapper[4837]: I0111 17:30:25.977717 4837 feature_gate.go:386] feature gates: {map[CloudDualStackNodeIPs:true DisableKubeletCloudCredentialProviders:true DynamicResourceAllocation:false EventedPLEG:false KMSv1:true MaxUnavailableStatefulSet:false NodeSwap:false ProcMountType:false RouteExternalCertificate:false ServiceAccountTokenNodeBinding:false TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:false UserNamespacesSupport:false ValidatingAdmissionPolicy:true VolumeAttributesClass:false]} Jan 11 17:30:25 crc kubenswrapper[4837]: I0111 17:30:25.977963 4837 server.go:940] "Client rotation is on, will bootstrap in background" Jan 11 17:30:25 crc kubenswrapper[4837]: I0111 17:30:25.981990 4837 bootstrap.go:85] "Current kubeconfig file contents are still valid, no bootstrap necessary" Jan 11 17:30:25 crc kubenswrapper[4837]: I0111 17:30:25.982113 4837 certificate_store.go:130] Loading cert/key pair from "/var/lib/kubelet/pki/kubelet-client-current.pem". Jan 11 17:30:25 crc kubenswrapper[4837]: I0111 17:30:25.982888 4837 server.go:997] "Starting client certificate rotation" Jan 11 17:30:25 crc kubenswrapper[4837]: I0111 17:30:25.982924 4837 certificate_manager.go:356] kubernetes.io/kube-apiserver-client-kubelet: Certificate rotation is enabled Jan 11 17:30:25 crc kubenswrapper[4837]: I0111 17:30:25.983428 4837 certificate_manager.go:356] kubernetes.io/kube-apiserver-client-kubelet: Certificate expiration is 2026-02-24 05:52:08 +0000 UTC, rotation deadline is 2025-11-13 16:28:15.749319526 +0000 UTC Jan 11 17:30:25 crc kubenswrapper[4837]: I0111 17:30:25.983560 4837 certificate_manager.go:356] kubernetes.io/kube-apiserver-client-kubelet: Rotating certificates Jan 11 17:30:25 crc kubenswrapper[4837]: I0111 17:30:25.990781 4837 dynamic_cafile_content.go:123] "Loaded a new CA Bundle and Verifier" name="client-ca-bundle::/etc/kubernetes/kubelet-ca.crt" Jan 11 17:30:25 crc kubenswrapper[4837]: E0111 17:30:25.992651 4837 certificate_manager.go:562] "Unhandled Error" err="kubernetes.io/kube-apiserver-client-kubelet: Failed while requesting a signed certificate from the control plane: cannot create certificate signing request: Post \"https://api-int.crc.testing:6443/apis/certificates.k8s.io/v1/certificatesigningrequests\": dial tcp 38.102.83.196:6443: connect: connection refused" logger="UnhandledError" Jan 11 17:30:25 crc kubenswrapper[4837]: I0111 17:30:25.993519 4837 dynamic_cafile_content.go:161] "Starting controller" name="client-ca-bundle::/etc/kubernetes/kubelet-ca.crt" Jan 11 17:30:26 crc kubenswrapper[4837]: I0111 17:30:26.005895 4837 log.go:25] "Validated CRI v1 runtime API" Jan 11 17:30:26 crc kubenswrapper[4837]: I0111 17:30:26.025947 4837 log.go:25] "Validated CRI v1 image API" Jan 11 17:30:26 crc kubenswrapper[4837]: I0111 17:30:26.027905 4837 server.go:1437] "Using cgroup driver setting received from the CRI runtime" cgroupDriver="systemd" Jan 11 17:30:26 crc kubenswrapper[4837]: I0111 17:30:26.030139 4837 fs.go:133] Filesystem UUIDs: map[0b076daa-c26a-46d2-b3a6-72a8dbc6e257:/dev/vda4 2026-01-11-17-22-59-00:/dev/sr0 7B77-95E7:/dev/vda2 de0497b0-db1b-465a-b278-03db02455c71:/dev/vda3] Jan 11 17:30:26 crc kubenswrapper[4837]: I0111 17:30:26.030176 4837 fs.go:134] Filesystem partitions: map[/dev/shm:{mountpoint:/dev/shm major:0 minor:22 fsType:tmpfs blockSize:0} /dev/vda3:{mountpoint:/boot major:252 minor:3 fsType:ext4 blockSize:0} /dev/vda4:{mountpoint:/var major:252 minor:4 fsType:xfs blockSize:0} /run:{mountpoint:/run major:0 minor:24 fsType:tmpfs blockSize:0} /run/user/1000:{mountpoint:/run/user/1000 major:0 minor:45 fsType:tmpfs blockSize:0} /tmp:{mountpoint:/tmp major:0 minor:30 fsType:tmpfs blockSize:0} /var/lib/etcd:{mountpoint:/var/lib/etcd major:0 minor:42 fsType:tmpfs blockSize:0}] Jan 11 17:30:26 crc kubenswrapper[4837]: I0111 17:30:26.047112 4837 manager.go:217] Machine: {Timestamp:2026-01-11 17:30:26.045806696 +0000 UTC m=+0.223999442 CPUVendorID:AuthenticAMD NumCores:12 NumPhysicalCores:1 NumSockets:12 CpuFrequency:2799998 MemoryCapacity:33654128640 SwapCapacity:0 MemoryByType:map[] NVMInfo:{MemoryModeCapacity:0 AppDirectModeCapacity:0 AvgPowerBudget:0} HugePages:[{PageSize:1048576 NumPages:0} {PageSize:2048 NumPages:0}] MachineID:21801e6708c44f15b81395eb736a7cec SystemUUID:860923a9-1393-406f-8d70-e9e11a3b51a7 BootID:998e7fc1-b4f8-46e4-96eb-3a9f319ce8e1 Filesystems:[{Device:/dev/shm DeviceMajor:0 DeviceMinor:22 Capacity:16827064320 Type:vfs Inodes:4108170 HasInodes:true} {Device:/run DeviceMajor:0 DeviceMinor:24 Capacity:6730825728 Type:vfs Inodes:819200 HasInodes:true} {Device:/dev/vda4 DeviceMajor:252 DeviceMinor:4 Capacity:85292941312 Type:vfs Inodes:41679680 HasInodes:true} {Device:/tmp DeviceMajor:0 DeviceMinor:30 Capacity:16827064320 Type:vfs Inodes:1048576 HasInodes:true} {Device:/dev/vda3 DeviceMajor:252 DeviceMinor:3 Capacity:366869504 Type:vfs Inodes:98304 HasInodes:true} {Device:/run/user/1000 DeviceMajor:0 DeviceMinor:45 Capacity:3365412864 Type:vfs Inodes:821634 HasInodes:true} {Device:/var/lib/etcd DeviceMajor:0 DeviceMinor:42 Capacity:1073741824 Type:vfs Inodes:4108170 HasInodes:true}] DiskMap:map[252:0:{Name:vda Major:252 Minor:0 Size:214748364800 Scheduler:none}] NetworkDevices:[{Name:br-ex MacAddress:fa:16:3e:e1:9f:b5 Speed:0 Mtu:1500} {Name:br-int MacAddress:d6:39:55:2e:22:71 Speed:0 Mtu:1400} {Name:ens3 MacAddress:fa:16:3e:e1:9f:b5 Speed:-1 Mtu:1500} {Name:ens7 MacAddress:fa:16:3e:9c:61:31 Speed:-1 Mtu:1500} {Name:ens7.20 MacAddress:52:54:00:eb:8c:3b Speed:-1 Mtu:1496} {Name:ens7.21 MacAddress:52:54:00:4a:c7:a6 Speed:-1 Mtu:1496} {Name:ens7.22 MacAddress:52:54:00:11:ef:ba Speed:-1 Mtu:1496} {Name:eth10 MacAddress:d6:3f:ac:0e:9c:50 Speed:0 Mtu:1500} {Name:ovn-k8s-mp0 MacAddress:0a:58:0a:d9:00:02 Speed:0 Mtu:1400} {Name:ovs-system MacAddress:be:e4:c2:cc:74:3c Speed:0 Mtu:1500}] Topology:[{Id:0 Memory:33654128640 HugePages:[{PageSize:1048576 NumPages:0} {PageSize:2048 NumPages:0}] Cores:[{Id:0 Threads:[0] Caches:[{Id:0 Size:32768 Type:Data Level:1} {Id:0 Size:32768 Type:Instruction Level:1} {Id:0 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:0 Size:16777216 Type:Unified Level:3}] SocketID:0 BookID: DrawerID:} {Id:0 Threads:[1] Caches:[{Id:1 Size:32768 Type:Data Level:1} {Id:1 Size:32768 Type:Instruction Level:1} {Id:1 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:1 Size:16777216 Type:Unified Level:3}] SocketID:1 BookID: DrawerID:} {Id:0 Threads:[10] Caches:[{Id:10 Size:32768 Type:Data Level:1} {Id:10 Size:32768 Type:Instruction Level:1} {Id:10 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:10 Size:16777216 Type:Unified Level:3}] SocketID:10 BookID: DrawerID:} {Id:0 Threads:[11] Caches:[{Id:11 Size:32768 Type:Data Level:1} {Id:11 Size:32768 Type:Instruction Level:1} {Id:11 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:11 Size:16777216 Type:Unified Level:3}] SocketID:11 BookID: DrawerID:} {Id:0 Threads:[2] Caches:[{Id:2 Size:32768 Type:Data Level:1} {Id:2 Size:32768 Type:Instruction Level:1} {Id:2 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:2 Size:16777216 Type:Unified Level:3}] SocketID:2 BookID: DrawerID:} {Id:0 Threads:[3] Caches:[{Id:3 Size:32768 Type:Data Level:1} {Id:3 Size:32768 Type:Instruction Level:1} {Id:3 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:3 Size:16777216 Type:Unified Level:3}] SocketID:3 BookID: DrawerID:} {Id:0 Threads:[4] Caches:[{Id:4 Size:32768 Type:Data Level:1} {Id:4 Size:32768 Type:Instruction Level:1} {Id:4 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:4 Size:16777216 Type:Unified Level:3}] SocketID:4 BookID: DrawerID:} {Id:0 Threads:[5] Caches:[{Id:5 Size:32768 Type:Data Level:1} {Id:5 Size:32768 Type:Instruction Level:1} {Id:5 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:5 Size:16777216 Type:Unified Level:3}] SocketID:5 BookID: DrawerID:} {Id:0 Threads:[6] Caches:[{Id:6 Size:32768 Type:Data Level:1} {Id:6 Size:32768 Type:Instruction Level:1} {Id:6 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:6 Size:16777216 Type:Unified Level:3}] SocketID:6 BookID: DrawerID:} {Id:0 Threads:[7] Caches:[{Id:7 Size:32768 Type:Data Level:1} {Id:7 Size:32768 Type:Instruction Level:1} {Id:7 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:7 Size:16777216 Type:Unified Level:3}] SocketID:7 BookID: DrawerID:} {Id:0 Threads:[8] Caches:[{Id:8 Size:32768 Type:Data Level:1} {Id:8 Size:32768 Type:Instruction Level:1} {Id:8 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:8 Size:16777216 Type:Unified Level:3}] SocketID:8 BookID: DrawerID:} {Id:0 Threads:[9] Caches:[{Id:9 Size:32768 Type:Data Level:1} {Id:9 Size:32768 Type:Instruction Level:1} {Id:9 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:9 Size:16777216 Type:Unified Level:3}] SocketID:9 BookID: DrawerID:}] Caches:[] Distances:[10]}] CloudProvider:Unknown InstanceType:Unknown InstanceID:None} Jan 11 17:30:26 crc kubenswrapper[4837]: I0111 17:30:26.047609 4837 manager_no_libpfm.go:29] cAdvisor is build without cgo and/or libpfm support. Perf event counters are not available. Jan 11 17:30:26 crc kubenswrapper[4837]: I0111 17:30:26.047868 4837 manager.go:233] Version: {KernelVersion:5.14.0-427.50.2.el9_4.x86_64 ContainerOsVersion:Red Hat Enterprise Linux CoreOS 418.94.202502100215-0 DockerVersion: DockerAPIVersion: CadvisorVersion: CadvisorRevision:} Jan 11 17:30:26 crc kubenswrapper[4837]: I0111 17:30:26.048146 4837 swap_util.go:113] "Swap is on" /proc/swaps contents="Filename\t\t\t\tType\t\tSize\t\tUsed\t\tPriority" Jan 11 17:30:26 crc kubenswrapper[4837]: I0111 17:30:26.048328 4837 container_manager_linux.go:267] "Container manager verified user specified cgroup-root exists" cgroupRoot=[] Jan 11 17:30:26 crc kubenswrapper[4837]: I0111 17:30:26.048368 4837 container_manager_linux.go:272] "Creating Container Manager object based on Node Config" nodeConfig={"NodeName":"crc","RuntimeCgroupsName":"/system.slice/crio.service","SystemCgroupsName":"/system.slice","KubeletCgroupsName":"","KubeletOOMScoreAdj":-999,"ContainerRuntime":"","CgroupsPerQOS":true,"CgroupRoot":"/","CgroupDriver":"systemd","KubeletRootDir":"/var/lib/kubelet","ProtectKernelDefaults":true,"KubeReservedCgroupName":"","SystemReservedCgroupName":"","ReservedSystemCPUs":{},"EnforceNodeAllocatable":{"pods":{}},"KubeReserved":null,"SystemReserved":{"cpu":"200m","ephemeral-storage":"350Mi","memory":"350Mi"},"HardEvictionThresholds":[{"Signal":"memory.available","Operator":"LessThan","Value":{"Quantity":"100Mi","Percentage":0},"GracePeriod":0,"MinReclaim":null},{"Signal":"nodefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.1},"GracePeriod":0,"MinReclaim":null},{"Signal":"nodefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null},{"Signal":"imagefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.15},"GracePeriod":0,"MinReclaim":null},{"Signal":"imagefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null}],"QOSReserved":{},"CPUManagerPolicy":"none","CPUManagerPolicyOptions":null,"TopologyManagerScope":"container","CPUManagerReconcilePeriod":10000000000,"ExperimentalMemoryManagerPolicy":"None","ExperimentalMemoryManagerReservedMemory":null,"PodPidsLimit":4096,"EnforceCPULimits":true,"CPUCFSQuotaPeriod":100000000,"TopologyManagerPolicy":"none","TopologyManagerPolicyOptions":null,"CgroupVersion":2} Jan 11 17:30:26 crc kubenswrapper[4837]: I0111 17:30:26.048603 4837 topology_manager.go:138] "Creating topology manager with none policy" Jan 11 17:30:26 crc kubenswrapper[4837]: I0111 17:30:26.048617 4837 container_manager_linux.go:303] "Creating device plugin manager" Jan 11 17:30:26 crc kubenswrapper[4837]: I0111 17:30:26.048822 4837 manager.go:142] "Creating Device Plugin manager" path="/var/lib/kubelet/device-plugins/kubelet.sock" Jan 11 17:30:26 crc kubenswrapper[4837]: I0111 17:30:26.048850 4837 server.go:66] "Creating device plugin registration server" version="v1beta1" socket="/var/lib/kubelet/device-plugins/kubelet.sock" Jan 11 17:30:26 crc kubenswrapper[4837]: I0111 17:30:26.049124 4837 state_mem.go:36] "Initialized new in-memory state store" Jan 11 17:30:26 crc kubenswrapper[4837]: I0111 17:30:26.049467 4837 server.go:1245] "Using root directory" path="/var/lib/kubelet" Jan 11 17:30:26 crc kubenswrapper[4837]: I0111 17:30:26.050251 4837 kubelet.go:418] "Attempting to sync node with API server" Jan 11 17:30:26 crc kubenswrapper[4837]: I0111 17:30:26.050274 4837 kubelet.go:313] "Adding static pod path" path="/etc/kubernetes/manifests" Jan 11 17:30:26 crc kubenswrapper[4837]: I0111 17:30:26.050298 4837 file.go:69] "Watching path" path="/etc/kubernetes/manifests" Jan 11 17:30:26 crc kubenswrapper[4837]: I0111 17:30:26.050313 4837 kubelet.go:324] "Adding apiserver pod source" Jan 11 17:30:26 crc kubenswrapper[4837]: I0111 17:30:26.050326 4837 apiserver.go:42] "Waiting for node sync before watching apiserver pods" Jan 11 17:30:26 crc kubenswrapper[4837]: I0111 17:30:26.052082 4837 kuberuntime_manager.go:262] "Container runtime initialized" containerRuntime="cri-o" version="1.31.5-4.rhaos4.18.gitdad78d5.el9" apiVersion="v1" Jan 11 17:30:26 crc kubenswrapper[4837]: I0111 17:30:26.052420 4837 certificate_store.go:130] Loading cert/key pair from "/var/lib/kubelet/pki/kubelet-server-current.pem". Jan 11 17:30:26 crc kubenswrapper[4837]: I0111 17:30:26.053383 4837 kubelet.go:854] "Not starting ClusterTrustBundle informer because we are in static kubelet mode" Jan 11 17:30:26 crc kubenswrapper[4837]: W0111 17:30:26.053368 4837 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Service: Get "https://api-int.crc.testing:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0": dial tcp 38.102.83.196:6443: connect: connection refused Jan 11 17:30:26 crc kubenswrapper[4837]: E0111 17:30:26.053577 4837 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Service: failed to list *v1.Service: Get \"https://api-int.crc.testing:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0\": dial tcp 38.102.83.196:6443: connect: connection refused" logger="UnhandledError" Jan 11 17:30:26 crc kubenswrapper[4837]: W0111 17:30:26.053428 4837 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Node: Get "https://api-int.crc.testing:6443/api/v1/nodes?fieldSelector=metadata.name%3Dcrc&limit=500&resourceVersion=0": dial tcp 38.102.83.196:6443: connect: connection refused Jan 11 17:30:26 crc kubenswrapper[4837]: E0111 17:30:26.053635 4837 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Node: failed to list *v1.Node: Get \"https://api-int.crc.testing:6443/api/v1/nodes?fieldSelector=metadata.name%3Dcrc&limit=500&resourceVersion=0\": dial tcp 38.102.83.196:6443: connect: connection refused" logger="UnhandledError" Jan 11 17:30:26 crc kubenswrapper[4837]: I0111 17:30:26.054052 4837 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/portworx-volume" Jan 11 17:30:26 crc kubenswrapper[4837]: I0111 17:30:26.054083 4837 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/empty-dir" Jan 11 17:30:26 crc kubenswrapper[4837]: I0111 17:30:26.054093 4837 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/git-repo" Jan 11 17:30:26 crc kubenswrapper[4837]: I0111 17:30:26.054102 4837 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/host-path" Jan 11 17:30:26 crc kubenswrapper[4837]: I0111 17:30:26.054117 4837 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/nfs" Jan 11 17:30:26 crc kubenswrapper[4837]: I0111 17:30:26.054126 4837 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/secret" Jan 11 17:30:26 crc kubenswrapper[4837]: I0111 17:30:26.054154 4837 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/iscsi" Jan 11 17:30:26 crc kubenswrapper[4837]: I0111 17:30:26.054168 4837 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/downward-api" Jan 11 17:30:26 crc kubenswrapper[4837]: I0111 17:30:26.054178 4837 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/fc" Jan 11 17:30:26 crc kubenswrapper[4837]: I0111 17:30:26.054186 4837 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/configmap" Jan 11 17:30:26 crc kubenswrapper[4837]: I0111 17:30:26.054216 4837 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/projected" Jan 11 17:30:26 crc kubenswrapper[4837]: I0111 17:30:26.054224 4837 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/local-volume" Jan 11 17:30:26 crc kubenswrapper[4837]: I0111 17:30:26.054430 4837 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/csi" Jan 11 17:30:26 crc kubenswrapper[4837]: I0111 17:30:26.054864 4837 server.go:1280] "Started kubelet" Jan 11 17:30:26 crc kubenswrapper[4837]: I0111 17:30:26.055287 4837 ratelimit.go:55] "Setting rate limiting for endpoint" service="podresources" qps=100 burstTokens=10 Jan 11 17:30:26 crc kubenswrapper[4837]: I0111 17:30:26.055206 4837 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": dial tcp 38.102.83.196:6443: connect: connection refused Jan 11 17:30:26 crc kubenswrapper[4837]: I0111 17:30:26.055537 4837 server.go:163] "Starting to listen" address="0.0.0.0" port=10250 Jan 11 17:30:26 crc kubenswrapper[4837]: I0111 17:30:26.056005 4837 server.go:236] "Starting to serve the podresources API" endpoint="unix:/var/lib/kubelet/pod-resources/kubelet.sock" Jan 11 17:30:26 crc systemd[1]: Started Kubernetes Kubelet. Jan 11 17:30:26 crc kubenswrapper[4837]: E0111 17:30:26.060274 4837 event.go:368] "Unable to write event (may retry after sleeping)" err="Post \"https://api-int.crc.testing:6443/api/v1/namespaces/default/events\": dial tcp 38.102.83.196:6443: connect: connection refused" event="&Event{ObjectMeta:{crc.1889bd7200ac4627 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:Starting,Message:Starting kubelet.,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-01-11 17:30:26.054841895 +0000 UTC m=+0.233034611,LastTimestamp:2026-01-11 17:30:26.054841895 +0000 UTC m=+0.233034611,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Jan 11 17:30:26 crc kubenswrapper[4837]: I0111 17:30:26.061197 4837 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate rotation is enabled Jan 11 17:30:26 crc kubenswrapper[4837]: I0111 17:30:26.061244 4837 fs_resource_analyzer.go:67] "Starting FS ResourceAnalyzer" Jan 11 17:30:26 crc kubenswrapper[4837]: I0111 17:30:26.061286 4837 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2026-01-02 11:24:03.790582616 +0000 UTC Jan 11 17:30:26 crc kubenswrapper[4837]: I0111 17:30:26.061345 4837 server.go:460] "Adding debug handlers to kubelet server" Jan 11 17:30:26 crc kubenswrapper[4837]: E0111 17:30:26.061385 4837 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Jan 11 17:30:26 crc kubenswrapper[4837]: E0111 17:30:26.061970 4837 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.196:6443: connect: connection refused" interval="200ms" Jan 11 17:30:26 crc kubenswrapper[4837]: I0111 17:30:26.062641 4837 desired_state_of_world_populator.go:146] "Desired state populator starts to run" Jan 11 17:30:26 crc kubenswrapper[4837]: I0111 17:30:26.062616 4837 volume_manager.go:287] "The desired_state_of_world populator starts" Jan 11 17:30:26 crc kubenswrapper[4837]: I0111 17:30:26.062706 4837 volume_manager.go:289] "Starting Kubelet Volume Manager" Jan 11 17:30:26 crc kubenswrapper[4837]: W0111 17:30:26.063374 4837 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.CSIDriver: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0": dial tcp 38.102.83.196:6443: connect: connection refused Jan 11 17:30:26 crc kubenswrapper[4837]: E0111 17:30:26.066765 4837 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.CSIDriver: failed to list *v1.CSIDriver: Get \"https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0\": dial tcp 38.102.83.196:6443: connect: connection refused" logger="UnhandledError" Jan 11 17:30:26 crc kubenswrapper[4837]: I0111 17:30:26.067425 4837 factory.go:153] Registering CRI-O factory Jan 11 17:30:26 crc kubenswrapper[4837]: I0111 17:30:26.067450 4837 factory.go:221] Registration of the crio container factory successfully Jan 11 17:30:26 crc kubenswrapper[4837]: I0111 17:30:26.067515 4837 factory.go:219] Registration of the containerd container factory failed: unable to create containerd client: containerd: cannot unix dial containerd api service: dial unix /run/containerd/containerd.sock: connect: no such file or directory Jan 11 17:30:26 crc kubenswrapper[4837]: I0111 17:30:26.067525 4837 factory.go:55] Registering systemd factory Jan 11 17:30:26 crc kubenswrapper[4837]: I0111 17:30:26.067533 4837 factory.go:221] Registration of the systemd container factory successfully Jan 11 17:30:26 crc kubenswrapper[4837]: I0111 17:30:26.067553 4837 factory.go:103] Registering Raw factory Jan 11 17:30:26 crc kubenswrapper[4837]: I0111 17:30:26.067572 4837 manager.go:1196] Started watching for new ooms in manager Jan 11 17:30:26 crc kubenswrapper[4837]: I0111 17:30:26.068382 4837 manager.go:319] Starting recovery of all containers Jan 11 17:30:26 crc kubenswrapper[4837]: I0111 17:30:26.077986 4837 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="20b0d48f-5fd6-431c-a545-e3c800c7b866" volumeName="kubernetes.io/secret/20b0d48f-5fd6-431c-a545-e3c800c7b866-cert" seLinuxMountContext="" Jan 11 17:30:26 crc kubenswrapper[4837]: I0111 17:30:26.078058 4837 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6ea678ab-3438-413e-bfe3-290ae7725660" volumeName="kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-env-overrides" seLinuxMountContext="" Jan 11 17:30:26 crc kubenswrapper[4837]: I0111 17:30:26.078074 4837 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="9d4552c7-cd75-42dd-8880-30dd377c49a4" volumeName="kubernetes.io/secret/9d4552c7-cd75-42dd-8880-30dd377c49a4-serving-cert" seLinuxMountContext="" Jan 11 17:30:26 crc kubenswrapper[4837]: I0111 17:30:26.078086 4837 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bc5039c0-ea34-426b-a2b7-fbbc87b49a6d" volumeName="kubernetes.io/empty-dir/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-available-featuregates" seLinuxMountContext="" Jan 11 17:30:26 crc kubenswrapper[4837]: I0111 17:30:26.078099 4837 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09efc573-dbb6-4249-bd59-9b87aba8dd28" volumeName="kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-serving-cert" seLinuxMountContext="" Jan 11 17:30:26 crc kubenswrapper[4837]: I0111 17:30:26.078111 4837 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-image-import-ca" seLinuxMountContext="" Jan 11 17:30:26 crc kubenswrapper[4837]: I0111 17:30:26.078122 4837 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="22c825df-677d-4ca6-82db-3454ed06e783" volumeName="kubernetes.io/projected/22c825df-677d-4ca6-82db-3454ed06e783-kube-api-access-7c4vf" seLinuxMountContext="" Jan 11 17:30:26 crc kubenswrapper[4837]: I0111 17:30:26.078134 4837 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="43509403-f426-496e-be36-56cef71462f5" volumeName="kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-oauth-serving-cert" seLinuxMountContext="" Jan 11 17:30:26 crc kubenswrapper[4837]: I0111 17:30:26.078147 4837 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="43509403-f426-496e-be36-56cef71462f5" volumeName="kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-oauth-config" seLinuxMountContext="" Jan 11 17:30:26 crc kubenswrapper[4837]: I0111 17:30:26.078159 4837 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5441d097-087c-4d9a-baa8-b210afa90fc9" volumeName="kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-config" seLinuxMountContext="" Jan 11 17:30:26 crc kubenswrapper[4837]: I0111 17:30:26.078170 4837 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7bb08738-c794-4ee8-9972-3a62ca171029" volumeName="kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-binary-copy" seLinuxMountContext="" Jan 11 17:30:26 crc kubenswrapper[4837]: I0111 17:30:26.078183 4837 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="43509403-f426-496e-be36-56cef71462f5" volumeName="kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-console-config" seLinuxMountContext="" Jan 11 17:30:26 crc kubenswrapper[4837]: I0111 17:30:26.078195 4837 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="43509403-f426-496e-be36-56cef71462f5" volumeName="kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-service-ca" seLinuxMountContext="" Jan 11 17:30:26 crc kubenswrapper[4837]: I0111 17:30:26.078208 4837 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49ef4625-1d3a-4a9f-b595-c2433d32326d" volumeName="kubernetes.io/projected/49ef4625-1d3a-4a9f-b595-c2433d32326d-kube-api-access-pjr6v" seLinuxMountContext="" Jan 11 17:30:26 crc kubenswrapper[4837]: I0111 17:30:26.078219 4837 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="57a731c4-ef35-47a8-b875-bfb08a7f8011" volumeName="kubernetes.io/projected/57a731c4-ef35-47a8-b875-bfb08a7f8011-kube-api-access-cfbct" seLinuxMountContext="" Jan 11 17:30:26 crc kubenswrapper[4837]: I0111 17:30:26.078231 4837 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-kube-api-access-kfwg7" seLinuxMountContext="" Jan 11 17:30:26 crc kubenswrapper[4837]: I0111 17:30:26.078247 4837 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b11524ee-3fca-4b1b-9cdf-6da289fdbc7d" volumeName="kubernetes.io/projected/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-kube-api-access-x4zgh" seLinuxMountContext="" Jan 11 17:30:26 crc kubenswrapper[4837]: I0111 17:30:26.078260 4837 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bf126b07-da06-4140-9a57-dfd54fc6b486" volumeName="kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-bound-sa-token" seLinuxMountContext="" Jan 11 17:30:26 crc kubenswrapper[4837]: I0111 17:30:26.078290 4837 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="f88749ec-7931-4ee7-b3fc-1ec5e11f92e9" volumeName="kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-srv-cert" seLinuxMountContext="" Jan 11 17:30:26 crc kubenswrapper[4837]: I0111 17:30:26.078335 4837 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" volumeName="kubernetes.io/projected/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-kube-api-access-zkvpv" seLinuxMountContext="" Jan 11 17:30:26 crc kubenswrapper[4837]: I0111 17:30:26.078349 4837 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="25e176fe-21b4-4974-b1ed-c8b94f112a7f" volumeName="kubernetes.io/secret/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-key" seLinuxMountContext="" Jan 11 17:30:26 crc kubenswrapper[4837]: I0111 17:30:26.078362 4837 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-login" seLinuxMountContext="" Jan 11 17:30:26 crc kubenswrapper[4837]: I0111 17:30:26.078373 4837 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="4bb40260-dbaa-4fb0-84df-5e680505d512" volumeName="kubernetes.io/projected/4bb40260-dbaa-4fb0-84df-5e680505d512-kube-api-access-2w9zh" seLinuxMountContext="" Jan 11 17:30:26 crc kubenswrapper[4837]: I0111 17:30:26.078388 4837 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" seLinuxMountContext="" Jan 11 17:30:26 crc kubenswrapper[4837]: I0111 17:30:26.079492 4837 reconstruct.go:144] "Volume is marked device as uncertain and added into the actual state" volumeName="kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" deviceMountPath="/var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/1f4776af88835e41c12b831b4c9fed40233456d14189815a54dbe7f892fc1983/globalmount" Jan 11 17:30:26 crc kubenswrapper[4837]: I0111 17:30:26.079540 4837 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="96b93a3a-6083-4aea-8eab-fe1aa8245ad9" volumeName="kubernetes.io/secret/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-metrics-tls" seLinuxMountContext="" Jan 11 17:30:26 crc kubenswrapper[4837]: I0111 17:30:26.079556 4837 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b6cd30de-2eeb-49a2-ab40-9167f4560ff5" volumeName="kubernetes.io/secret/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-operator-metrics" seLinuxMountContext="" Jan 11 17:30:26 crc kubenswrapper[4837]: I0111 17:30:26.079574 4837 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="c03ee662-fb2f-4fc4-a2c1-af487c19d254" volumeName="kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-metrics-certs" seLinuxMountContext="" Jan 11 17:30:26 crc kubenswrapper[4837]: I0111 17:30:26.079586 4837 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="25e176fe-21b4-4974-b1ed-c8b94f112a7f" volumeName="kubernetes.io/configmap/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-cabundle" seLinuxMountContext="" Jan 11 17:30:26 crc kubenswrapper[4837]: I0111 17:30:26.079597 4837 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" volumeName="kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-apiservice-cert" seLinuxMountContext="" Jan 11 17:30:26 crc kubenswrapper[4837]: I0111 17:30:26.079608 4837 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="31d8b7a1-420e-4252-a5b7-eebe8a111292" volumeName="kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-auth-proxy-config" seLinuxMountContext="" Jan 11 17:30:26 crc kubenswrapper[4837]: I0111 17:30:26.079618 4837 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="3b6479f0-333b-4a96-9adf-2099afdc2447" volumeName="kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr" seLinuxMountContext="" Jan 11 17:30:26 crc kubenswrapper[4837]: I0111 17:30:26.079628 4837 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7bb08738-c794-4ee8-9972-3a62ca171029" volumeName="kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-sysctl-allowlist" seLinuxMountContext="" Jan 11 17:30:26 crc kubenswrapper[4837]: I0111 17:30:26.079638 4837 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b6cd30de-2eeb-49a2-ab40-9167f4560ff5" volumeName="kubernetes.io/configmap/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-trusted-ca" seLinuxMountContext="" Jan 11 17:30:26 crc kubenswrapper[4837]: I0111 17:30:26.079648 4837 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c" volumeName="kubernetes.io/secret/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-serving-cert" seLinuxMountContext="" Jan 11 17:30:26 crc kubenswrapper[4837]: I0111 17:30:26.079657 4837 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6509e943-70c6-444c-bc41-48a544e36fbd" volumeName="kubernetes.io/secret/6509e943-70c6-444c-bc41-48a544e36fbd-serving-cert" seLinuxMountContext="" Jan 11 17:30:26 crc kubenswrapper[4837]: I0111 17:30:26.079681 4837 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="87cf06ed-a83f-41a7-828d-70653580a8cb" volumeName="kubernetes.io/projected/87cf06ed-a83f-41a7-828d-70653580a8cb-kube-api-access-d6qdx" seLinuxMountContext="" Jan 11 17:30:26 crc kubenswrapper[4837]: I0111 17:30:26.079691 4837 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="c03ee662-fb2f-4fc4-a2c1-af487c19d254" volumeName="kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-stats-auth" seLinuxMountContext="" Jan 11 17:30:26 crc kubenswrapper[4837]: I0111 17:30:26.079701 4837 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="fda69060-fa79-4696-b1a6-7980f124bf7c" volumeName="kubernetes.io/projected/fda69060-fa79-4696-b1a6-7980f124bf7c-kube-api-access-xcgwh" seLinuxMountContext="" Jan 11 17:30:26 crc kubenswrapper[4837]: I0111 17:30:26.079712 4837 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1386a44e-36a2-460c-96d0-0359d2b6f0f5" volumeName="kubernetes.io/projected/1386a44e-36a2-460c-96d0-0359d2b6f0f5-kube-api-access" seLinuxMountContext="" Jan 11 17:30:26 crc kubenswrapper[4837]: I0111 17:30:26.079721 4837 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="37a5e44f-9a88-4405-be8a-b645485e7312" volumeName="kubernetes.io/projected/37a5e44f-9a88-4405-be8a-b645485e7312-kube-api-access-rdwmf" seLinuxMountContext="" Jan 11 17:30:26 crc kubenswrapper[4837]: I0111 17:30:26.079731 4837 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6ea678ab-3438-413e-bfe3-290ae7725660" volumeName="kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-script-lib" seLinuxMountContext="" Jan 11 17:30:26 crc kubenswrapper[4837]: I0111 17:30:26.079740 4837 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8cea82b4-6893-4ddc-af9f-1bb5ae425c5b" volumeName="kubernetes.io/configmap/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-config" seLinuxMountContext="" Jan 11 17:30:26 crc kubenswrapper[4837]: I0111 17:30:26.079750 4837 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="a31745f5-9847-4afe-82a5-3161cc66ca93" volumeName="kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-kube-api-access-lz9wn" seLinuxMountContext="" Jan 11 17:30:26 crc kubenswrapper[4837]: I0111 17:30:26.079759 4837 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="a31745f5-9847-4afe-82a5-3161cc66ca93" volumeName="kubernetes.io/secret/a31745f5-9847-4afe-82a5-3161cc66ca93-metrics-tls" seLinuxMountContext="" Jan 11 17:30:26 crc kubenswrapper[4837]: I0111 17:30:26.079768 4837 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bc5039c0-ea34-426b-a2b7-fbbc87b49a6d" volumeName="kubernetes.io/projected/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-kube-api-access-mnrrd" seLinuxMountContext="" Jan 11 17:30:26 crc kubenswrapper[4837]: I0111 17:30:26.079828 4837 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="d75a4c96-2883-4a0b-bab2-0fab2b6c0b49" volumeName="kubernetes.io/projected/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-kube-api-access-rczfb" seLinuxMountContext="" Jan 11 17:30:26 crc kubenswrapper[4837]: I0111 17:30:26.079843 4837 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-client" seLinuxMountContext="" Jan 11 17:30:26 crc kubenswrapper[4837]: I0111 17:30:26.079854 4837 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-session" seLinuxMountContext="" Jan 11 17:30:26 crc kubenswrapper[4837]: I0111 17:30:26.079864 4837 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5441d097-087c-4d9a-baa8-b210afa90fc9" volumeName="kubernetes.io/secret/5441d097-087c-4d9a-baa8-b210afa90fc9-serving-cert" seLinuxMountContext="" Jan 11 17:30:26 crc kubenswrapper[4837]: I0111 17:30:26.079876 4837 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5b88f790-22fa-440e-b583-365168c0b23d" volumeName="kubernetes.io/secret/5b88f790-22fa-440e-b583-365168c0b23d-metrics-certs" seLinuxMountContext="" Jan 11 17:30:26 crc kubenswrapper[4837]: I0111 17:30:26.079886 4837 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7583ce53-e0fe-4a16-9e4d-50516596a136" volumeName="kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-client-ca" seLinuxMountContext="" Jan 11 17:30:26 crc kubenswrapper[4837]: I0111 17:30:26.079895 4837 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7583ce53-e0fe-4a16-9e4d-50516596a136" volumeName="kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-proxy-ca-bundles" seLinuxMountContext="" Jan 11 17:30:26 crc kubenswrapper[4837]: I0111 17:30:26.079909 4837 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="ef543e1b-8068-4ea3-b32a-61027b32e95d" volumeName="kubernetes.io/configmap/ef543e1b-8068-4ea3-b32a-61027b32e95d-ovnkube-identity-cm" seLinuxMountContext="" Jan 11 17:30:26 crc kubenswrapper[4837]: I0111 17:30:26.079920 4837 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="f88749ec-7931-4ee7-b3fc-1ec5e11f92e9" volumeName="kubernetes.io/projected/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-kube-api-access-dbsvg" seLinuxMountContext="" Jan 11 17:30:26 crc kubenswrapper[4837]: I0111 17:30:26.079930 4837 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="01ab3dd5-8196-46d0-ad33-122e2ca51def" volumeName="kubernetes.io/configmap/01ab3dd5-8196-46d0-ad33-122e2ca51def-config" seLinuxMountContext="" Jan 11 17:30:26 crc kubenswrapper[4837]: I0111 17:30:26.079940 4837 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09efc573-dbb6-4249-bd59-9b87aba8dd28" volumeName="kubernetes.io/projected/09efc573-dbb6-4249-bd59-9b87aba8dd28-kube-api-access-8tdtz" seLinuxMountContext="" Jan 11 17:30:26 crc kubenswrapper[4837]: I0111 17:30:26.079950 4837 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09efc573-dbb6-4249-bd59-9b87aba8dd28" volumeName="kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-client" seLinuxMountContext="" Jan 11 17:30:26 crc kubenswrapper[4837]: I0111 17:30:26.079960 4837 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1d611f23-29be-4491-8495-bee1670e935f" volumeName="kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-utilities" seLinuxMountContext="" Jan 11 17:30:26 crc kubenswrapper[4837]: I0111 17:30:26.079969 4837 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="496e6271-fb68-4057-954e-a0d97a4afa3f" volumeName="kubernetes.io/projected/496e6271-fb68-4057-954e-a0d97a4afa3f-kube-api-access" seLinuxMountContext="" Jan 11 17:30:26 crc kubenswrapper[4837]: I0111 17:30:26.079981 4837 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5fe579f8-e8a6-4643-bce5-a661393c4dde" volumeName="kubernetes.io/projected/5fe579f8-e8a6-4643-bce5-a661393c4dde-kube-api-access-fcqwp" seLinuxMountContext="" Jan 11 17:30:26 crc kubenswrapper[4837]: I0111 17:30:26.079990 4837 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6509e943-70c6-444c-bc41-48a544e36fbd" volumeName="kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-service-ca-bundle" seLinuxMountContext="" Jan 11 17:30:26 crc kubenswrapper[4837]: I0111 17:30:26.080019 4837 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="a31745f5-9847-4afe-82a5-3161cc66ca93" volumeName="kubernetes.io/configmap/a31745f5-9847-4afe-82a5-3161cc66ca93-trusted-ca" seLinuxMountContext="" Jan 11 17:30:26 crc kubenswrapper[4837]: I0111 17:30:26.080034 4837 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b6312bbd-5731-4ea0-a20f-81d5a57df44a" volumeName="kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-profile-collector-cert" seLinuxMountContext="" Jan 11 17:30:26 crc kubenswrapper[4837]: I0111 17:30:26.080046 4837 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-audit-policies" seLinuxMountContext="" Jan 11 17:30:26 crc kubenswrapper[4837]: I0111 17:30:26.080055 4837 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5225d0e4-402f-4861-b410-819f433b1803" volumeName="kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-utilities" seLinuxMountContext="" Jan 11 17:30:26 crc kubenswrapper[4837]: I0111 17:30:26.080065 4837 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="57a731c4-ef35-47a8-b875-bfb08a7f8011" volumeName="kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-catalog-content" seLinuxMountContext="" Jan 11 17:30:26 crc kubenswrapper[4837]: I0111 17:30:26.080074 4837 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5fe579f8-e8a6-4643-bce5-a661393c4dde" volumeName="kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-node-bootstrap-token" seLinuxMountContext="" Jan 11 17:30:26 crc kubenswrapper[4837]: I0111 17:30:26.080084 4837 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6731426b-95fe-49ff-bb5f-40441049fde2" volumeName="kubernetes.io/projected/6731426b-95fe-49ff-bb5f-40441049fde2-kube-api-access-x7zkh" seLinuxMountContext="" Jan 11 17:30:26 crc kubenswrapper[4837]: I0111 17:30:26.080093 4837 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-trusted-ca-bundle" seLinuxMountContext="" Jan 11 17:30:26 crc kubenswrapper[4837]: I0111 17:30:26.080102 4837 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="31d8b7a1-420e-4252-a5b7-eebe8a111292" volumeName="kubernetes.io/secret/31d8b7a1-420e-4252-a5b7-eebe8a111292-proxy-tls" seLinuxMountContext="" Jan 11 17:30:26 crc kubenswrapper[4837]: I0111 17:30:26.080112 4837 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="3cb93b32-e0ae-4377-b9c8-fdb9842c6d59" volumeName="kubernetes.io/projected/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-kube-api-access-wxkg8" seLinuxMountContext="" Jan 11 17:30:26 crc kubenswrapper[4837]: I0111 17:30:26.080121 4837 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" volumeName="kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf" seLinuxMountContext="" Jan 11 17:30:26 crc kubenswrapper[4837]: I0111 17:30:26.080131 4837 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="c03ee662-fb2f-4fc4-a2c1-af487c19d254" volumeName="kubernetes.io/projected/c03ee662-fb2f-4fc4-a2c1-af487c19d254-kube-api-access-v47cf" seLinuxMountContext="" Jan 11 17:30:26 crc kubenswrapper[4837]: I0111 17:30:26.080140 4837 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="e7e6199b-1264-4501-8953-767f51328d08" volumeName="kubernetes.io/projected/e7e6199b-1264-4501-8953-767f51328d08-kube-api-access" seLinuxMountContext="" Jan 11 17:30:26 crc kubenswrapper[4837]: I0111 17:30:26.080149 4837 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="ef543e1b-8068-4ea3-b32a-61027b32e95d" volumeName="kubernetes.io/configmap/ef543e1b-8068-4ea3-b32a-61027b32e95d-env-overrides" seLinuxMountContext="" Jan 11 17:30:26 crc kubenswrapper[4837]: I0111 17:30:26.080159 4837 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09efc573-dbb6-4249-bd59-9b87aba8dd28" volumeName="kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-ca" seLinuxMountContext="" Jan 11 17:30:26 crc kubenswrapper[4837]: I0111 17:30:26.080167 4837 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="20b0d48f-5fd6-431c-a545-e3c800c7b866" volumeName="kubernetes.io/projected/20b0d48f-5fd6-431c-a545-e3c800c7b866-kube-api-access-w9rds" seLinuxMountContext="" Jan 11 17:30:26 crc kubenswrapper[4837]: I0111 17:30:26.080203 4837 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5441d097-087c-4d9a-baa8-b210afa90fc9" volumeName="kubernetes.io/projected/5441d097-087c-4d9a-baa8-b210afa90fc9-kube-api-access-2d4wz" seLinuxMountContext="" Jan 11 17:30:26 crc kubenswrapper[4837]: I0111 17:30:26.080812 4837 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7bb08738-c794-4ee8-9972-3a62ca171029" volumeName="kubernetes.io/projected/7bb08738-c794-4ee8-9972-3a62ca171029-kube-api-access-279lb" seLinuxMountContext="" Jan 11 17:30:26 crc kubenswrapper[4837]: I0111 17:30:26.080841 4837 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/empty-dir/8f668bae-612b-4b75-9490-919e737c6a3b-ca-trust-extracted" seLinuxMountContext="" Jan 11 17:30:26 crc kubenswrapper[4837]: I0111 17:30:26.080851 4837 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="a0128f3a-b052-44ed-a84e-c4c8aaf17c13" volumeName="kubernetes.io/secret/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-samples-operator-tls" seLinuxMountContext="" Jan 11 17:30:26 crc kubenswrapper[4837]: I0111 17:30:26.080861 4837 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="e7e6199b-1264-4501-8953-767f51328d08" volumeName="kubernetes.io/secret/e7e6199b-1264-4501-8953-767f51328d08-serving-cert" seLinuxMountContext="" Jan 11 17:30:26 crc kubenswrapper[4837]: I0111 17:30:26.080872 4837 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" volumeName="kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-encryption-config" seLinuxMountContext="" Jan 11 17:30:26 crc kubenswrapper[4837]: I0111 17:30:26.080883 4837 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-serving-ca" seLinuxMountContext="" Jan 11 17:30:26 crc kubenswrapper[4837]: I0111 17:30:26.080893 4837 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="3ab1a177-2de0-46d9-b765-d0d0649bb42e" volumeName="kubernetes.io/secret/3ab1a177-2de0-46d9-b765-d0d0649bb42e-package-server-manager-serving-cert" seLinuxMountContext="" Jan 11 17:30:26 crc kubenswrapper[4837]: I0111 17:30:26.080905 4837 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6ea678ab-3438-413e-bfe3-290ae7725660" volumeName="kubernetes.io/projected/6ea678ab-3438-413e-bfe3-290ae7725660-kube-api-access-htfz6" seLinuxMountContext="" Jan 11 17:30:26 crc kubenswrapper[4837]: I0111 17:30:26.080915 4837 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="925f1c65-6136-48ba-85aa-3a3b50560753" volumeName="kubernetes.io/secret/925f1c65-6136-48ba-85aa-3a3b50560753-ovn-control-plane-metrics-cert" seLinuxMountContext="" Jan 11 17:30:26 crc kubenswrapper[4837]: I0111 17:30:26.080925 4837 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="a31745f5-9847-4afe-82a5-3161cc66ca93" volumeName="kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-bound-sa-token" seLinuxMountContext="" Jan 11 17:30:26 crc kubenswrapper[4837]: I0111 17:30:26.080935 4837 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b6312bbd-5731-4ea0-a20f-81d5a57df44a" volumeName="kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-srv-cert" seLinuxMountContext="" Jan 11 17:30:26 crc kubenswrapper[4837]: I0111 17:30:26.080946 4837 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1d611f23-29be-4491-8495-bee1670e935f" volumeName="kubernetes.io/projected/1d611f23-29be-4491-8495-bee1670e935f-kube-api-access-bf2bz" seLinuxMountContext="" Jan 11 17:30:26 crc kubenswrapper[4837]: I0111 17:30:26.080957 4837 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="37a5e44f-9a88-4405-be8a-b645485e7312" volumeName="kubernetes.io/secret/37a5e44f-9a88-4405-be8a-b645485e7312-metrics-tls" seLinuxMountContext="" Jan 11 17:30:26 crc kubenswrapper[4837]: I0111 17:30:26.080967 4837 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="3cb93b32-e0ae-4377-b9c8-fdb9842c6d59" volumeName="kubernetes.io/configmap/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-serviceca" seLinuxMountContext="" Jan 11 17:30:26 crc kubenswrapper[4837]: I0111 17:30:26.080979 4837 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6ea678ab-3438-413e-bfe3-290ae7725660" volumeName="kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-config" seLinuxMountContext="" Jan 11 17:30:26 crc kubenswrapper[4837]: I0111 17:30:26.080989 4837 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="9d4552c7-cd75-42dd-8880-30dd377c49a4" volumeName="kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-config" seLinuxMountContext="" Jan 11 17:30:26 crc kubenswrapper[4837]: I0111 17:30:26.080999 4837 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" volumeName="kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-webhook-cert" seLinuxMountContext="" Jan 11 17:30:26 crc kubenswrapper[4837]: I0111 17:30:26.081010 4837 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="3ab1a177-2de0-46d9-b765-d0d0649bb42e" volumeName="kubernetes.io/projected/3ab1a177-2de0-46d9-b765-d0d0649bb42e-kube-api-access-4d4hj" seLinuxMountContext="" Jan 11 17:30:26 crc kubenswrapper[4837]: I0111 17:30:26.081021 4837 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7583ce53-e0fe-4a16-9e4d-50516596a136" volumeName="kubernetes.io/projected/7583ce53-e0fe-4a16-9e4d-50516596a136-kube-api-access-xcphl" seLinuxMountContext="" Jan 11 17:30:26 crc kubenswrapper[4837]: I0111 17:30:26.081032 4837 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8cea82b4-6893-4ddc-af9f-1bb5ae425c5b" volumeName="kubernetes.io/secret/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-serving-cert" seLinuxMountContext="" Jan 11 17:30:26 crc kubenswrapper[4837]: I0111 17:30:26.081044 4837 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-registry-certificates" seLinuxMountContext="" Jan 11 17:30:26 crc kubenswrapper[4837]: I0111 17:30:26.081055 4837 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="925f1c65-6136-48ba-85aa-3a3b50560753" volumeName="kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-env-overrides" seLinuxMountContext="" Jan 11 17:30:26 crc kubenswrapper[4837]: I0111 17:30:26.081064 4837 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="c03ee662-fb2f-4fc4-a2c1-af487c19d254" volumeName="kubernetes.io/configmap/c03ee662-fb2f-4fc4-a2c1-af487c19d254-service-ca-bundle" seLinuxMountContext="" Jan 11 17:30:26 crc kubenswrapper[4837]: I0111 17:30:26.081074 4837 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="496e6271-fb68-4057-954e-a0d97a4afa3f" volumeName="kubernetes.io/secret/496e6271-fb68-4057-954e-a0d97a4afa3f-serving-cert" seLinuxMountContext="" Jan 11 17:30:26 crc kubenswrapper[4837]: I0111 17:30:26.081084 4837 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-ocp-branding-template" seLinuxMountContext="" Jan 11 17:30:26 crc kubenswrapper[4837]: I0111 17:30:26.081095 4837 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6402fda4-df10-493c-b4e5-d0569419652d" volumeName="kubernetes.io/secret/6402fda4-df10-493c-b4e5-d0569419652d-machine-api-operator-tls" seLinuxMountContext="" Jan 11 17:30:26 crc kubenswrapper[4837]: I0111 17:30:26.081110 4837 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="ef543e1b-8068-4ea3-b32a-61027b32e95d" volumeName="kubernetes.io/projected/ef543e1b-8068-4ea3-b32a-61027b32e95d-kube-api-access-s2kz5" seLinuxMountContext="" Jan 11 17:30:26 crc kubenswrapper[4837]: I0111 17:30:26.081122 4837 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09efc573-dbb6-4249-bd59-9b87aba8dd28" volumeName="kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-service-ca" seLinuxMountContext="" Jan 11 17:30:26 crc kubenswrapper[4837]: I0111 17:30:26.081132 4837 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1386a44e-36a2-460c-96d0-0359d2b6f0f5" volumeName="kubernetes.io/secret/1386a44e-36a2-460c-96d0-0359d2b6f0f5-serving-cert" seLinuxMountContext="" Jan 11 17:30:26 crc kubenswrapper[4837]: I0111 17:30:26.081142 4837 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="43509403-f426-496e-be36-56cef71462f5" volumeName="kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-trusted-ca-bundle" seLinuxMountContext="" Jan 11 17:30:26 crc kubenswrapper[4837]: I0111 17:30:26.081153 4837 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7539238d-5fe0-46ed-884e-1c3b566537ec" volumeName="kubernetes.io/configmap/7539238d-5fe0-46ed-884e-1c3b566537ec-config" seLinuxMountContext="" Jan 11 17:30:26 crc kubenswrapper[4837]: I0111 17:30:26.081163 4837 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="c03ee662-fb2f-4fc4-a2c1-af487c19d254" volumeName="kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-default-certificate" seLinuxMountContext="" Jan 11 17:30:26 crc kubenswrapper[4837]: I0111 17:30:26.081173 4837 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1d611f23-29be-4491-8495-bee1670e935f" volumeName="kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-catalog-content" seLinuxMountContext="" Jan 11 17:30:26 crc kubenswrapper[4837]: I0111 17:30:26.081184 4837 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="43509403-f426-496e-be36-56cef71462f5" volumeName="kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-serving-cert" seLinuxMountContext="" Jan 11 17:30:26 crc kubenswrapper[4837]: I0111 17:30:26.081194 4837 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5225d0e4-402f-4861-b410-819f433b1803" volumeName="kubernetes.io/projected/5225d0e4-402f-4861-b410-819f433b1803-kube-api-access-9xfj7" seLinuxMountContext="" Jan 11 17:30:26 crc kubenswrapper[4837]: I0111 17:30:26.081928 4837 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5fe579f8-e8a6-4643-bce5-a661393c4dde" volumeName="kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-certs" seLinuxMountContext="" Jan 11 17:30:26 crc kubenswrapper[4837]: I0111 17:30:26.081949 4837 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="efdd0498-1daa-4136-9a4a-3b948c2293fc" volumeName="kubernetes.io/secret/efdd0498-1daa-4136-9a4a-3b948c2293fc-webhook-certs" seLinuxMountContext="" Jan 11 17:30:26 crc kubenswrapper[4837]: I0111 17:30:26.081959 4837 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="44663579-783b-4372-86d6-acf235a62d72" volumeName="kubernetes.io/projected/44663579-783b-4372-86d6-acf235a62d72-kube-api-access-vt5rc" seLinuxMountContext="" Jan 11 17:30:26 crc kubenswrapper[4837]: I0111 17:30:26.081968 4837 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="4bb40260-dbaa-4fb0-84df-5e680505d512" volumeName="kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-multus-daemon-config" seLinuxMountContext="" Jan 11 17:30:26 crc kubenswrapper[4837]: I0111 17:30:26.082415 4837 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5b88f790-22fa-440e-b583-365168c0b23d" volumeName="kubernetes.io/projected/5b88f790-22fa-440e-b583-365168c0b23d-kube-api-access-jkwtn" seLinuxMountContext="" Jan 11 17:30:26 crc kubenswrapper[4837]: I0111 17:30:26.082435 4837 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6ea678ab-3438-413e-bfe3-290ae7725660" volumeName="kubernetes.io/secret/6ea678ab-3438-413e-bfe3-290ae7725660-ovn-node-metrics-cert" seLinuxMountContext="" Jan 11 17:30:26 crc kubenswrapper[4837]: I0111 17:30:26.082449 4837 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="9d751cbb-f2e2-430d-9754-c882a5e924a5" volumeName="kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl" seLinuxMountContext="" Jan 11 17:30:26 crc kubenswrapper[4837]: I0111 17:30:26.082463 4837 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b11524ee-3fca-4b1b-9cdf-6da289fdbc7d" volumeName="kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-utilities" seLinuxMountContext="" Jan 11 17:30:26 crc kubenswrapper[4837]: I0111 17:30:26.082479 4837 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bd23aa5c-e532-4e53-bccf-e79f130c5ae8" volumeName="kubernetes.io/projected/bd23aa5c-e532-4e53-bccf-e79f130c5ae8-kube-api-access-jhbk2" seLinuxMountContext="" Jan 11 17:30:26 crc kubenswrapper[4837]: I0111 17:30:26.082493 4837 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09efc573-dbb6-4249-bd59-9b87aba8dd28" volumeName="kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-config" seLinuxMountContext="" Jan 11 17:30:26 crc kubenswrapper[4837]: I0111 17:30:26.082506 4837 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="0b574797-001e-440a-8f4e-c0be86edad0f" volumeName="kubernetes.io/projected/0b574797-001e-440a-8f4e-c0be86edad0f-kube-api-access-lzf88" seLinuxMountContext="" Jan 11 17:30:26 crc kubenswrapper[4837]: I0111 17:30:26.082520 4837 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-service-ca" seLinuxMountContext="" Jan 11 17:30:26 crc kubenswrapper[4837]: I0111 17:30:26.082533 4837 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6731426b-95fe-49ff-bb5f-40441049fde2" volumeName="kubernetes.io/secret/6731426b-95fe-49ff-bb5f-40441049fde2-control-plane-machine-set-operator-tls" seLinuxMountContext="" Jan 11 17:30:26 crc kubenswrapper[4837]: I0111 17:30:26.082546 4837 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7539238d-5fe0-46ed-884e-1c3b566537ec" volumeName="kubernetes.io/secret/7539238d-5fe0-46ed-884e-1c3b566537ec-serving-cert" seLinuxMountContext="" Jan 11 17:30:26 crc kubenswrapper[4837]: I0111 17:30:26.082559 4837 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-registry-tls" seLinuxMountContext="" Jan 11 17:30:26 crc kubenswrapper[4837]: I0111 17:30:26.082572 4837 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="96b93a3a-6083-4aea-8eab-fe1aa8245ad9" volumeName="kubernetes.io/projected/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-kube-api-access-nzwt7" seLinuxMountContext="" Jan 11 17:30:26 crc kubenswrapper[4837]: I0111 17:30:26.082587 4837 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="9d4552c7-cd75-42dd-8880-30dd377c49a4" volumeName="kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-trusted-ca" seLinuxMountContext="" Jan 11 17:30:26 crc kubenswrapper[4837]: I0111 17:30:26.082600 4837 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="ef543e1b-8068-4ea3-b32a-61027b32e95d" volumeName="kubernetes.io/secret/ef543e1b-8068-4ea3-b32a-61027b32e95d-webhook-cert" seLinuxMountContext="" Jan 11 17:30:26 crc kubenswrapper[4837]: I0111 17:30:26.082613 4837 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="01ab3dd5-8196-46d0-ad33-122e2ca51def" volumeName="kubernetes.io/secret/01ab3dd5-8196-46d0-ad33-122e2ca51def-serving-cert" seLinuxMountContext="" Jan 11 17:30:26 crc kubenswrapper[4837]: I0111 17:30:26.082626 4837 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="0b78653f-4ff9-4508-8672-245ed9b561e3" volumeName="kubernetes.io/secret/0b78653f-4ff9-4508-8672-245ed9b561e3-serving-cert" seLinuxMountContext="" Jan 11 17:30:26 crc kubenswrapper[4837]: I0111 17:30:26.082641 4837 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-serving-cert" seLinuxMountContext="" Jan 11 17:30:26 crc kubenswrapper[4837]: I0111 17:30:26.082653 4837 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c" volumeName="kubernetes.io/configmap/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-config" seLinuxMountContext="" Jan 11 17:30:26 crc kubenswrapper[4837]: I0111 17:30:26.082666 4837 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-cliconfig" seLinuxMountContext="" Jan 11 17:30:26 crc kubenswrapper[4837]: I0111 17:30:26.082719 4837 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5225d0e4-402f-4861-b410-819f433b1803" volumeName="kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-catalog-content" seLinuxMountContext="" Jan 11 17:30:26 crc kubenswrapper[4837]: I0111 17:30:26.082735 4837 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6402fda4-df10-493c-b4e5-d0569419652d" volumeName="kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-config" seLinuxMountContext="" Jan 11 17:30:26 crc kubenswrapper[4837]: I0111 17:30:26.082750 4837 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bf126b07-da06-4140-9a57-dfd54fc6b486" volumeName="kubernetes.io/secret/bf126b07-da06-4140-9a57-dfd54fc6b486-image-registry-operator-tls" seLinuxMountContext="" Jan 11 17:30:26 crc kubenswrapper[4837]: I0111 17:30:26.082764 4837 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" volumeName="kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-serving-ca" seLinuxMountContext="" Jan 11 17:30:26 crc kubenswrapper[4837]: I0111 17:30:26.082780 4837 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" volumeName="kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-trusted-ca-bundle" seLinuxMountContext="" Jan 11 17:30:26 crc kubenswrapper[4837]: I0111 17:30:26.082793 4837 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5441d097-087c-4d9a-baa8-b210afa90fc9" volumeName="kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-client-ca" seLinuxMountContext="" Jan 11 17:30:26 crc kubenswrapper[4837]: I0111 17:30:26.082806 4837 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6509e943-70c6-444c-bc41-48a544e36fbd" volumeName="kubernetes.io/projected/6509e943-70c6-444c-bc41-48a544e36fbd-kube-api-access-6g6sz" seLinuxMountContext="" Jan 11 17:30:26 crc kubenswrapper[4837]: I0111 17:30:26.082820 4837 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" volumeName="kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-serving-cert" seLinuxMountContext="" Jan 11 17:30:26 crc kubenswrapper[4837]: I0111 17:30:26.082837 4837 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="22c825df-677d-4ca6-82db-3454ed06e783" volumeName="kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-auth-proxy-config" seLinuxMountContext="" Jan 11 17:30:26 crc kubenswrapper[4837]: I0111 17:30:26.082852 4837 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/projected/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-kube-api-access-ngvvp" seLinuxMountContext="" Jan 11 17:30:26 crc kubenswrapper[4837]: I0111 17:30:26.082865 4837 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-idp-0-file-data" seLinuxMountContext="" Jan 11 17:30:26 crc kubenswrapper[4837]: I0111 17:30:26.082879 4837 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7583ce53-e0fe-4a16-9e4d-50516596a136" volumeName="kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-config" seLinuxMountContext="" Jan 11 17:30:26 crc kubenswrapper[4837]: I0111 17:30:26.082893 4837 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-trusted-ca-bundle" seLinuxMountContext="" Jan 11 17:30:26 crc kubenswrapper[4837]: I0111 17:30:26.082915 4837 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-router-certs" seLinuxMountContext="" Jan 11 17:30:26 crc kubenswrapper[4837]: I0111 17:30:26.082930 4837 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6402fda4-df10-493c-b4e5-d0569419652d" volumeName="kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-images" seLinuxMountContext="" Jan 11 17:30:26 crc kubenswrapper[4837]: I0111 17:30:26.082943 4837 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/secret/8f668bae-612b-4b75-9490-919e737c6a3b-installation-pull-secrets" seLinuxMountContext="" Jan 11 17:30:26 crc kubenswrapper[4837]: I0111 17:30:26.082956 4837 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="9d4552c7-cd75-42dd-8880-30dd377c49a4" volumeName="kubernetes.io/projected/9d4552c7-cd75-42dd-8880-30dd377c49a4-kube-api-access-pcxfs" seLinuxMountContext="" Jan 11 17:30:26 crc kubenswrapper[4837]: I0111 17:30:26.082969 4837 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b6cd30de-2eeb-49a2-ab40-9167f4560ff5" volumeName="kubernetes.io/projected/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-kube-api-access-pj782" seLinuxMountContext="" Jan 11 17:30:26 crc kubenswrapper[4837]: I0111 17:30:26.082982 4837 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="fda69060-fa79-4696-b1a6-7980f124bf7c" volumeName="kubernetes.io/configmap/fda69060-fa79-4696-b1a6-7980f124bf7c-mcd-auth-proxy-config" seLinuxMountContext="" Jan 11 17:30:26 crc kubenswrapper[4837]: I0111 17:30:26.082997 4837 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="0b78653f-4ff9-4508-8672-245ed9b561e3" volumeName="kubernetes.io/projected/0b78653f-4ff9-4508-8672-245ed9b561e3-kube-api-access" seLinuxMountContext="" Jan 11 17:30:26 crc kubenswrapper[4837]: I0111 17:30:26.083011 4837 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="31d8b7a1-420e-4252-a5b7-eebe8a111292" volumeName="kubernetes.io/projected/31d8b7a1-420e-4252-a5b7-eebe8a111292-kube-api-access-zgdk5" seLinuxMountContext="" Jan 11 17:30:26 crc kubenswrapper[4837]: I0111 17:30:26.083025 4837 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" volumeName="kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert" seLinuxMountContext="" Jan 11 17:30:26 crc kubenswrapper[4837]: I0111 17:30:26.083038 4837 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8cea82b4-6893-4ddc-af9f-1bb5ae425c5b" volumeName="kubernetes.io/projected/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-kube-api-access-w4xd4" seLinuxMountContext="" Jan 11 17:30:26 crc kubenswrapper[4837]: I0111 17:30:26.083053 4837 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" volumeName="kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-audit-policies" seLinuxMountContext="" Jan 11 17:30:26 crc kubenswrapper[4837]: I0111 17:30:26.083065 4837 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" volumeName="kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-client" seLinuxMountContext="" Jan 11 17:30:26 crc kubenswrapper[4837]: I0111 17:30:26.083080 4837 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="0b574797-001e-440a-8f4e-c0be86edad0f" volumeName="kubernetes.io/configmap/0b574797-001e-440a-8f4e-c0be86edad0f-mcc-auth-proxy-config" seLinuxMountContext="" Jan 11 17:30:26 crc kubenswrapper[4837]: I0111 17:30:26.083092 4837 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1386a44e-36a2-460c-96d0-0359d2b6f0f5" volumeName="kubernetes.io/configmap/1386a44e-36a2-460c-96d0-0359d2b6f0f5-config" seLinuxMountContext="" Jan 11 17:30:26 crc kubenswrapper[4837]: I0111 17:30:26.083106 4837 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/projected/1bf7eb37-55a3-4c65-b768-a94c82151e69-kube-api-access-sb6h7" seLinuxMountContext="" Jan 11 17:30:26 crc kubenswrapper[4837]: I0111 17:30:26.083119 4837 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-error" seLinuxMountContext="" Jan 11 17:30:26 crc kubenswrapper[4837]: I0111 17:30:26.083134 4837 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-provider-selection" seLinuxMountContext="" Jan 11 17:30:26 crc kubenswrapper[4837]: I0111 17:30:26.083147 4837 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-serving-cert" seLinuxMountContext="" Jan 11 17:30:26 crc kubenswrapper[4837]: I0111 17:30:26.083160 4837 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="925f1c65-6136-48ba-85aa-3a3b50560753" volumeName="kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-ovnkube-config" seLinuxMountContext="" Jan 11 17:30:26 crc kubenswrapper[4837]: I0111 17:30:26.083175 4837 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="43509403-f426-496e-be36-56cef71462f5" volumeName="kubernetes.io/projected/43509403-f426-496e-be36-56cef71462f5-kube-api-access-qg5z5" seLinuxMountContext="" Jan 11 17:30:26 crc kubenswrapper[4837]: I0111 17:30:26.083189 4837 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="87cf06ed-a83f-41a7-828d-70653580a8cb" volumeName="kubernetes.io/secret/87cf06ed-a83f-41a7-828d-70653580a8cb-metrics-tls" seLinuxMountContext="" Jan 11 17:30:26 crc kubenswrapper[4837]: I0111 17:30:26.083201 4837 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bf126b07-da06-4140-9a57-dfd54fc6b486" volumeName="kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-kube-api-access-rnphk" seLinuxMountContext="" Jan 11 17:30:26 crc kubenswrapper[4837]: I0111 17:30:26.083245 4837 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="f88749ec-7931-4ee7-b3fc-1ec5e11f92e9" volumeName="kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-profile-collector-cert" seLinuxMountContext="" Jan 11 17:30:26 crc kubenswrapper[4837]: I0111 17:30:26.083258 4837 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="0b78653f-4ff9-4508-8672-245ed9b561e3" volumeName="kubernetes.io/configmap/0b78653f-4ff9-4508-8672-245ed9b561e3-service-ca" seLinuxMountContext="" Jan 11 17:30:26 crc kubenswrapper[4837]: I0111 17:30:26.083271 4837 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-encryption-config" seLinuxMountContext="" Jan 11 17:30:26 crc kubenswrapper[4837]: I0111 17:30:26.083285 4837 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c" volumeName="kubernetes.io/projected/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-kube-api-access-qs4fp" seLinuxMountContext="" Jan 11 17:30:26 crc kubenswrapper[4837]: I0111 17:30:26.083298 4837 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" volumeName="kubernetes.io/projected/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-kube-api-access-6ccd8" seLinuxMountContext="" Jan 11 17:30:26 crc kubenswrapper[4837]: I0111 17:30:26.083314 4837 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="496e6271-fb68-4057-954e-a0d97a4afa3f" volumeName="kubernetes.io/configmap/496e6271-fb68-4057-954e-a0d97a4afa3f-config" seLinuxMountContext="" Jan 11 17:30:26 crc kubenswrapper[4837]: I0111 17:30:26.083328 4837 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="57a731c4-ef35-47a8-b875-bfb08a7f8011" volumeName="kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-utilities" seLinuxMountContext="" Jan 11 17:30:26 crc kubenswrapper[4837]: I0111 17:30:26.083343 4837 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-bound-sa-token" seLinuxMountContext="" Jan 11 17:30:26 crc kubenswrapper[4837]: I0111 17:30:26.083357 4837 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bf126b07-da06-4140-9a57-dfd54fc6b486" volumeName="kubernetes.io/configmap/bf126b07-da06-4140-9a57-dfd54fc6b486-trusted-ca" seLinuxMountContext="" Jan 11 17:30:26 crc kubenswrapper[4837]: I0111 17:30:26.083373 4837 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d" volumeName="kubernetes.io/projected/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d-kube-api-access-x2m85" seLinuxMountContext="" Jan 11 17:30:26 crc kubenswrapper[4837]: I0111 17:30:26.083388 4837 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="22c825df-677d-4ca6-82db-3454ed06e783" volumeName="kubernetes.io/secret/22c825df-677d-4ca6-82db-3454ed06e783-machine-approver-tls" seLinuxMountContext="" Jan 11 17:30:26 crc kubenswrapper[4837]: I0111 17:30:26.083401 4837 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-trusted-ca" seLinuxMountContext="" Jan 11 17:30:26 crc kubenswrapper[4837]: I0111 17:30:26.083414 4837 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bc5039c0-ea34-426b-a2b7-fbbc87b49a6d" volumeName="kubernetes.io/secret/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-serving-cert" seLinuxMountContext="" Jan 11 17:30:26 crc kubenswrapper[4837]: I0111 17:30:26.083428 4837 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="01ab3dd5-8196-46d0-ad33-122e2ca51def" volumeName="kubernetes.io/projected/01ab3dd5-8196-46d0-ad33-122e2ca51def-kube-api-access-w7l8j" seLinuxMountContext="" Jan 11 17:30:26 crc kubenswrapper[4837]: I0111 17:30:26.083441 4837 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="25e176fe-21b4-4974-b1ed-c8b94f112a7f" volumeName="kubernetes.io/projected/25e176fe-21b4-4974-b1ed-c8b94f112a7f-kube-api-access-d4lsv" seLinuxMountContext="" Jan 11 17:30:26 crc kubenswrapper[4837]: I0111 17:30:26.083453 4837 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" volumeName="kubernetes.io/empty-dir/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-tmpfs" seLinuxMountContext="" Jan 11 17:30:26 crc kubenswrapper[4837]: I0111 17:30:26.083467 4837 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="31d8b7a1-420e-4252-a5b7-eebe8a111292" volumeName="kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-images" seLinuxMountContext="" Jan 11 17:30:26 crc kubenswrapper[4837]: I0111 17:30:26.083480 4837 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7583ce53-e0fe-4a16-9e4d-50516596a136" volumeName="kubernetes.io/secret/7583ce53-e0fe-4a16-9e4d-50516596a136-serving-cert" seLinuxMountContext="" Jan 11 17:30:26 crc kubenswrapper[4837]: I0111 17:30:26.083493 4837 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="a0128f3a-b052-44ed-a84e-c4c8aaf17c13" volumeName="kubernetes.io/projected/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-kube-api-access-gf66m" seLinuxMountContext="" Jan 11 17:30:26 crc kubenswrapper[4837]: I0111 17:30:26.083505 4837 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="efdd0498-1daa-4136-9a4a-3b948c2293fc" volumeName="kubernetes.io/projected/efdd0498-1daa-4136-9a4a-3b948c2293fc-kube-api-access-fqsjt" seLinuxMountContext="" Jan 11 17:30:26 crc kubenswrapper[4837]: I0111 17:30:26.083518 4837 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="0b574797-001e-440a-8f4e-c0be86edad0f" volumeName="kubernetes.io/secret/0b574797-001e-440a-8f4e-c0be86edad0f-proxy-tls" seLinuxMountContext="" Jan 11 17:30:26 crc kubenswrapper[4837]: I0111 17:30:26.083532 4837 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-config" seLinuxMountContext="" Jan 11 17:30:26 crc kubenswrapper[4837]: I0111 17:30:26.083547 4837 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="22c825df-677d-4ca6-82db-3454ed06e783" volumeName="kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-config" seLinuxMountContext="" Jan 11 17:30:26 crc kubenswrapper[4837]: I0111 17:30:26.083561 4837 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="4bb40260-dbaa-4fb0-84df-5e680505d512" volumeName="kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-cni-binary-copy" seLinuxMountContext="" Jan 11 17:30:26 crc kubenswrapper[4837]: I0111 17:30:26.083574 4837 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6509e943-70c6-444c-bc41-48a544e36fbd" volumeName="kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-config" seLinuxMountContext="" Jan 11 17:30:26 crc kubenswrapper[4837]: I0111 17:30:26.083587 4837 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="925f1c65-6136-48ba-85aa-3a3b50560753" volumeName="kubernetes.io/projected/925f1c65-6136-48ba-85aa-3a3b50560753-kube-api-access-s4n52" seLinuxMountContext="" Jan 11 17:30:26 crc kubenswrapper[4837]: I0111 17:30:26.083601 4837 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b6312bbd-5731-4ea0-a20f-81d5a57df44a" volumeName="kubernetes.io/projected/b6312bbd-5731-4ea0-a20f-81d5a57df44a-kube-api-access-249nr" seLinuxMountContext="" Jan 11 17:30:26 crc kubenswrapper[4837]: I0111 17:30:26.083614 4837 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="d75a4c96-2883-4a0b-bab2-0fab2b6c0b49" volumeName="kubernetes.io/configmap/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-iptables-alerter-script" seLinuxMountContext="" Jan 11 17:30:26 crc kubenswrapper[4837]: I0111 17:30:26.083627 4837 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="fda69060-fa79-4696-b1a6-7980f124bf7c" volumeName="kubernetes.io/secret/fda69060-fa79-4696-b1a6-7980f124bf7c-proxy-tls" seLinuxMountContext="" Jan 11 17:30:26 crc kubenswrapper[4837]: I0111 17:30:26.083640 4837 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-audit" seLinuxMountContext="" Jan 11 17:30:26 crc kubenswrapper[4837]: I0111 17:30:26.083655 4837 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6402fda4-df10-493c-b4e5-d0569419652d" volumeName="kubernetes.io/projected/6402fda4-df10-493c-b4e5-d0569419652d-kube-api-access-mg5zb" seLinuxMountContext="" Jan 11 17:30:26 crc kubenswrapper[4837]: I0111 17:30:26.083670 4837 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6509e943-70c6-444c-bc41-48a544e36fbd" volumeName="kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-trusted-ca-bundle" seLinuxMountContext="" Jan 11 17:30:26 crc kubenswrapper[4837]: I0111 17:30:26.083701 4837 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7539238d-5fe0-46ed-884e-1c3b566537ec" volumeName="kubernetes.io/projected/7539238d-5fe0-46ed-884e-1c3b566537ec-kube-api-access-tk88c" seLinuxMountContext="" Jan 11 17:30:26 crc kubenswrapper[4837]: I0111 17:30:26.083713 4837 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="87cf06ed-a83f-41a7-828d-70653580a8cb" volumeName="kubernetes.io/configmap/87cf06ed-a83f-41a7-828d-70653580a8cb-config-volume" seLinuxMountContext="" Jan 11 17:30:26 crc kubenswrapper[4837]: I0111 17:30:26.083726 4837 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b11524ee-3fca-4b1b-9cdf-6da289fdbc7d" volumeName="kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-catalog-content" seLinuxMountContext="" Jan 11 17:30:26 crc kubenswrapper[4837]: I0111 17:30:26.083740 4837 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="e7e6199b-1264-4501-8953-767f51328d08" volumeName="kubernetes.io/configmap/e7e6199b-1264-4501-8953-767f51328d08-config" seLinuxMountContext="" Jan 11 17:30:26 crc kubenswrapper[4837]: I0111 17:30:26.083754 4837 reconstruct.go:97] "Volume reconstruction finished" Jan 11 17:30:26 crc kubenswrapper[4837]: I0111 17:30:26.083765 4837 reconciler.go:26] "Reconciler: start to sync state" Jan 11 17:30:26 crc kubenswrapper[4837]: I0111 17:30:26.096420 4837 manager.go:324] Recovery completed Jan 11 17:30:26 crc kubenswrapper[4837]: I0111 17:30:26.106109 4837 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Jan 11 17:30:26 crc kubenswrapper[4837]: I0111 17:30:26.107785 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 11 17:30:26 crc kubenswrapper[4837]: I0111 17:30:26.107847 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 11 17:30:26 crc kubenswrapper[4837]: I0111 17:30:26.107857 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 11 17:30:26 crc kubenswrapper[4837]: I0111 17:30:26.108590 4837 cpu_manager.go:225] "Starting CPU manager" policy="none" Jan 11 17:30:26 crc kubenswrapper[4837]: I0111 17:30:26.108628 4837 cpu_manager.go:226] "Reconciling" reconcilePeriod="10s" Jan 11 17:30:26 crc kubenswrapper[4837]: I0111 17:30:26.108657 4837 state_mem.go:36] "Initialized new in-memory state store" Jan 11 17:30:26 crc kubenswrapper[4837]: E0111 17:30:26.161861 4837 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Jan 11 17:30:26 crc kubenswrapper[4837]: E0111 17:30:26.262294 4837 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Jan 11 17:30:26 crc kubenswrapper[4837]: E0111 17:30:26.262791 4837 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.196:6443: connect: connection refused" interval="400ms" Jan 11 17:30:26 crc kubenswrapper[4837]: I0111 17:30:26.360786 4837 kubelet_network_linux.go:50] "Initialized iptables rules." protocol="IPv4" Jan 11 17:30:26 crc kubenswrapper[4837]: E0111 17:30:26.362700 4837 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Jan 11 17:30:26 crc kubenswrapper[4837]: I0111 17:30:26.362664 4837 kubelet_network_linux.go:50] "Initialized iptables rules." protocol="IPv6" Jan 11 17:30:26 crc kubenswrapper[4837]: I0111 17:30:26.362759 4837 status_manager.go:217] "Starting to sync pod status with apiserver" Jan 11 17:30:26 crc kubenswrapper[4837]: I0111 17:30:26.362791 4837 kubelet.go:2335] "Starting kubelet main sync loop" Jan 11 17:30:26 crc kubenswrapper[4837]: E0111 17:30:26.362842 4837 kubelet.go:2359] "Skipping pod synchronization" err="[container runtime status check may not have completed yet, PLEG is not healthy: pleg has yet to be successful]" Jan 11 17:30:26 crc kubenswrapper[4837]: W0111 17:30:26.363400 4837 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.RuntimeClass: Get "https://api-int.crc.testing:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0": dial tcp 38.102.83.196:6443: connect: connection refused Jan 11 17:30:26 crc kubenswrapper[4837]: E0111 17:30:26.363448 4837 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.RuntimeClass: failed to list *v1.RuntimeClass: Get \"https://api-int.crc.testing:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0\": dial tcp 38.102.83.196:6443: connect: connection refused" logger="UnhandledError" Jan 11 17:30:26 crc kubenswrapper[4837]: E0111 17:30:26.462824 4837 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Jan 11 17:30:26 crc kubenswrapper[4837]: E0111 17:30:26.462881 4837 kubelet.go:2359] "Skipping pod synchronization" err="container runtime status check may not have completed yet" Jan 11 17:30:26 crc kubenswrapper[4837]: I0111 17:30:26.482436 4837 policy_none.go:49] "None policy: Start" Jan 11 17:30:26 crc kubenswrapper[4837]: I0111 17:30:26.484625 4837 memory_manager.go:170] "Starting memorymanager" policy="None" Jan 11 17:30:26 crc kubenswrapper[4837]: I0111 17:30:26.484652 4837 state_mem.go:35] "Initializing new in-memory state store" Jan 11 17:30:26 crc kubenswrapper[4837]: E0111 17:30:26.562919 4837 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Jan 11 17:30:26 crc kubenswrapper[4837]: I0111 17:30:26.610903 4837 manager.go:334] "Starting Device Plugin manager" Jan 11 17:30:26 crc kubenswrapper[4837]: I0111 17:30:26.611028 4837 manager.go:513] "Failed to read data from checkpoint" checkpoint="kubelet_internal_checkpoint" err="checkpoint is not found" Jan 11 17:30:26 crc kubenswrapper[4837]: I0111 17:30:26.611040 4837 server.go:79] "Starting device plugin registration server" Jan 11 17:30:26 crc kubenswrapper[4837]: I0111 17:30:26.611456 4837 eviction_manager.go:189] "Eviction manager: starting control loop" Jan 11 17:30:26 crc kubenswrapper[4837]: I0111 17:30:26.611473 4837 container_log_manager.go:189] "Initializing container log rotate workers" workers=1 monitorPeriod="10s" Jan 11 17:30:26 crc kubenswrapper[4837]: I0111 17:30:26.611666 4837 plugin_watcher.go:51] "Plugin Watcher Start" path="/var/lib/kubelet/plugins_registry" Jan 11 17:30:26 crc kubenswrapper[4837]: I0111 17:30:26.611763 4837 plugin_manager.go:116] "The desired_state_of_world populator (plugin watcher) starts" Jan 11 17:30:26 crc kubenswrapper[4837]: I0111 17:30:26.611778 4837 plugin_manager.go:118] "Starting Kubelet Plugin Manager" Jan 11 17:30:26 crc kubenswrapper[4837]: E0111 17:30:26.618871 4837 eviction_manager.go:285] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"crc\" not found" Jan 11 17:30:26 crc kubenswrapper[4837]: I0111 17:30:26.663382 4837 kubelet.go:2421] "SyncLoop ADD" source="file" pods=["openshift-kube-controller-manager/kube-controller-manager-crc","openshift-kube-scheduler/openshift-kube-scheduler-crc","openshift-machine-config-operator/kube-rbac-proxy-crio-crc","openshift-etcd/etcd-crc","openshift-kube-apiserver/kube-apiserver-crc"] Jan 11 17:30:26 crc kubenswrapper[4837]: I0111 17:30:26.663505 4837 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Jan 11 17:30:26 crc kubenswrapper[4837]: E0111 17:30:26.663753 4837 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.196:6443: connect: connection refused" interval="800ms" Jan 11 17:30:26 crc kubenswrapper[4837]: I0111 17:30:26.664734 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 11 17:30:26 crc kubenswrapper[4837]: I0111 17:30:26.664767 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 11 17:30:26 crc kubenswrapper[4837]: I0111 17:30:26.664777 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 11 17:30:26 crc kubenswrapper[4837]: I0111 17:30:26.664889 4837 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Jan 11 17:30:26 crc kubenswrapper[4837]: I0111 17:30:26.665033 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Jan 11 17:30:26 crc kubenswrapper[4837]: I0111 17:30:26.665069 4837 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Jan 11 17:30:26 crc kubenswrapper[4837]: I0111 17:30:26.666635 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 11 17:30:26 crc kubenswrapper[4837]: I0111 17:30:26.666658 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 11 17:30:26 crc kubenswrapper[4837]: I0111 17:30:26.666668 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 11 17:30:26 crc kubenswrapper[4837]: I0111 17:30:26.666926 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 11 17:30:26 crc kubenswrapper[4837]: I0111 17:30:26.666945 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 11 17:30:26 crc kubenswrapper[4837]: I0111 17:30:26.666954 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 11 17:30:26 crc kubenswrapper[4837]: I0111 17:30:26.667024 4837 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Jan 11 17:30:26 crc kubenswrapper[4837]: I0111 17:30:26.667259 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Jan 11 17:30:26 crc kubenswrapper[4837]: I0111 17:30:26.667307 4837 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Jan 11 17:30:26 crc kubenswrapper[4837]: I0111 17:30:26.667754 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 11 17:30:26 crc kubenswrapper[4837]: I0111 17:30:26.667782 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 11 17:30:26 crc kubenswrapper[4837]: I0111 17:30:26.667790 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 11 17:30:26 crc kubenswrapper[4837]: I0111 17:30:26.667916 4837 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Jan 11 17:30:26 crc kubenswrapper[4837]: I0111 17:30:26.668288 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 11 17:30:26 crc kubenswrapper[4837]: I0111 17:30:26.668315 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 11 17:30:26 crc kubenswrapper[4837]: I0111 17:30:26.668325 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 11 17:30:26 crc kubenswrapper[4837]: I0111 17:30:26.668525 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Jan 11 17:30:26 crc kubenswrapper[4837]: I0111 17:30:26.668636 4837 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Jan 11 17:30:26 crc kubenswrapper[4837]: I0111 17:30:26.668834 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 11 17:30:26 crc kubenswrapper[4837]: I0111 17:30:26.668884 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 11 17:30:26 crc kubenswrapper[4837]: I0111 17:30:26.668919 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 11 17:30:26 crc kubenswrapper[4837]: I0111 17:30:26.669125 4837 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Jan 11 17:30:26 crc kubenswrapper[4837]: I0111 17:30:26.669362 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-etcd/etcd-crc" Jan 11 17:30:26 crc kubenswrapper[4837]: I0111 17:30:26.669445 4837 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Jan 11 17:30:26 crc kubenswrapper[4837]: I0111 17:30:26.669708 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 11 17:30:26 crc kubenswrapper[4837]: I0111 17:30:26.669729 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 11 17:30:26 crc kubenswrapper[4837]: I0111 17:30:26.669738 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 11 17:30:26 crc kubenswrapper[4837]: I0111 17:30:26.670159 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 11 17:30:26 crc kubenswrapper[4837]: I0111 17:30:26.670595 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 11 17:30:26 crc kubenswrapper[4837]: I0111 17:30:26.670741 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 11 17:30:26 crc kubenswrapper[4837]: I0111 17:30:26.670641 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 11 17:30:26 crc kubenswrapper[4837]: I0111 17:30:26.670920 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 11 17:30:26 crc kubenswrapper[4837]: I0111 17:30:26.670941 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 11 17:30:26 crc kubenswrapper[4837]: I0111 17:30:26.671204 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-crc" Jan 11 17:30:26 crc kubenswrapper[4837]: I0111 17:30:26.671327 4837 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Jan 11 17:30:26 crc kubenswrapper[4837]: I0111 17:30:26.674463 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 11 17:30:26 crc kubenswrapper[4837]: I0111 17:30:26.674549 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 11 17:30:26 crc kubenswrapper[4837]: I0111 17:30:26.674589 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 11 17:30:26 crc kubenswrapper[4837]: I0111 17:30:26.712446 4837 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Jan 11 17:30:26 crc kubenswrapper[4837]: I0111 17:30:26.714909 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 11 17:30:26 crc kubenswrapper[4837]: I0111 17:30:26.714950 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 11 17:30:26 crc kubenswrapper[4837]: I0111 17:30:26.714963 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 11 17:30:26 crc kubenswrapper[4837]: I0111 17:30:26.714992 4837 kubelet_node_status.go:76] "Attempting to register node" node="crc" Jan 11 17:30:26 crc kubenswrapper[4837]: E0111 17:30:26.715598 4837 kubelet_node_status.go:99] "Unable to register node with API server" err="Post \"https://api-int.crc.testing:6443/api/v1/nodes\": dial tcp 38.102.83.196:6443: connect: connection refused" node="crc" Jan 11 17:30:26 crc kubenswrapper[4837]: I0111 17:30:26.791422 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f614b9022728cf315e60c057852e563e-resource-dir\") pod \"kube-controller-manager-crc\" (UID: \"f614b9022728cf315e60c057852e563e\") " pod="openshift-kube-controller-manager/kube-controller-manager-crc" Jan 11 17:30:26 crc kubenswrapper[4837]: I0111 17:30:26.791481 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"usr-local-bin\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-usr-local-bin\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Jan 11 17:30:26 crc kubenswrapper[4837]: I0111 17:30:26.791510 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-audit-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Jan 11 17:30:26 crc kubenswrapper[4837]: I0111 17:30:26.791535 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"static-pod-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-static-pod-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Jan 11 17:30:26 crc kubenswrapper[4837]: I0111 17:30:26.791556 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"data-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-data-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Jan 11 17:30:26 crc kubenswrapper[4837]: I0111 17:30:26.791577 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/3dcd261975c3d6b9a6ad6367fd4facd3-cert-dir\") pod \"openshift-kube-scheduler-crc\" (UID: \"3dcd261975c3d6b9a6ad6367fd4facd3\") " pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Jan 11 17:30:26 crc kubenswrapper[4837]: I0111 17:30:26.791596 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/d1b160f5dda77d281dd8e69ec8d817f9-etc-kube\") pod \"kube-rbac-proxy-crio-crc\" (UID: \"d1b160f5dda77d281dd8e69ec8d817f9\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Jan 11 17:30:26 crc kubenswrapper[4837]: I0111 17:30:26.791618 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-resource-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Jan 11 17:30:26 crc kubenswrapper[4837]: I0111 17:30:26.791857 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/f614b9022728cf315e60c057852e563e-cert-dir\") pod \"kube-controller-manager-crc\" (UID: \"f614b9022728cf315e60c057852e563e\") " pod="openshift-kube-controller-manager/kube-controller-manager-crc" Jan 11 17:30:26 crc kubenswrapper[4837]: I0111 17:30:26.791876 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/3dcd261975c3d6b9a6ad6367fd4facd3-resource-dir\") pod \"openshift-kube-scheduler-crc\" (UID: \"3dcd261975c3d6b9a6ad6367fd4facd3\") " pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Jan 11 17:30:26 crc kubenswrapper[4837]: I0111 17:30:26.791919 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/d1b160f5dda77d281dd8e69ec8d817f9-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-crc\" (UID: \"d1b160f5dda77d281dd8e69ec8d817f9\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Jan 11 17:30:26 crc kubenswrapper[4837]: I0111 17:30:26.791945 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-cert-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Jan 11 17:30:26 crc kubenswrapper[4837]: I0111 17:30:26.791964 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-log-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Jan 11 17:30:26 crc kubenswrapper[4837]: I0111 17:30:26.791983 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-resource-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Jan 11 17:30:26 crc kubenswrapper[4837]: I0111 17:30:26.792153 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-cert-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Jan 11 17:30:26 crc kubenswrapper[4837]: I0111 17:30:26.894482 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"static-pod-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-static-pod-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Jan 11 17:30:26 crc kubenswrapper[4837]: I0111 17:30:26.894593 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"data-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-data-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Jan 11 17:30:26 crc kubenswrapper[4837]: I0111 17:30:26.894657 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/3dcd261975c3d6b9a6ad6367fd4facd3-cert-dir\") pod \"openshift-kube-scheduler-crc\" (UID: \"3dcd261975c3d6b9a6ad6367fd4facd3\") " pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Jan 11 17:30:26 crc kubenswrapper[4837]: I0111 17:30:26.894753 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/d1b160f5dda77d281dd8e69ec8d817f9-etc-kube\") pod \"kube-rbac-proxy-crio-crc\" (UID: \"d1b160f5dda77d281dd8e69ec8d817f9\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Jan 11 17:30:26 crc kubenswrapper[4837]: I0111 17:30:26.894796 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-resource-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Jan 11 17:30:26 crc kubenswrapper[4837]: I0111 17:30:26.894834 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/f614b9022728cf315e60c057852e563e-cert-dir\") pod \"kube-controller-manager-crc\" (UID: \"f614b9022728cf315e60c057852e563e\") " pod="openshift-kube-controller-manager/kube-controller-manager-crc" Jan 11 17:30:26 crc kubenswrapper[4837]: I0111 17:30:26.894873 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/3dcd261975c3d6b9a6ad6367fd4facd3-resource-dir\") pod \"openshift-kube-scheduler-crc\" (UID: \"3dcd261975c3d6b9a6ad6367fd4facd3\") " pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Jan 11 17:30:26 crc kubenswrapper[4837]: I0111 17:30:26.894884 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/3dcd261975c3d6b9a6ad6367fd4facd3-cert-dir\") pod \"openshift-kube-scheduler-crc\" (UID: \"3dcd261975c3d6b9a6ad6367fd4facd3\") " pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Jan 11 17:30:26 crc kubenswrapper[4837]: I0111 17:30:26.894927 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"data-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-data-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Jan 11 17:30:26 crc kubenswrapper[4837]: I0111 17:30:26.894913 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/d1b160f5dda77d281dd8e69ec8d817f9-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-crc\" (UID: \"d1b160f5dda77d281dd8e69ec8d817f9\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Jan 11 17:30:26 crc kubenswrapper[4837]: I0111 17:30:26.895009 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/d1b160f5dda77d281dd8e69ec8d817f9-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-crc\" (UID: \"d1b160f5dda77d281dd8e69ec8d817f9\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Jan 11 17:30:26 crc kubenswrapper[4837]: I0111 17:30:26.895039 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/d1b160f5dda77d281dd8e69ec8d817f9-etc-kube\") pod \"kube-rbac-proxy-crio-crc\" (UID: \"d1b160f5dda77d281dd8e69ec8d817f9\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Jan 11 17:30:26 crc kubenswrapper[4837]: I0111 17:30:26.895031 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-resource-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Jan 11 17:30:26 crc kubenswrapper[4837]: I0111 17:30:26.895056 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-cert-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Jan 11 17:30:26 crc kubenswrapper[4837]: I0111 17:30:26.895193 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-log-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Jan 11 17:30:26 crc kubenswrapper[4837]: I0111 17:30:26.895094 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/3dcd261975c3d6b9a6ad6367fd4facd3-resource-dir\") pod \"openshift-kube-scheduler-crc\" (UID: \"3dcd261975c3d6b9a6ad6367fd4facd3\") " pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Jan 11 17:30:26 crc kubenswrapper[4837]: I0111 17:30:26.895222 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-resource-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Jan 11 17:30:26 crc kubenswrapper[4837]: I0111 17:30:26.895240 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-log-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Jan 11 17:30:26 crc kubenswrapper[4837]: I0111 17:30:26.895255 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-resource-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Jan 11 17:30:26 crc kubenswrapper[4837]: I0111 17:30:26.895098 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/f614b9022728cf315e60c057852e563e-cert-dir\") pod \"kube-controller-manager-crc\" (UID: \"f614b9022728cf315e60c057852e563e\") " pod="openshift-kube-controller-manager/kube-controller-manager-crc" Jan 11 17:30:26 crc kubenswrapper[4837]: I0111 17:30:26.895103 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-cert-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Jan 11 17:30:26 crc kubenswrapper[4837]: I0111 17:30:26.895318 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-cert-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Jan 11 17:30:26 crc kubenswrapper[4837]: I0111 17:30:26.895347 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"static-pod-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-static-pod-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Jan 11 17:30:26 crc kubenswrapper[4837]: I0111 17:30:26.895377 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f614b9022728cf315e60c057852e563e-resource-dir\") pod \"kube-controller-manager-crc\" (UID: \"f614b9022728cf315e60c057852e563e\") " pod="openshift-kube-controller-manager/kube-controller-manager-crc" Jan 11 17:30:26 crc kubenswrapper[4837]: I0111 17:30:26.895445 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f614b9022728cf315e60c057852e563e-resource-dir\") pod \"kube-controller-manager-crc\" (UID: \"f614b9022728cf315e60c057852e563e\") " pod="openshift-kube-controller-manager/kube-controller-manager-crc" Jan 11 17:30:26 crc kubenswrapper[4837]: I0111 17:30:26.895409 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-cert-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Jan 11 17:30:26 crc kubenswrapper[4837]: I0111 17:30:26.895478 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"usr-local-bin\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-usr-local-bin\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Jan 11 17:30:26 crc kubenswrapper[4837]: I0111 17:30:26.895522 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"usr-local-bin\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-usr-local-bin\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Jan 11 17:30:26 crc kubenswrapper[4837]: I0111 17:30:26.895544 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-audit-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Jan 11 17:30:26 crc kubenswrapper[4837]: I0111 17:30:26.895733 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-audit-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Jan 11 17:30:26 crc kubenswrapper[4837]: I0111 17:30:26.916500 4837 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Jan 11 17:30:26 crc kubenswrapper[4837]: I0111 17:30:26.917903 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 11 17:30:26 crc kubenswrapper[4837]: I0111 17:30:26.917942 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 11 17:30:26 crc kubenswrapper[4837]: I0111 17:30:26.917954 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 11 17:30:26 crc kubenswrapper[4837]: I0111 17:30:26.917981 4837 kubelet_node_status.go:76] "Attempting to register node" node="crc" Jan 11 17:30:26 crc kubenswrapper[4837]: E0111 17:30:26.918467 4837 kubelet_node_status.go:99] "Unable to register node with API server" err="Post \"https://api-int.crc.testing:6443/api/v1/nodes\": dial tcp 38.102.83.196:6443: connect: connection refused" node="crc" Jan 11 17:30:26 crc kubenswrapper[4837]: I0111 17:30:26.998770 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Jan 11 17:30:27 crc kubenswrapper[4837]: I0111 17:30:27.006005 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Jan 11 17:30:27 crc kubenswrapper[4837]: I0111 17:30:27.021975 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Jan 11 17:30:27 crc kubenswrapper[4837]: I0111 17:30:27.031912 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-etcd/etcd-crc" Jan 11 17:30:27 crc kubenswrapper[4837]: I0111 17:30:27.036258 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-crc" Jan 11 17:30:27 crc kubenswrapper[4837]: W0111 17:30:27.042374 4837 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf614b9022728cf315e60c057852e563e.slice/crio-d03a57a2d28fdf696fa8ee988c62f6d555973ebf23d3c57ed4a100df06ab182e WatchSource:0}: Error finding container d03a57a2d28fdf696fa8ee988c62f6d555973ebf23d3c57ed4a100df06ab182e: Status 404 returned error can't find the container with id d03a57a2d28fdf696fa8ee988c62f6d555973ebf23d3c57ed4a100df06ab182e Jan 11 17:30:27 crc kubenswrapper[4837]: W0111 17:30:27.052145 4837 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod3dcd261975c3d6b9a6ad6367fd4facd3.slice/crio-37436e5705de7f4c24a899ed80cc57942f4bbd1c4a7b391992f9465dc252663a WatchSource:0}: Error finding container 37436e5705de7f4c24a899ed80cc57942f4bbd1c4a7b391992f9465dc252663a: Status 404 returned error can't find the container with id 37436e5705de7f4c24a899ed80cc57942f4bbd1c4a7b391992f9465dc252663a Jan 11 17:30:27 crc kubenswrapper[4837]: W0111 17:30:27.053212 4837 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod2139d3e2895fc6797b9c76a1b4c9886d.slice/crio-ac29a3f34a12c8dabc0ec63be74c3690fa8780900f01fd18e5a939968d4b5423 WatchSource:0}: Error finding container ac29a3f34a12c8dabc0ec63be74c3690fa8780900f01fd18e5a939968d4b5423: Status 404 returned error can't find the container with id ac29a3f34a12c8dabc0ec63be74c3690fa8780900f01fd18e5a939968d4b5423 Jan 11 17:30:27 crc kubenswrapper[4837]: I0111 17:30:27.056019 4837 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": dial tcp 38.102.83.196:6443: connect: connection refused Jan 11 17:30:27 crc kubenswrapper[4837]: I0111 17:30:27.062303 4837 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2026-01-04 04:40:49.580635682 +0000 UTC Jan 11 17:30:27 crc kubenswrapper[4837]: W0111 17:30:27.062735 4837 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.CSIDriver: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0": dial tcp 38.102.83.196:6443: connect: connection refused Jan 11 17:30:27 crc kubenswrapper[4837]: E0111 17:30:27.062874 4837 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.CSIDriver: failed to list *v1.CSIDriver: Get \"https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0\": dial tcp 38.102.83.196:6443: connect: connection refused" logger="UnhandledError" Jan 11 17:30:27 crc kubenswrapper[4837]: W0111 17:30:27.139759 4837 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf4b27818a5e8e43d0dc095d08835c792.slice/crio-095e96f267b4cc13275670be2461f3d1dcec4bedd882ef813b4e91389cfb71c7 WatchSource:0}: Error finding container 095e96f267b4cc13275670be2461f3d1dcec4bedd882ef813b4e91389cfb71c7: Status 404 returned error can't find the container with id 095e96f267b4cc13275670be2461f3d1dcec4bedd882ef813b4e91389cfb71c7 Jan 11 17:30:27 crc kubenswrapper[4837]: W0111 17:30:27.168945 4837 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Service: Get "https://api-int.crc.testing:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0": dial tcp 38.102.83.196:6443: connect: connection refused Jan 11 17:30:27 crc kubenswrapper[4837]: E0111 17:30:27.169064 4837 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Service: failed to list *v1.Service: Get \"https://api-int.crc.testing:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0\": dial tcp 38.102.83.196:6443: connect: connection refused" logger="UnhandledError" Jan 11 17:30:27 crc kubenswrapper[4837]: W0111 17:30:27.198314 4837 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Node: Get "https://api-int.crc.testing:6443/api/v1/nodes?fieldSelector=metadata.name%3Dcrc&limit=500&resourceVersion=0": dial tcp 38.102.83.196:6443: connect: connection refused Jan 11 17:30:27 crc kubenswrapper[4837]: E0111 17:30:27.198403 4837 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Node: failed to list *v1.Node: Get \"https://api-int.crc.testing:6443/api/v1/nodes?fieldSelector=metadata.name%3Dcrc&limit=500&resourceVersion=0\": dial tcp 38.102.83.196:6443: connect: connection refused" logger="UnhandledError" Jan 11 17:30:27 crc kubenswrapper[4837]: I0111 17:30:27.319262 4837 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Jan 11 17:30:27 crc kubenswrapper[4837]: I0111 17:30:27.320530 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 11 17:30:27 crc kubenswrapper[4837]: I0111 17:30:27.320574 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 11 17:30:27 crc kubenswrapper[4837]: I0111 17:30:27.320583 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 11 17:30:27 crc kubenswrapper[4837]: I0111 17:30:27.320606 4837 kubelet_node_status.go:76] "Attempting to register node" node="crc" Jan 11 17:30:27 crc kubenswrapper[4837]: E0111 17:30:27.321061 4837 kubelet_node_status.go:99] "Unable to register node with API server" err="Post \"https://api-int.crc.testing:6443/api/v1/nodes\": dial tcp 38.102.83.196:6443: connect: connection refused" node="crc" Jan 11 17:30:27 crc kubenswrapper[4837]: I0111 17:30:27.366328 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" event={"ID":"d1b160f5dda77d281dd8e69ec8d817f9","Type":"ContainerStarted","Data":"6fd62ed65e293233219c86c06917bd4f45bf54fde9e4aeb020217101b2b6b369"} Jan 11 17:30:27 crc kubenswrapper[4837]: I0111 17:30:27.367331 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" event={"ID":"3dcd261975c3d6b9a6ad6367fd4facd3","Type":"ContainerStarted","Data":"37436e5705de7f4c24a899ed80cc57942f4bbd1c4a7b391992f9465dc252663a"} Jan 11 17:30:27 crc kubenswrapper[4837]: I0111 17:30:27.368467 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerStarted","Data":"d03a57a2d28fdf696fa8ee988c62f6d555973ebf23d3c57ed4a100df06ab182e"} Jan 11 17:30:27 crc kubenswrapper[4837]: I0111 17:30:27.369461 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerStarted","Data":"ac29a3f34a12c8dabc0ec63be74c3690fa8780900f01fd18e5a939968d4b5423"} Jan 11 17:30:27 crc kubenswrapper[4837]: I0111 17:30:27.370177 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"095e96f267b4cc13275670be2461f3d1dcec4bedd882ef813b4e91389cfb71c7"} Jan 11 17:30:27 crc kubenswrapper[4837]: E0111 17:30:27.464666 4837 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.196:6443: connect: connection refused" interval="1.6s" Jan 11 17:30:27 crc kubenswrapper[4837]: E0111 17:30:27.659694 4837 event.go:368] "Unable to write event (may retry after sleeping)" err="Post \"https://api-int.crc.testing:6443/api/v1/namespaces/default/events\": dial tcp 38.102.83.196:6443: connect: connection refused" event="&Event{ObjectMeta:{crc.1889bd7200ac4627 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:Starting,Message:Starting kubelet.,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-01-11 17:30:26.054841895 +0000 UTC m=+0.233034611,LastTimestamp:2026-01-11 17:30:26.054841895 +0000 UTC m=+0.233034611,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Jan 11 17:30:28 crc kubenswrapper[4837]: I0111 17:30:28.056959 4837 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": dial tcp 38.102.83.196:6443: connect: connection refused Jan 11 17:30:28 crc kubenswrapper[4837]: I0111 17:30:28.063280 4837 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-06 00:43:22.95522058 +0000 UTC Jan 11 17:30:28 crc kubenswrapper[4837]: I0111 17:30:28.121726 4837 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Jan 11 17:30:28 crc kubenswrapper[4837]: I0111 17:30:28.123195 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 11 17:30:28 crc kubenswrapper[4837]: I0111 17:30:28.123235 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 11 17:30:28 crc kubenswrapper[4837]: I0111 17:30:28.123247 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 11 17:30:28 crc kubenswrapper[4837]: I0111 17:30:28.123274 4837 kubelet_node_status.go:76] "Attempting to register node" node="crc" Jan 11 17:30:28 crc kubenswrapper[4837]: E0111 17:30:28.123713 4837 kubelet_node_status.go:99] "Unable to register node with API server" err="Post \"https://api-int.crc.testing:6443/api/v1/nodes\": dial tcp 38.102.83.196:6443: connect: connection refused" node="crc" Jan 11 17:30:28 crc kubenswrapper[4837]: I0111 17:30:28.187335 4837 certificate_manager.go:356] kubernetes.io/kube-apiserver-client-kubelet: Rotating certificates Jan 11 17:30:28 crc kubenswrapper[4837]: E0111 17:30:28.188638 4837 certificate_manager.go:562] "Unhandled Error" err="kubernetes.io/kube-apiserver-client-kubelet: Failed while requesting a signed certificate from the control plane: cannot create certificate signing request: Post \"https://api-int.crc.testing:6443/apis/certificates.k8s.io/v1/certificatesigningrequests\": dial tcp 38.102.83.196:6443: connect: connection refused" logger="UnhandledError" Jan 11 17:30:29 crc kubenswrapper[4837]: W0111 17:30:29.034874 4837 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.CSIDriver: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0": dial tcp 38.102.83.196:6443: connect: connection refused Jan 11 17:30:29 crc kubenswrapper[4837]: E0111 17:30:29.034978 4837 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.CSIDriver: failed to list *v1.CSIDriver: Get \"https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0\": dial tcp 38.102.83.196:6443: connect: connection refused" logger="UnhandledError" Jan 11 17:30:29 crc kubenswrapper[4837]: I0111 17:30:29.056315 4837 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": dial tcp 38.102.83.196:6443: connect: connection refused Jan 11 17:30:29 crc kubenswrapper[4837]: I0111 17:30:29.063597 4837 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2026-01-07 01:57:37.89876355 +0000 UTC Jan 11 17:30:29 crc kubenswrapper[4837]: E0111 17:30:29.065277 4837 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.196:6443: connect: connection refused" interval="3.2s" Jan 11 17:30:29 crc kubenswrapper[4837]: W0111 17:30:29.285451 4837 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.RuntimeClass: Get "https://api-int.crc.testing:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0": dial tcp 38.102.83.196:6443: connect: connection refused Jan 11 17:30:29 crc kubenswrapper[4837]: E0111 17:30:29.285566 4837 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.RuntimeClass: failed to list *v1.RuntimeClass: Get \"https://api-int.crc.testing:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0\": dial tcp 38.102.83.196:6443: connect: connection refused" logger="UnhandledError" Jan 11 17:30:29 crc kubenswrapper[4837]: W0111 17:30:29.331621 4837 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Node: Get "https://api-int.crc.testing:6443/api/v1/nodes?fieldSelector=metadata.name%3Dcrc&limit=500&resourceVersion=0": dial tcp 38.102.83.196:6443: connect: connection refused Jan 11 17:30:29 crc kubenswrapper[4837]: E0111 17:30:29.331755 4837 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Node: failed to list *v1.Node: Get \"https://api-int.crc.testing:6443/api/v1/nodes?fieldSelector=metadata.name%3Dcrc&limit=500&resourceVersion=0\": dial tcp 38.102.83.196:6443: connect: connection refused" logger="UnhandledError" Jan 11 17:30:29 crc kubenswrapper[4837]: I0111 17:30:29.578057 4837 generic.go:334] "Generic (PLEG): container finished" podID="2139d3e2895fc6797b9c76a1b4c9886d" containerID="6e43d48dddc3ebbe0e86cf10ae809e141d3d760533889fec5f778ea0348767ad" exitCode=0 Jan 11 17:30:29 crc kubenswrapper[4837]: I0111 17:30:29.578134 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerDied","Data":"6e43d48dddc3ebbe0e86cf10ae809e141d3d760533889fec5f778ea0348767ad"} Jan 11 17:30:29 crc kubenswrapper[4837]: I0111 17:30:29.578353 4837 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Jan 11 17:30:29 crc kubenswrapper[4837]: I0111 17:30:29.579824 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 11 17:30:29 crc kubenswrapper[4837]: I0111 17:30:29.579872 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 11 17:30:29 crc kubenswrapper[4837]: I0111 17:30:29.579888 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 11 17:30:29 crc kubenswrapper[4837]: I0111 17:30:29.582055 4837 generic.go:334] "Generic (PLEG): container finished" podID="f4b27818a5e8e43d0dc095d08835c792" containerID="7e80355f699d3f0ea37d79f71e08cabb59de047b88df75691a2da0ef7de9d52b" exitCode=0 Jan 11 17:30:29 crc kubenswrapper[4837]: I0111 17:30:29.582189 4837 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Jan 11 17:30:29 crc kubenswrapper[4837]: I0111 17:30:29.582184 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerDied","Data":"7e80355f699d3f0ea37d79f71e08cabb59de047b88df75691a2da0ef7de9d52b"} Jan 11 17:30:29 crc kubenswrapper[4837]: I0111 17:30:29.583869 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 11 17:30:29 crc kubenswrapper[4837]: I0111 17:30:29.583920 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 11 17:30:29 crc kubenswrapper[4837]: I0111 17:30:29.583937 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 11 17:30:29 crc kubenswrapper[4837]: I0111 17:30:29.584599 4837 generic.go:334] "Generic (PLEG): container finished" podID="d1b160f5dda77d281dd8e69ec8d817f9" containerID="080f099735ee1daf20d02d67b317aa73e4be0ace2baafc25572e10fc70554e81" exitCode=0 Jan 11 17:30:29 crc kubenswrapper[4837]: I0111 17:30:29.584704 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" event={"ID":"d1b160f5dda77d281dd8e69ec8d817f9","Type":"ContainerDied","Data":"080f099735ee1daf20d02d67b317aa73e4be0ace2baafc25572e10fc70554e81"} Jan 11 17:30:29 crc kubenswrapper[4837]: I0111 17:30:29.584747 4837 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Jan 11 17:30:29 crc kubenswrapper[4837]: I0111 17:30:29.586043 4837 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Jan 11 17:30:29 crc kubenswrapper[4837]: I0111 17:30:29.587272 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 11 17:30:29 crc kubenswrapper[4837]: I0111 17:30:29.587482 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 11 17:30:29 crc kubenswrapper[4837]: I0111 17:30:29.587399 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 11 17:30:29 crc kubenswrapper[4837]: I0111 17:30:29.587742 4837 generic.go:334] "Generic (PLEG): container finished" podID="3dcd261975c3d6b9a6ad6367fd4facd3" containerID="ebe3e5dfb1cb302cd3b72ee88a864dc0e47a098eb7775d2fe93f100207bb910d" exitCode=0 Jan 11 17:30:29 crc kubenswrapper[4837]: I0111 17:30:29.587794 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 11 17:30:29 crc kubenswrapper[4837]: I0111 17:30:29.587821 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 11 17:30:29 crc kubenswrapper[4837]: I0111 17:30:29.587814 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" event={"ID":"3dcd261975c3d6b9a6ad6367fd4facd3","Type":"ContainerDied","Data":"ebe3e5dfb1cb302cd3b72ee88a864dc0e47a098eb7775d2fe93f100207bb910d"} Jan 11 17:30:29 crc kubenswrapper[4837]: I0111 17:30:29.587940 4837 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Jan 11 17:30:29 crc kubenswrapper[4837]: I0111 17:30:29.588290 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 11 17:30:29 crc kubenswrapper[4837]: I0111 17:30:29.589440 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 11 17:30:29 crc kubenswrapper[4837]: I0111 17:30:29.589492 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 11 17:30:29 crc kubenswrapper[4837]: I0111 17:30:29.589516 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 11 17:30:29 crc kubenswrapper[4837]: I0111 17:30:29.589946 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerStarted","Data":"e4a6334ba6c5d767d1ea3901dae9be1569836c6043224f9ccedd0100448a80fe"} Jan 11 17:30:29 crc kubenswrapper[4837]: I0111 17:30:29.724276 4837 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Jan 11 17:30:29 crc kubenswrapper[4837]: I0111 17:30:29.725148 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 11 17:30:29 crc kubenswrapper[4837]: I0111 17:30:29.725183 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 11 17:30:29 crc kubenswrapper[4837]: I0111 17:30:29.725196 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 11 17:30:29 crc kubenswrapper[4837]: I0111 17:30:29.725217 4837 kubelet_node_status.go:76] "Attempting to register node" node="crc" Jan 11 17:30:29 crc kubenswrapper[4837]: E0111 17:30:29.725825 4837 kubelet_node_status.go:99] "Unable to register node with API server" err="Post \"https://api-int.crc.testing:6443/api/v1/nodes\": dial tcp 38.102.83.196:6443: connect: connection refused" node="crc" Jan 11 17:30:30 crc kubenswrapper[4837]: I0111 17:30:30.057037 4837 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": dial tcp 38.102.83.196:6443: connect: connection refused Jan 11 17:30:30 crc kubenswrapper[4837]: I0111 17:30:30.063761 4837 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-05 00:50:40.974621456 +0000 UTC Jan 11 17:30:30 crc kubenswrapper[4837]: W0111 17:30:30.122000 4837 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Service: Get "https://api-int.crc.testing:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0": dial tcp 38.102.83.196:6443: connect: connection refused Jan 11 17:30:30 crc kubenswrapper[4837]: E0111 17:30:30.122095 4837 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Service: failed to list *v1.Service: Get \"https://api-int.crc.testing:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0\": dial tcp 38.102.83.196:6443: connect: connection refused" logger="UnhandledError" Jan 11 17:30:30 crc kubenswrapper[4837]: I0111 17:30:30.595149 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerStarted","Data":"f143223d5223a4e1e7056b09d8a43a04fde926d4c15184b5b452190356db5881"} Jan 11 17:30:31 crc kubenswrapper[4837]: I0111 17:30:31.056115 4837 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": dial tcp 38.102.83.196:6443: connect: connection refused Jan 11 17:30:31 crc kubenswrapper[4837]: W0111 17:30:31.056795 4837 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.RuntimeClass: Get "https://api-int.crc.testing:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0": dial tcp 38.102.83.196:6443: connect: connection refused Jan 11 17:30:31 crc kubenswrapper[4837]: E0111 17:30:31.056911 4837 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.RuntimeClass: failed to list *v1.RuntimeClass: Get \"https://api-int.crc.testing:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0\": dial tcp 38.102.83.196:6443: connect: connection refused" logger="UnhandledError" Jan 11 17:30:31 crc kubenswrapper[4837]: I0111 17:30:31.064500 4837 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-11-06 21:41:34.246754452 +0000 UTC Jan 11 17:30:32 crc kubenswrapper[4837]: I0111 17:30:32.056776 4837 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": dial tcp 38.102.83.196:6443: connect: connection refused Jan 11 17:30:32 crc kubenswrapper[4837]: I0111 17:30:32.065473 4837 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-11-11 00:26:56.12140232 +0000 UTC Jan 11 17:30:32 crc kubenswrapper[4837]: E0111 17:30:32.265850 4837 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.196:6443: connect: connection refused" interval="6.4s" Jan 11 17:30:32 crc kubenswrapper[4837]: W0111 17:30:32.479989 4837 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.CSIDriver: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0": dial tcp 38.102.83.196:6443: connect: connection refused Jan 11 17:30:32 crc kubenswrapper[4837]: E0111 17:30:32.480114 4837 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.CSIDriver: failed to list *v1.CSIDriver: Get \"https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0\": dial tcp 38.102.83.196:6443: connect: connection refused" logger="UnhandledError" Jan 11 17:30:32 crc kubenswrapper[4837]: I0111 17:30:32.566960 4837 certificate_manager.go:356] kubernetes.io/kube-apiserver-client-kubelet: Rotating certificates Jan 11 17:30:32 crc kubenswrapper[4837]: E0111 17:30:32.568577 4837 certificate_manager.go:562] "Unhandled Error" err="kubernetes.io/kube-apiserver-client-kubelet: Failed while requesting a signed certificate from the control plane: cannot create certificate signing request: Post \"https://api-int.crc.testing:6443/apis/certificates.k8s.io/v1/certificatesigningrequests\": dial tcp 38.102.83.196:6443: connect: connection refused" logger="UnhandledError" Jan 11 17:30:32 crc kubenswrapper[4837]: I0111 17:30:32.602368 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" event={"ID":"d1b160f5dda77d281dd8e69ec8d817f9","Type":"ContainerStarted","Data":"06e75931338b3bbd83cb3267699473b3347e39227c029a932fef7a8063c55ced"} Jan 11 17:30:32 crc kubenswrapper[4837]: I0111 17:30:32.602499 4837 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Jan 11 17:30:32 crc kubenswrapper[4837]: I0111 17:30:32.603875 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 11 17:30:32 crc kubenswrapper[4837]: I0111 17:30:32.603917 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 11 17:30:32 crc kubenswrapper[4837]: I0111 17:30:32.603928 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 11 17:30:32 crc kubenswrapper[4837]: I0111 17:30:32.604380 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" event={"ID":"3dcd261975c3d6b9a6ad6367fd4facd3","Type":"ContainerStarted","Data":"3f1570357568efb8bd28e5d0705440f75cbdee0d1872382250316b2ced45b7b4"} Jan 11 17:30:32 crc kubenswrapper[4837]: I0111 17:30:32.606368 4837 generic.go:334] "Generic (PLEG): container finished" podID="2139d3e2895fc6797b9c76a1b4c9886d" containerID="dce0361713529f1f3d55557e73df22d35bebbe40066189b37ba03fc2cf514205" exitCode=0 Jan 11 17:30:32 crc kubenswrapper[4837]: I0111 17:30:32.606452 4837 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Jan 11 17:30:32 crc kubenswrapper[4837]: I0111 17:30:32.606460 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerDied","Data":"dce0361713529f1f3d55557e73df22d35bebbe40066189b37ba03fc2cf514205"} Jan 11 17:30:32 crc kubenswrapper[4837]: I0111 17:30:32.607187 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 11 17:30:32 crc kubenswrapper[4837]: I0111 17:30:32.607215 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 11 17:30:32 crc kubenswrapper[4837]: I0111 17:30:32.607225 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 11 17:30:32 crc kubenswrapper[4837]: I0111 17:30:32.609293 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"7d60a2ce54687f5ded99fd2e70894c23fe35a18b1bd557e6f05edca994881a3c"} Jan 11 17:30:32 crc kubenswrapper[4837]: I0111 17:30:32.926183 4837 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Jan 11 17:30:32 crc kubenswrapper[4837]: I0111 17:30:32.927420 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 11 17:30:32 crc kubenswrapper[4837]: I0111 17:30:32.927492 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 11 17:30:32 crc kubenswrapper[4837]: I0111 17:30:32.927511 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 11 17:30:32 crc kubenswrapper[4837]: I0111 17:30:32.927548 4837 kubelet_node_status.go:76] "Attempting to register node" node="crc" Jan 11 17:30:32 crc kubenswrapper[4837]: E0111 17:30:32.928266 4837 kubelet_node_status.go:99] "Unable to register node with API server" err="Post \"https://api-int.crc.testing:6443/api/v1/nodes\": dial tcp 38.102.83.196:6443: connect: connection refused" node="crc" Jan 11 17:30:33 crc kubenswrapper[4837]: I0111 17:30:33.056326 4837 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": dial tcp 38.102.83.196:6443: connect: connection refused Jan 11 17:30:33 crc kubenswrapper[4837]: I0111 17:30:33.066072 4837 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-17 12:25:19.617417438 +0000 UTC Jan 11 17:30:33 crc kubenswrapper[4837]: I0111 17:30:33.615851 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerStarted","Data":"db275db8ed6536a370703777ae03fc20d9a4336ba7ef28cca23945d9f76dc137"} Jan 11 17:30:33 crc kubenswrapper[4837]: I0111 17:30:33.619438 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerStarted","Data":"9d344161514ccacd69f22e10f043ccdcf3b4f5a5cdcc0a0578bcb7b135f2f829"} Jan 11 17:30:33 crc kubenswrapper[4837]: I0111 17:30:33.619497 4837 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Jan 11 17:30:33 crc kubenswrapper[4837]: I0111 17:30:33.620803 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 11 17:30:33 crc kubenswrapper[4837]: I0111 17:30:33.620865 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 11 17:30:33 crc kubenswrapper[4837]: I0111 17:30:33.620887 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 11 17:30:34 crc kubenswrapper[4837]: I0111 17:30:34.066860 4837 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-11-20 14:11:13.659380968 +0000 UTC Jan 11 17:30:34 crc kubenswrapper[4837]: I0111 17:30:34.624119 4837 generic.go:334] "Generic (PLEG): container finished" podID="2139d3e2895fc6797b9c76a1b4c9886d" containerID="db275db8ed6536a370703777ae03fc20d9a4336ba7ef28cca23945d9f76dc137" exitCode=0 Jan 11 17:30:34 crc kubenswrapper[4837]: I0111 17:30:34.624169 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerDied","Data":"db275db8ed6536a370703777ae03fc20d9a4336ba7ef28cca23945d9f76dc137"} Jan 11 17:30:34 crc kubenswrapper[4837]: I0111 17:30:34.624333 4837 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Jan 11 17:30:34 crc kubenswrapper[4837]: I0111 17:30:34.625851 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 11 17:30:34 crc kubenswrapper[4837]: I0111 17:30:34.625904 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 11 17:30:34 crc kubenswrapper[4837]: I0111 17:30:34.625921 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 11 17:30:34 crc kubenswrapper[4837]: I0111 17:30:34.626987 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"b0188eb536b523f8b0f5373307a7649fedd407a748d413a27ca1f61ed03e9e1d"} Jan 11 17:30:34 crc kubenswrapper[4837]: I0111 17:30:34.630260 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" event={"ID":"3dcd261975c3d6b9a6ad6367fd4facd3","Type":"ContainerStarted","Data":"1454dce2acfc65d7ef5ab03957c74ea3fba47fe5ac62a9e8a4599ac7b6a2c217"} Jan 11 17:30:34 crc kubenswrapper[4837]: I0111 17:30:34.630296 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" event={"ID":"3dcd261975c3d6b9a6ad6367fd4facd3","Type":"ContainerStarted","Data":"61c9dacc05eec80f9db657621ea0185c5aa06416af8f96a0a0cb43304b6cd17c"} Jan 11 17:30:34 crc kubenswrapper[4837]: I0111 17:30:34.630298 4837 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Jan 11 17:30:34 crc kubenswrapper[4837]: I0111 17:30:34.631237 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 11 17:30:34 crc kubenswrapper[4837]: I0111 17:30:34.631292 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 11 17:30:34 crc kubenswrapper[4837]: I0111 17:30:34.631316 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 11 17:30:34 crc kubenswrapper[4837]: I0111 17:30:34.633146 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerStarted","Data":"5e28776f0910d2b90b9ec6cd95b16bfc77cf6eda9a615666acf6e461076ed9d3"} Jan 11 17:30:34 crc kubenswrapper[4837]: I0111 17:30:34.633244 4837 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Jan 11 17:30:34 crc kubenswrapper[4837]: I0111 17:30:34.634026 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 11 17:30:34 crc kubenswrapper[4837]: I0111 17:30:34.634056 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 11 17:30:34 crc kubenswrapper[4837]: I0111 17:30:34.634068 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 11 17:30:35 crc kubenswrapper[4837]: I0111 17:30:35.066937 4837 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-04 01:41:23.43209423 +0000 UTC Jan 11 17:30:35 crc kubenswrapper[4837]: I0111 17:30:35.640600 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerStarted","Data":"89e238a830afa451f79f3035a97d82d54719815e59f685e3b3deabf86ec71dba"} Jan 11 17:30:35 crc kubenswrapper[4837]: I0111 17:30:35.645220 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"d3ff334f7925a2e69fa040b1ebfee0fd51669c60ff75534c1e08a9bf30b28d4d"} Jan 11 17:30:35 crc kubenswrapper[4837]: I0111 17:30:35.645327 4837 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Jan 11 17:30:35 crc kubenswrapper[4837]: I0111 17:30:35.645420 4837 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Jan 11 17:30:35 crc kubenswrapper[4837]: I0111 17:30:35.645553 4837 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Jan 11 17:30:35 crc kubenswrapper[4837]: I0111 17:30:35.647042 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 11 17:30:35 crc kubenswrapper[4837]: I0111 17:30:35.647096 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 11 17:30:35 crc kubenswrapper[4837]: I0111 17:30:35.647113 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 11 17:30:35 crc kubenswrapper[4837]: I0111 17:30:35.647115 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 11 17:30:35 crc kubenswrapper[4837]: I0111 17:30:35.647160 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 11 17:30:35 crc kubenswrapper[4837]: I0111 17:30:35.647184 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 11 17:30:36 crc kubenswrapper[4837]: I0111 17:30:36.079528 4837 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-29 10:30:04.321343665 +0000 UTC Jan 11 17:30:36 crc kubenswrapper[4837]: E0111 17:30:36.619138 4837 eviction_manager.go:285] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"crc\" not found" Jan 11 17:30:36 crc kubenswrapper[4837]: I0111 17:30:36.651446 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerStarted","Data":"e120bccd67a15bdc494a1dfdf00295cfa89fe116c3b64ca8909e4da525c0f8ef"} Jan 11 17:30:36 crc kubenswrapper[4837]: I0111 17:30:36.656798 4837 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Jan 11 17:30:36 crc kubenswrapper[4837]: I0111 17:30:36.656896 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"c40f593938fad7c4202ddde8c6125eed092683e19464cc63448e7579e8174e54"} Jan 11 17:30:36 crc kubenswrapper[4837]: I0111 17:30:36.658227 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 11 17:30:36 crc kubenswrapper[4837]: I0111 17:30:36.658281 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 11 17:30:36 crc kubenswrapper[4837]: I0111 17:30:36.658299 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 11 17:30:37 crc kubenswrapper[4837]: I0111 17:30:37.080486 4837 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-11-09 10:03:35.642347528 +0000 UTC Jan 11 17:30:38 crc kubenswrapper[4837]: I0111 17:30:38.031167 4837 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Jan 11 17:30:38 crc kubenswrapper[4837]: I0111 17:30:38.031372 4837 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Jan 11 17:30:38 crc kubenswrapper[4837]: I0111 17:30:38.033171 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 11 17:30:38 crc kubenswrapper[4837]: I0111 17:30:38.033236 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 11 17:30:38 crc kubenswrapper[4837]: I0111 17:30:38.033261 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 11 17:30:38 crc kubenswrapper[4837]: I0111 17:30:38.081261 4837 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-11-20 00:03:57.527823444 +0000 UTC Jan 11 17:30:38 crc kubenswrapper[4837]: I0111 17:30:38.667946 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerStarted","Data":"b80c685860d30559baac31d20782dd8a5237a1672501b8c6e1d8a2d70f75e128"} Jan 11 17:30:38 crc kubenswrapper[4837]: I0111 17:30:38.667980 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerStarted","Data":"9724cece1f266392bd7016e44fe55c35fe0e2eb8ef58323118d5d5fced3674c7"} Jan 11 17:30:38 crc kubenswrapper[4837]: I0111 17:30:38.671533 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"9f228eaa8376ca17f605b85e71a21a46e226d720350baabc72ef9ef6d468cd0e"} Jan 11 17:30:38 crc kubenswrapper[4837]: I0111 17:30:38.671767 4837 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Jan 11 17:30:38 crc kubenswrapper[4837]: I0111 17:30:38.672591 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 11 17:30:38 crc kubenswrapper[4837]: I0111 17:30:38.672764 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 11 17:30:38 crc kubenswrapper[4837]: I0111 17:30:38.672878 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 11 17:30:39 crc kubenswrapper[4837]: I0111 17:30:39.082236 4837 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2026-01-06 05:14:26.431394436 +0000 UTC Jan 11 17:30:39 crc kubenswrapper[4837]: I0111 17:30:39.329375 4837 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Jan 11 17:30:39 crc kubenswrapper[4837]: I0111 17:30:39.331009 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 11 17:30:39 crc kubenswrapper[4837]: I0111 17:30:39.331069 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 11 17:30:39 crc kubenswrapper[4837]: I0111 17:30:39.331094 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 11 17:30:39 crc kubenswrapper[4837]: I0111 17:30:39.331140 4837 kubelet_node_status.go:76] "Attempting to register node" node="crc" Jan 11 17:30:39 crc kubenswrapper[4837]: I0111 17:30:39.414991 4837 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-apiserver/kube-apiserver-crc" Jan 11 17:30:39 crc kubenswrapper[4837]: I0111 17:30:39.680739 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerStarted","Data":"95408498599cfaa29dff9261ee0f65658c32657f2df5a2bb23b0ed442eba84d6"} Jan 11 17:30:39 crc kubenswrapper[4837]: I0111 17:30:39.680841 4837 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Jan 11 17:30:39 crc kubenswrapper[4837]: I0111 17:30:39.680864 4837 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Jan 11 17:30:39 crc kubenswrapper[4837]: I0111 17:30:39.680973 4837 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-apiserver/kube-apiserver-crc" Jan 11 17:30:39 crc kubenswrapper[4837]: I0111 17:30:39.682187 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 11 17:30:39 crc kubenswrapper[4837]: I0111 17:30:39.682228 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 11 17:30:39 crc kubenswrapper[4837]: I0111 17:30:39.682244 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 11 17:30:39 crc kubenswrapper[4837]: I0111 17:30:39.682635 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 11 17:30:39 crc kubenswrapper[4837]: I0111 17:30:39.682862 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 11 17:30:39 crc kubenswrapper[4837]: I0111 17:30:39.683042 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 11 17:30:40 crc kubenswrapper[4837]: I0111 17:30:40.083449 4837 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-23 08:31:23.746299243 +0000 UTC Jan 11 17:30:40 crc kubenswrapper[4837]: I0111 17:30:40.580803 4837 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Jan 11 17:30:40 crc kubenswrapper[4837]: I0111 17:30:40.581034 4837 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Jan 11 17:30:40 crc kubenswrapper[4837]: I0111 17:30:40.582886 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 11 17:30:40 crc kubenswrapper[4837]: I0111 17:30:40.582949 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 11 17:30:40 crc kubenswrapper[4837]: I0111 17:30:40.582970 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 11 17:30:40 crc kubenswrapper[4837]: I0111 17:30:40.603557 4837 certificate_manager.go:356] kubernetes.io/kube-apiserver-client-kubelet: Rotating certificates Jan 11 17:30:40 crc kubenswrapper[4837]: I0111 17:30:40.684552 4837 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Jan 11 17:30:40 crc kubenswrapper[4837]: I0111 17:30:40.684610 4837 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Jan 11 17:30:40 crc kubenswrapper[4837]: I0111 17:30:40.686557 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 11 17:30:40 crc kubenswrapper[4837]: I0111 17:30:40.686615 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 11 17:30:40 crc kubenswrapper[4837]: I0111 17:30:40.686643 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 11 17:30:40 crc kubenswrapper[4837]: I0111 17:30:40.686557 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 11 17:30:40 crc kubenswrapper[4837]: I0111 17:30:40.686801 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 11 17:30:40 crc kubenswrapper[4837]: I0111 17:30:40.686826 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 11 17:30:40 crc kubenswrapper[4837]: I0111 17:30:40.687750 4837 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Jan 11 17:30:40 crc kubenswrapper[4837]: I0111 17:30:40.687988 4837 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Jan 11 17:30:40 crc kubenswrapper[4837]: I0111 17:30:40.689593 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 11 17:30:40 crc kubenswrapper[4837]: I0111 17:30:40.689870 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 11 17:30:40 crc kubenswrapper[4837]: I0111 17:30:40.690046 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 11 17:30:41 crc kubenswrapper[4837]: I0111 17:30:41.084404 4837 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-11-26 02:55:40.061425966 +0000 UTC Jan 11 17:30:41 crc kubenswrapper[4837]: I0111 17:30:41.713290 4837 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-kube-apiserver/kube-apiserver-crc" Jan 11 17:30:41 crc kubenswrapper[4837]: I0111 17:30:41.713501 4837 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Jan 11 17:30:41 crc kubenswrapper[4837]: I0111 17:30:41.714799 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 11 17:30:41 crc kubenswrapper[4837]: I0111 17:30:41.714850 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 11 17:30:41 crc kubenswrapper[4837]: I0111 17:30:41.714864 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 11 17:30:42 crc kubenswrapper[4837]: I0111 17:30:42.040906 4837 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Jan 11 17:30:42 crc kubenswrapper[4837]: I0111 17:30:42.041208 4837 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Jan 11 17:30:42 crc kubenswrapper[4837]: I0111 17:30:42.043192 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 11 17:30:42 crc kubenswrapper[4837]: I0111 17:30:42.043251 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 11 17:30:42 crc kubenswrapper[4837]: I0111 17:30:42.043269 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 11 17:30:42 crc kubenswrapper[4837]: I0111 17:30:42.049185 4837 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Jan 11 17:30:42 crc kubenswrapper[4837]: I0111 17:30:42.085992 4837 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2026-01-06 17:18:47.29533062 +0000 UTC Jan 11 17:30:42 crc kubenswrapper[4837]: I0111 17:30:42.690218 4837 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Jan 11 17:30:42 crc kubenswrapper[4837]: I0111 17:30:42.692317 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 11 17:30:42 crc kubenswrapper[4837]: I0111 17:30:42.692374 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 11 17:30:42 crc kubenswrapper[4837]: I0111 17:30:42.692413 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 11 17:30:42 crc kubenswrapper[4837]: I0111 17:30:42.696045 4837 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Jan 11 17:30:43 crc kubenswrapper[4837]: I0111 17:30:43.087134 4837 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-05 05:55:46.393945624 +0000 UTC Jan 11 17:30:43 crc kubenswrapper[4837]: W0111 17:30:43.547411 4837 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Node: Get "https://api-int.crc.testing:6443/api/v1/nodes?fieldSelector=metadata.name%3Dcrc&limit=500&resourceVersion=0": net/http: TLS handshake timeout Jan 11 17:30:43 crc kubenswrapper[4837]: I0111 17:30:43.547625 4837 trace.go:236] Trace[529679933]: "Reflector ListAndWatch" name:k8s.io/client-go/informers/factory.go:160 (11-Jan-2026 17:30:33.545) (total time: 10002ms): Jan 11 17:30:43 crc kubenswrapper[4837]: Trace[529679933]: ---"Objects listed" error:Get "https://api-int.crc.testing:6443/api/v1/nodes?fieldSelector=metadata.name%3Dcrc&limit=500&resourceVersion=0": net/http: TLS handshake timeout 10001ms (17:30:43.547) Jan 11 17:30:43 crc kubenswrapper[4837]: Trace[529679933]: [10.002086561s] [10.002086561s] END Jan 11 17:30:43 crc kubenswrapper[4837]: E0111 17:30:43.547718 4837 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Node: failed to list *v1.Node: Get \"https://api-int.crc.testing:6443/api/v1/nodes?fieldSelector=metadata.name%3Dcrc&limit=500&resourceVersion=0\": net/http: TLS handshake timeout" logger="UnhandledError" Jan 11 17:30:43 crc kubenswrapper[4837]: I0111 17:30:43.688609 4837 patch_prober.go:28] interesting pod/kube-controller-manager-crc container/cluster-policy-controller namespace/openshift-kube-controller-manager: Startup probe status=failure output="Get \"https://192.168.126.11:10357/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" start-of-body= Jan 11 17:30:43 crc kubenswrapper[4837]: I0111 17:30:43.688832 4837 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-kube-controller-manager/kube-controller-manager-crc" podUID="f614b9022728cf315e60c057852e563e" containerName="cluster-policy-controller" probeResult="failure" output="Get \"https://192.168.126.11:10357/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Jan 11 17:30:43 crc kubenswrapper[4837]: I0111 17:30:43.693216 4837 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Jan 11 17:30:43 crc kubenswrapper[4837]: I0111 17:30:43.694307 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 11 17:30:43 crc kubenswrapper[4837]: I0111 17:30:43.694377 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 11 17:30:43 crc kubenswrapper[4837]: I0111 17:30:43.694397 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 11 17:30:44 crc kubenswrapper[4837]: I0111 17:30:44.056788 4837 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": net/http: TLS handshake timeout Jan 11 17:30:44 crc kubenswrapper[4837]: I0111 17:30:44.087622 4837 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-11-21 18:56:44.251833626 +0000 UTC Jan 11 17:30:44 crc kubenswrapper[4837]: I0111 17:30:44.563497 4837 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-etcd/etcd-crc" Jan 11 17:30:44 crc kubenswrapper[4837]: I0111 17:30:44.563844 4837 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Jan 11 17:30:44 crc kubenswrapper[4837]: I0111 17:30:44.565424 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 11 17:30:44 crc kubenswrapper[4837]: I0111 17:30:44.565473 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 11 17:30:44 crc kubenswrapper[4837]: I0111 17:30:44.565491 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 11 17:30:45 crc kubenswrapper[4837]: I0111 17:30:45.088732 4837 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-13 22:32:27.302458121 +0000 UTC Jan 11 17:30:45 crc kubenswrapper[4837]: W0111 17:30:45.976326 4837 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.RuntimeClass: Get "https://api-int.crc.testing:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0": net/http: TLS handshake timeout Jan 11 17:30:45 crc kubenswrapper[4837]: I0111 17:30:45.976416 4837 trace.go:236] Trace[653501131]: "Reflector ListAndWatch" name:k8s.io/client-go/informers/factory.go:160 (11-Jan-2026 17:30:35.973) (total time: 10002ms): Jan 11 17:30:45 crc kubenswrapper[4837]: Trace[653501131]: ---"Objects listed" error:Get "https://api-int.crc.testing:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0": net/http: TLS handshake timeout 10002ms (17:30:45.976) Jan 11 17:30:45 crc kubenswrapper[4837]: Trace[653501131]: [10.002781311s] [10.002781311s] END Jan 11 17:30:45 crc kubenswrapper[4837]: E0111 17:30:45.976442 4837 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.RuntimeClass: failed to list *v1.RuntimeClass: Get \"https://api-int.crc.testing:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0\": net/http: TLS handshake timeout" logger="UnhandledError" Jan 11 17:30:46 crc kubenswrapper[4837]: I0111 17:30:46.089089 4837 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2026-01-01 11:58:33.059685807 +0000 UTC Jan 11 17:30:46 crc kubenswrapper[4837]: W0111 17:30:46.482462 4837 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Service: Get "https://api-int.crc.testing:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0": net/http: TLS handshake timeout Jan 11 17:30:46 crc kubenswrapper[4837]: I0111 17:30:46.482611 4837 trace.go:236] Trace[1193993856]: "Reflector ListAndWatch" name:k8s.io/client-go/informers/factory.go:160 (11-Jan-2026 17:30:36.481) (total time: 10001ms): Jan 11 17:30:46 crc kubenswrapper[4837]: Trace[1193993856]: ---"Objects listed" error:Get "https://api-int.crc.testing:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0": net/http: TLS handshake timeout 10001ms (17:30:46.482) Jan 11 17:30:46 crc kubenswrapper[4837]: Trace[1193993856]: [10.001285117s] [10.001285117s] END Jan 11 17:30:46 crc kubenswrapper[4837]: E0111 17:30:46.482644 4837 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Service: failed to list *v1.Service: Get \"https://api-int.crc.testing:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0\": net/http: TLS handshake timeout" logger="UnhandledError" Jan 11 17:30:46 crc kubenswrapper[4837]: E0111 17:30:46.619293 4837 eviction_manager.go:285] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"crc\" not found" Jan 11 17:30:47 crc kubenswrapper[4837]: I0111 17:30:47.089619 4837 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-11-12 04:19:08.507000771 +0000 UTC Jan 11 17:30:48 crc kubenswrapper[4837]: I0111 17:30:48.090355 4837 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-27 00:47:32.624162084 +0000 UTC Jan 11 17:30:48 crc kubenswrapper[4837]: E0111 17:30:48.666728 4837 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" interval="7s" Jan 11 17:30:49 crc kubenswrapper[4837]: E0111 17:30:49.042814 4837 event.go:368] "Unable to write event (may retry after sleeping)" err="Post \"https://api-int.crc.testing:6443/api/v1/namespaces/default/events\": net/http: TLS handshake timeout" event="&Event{ObjectMeta:{crc.1889bd7200ac4627 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:Starting,Message:Starting kubelet.,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-01-11 17:30:26.054841895 +0000 UTC m=+0.233034611,LastTimestamp:2026-01-11 17:30:26.054841895 +0000 UTC m=+0.233034611,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Jan 11 17:30:49 crc kubenswrapper[4837]: I0111 17:30:49.091308 4837 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-18 08:28:09.329998527 +0000 UTC Jan 11 17:30:49 crc kubenswrapper[4837]: W0111 17:30:49.230183 4837 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.CSIDriver: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0": net/http: TLS handshake timeout Jan 11 17:30:49 crc kubenswrapper[4837]: I0111 17:30:49.230318 4837 trace.go:236] Trace[757320557]: "Reflector ListAndWatch" name:k8s.io/client-go/informers/factory.go:160 (11-Jan-2026 17:30:39.228) (total time: 10001ms): Jan 11 17:30:49 crc kubenswrapper[4837]: Trace[757320557]: ---"Objects listed" error:Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0": net/http: TLS handshake timeout 10001ms (17:30:49.230) Jan 11 17:30:49 crc kubenswrapper[4837]: Trace[757320557]: [10.001432954s] [10.001432954s] END Jan 11 17:30:49 crc kubenswrapper[4837]: E0111 17:30:49.230350 4837 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.CSIDriver: failed to list *v1.CSIDriver: Get \"https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0\": net/http: TLS handshake timeout" logger="UnhandledError" Jan 11 17:30:49 crc kubenswrapper[4837]: E0111 17:30:49.332410 4837 kubelet_node_status.go:99] "Unable to register node with API server" err="Post \"https://api-int.crc.testing:6443/api/v1/nodes\": net/http: TLS handshake timeout" node="crc" Jan 11 17:30:49 crc kubenswrapper[4837]: I0111 17:30:49.388078 4837 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-etcd/etcd-crc" Jan 11 17:30:49 crc kubenswrapper[4837]: I0111 17:30:49.388312 4837 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Jan 11 17:30:49 crc kubenswrapper[4837]: I0111 17:30:49.390011 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 11 17:30:49 crc kubenswrapper[4837]: I0111 17:30:49.390070 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 11 17:30:49 crc kubenswrapper[4837]: I0111 17:30:49.390096 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 11 17:30:50 crc kubenswrapper[4837]: I0111 17:30:50.092772 4837 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-11-22 00:53:06.549114935 +0000 UTC Jan 11 17:30:50 crc kubenswrapper[4837]: I0111 17:30:50.388239 4837 patch_prober.go:28] interesting pod/etcd-crc container/etcd namespace/openshift-etcd: Startup probe status=failure output="Get \"https://192.168.126.11:9980/readyz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" start-of-body= Jan 11 17:30:50 crc kubenswrapper[4837]: I0111 17:30:50.388317 4837 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-etcd/etcd-crc" podUID="2139d3e2895fc6797b9c76a1b4c9886d" containerName="etcd" probeResult="failure" output="Get \"https://192.168.126.11:9980/readyz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Jan 11 17:30:50 crc kubenswrapper[4837]: E0111 17:30:50.606584 4837 certificate_manager.go:562] "Unhandled Error" err="kubernetes.io/kube-apiserver-client-kubelet: Failed while requesting a signed certificate from the control plane: cannot create certificate signing request: Post \"https://api-int.crc.testing:6443/apis/certificates.k8s.io/v1/certificatesigningrequests\": net/http: TLS handshake timeout" logger="UnhandledError" Jan 11 17:30:51 crc kubenswrapper[4837]: I0111 17:30:51.093246 4837 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2026-01-04 23:39:13.925645781 +0000 UTC Jan 11 17:30:51 crc kubenswrapper[4837]: I0111 17:30:51.714323 4837 patch_prober.go:28] interesting pod/kube-apiserver-crc container/kube-apiserver namespace/openshift-kube-apiserver: Startup probe status=failure output="Get \"https://192.168.126.11:6443/livez\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" start-of-body= Jan 11 17:30:51 crc kubenswrapper[4837]: I0111 17:30:51.714458 4837 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver" probeResult="failure" output="Get \"https://192.168.126.11:6443/livez\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Jan 11 17:30:52 crc kubenswrapper[4837]: I0111 17:30:52.093732 4837 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-30 09:17:11.349013463 +0000 UTC Jan 11 17:30:53 crc kubenswrapper[4837]: I0111 17:30:53.094642 4837 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2026-01-09 07:34:21.502215924 +0000 UTC Jan 11 17:30:53 crc kubenswrapper[4837]: I0111 17:30:53.688796 4837 patch_prober.go:28] interesting pod/kube-controller-manager-crc container/cluster-policy-controller namespace/openshift-kube-controller-manager: Startup probe status=failure output="Get \"https://192.168.126.11:10357/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" start-of-body= Jan 11 17:30:53 crc kubenswrapper[4837]: I0111 17:30:53.689246 4837 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-kube-controller-manager/kube-controller-manager-crc" podUID="f614b9022728cf315e60c057852e563e" containerName="cluster-policy-controller" probeResult="failure" output="Get \"https://192.168.126.11:10357/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Jan 11 17:30:54 crc kubenswrapper[4837]: I0111 17:30:54.095229 4837 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-15 12:34:02.885118117 +0000 UTC Jan 11 17:30:55 crc kubenswrapper[4837]: I0111 17:30:55.057580 4837 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": net/http: TLS handshake timeout Jan 11 17:30:55 crc kubenswrapper[4837]: I0111 17:30:55.095578 4837 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-06 18:55:21.346317003 +0000 UTC Jan 11 17:30:56 crc kubenswrapper[4837]: I0111 17:30:56.096649 4837 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-11-15 17:29:44.796212977 +0000 UTC Jan 11 17:30:56 crc kubenswrapper[4837]: I0111 17:30:56.333031 4837 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Jan 11 17:30:56 crc kubenswrapper[4837]: I0111 17:30:56.334578 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 11 17:30:56 crc kubenswrapper[4837]: I0111 17:30:56.334762 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 11 17:30:56 crc kubenswrapper[4837]: I0111 17:30:56.334881 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 11 17:30:56 crc kubenswrapper[4837]: I0111 17:30:56.334997 4837 kubelet_node_status.go:76] "Attempting to register node" node="crc" Jan 11 17:30:56 crc kubenswrapper[4837]: E0111 17:30:56.619523 4837 eviction_manager.go:285] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"crc\" not found" Jan 11 17:30:57 crc kubenswrapper[4837]: I0111 17:30:57.097558 4837 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2026-01-01 11:06:59.034299875 +0000 UTC Jan 11 17:30:58 crc kubenswrapper[4837]: I0111 17:30:58.098093 4837 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-05 02:18:08.839344919 +0000 UTC Jan 11 17:30:59 crc kubenswrapper[4837]: I0111 17:30:59.099781 4837 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-18 04:49:05.389276412 +0000 UTC Jan 11 17:31:00 crc kubenswrapper[4837]: I0111 17:31:00.101578 4837 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-11 05:21:27.207500405 +0000 UTC Jan 11 17:31:00 crc kubenswrapper[4837]: I0111 17:31:00.272318 4837 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-etcd/etcd-crc" Jan 11 17:31:00 crc kubenswrapper[4837]: I0111 17:31:00.272553 4837 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Jan 11 17:31:00 crc kubenswrapper[4837]: I0111 17:31:00.273978 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 11 17:31:00 crc kubenswrapper[4837]: I0111 17:31:00.274019 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 11 17:31:00 crc kubenswrapper[4837]: I0111 17:31:00.274039 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 11 17:31:00 crc kubenswrapper[4837]: I0111 17:31:00.291414 4837 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-etcd/etcd-crc" Jan 11 17:31:00 crc kubenswrapper[4837]: I0111 17:31:00.381237 4837 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Jan 11 17:31:00 crc kubenswrapper[4837]: I0111 17:31:00.382602 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 11 17:31:00 crc kubenswrapper[4837]: I0111 17:31:00.382654 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 11 17:31:00 crc kubenswrapper[4837]: I0111 17:31:00.382706 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 11 17:31:00 crc kubenswrapper[4837]: I0111 17:31:00.716302 4837 patch_prober.go:28] interesting pod/kube-apiserver-crc container/kube-apiserver-check-endpoints namespace/openshift-kube-apiserver: Liveness probe status=failure output="Get \"https://192.168.126.11:17697/healthz\": read tcp 192.168.126.11:50392->192.168.126.11:17697: read: connection reset by peer" start-of-body= Jan 11 17:31:00 crc kubenswrapper[4837]: I0111 17:31:00.716386 4837 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" probeResult="failure" output="Get \"https://192.168.126.11:17697/healthz\": read tcp 192.168.126.11:50392->192.168.126.11:17697: read: connection reset by peer" Jan 11 17:31:00 crc kubenswrapper[4837]: I0111 17:31:00.716863 4837 patch_prober.go:28] interesting pod/kube-apiserver-crc container/kube-apiserver-check-endpoints namespace/openshift-kube-apiserver: Readiness probe status=failure output="Get \"https://192.168.126.11:17697/healthz\": read tcp 192.168.126.11:50398->192.168.126.11:17697: read: connection reset by peer" start-of-body= Jan 11 17:31:00 crc kubenswrapper[4837]: I0111 17:31:00.716902 4837 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" probeResult="failure" output="Get \"https://192.168.126.11:17697/healthz\": read tcp 192.168.126.11:50398->192.168.126.11:17697: read: connection reset by peer" Jan 11 17:31:01 crc kubenswrapper[4837]: I0111 17:31:01.101785 4837 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-21 15:40:21.69499297 +0000 UTC Jan 11 17:31:01 crc kubenswrapper[4837]: I0111 17:31:01.715481 4837 patch_prober.go:28] interesting pod/kube-apiserver-crc container/kube-apiserver namespace/openshift-kube-apiserver: Startup probe status=failure output="Get \"https://192.168.126.11:6443/livez\": context deadline exceeded" start-of-body= Jan 11 17:31:01 crc kubenswrapper[4837]: I0111 17:31:01.715576 4837 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver" probeResult="failure" output="Get \"https://192.168.126.11:6443/livez\": context deadline exceeded" Jan 11 17:31:02 crc kubenswrapper[4837]: I0111 17:31:02.102616 4837 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-14 19:11:49.179791807 +0000 UTC Jan 11 17:31:02 crc kubenswrapper[4837]: I0111 17:31:02.389372 4837 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-check-endpoints/0.log" Jan 11 17:31:02 crc kubenswrapper[4837]: I0111 17:31:02.393017 4837 generic.go:334] "Generic (PLEG): container finished" podID="f4b27818a5e8e43d0dc095d08835c792" containerID="9f228eaa8376ca17f605b85e71a21a46e226d720350baabc72ef9ef6d468cd0e" exitCode=255 Jan 11 17:31:02 crc kubenswrapper[4837]: I0111 17:31:02.393118 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerDied","Data":"9f228eaa8376ca17f605b85e71a21a46e226d720350baabc72ef9ef6d468cd0e"} Jan 11 17:31:02 crc kubenswrapper[4837]: I0111 17:31:02.393340 4837 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Jan 11 17:31:02 crc kubenswrapper[4837]: I0111 17:31:02.394928 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 11 17:31:02 crc kubenswrapper[4837]: I0111 17:31:02.394990 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 11 17:31:02 crc kubenswrapper[4837]: I0111 17:31:02.395010 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 11 17:31:02 crc kubenswrapper[4837]: I0111 17:31:02.395863 4837 scope.go:117] "RemoveContainer" containerID="9f228eaa8376ca17f605b85e71a21a46e226d720350baabc72ef9ef6d468cd0e" Jan 11 17:31:02 crc kubenswrapper[4837]: I0111 17:31:02.989407 4837 patch_prober.go:28] interesting pod/kube-controller-manager-crc container/cluster-policy-controller namespace/openshift-kube-controller-manager: Startup probe status=failure output="Get \"https://192.168.126.11:10357/healthz\": read tcp 192.168.126.11:45194->192.168.126.11:10357: read: connection reset by peer" start-of-body= Jan 11 17:31:02 crc kubenswrapper[4837]: I0111 17:31:02.989493 4837 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-kube-controller-manager/kube-controller-manager-crc" podUID="f614b9022728cf315e60c057852e563e" containerName="cluster-policy-controller" probeResult="failure" output="Get \"https://192.168.126.11:10357/healthz\": read tcp 192.168.126.11:45194->192.168.126.11:10357: read: connection reset by peer" Jan 11 17:31:02 crc kubenswrapper[4837]: I0111 17:31:02.989572 4837 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Jan 11 17:31:02 crc kubenswrapper[4837]: I0111 17:31:02.989842 4837 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Jan 11 17:31:02 crc kubenswrapper[4837]: I0111 17:31:02.993830 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 11 17:31:02 crc kubenswrapper[4837]: I0111 17:31:02.993887 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 11 17:31:02 crc kubenswrapper[4837]: I0111 17:31:02.993910 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 11 17:31:02 crc kubenswrapper[4837]: I0111 17:31:02.994820 4837 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="cluster-policy-controller" containerStatusID={"Type":"cri-o","ID":"f143223d5223a4e1e7056b09d8a43a04fde926d4c15184b5b452190356db5881"} pod="openshift-kube-controller-manager/kube-controller-manager-crc" containerMessage="Container cluster-policy-controller failed startup probe, will be restarted" Jan 11 17:31:02 crc kubenswrapper[4837]: I0111 17:31:02.995123 4837 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-kube-controller-manager/kube-controller-manager-crc" podUID="f614b9022728cf315e60c057852e563e" containerName="cluster-policy-controller" containerID="cri-o://f143223d5223a4e1e7056b09d8a43a04fde926d4c15184b5b452190356db5881" gracePeriod=30 Jan 11 17:31:03 crc kubenswrapper[4837]: I0111 17:31:03.103364 4837 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-10 01:13:20.332066794 +0000 UTC Jan 11 17:31:03 crc kubenswrapper[4837]: I0111 17:31:03.395230 4837 patch_prober.go:28] interesting pod/kube-apiserver-crc container/kube-apiserver namespace/openshift-kube-apiserver: Startup probe status=failure output="HTTP probe failed with statuscode: 403" start-of-body={"kind":"Status","apiVersion":"v1","metadata":{},"status":"Failure","message":"forbidden: User \"system:anonymous\" cannot get path \"/livez\"","reason":"Forbidden","details":{},"code":403} Jan 11 17:31:03 crc kubenswrapper[4837]: I0111 17:31:03.395793 4837 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver" probeResult="failure" output="HTTP probe failed with statuscode: 403" Jan 11 17:31:04 crc kubenswrapper[4837]: I0111 17:31:04.104291 4837 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-11-14 14:31:02.790532388 +0000 UTC Jan 11 17:31:04 crc kubenswrapper[4837]: I0111 17:31:04.399742 4837 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-check-endpoints/0.log" Jan 11 17:31:04 crc kubenswrapper[4837]: I0111 17:31:04.404688 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"45dc7defae85e8f8ec0f441595434d86ab3cf87a7c81cafb9ca178af9a300027"} Jan 11 17:31:04 crc kubenswrapper[4837]: I0111 17:31:04.404827 4837 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Jan 11 17:31:04 crc kubenswrapper[4837]: I0111 17:31:04.407242 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 11 17:31:04 crc kubenswrapper[4837]: I0111 17:31:04.407274 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 11 17:31:04 crc kubenswrapper[4837]: I0111 17:31:04.407285 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 11 17:31:04 crc kubenswrapper[4837]: I0111 17:31:04.410521 4837 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-controller-manager_kube-controller-manager-crc_f614b9022728cf315e60c057852e563e/cluster-policy-controller/0.log" Jan 11 17:31:04 crc kubenswrapper[4837]: I0111 17:31:04.414042 4837 generic.go:334] "Generic (PLEG): container finished" podID="f614b9022728cf315e60c057852e563e" containerID="f143223d5223a4e1e7056b09d8a43a04fde926d4c15184b5b452190356db5881" exitCode=255 Jan 11 17:31:04 crc kubenswrapper[4837]: I0111 17:31:04.414096 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerDied","Data":"f143223d5223a4e1e7056b09d8a43a04fde926d4c15184b5b452190356db5881"} Jan 11 17:31:05 crc kubenswrapper[4837]: I0111 17:31:05.104440 4837 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2026-01-18 16:55:11.215393856 +0000 UTC Jan 11 17:31:05 crc kubenswrapper[4837]: I0111 17:31:05.104519 4837 certificate_manager.go:356] kubernetes.io/kubelet-serving: Waiting 167h24m6.110884384s for next certificate rotation Jan 11 17:31:05 crc kubenswrapper[4837]: I0111 17:31:05.284079 4837 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-apiserver/kube-apiserver-crc" Jan 11 17:31:05 crc kubenswrapper[4837]: I0111 17:31:05.420723 4837 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-controller-manager_kube-controller-manager-crc_f614b9022728cf315e60c057852e563e/cluster-policy-controller/0.log" Jan 11 17:31:05 crc kubenswrapper[4837]: I0111 17:31:05.421148 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerStarted","Data":"4d900bf1f3701fc269e75b435a2660c096b93a7c043693a5f294666defbe51c1"} Jan 11 17:31:05 crc kubenswrapper[4837]: I0111 17:31:05.421183 4837 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Jan 11 17:31:05 crc kubenswrapper[4837]: I0111 17:31:05.421246 4837 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Jan 11 17:31:05 crc kubenswrapper[4837]: I0111 17:31:05.422598 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 11 17:31:05 crc kubenswrapper[4837]: I0111 17:31:05.422656 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 11 17:31:05 crc kubenswrapper[4837]: I0111 17:31:05.422706 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 11 17:31:05 crc kubenswrapper[4837]: I0111 17:31:05.422658 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 11 17:31:05 crc kubenswrapper[4837]: I0111 17:31:05.422837 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 11 17:31:05 crc kubenswrapper[4837]: I0111 17:31:05.422858 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 11 17:31:06 crc kubenswrapper[4837]: I0111 17:31:06.424362 4837 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Jan 11 17:31:06 crc kubenswrapper[4837]: I0111 17:31:06.425309 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 11 17:31:06 crc kubenswrapper[4837]: I0111 17:31:06.425351 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 11 17:31:06 crc kubenswrapper[4837]: I0111 17:31:06.425363 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 11 17:31:06 crc kubenswrapper[4837]: E0111 17:31:06.619727 4837 eviction_manager.go:285] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"crc\" not found" Jan 11 17:31:06 crc kubenswrapper[4837]: I0111 17:31:06.693148 4837 certificate_manager.go:356] kubernetes.io/kube-apiserver-client-kubelet: Rotating certificates Jan 11 17:31:06 crc kubenswrapper[4837]: I0111 17:31:06.705241 4837 reflector.go:368] Caches populated for *v1.CertificateSigningRequest from k8s.io/client-go/tools/watch/informerwatcher.go:146 Jan 11 17:31:06 crc kubenswrapper[4837]: I0111 17:31:06.716426 4837 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-kube-apiserver/kube-apiserver-crc" Jan 11 17:31:06 crc kubenswrapper[4837]: I0111 17:31:06.716560 4837 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Jan 11 17:31:06 crc kubenswrapper[4837]: I0111 17:31:06.717560 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 11 17:31:06 crc kubenswrapper[4837]: I0111 17:31:06.717602 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 11 17:31:06 crc kubenswrapper[4837]: I0111 17:31:06.717611 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 11 17:31:06 crc kubenswrapper[4837]: I0111 17:31:06.719907 4837 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-apiserver/kube-apiserver-crc" Jan 11 17:31:06 crc kubenswrapper[4837]: I0111 17:31:06.724339 4837 csr.go:261] certificate signing request csr-lskc6 is approved, waiting to be issued Jan 11 17:31:06 crc kubenswrapper[4837]: I0111 17:31:06.730726 4837 csr.go:257] certificate signing request csr-lskc6 is issued Jan 11 17:31:07 crc kubenswrapper[4837]: I0111 17:31:07.427528 4837 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Jan 11 17:31:07 crc kubenswrapper[4837]: I0111 17:31:07.428285 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 11 17:31:07 crc kubenswrapper[4837]: I0111 17:31:07.428387 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 11 17:31:07 crc kubenswrapper[4837]: I0111 17:31:07.428466 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 11 17:31:07 crc kubenswrapper[4837]: I0111 17:31:07.732159 4837 certificate_manager.go:356] kubernetes.io/kube-apiserver-client-kubelet: Certificate expiration is 2027-01-11 17:26:06 +0000 UTC, rotation deadline is 2026-11-05 14:01:28.705790093 +0000 UTC Jan 11 17:31:07 crc kubenswrapper[4837]: I0111 17:31:07.732204 4837 certificate_manager.go:356] kubernetes.io/kube-apiserver-client-kubelet: Waiting 7148h30m20.973589116s for next certificate rotation Jan 11 17:31:08 crc kubenswrapper[4837]: I0111 17:31:08.031878 4837 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Jan 11 17:31:08 crc kubenswrapper[4837]: I0111 17:31:08.032067 4837 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Jan 11 17:31:08 crc kubenswrapper[4837]: I0111 17:31:08.033473 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 11 17:31:08 crc kubenswrapper[4837]: I0111 17:31:08.033514 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 11 17:31:08 crc kubenswrapper[4837]: I0111 17:31:08.033531 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 11 17:31:08 crc kubenswrapper[4837]: I0111 17:31:08.040699 4837 reflector.go:368] Caches populated for *v1.CSIDriver from k8s.io/client-go/informers/factory.go:160 Jan 11 17:31:08 crc kubenswrapper[4837]: E0111 17:31:08.376396 4837 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": context deadline exceeded" interval="7s" Jan 11 17:31:08 crc kubenswrapper[4837]: I0111 17:31:08.379241 4837 trace.go:236] Trace[1495090338]: "Reflector ListAndWatch" name:k8s.io/client-go/informers/factory.go:160 (11-Jan-2026 17:30:54.695) (total time: 13683ms): Jan 11 17:31:08 crc kubenswrapper[4837]: Trace[1495090338]: ---"Objects listed" error: 13683ms (17:31:08.379) Jan 11 17:31:08 crc kubenswrapper[4837]: Trace[1495090338]: [13.683906226s] [13.683906226s] END Jan 11 17:31:08 crc kubenswrapper[4837]: I0111 17:31:08.379294 4837 reflector.go:368] Caches populated for *v1.RuntimeClass from k8s.io/client-go/informers/factory.go:160 Jan 11 17:31:08 crc kubenswrapper[4837]: E0111 17:31:08.384527 4837 kubelet_node_status.go:99] "Unable to register node with API server" err="nodes \"crc\" is forbidden: autoscaling.openshift.io/ManagedNode infra config cache not synchronized" node="crc" Jan 11 17:31:08 crc kubenswrapper[4837]: I0111 17:31:08.385183 4837 trace.go:236] Trace[625873087]: "Reflector ListAndWatch" name:k8s.io/client-go/informers/factory.go:160 (11-Jan-2026 17:30:55.380) (total time: 13004ms): Jan 11 17:31:08 crc kubenswrapper[4837]: Trace[625873087]: ---"Objects listed" error: 13004ms (17:31:08.384) Jan 11 17:31:08 crc kubenswrapper[4837]: Trace[625873087]: [13.004832168s] [13.004832168s] END Jan 11 17:31:08 crc kubenswrapper[4837]: I0111 17:31:08.385212 4837 reflector.go:368] Caches populated for *v1.Node from k8s.io/client-go/informers/factory.go:160 Jan 11 17:31:08 crc kubenswrapper[4837]: I0111 17:31:08.385962 4837 trace.go:236] Trace[1266369129]: "Reflector ListAndWatch" name:k8s.io/client-go/informers/factory.go:160 (11-Jan-2026 17:30:56.268) (total time: 12117ms): Jan 11 17:31:08 crc kubenswrapper[4837]: Trace[1266369129]: ---"Objects listed" error: 12116ms (17:31:08.385) Jan 11 17:31:08 crc kubenswrapper[4837]: Trace[1266369129]: [12.117160035s] [12.117160035s] END Jan 11 17:31:08 crc kubenswrapper[4837]: I0111 17:31:08.385999 4837 reflector.go:368] Caches populated for *v1.Service from k8s.io/client-go/informers/factory.go:160 Jan 11 17:31:08 crc kubenswrapper[4837]: I0111 17:31:08.387968 4837 reconstruct.go:205] "DevicePaths of reconstructed volumes updated" Jan 11 17:31:09 crc kubenswrapper[4837]: I0111 17:31:09.078383 4837 apiserver.go:52] "Watching apiserver" Jan 11 17:31:09 crc kubenswrapper[4837]: I0111 17:31:09.082260 4837 reflector.go:368] Caches populated for *v1.Pod from pkg/kubelet/config/apiserver.go:66 Jan 11 17:31:09 crc kubenswrapper[4837]: I0111 17:31:09.082737 4837 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-config-operator/machine-config-daemon-pqnst","openshift-network-operator/iptables-alerter-4ln5h","openshift-ovn-kubernetes/ovnkube-node-b7lgc","openshift-dns/node-resolver-kcvkg","openshift-multus/multus-additional-cni-plugins-7996w","openshift-multus/multus-v5bf5","openshift-network-console/networking-console-plugin-85b44fc459-gdk6g","openshift-network-diagnostics/network-check-source-55646444c4-trplf","openshift-network-diagnostics/network-check-target-xd92c","openshift-network-node-identity/network-node-identity-vrzqb","openshift-network-operator/network-operator-58b4c7f79c-55gtf"] Jan 11 17:31:09 crc kubenswrapper[4837]: I0111 17:31:09.083116 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Jan 11 17:31:09 crc kubenswrapper[4837]: E0111 17:31:09.083180 4837 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Jan 11 17:31:09 crc kubenswrapper[4837]: I0111 17:31:09.083184 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Jan 11 17:31:09 crc kubenswrapper[4837]: I0111 17:31:09.083543 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-node-identity/network-node-identity-vrzqb" Jan 11 17:31:09 crc kubenswrapper[4837]: I0111 17:31:09.083611 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Jan 11 17:31:09 crc kubenswrapper[4837]: I0111 17:31:09.083658 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/iptables-alerter-4ln5h" Jan 11 17:31:09 crc kubenswrapper[4837]: E0111 17:31:09.083761 4837 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Jan 11 17:31:09 crc kubenswrapper[4837]: I0111 17:31:09.083870 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Jan 11 17:31:09 crc kubenswrapper[4837]: E0111 17:31:09.083919 4837 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Jan 11 17:31:09 crc kubenswrapper[4837]: I0111 17:31:09.084060 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-daemon-pqnst" Jan 11 17:31:09 crc kubenswrapper[4837]: I0111 17:31:09.084374 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-v5bf5" Jan 11 17:31:09 crc kubenswrapper[4837]: I0111 17:31:09.084513 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-b7lgc" Jan 11 17:31:09 crc kubenswrapper[4837]: I0111 17:31:09.084704 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-additional-cni-plugins-7996w" Jan 11 17:31:09 crc kubenswrapper[4837]: I0111 17:31:09.084838 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/node-resolver-kcvkg" Jan 11 17:31:09 crc kubenswrapper[4837]: I0111 17:31:09.092395 4837 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"kube-rbac-proxy" Jan 11 17:31:09 crc kubenswrapper[4837]: I0111 17:31:09.092458 4837 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-operator"/"openshift-service-ca.crt" Jan 11 17:31:09 crc kubenswrapper[4837]: I0111 17:31:09.092836 4837 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-network-operator"/"metrics-tls" Jan 11 17:31:09 crc kubenswrapper[4837]: I0111 17:31:09.093085 4837 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns"/"kube-root-ca.crt" Jan 11 17:31:09 crc kubenswrapper[4837]: I0111 17:31:09.093131 4837 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"cni-copy-resources" Jan 11 17:31:09 crc kubenswrapper[4837]: I0111 17:31:09.093264 4837 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"openshift-service-ca.crt" Jan 11 17:31:09 crc kubenswrapper[4837]: I0111 17:31:09.093296 4837 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-operator"/"iptables-alerter-script" Jan 11 17:31:09 crc kubenswrapper[4837]: I0111 17:31:09.093406 4837 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"kube-root-ca.crt" Jan 11 17:31:09 crc kubenswrapper[4837]: I0111 17:31:09.093429 4837 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-operator"/"kube-root-ca.crt" Jan 11 17:31:09 crc kubenswrapper[4837]: I0111 17:31:09.093478 4837 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-node-metrics-cert" Jan 11 17:31:09 crc kubenswrapper[4837]: I0111 17:31:09.093549 4837 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns"/"node-resolver-dockercfg-kz9s7" Jan 11 17:31:09 crc kubenswrapper[4837]: I0111 17:31:09.093559 4837 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"multus-ancillary-tools-dockercfg-vnmsz" Jan 11 17:31:09 crc kubenswrapper[4837]: I0111 17:31:09.093664 4837 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-network-node-identity"/"network-node-identity-cert" Jan 11 17:31:09 crc kubenswrapper[4837]: I0111 17:31:09.093716 4837 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"ovnkube-config" Jan 11 17:31:09 crc kubenswrapper[4837]: I0111 17:31:09.093622 4837 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"ovnkube-identity-cm" Jan 11 17:31:09 crc kubenswrapper[4837]: I0111 17:31:09.093411 4837 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"env-overrides" Jan 11 17:31:09 crc kubenswrapper[4837]: I0111 17:31:09.094040 4837 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"env-overrides" Jan 11 17:31:09 crc kubenswrapper[4837]: I0111 17:31:09.094184 4837 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"default-dockercfg-2q5b6" Jan 11 17:31:09 crc kubenswrapper[4837]: I0111 17:31:09.094318 4837 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"openshift-service-ca.crt" Jan 11 17:31:09 crc kubenswrapper[4837]: I0111 17:31:09.094444 4837 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"kube-root-ca.crt" Jan 11 17:31:09 crc kubenswrapper[4837]: I0111 17:31:09.094543 4837 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"multus-daemon-config" Jan 11 17:31:09 crc kubenswrapper[4837]: I0111 17:31:09.094635 4837 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"default-cni-sysctl-allowlist" Jan 11 17:31:09 crc kubenswrapper[4837]: I0111 17:31:09.094868 4837 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"openshift-service-ca.crt" Jan 11 17:31:09 crc kubenswrapper[4837]: I0111 17:31:09.094980 4837 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"kube-root-ca.crt" Jan 11 17:31:09 crc kubenswrapper[4837]: I0111 17:31:09.095116 4837 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-daemon-dockercfg-r5tcq" Jan 11 17:31:09 crc kubenswrapper[4837]: I0111 17:31:09.095234 4837 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"proxy-tls" Jan 11 17:31:09 crc kubenswrapper[4837]: I0111 17:31:09.095338 4837 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-kubernetes-node-dockercfg-pwtwl" Jan 11 17:31:09 crc kubenswrapper[4837]: I0111 17:31:09.095446 4837 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"kube-root-ca.crt" Jan 11 17:31:09 crc kubenswrapper[4837]: I0111 17:31:09.095537 4837 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"openshift-service-ca.crt" Jan 11 17:31:09 crc kubenswrapper[4837]: I0111 17:31:09.095735 4837 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"ovnkube-script-lib" Jan 11 17:31:09 crc kubenswrapper[4837]: I0111 17:31:09.096602 4837 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns"/"openshift-service-ca.crt" Jan 11 17:31:09 crc kubenswrapper[4837]: I0111 17:31:09.106700 4837 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-11T17:31:09Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-11T17:31:09Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Jan 11 17:31:09 crc kubenswrapper[4837]: I0111 17:31:09.122810 4837 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-11T17:31:09Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-11T17:31:09Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Jan 11 17:31:09 crc kubenswrapper[4837]: I0111 17:31:09.145219 4837 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-11T17:31:09Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-11T17:31:09Z\\\",\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":false,\\\"restartCount\\\":5,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Jan 11 17:31:09 crc kubenswrapper[4837]: I0111 17:31:09.155390 4837 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-11T17:31:09Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-11T17:31:09Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Jan 11 17:31:09 crc kubenswrapper[4837]: I0111 17:31:09.164384 4837 desired_state_of_world_populator.go:154] "Finished populating initial desired state of world" Jan 11 17:31:09 crc kubenswrapper[4837]: I0111 17:31:09.172321 4837 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-pqnst" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1cf6fa66-290a-4e29-bafc-e60185a22fe2\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-11T17:31:09Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-11T17:31:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-11T17:31:09Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-11T17:31:09Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wgdql\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wgdql\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-11T17:31:09Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-pqnst\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Jan 11 17:31:09 crc kubenswrapper[4837]: I0111 17:31:09.185106 4837 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-7996w" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"46729771-ba0b-4b7c-8245-b2d57acb5a2c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-11T17:31:09Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-11T17:31:09Z\\\",\\\"message\\\":\\\"containers with incomplete status: [egress-router-binary-copy cni-plugins bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-11T17:31:09Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-11T17:31:09Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fhxrk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fhxrk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fhxrk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fhxrk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fhxrk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fhxrk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fhxrk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-11T17:31:09Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-7996w\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Jan 11 17:31:09 crc kubenswrapper[4837]: I0111 17:31:09.192253 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-metrics-tls\") pod \"96b93a3a-6083-4aea-8eab-fe1aa8245ad9\" (UID: \"96b93a3a-6083-4aea-8eab-fe1aa8245ad9\") " Jan 11 17:31:09 crc kubenswrapper[4837]: I0111 17:31:09.192290 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-serving-cert\") pod \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\" (UID: \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\") " Jan 11 17:31:09 crc kubenswrapper[4837]: I0111 17:31:09.192308 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-serving-cert\") pod \"09efc573-dbb6-4249-bd59-9b87aba8dd28\" (UID: \"09efc573-dbb6-4249-bd59-9b87aba8dd28\") " Jan 11 17:31:09 crc kubenswrapper[4837]: I0111 17:31:09.192327 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-vt5rc\" (UniqueName: \"kubernetes.io/projected/44663579-783b-4372-86d6-acf235a62d72-kube-api-access-vt5rc\") pod \"44663579-783b-4372-86d6-acf235a62d72\" (UID: \"44663579-783b-4372-86d6-acf235a62d72\") " Jan 11 17:31:09 crc kubenswrapper[4837]: I0111 17:31:09.192341 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"node-bootstrap-token\" (UniqueName: \"kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-node-bootstrap-token\") pod \"5fe579f8-e8a6-4643-bce5-a661393c4dde\" (UID: \"5fe579f8-e8a6-4643-bce5-a661393c4dde\") " Jan 11 17:31:09 crc kubenswrapper[4837]: I0111 17:31:09.192358 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/e7e6199b-1264-4501-8953-767f51328d08-kube-api-access\") pod \"e7e6199b-1264-4501-8953-767f51328d08\" (UID: \"e7e6199b-1264-4501-8953-767f51328d08\") " Jan 11 17:31:09 crc kubenswrapper[4837]: I0111 17:31:09.192377 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-service-ca\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Jan 11 17:31:09 crc kubenswrapper[4837]: I0111 17:31:09.192394 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-audit-policies\") pod \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\" (UID: \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\") " Jan 11 17:31:09 crc kubenswrapper[4837]: I0111 17:31:09.192410 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"mcc-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/0b574797-001e-440a-8f4e-c0be86edad0f-mcc-auth-proxy-config\") pod \"0b574797-001e-440a-8f4e-c0be86edad0f\" (UID: \"0b574797-001e-440a-8f4e-c0be86edad0f\") " Jan 11 17:31:09 crc kubenswrapper[4837]: I0111 17:31:09.192425 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ngvvp\" (UniqueName: \"kubernetes.io/projected/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-kube-api-access-ngvvp\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Jan 11 17:31:09 crc kubenswrapper[4837]: I0111 17:31:09.192439 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-client\") pod \"09efc573-dbb6-4249-bd59-9b87aba8dd28\" (UID: \"09efc573-dbb6-4249-bd59-9b87aba8dd28\") " Jan 11 17:31:09 crc kubenswrapper[4837]: I0111 17:31:09.192454 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-trusted-ca-bundle\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Jan 11 17:31:09 crc kubenswrapper[4837]: I0111 17:31:09.192469 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-config\") pod \"9d4552c7-cd75-42dd-8880-30dd377c49a4\" (UID: \"9d4552c7-cd75-42dd-8880-30dd377c49a4\") " Jan 11 17:31:09 crc kubenswrapper[4837]: I0111 17:31:09.192483 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-client-ca\") pod \"5441d097-087c-4d9a-baa8-b210afa90fc9\" (UID: \"5441d097-087c-4d9a-baa8-b210afa90fc9\") " Jan 11 17:31:09 crc kubenswrapper[4837]: I0111 17:31:09.192500 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-idp-0-file-data\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Jan 11 17:31:09 crc kubenswrapper[4837]: I0111 17:31:09.192517 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"mcd-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/fda69060-fa79-4696-b1a6-7980f124bf7c-mcd-auth-proxy-config\") pod \"fda69060-fa79-4696-b1a6-7980f124bf7c\" (UID: \"fda69060-fa79-4696-b1a6-7980f124bf7c\") " Jan 11 17:31:09 crc kubenswrapper[4837]: I0111 17:31:09.192531 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-trusted-ca\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Jan 11 17:31:09 crc kubenswrapper[4837]: I0111 17:31:09.192552 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-client-ca\") pod \"7583ce53-e0fe-4a16-9e4d-50516596a136\" (UID: \"7583ce53-e0fe-4a16-9e4d-50516596a136\") " Jan 11 17:31:09 crc kubenswrapper[4837]: I0111 17:31:09.192567 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/a31745f5-9847-4afe-82a5-3161cc66ca93-trusted-ca\") pod \"a31745f5-9847-4afe-82a5-3161cc66ca93\" (UID: \"a31745f5-9847-4afe-82a5-3161cc66ca93\") " Jan 11 17:31:09 crc kubenswrapper[4837]: I0111 17:31:09.192596 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jhbk2\" (UniqueName: \"kubernetes.io/projected/bd23aa5c-e532-4e53-bccf-e79f130c5ae8-kube-api-access-jhbk2\") pod \"bd23aa5c-e532-4e53-bccf-e79f130c5ae8\" (UID: \"bd23aa5c-e532-4e53-bccf-e79f130c5ae8\") " Jan 11 17:31:09 crc kubenswrapper[4837]: I0111 17:31:09.192588 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-metrics-tls" (OuterVolumeSpecName: "metrics-tls") pod "96b93a3a-6083-4aea-8eab-fe1aa8245ad9" (UID: "96b93a3a-6083-4aea-8eab-fe1aa8245ad9"). InnerVolumeSpecName "metrics-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 11 17:31:09 crc kubenswrapper[4837]: I0111 17:31:09.192613 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-cliconfig\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Jan 11 17:31:09 crc kubenswrapper[4837]: I0111 17:31:09.192629 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6g6sz\" (UniqueName: \"kubernetes.io/projected/6509e943-70c6-444c-bc41-48a544e36fbd-kube-api-access-6g6sz\") pod \"6509e943-70c6-444c-bc41-48a544e36fbd\" (UID: \"6509e943-70c6-444c-bc41-48a544e36fbd\") " Jan 11 17:31:09 crc kubenswrapper[4837]: I0111 17:31:09.192620 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "09efc573-dbb6-4249-bd59-9b87aba8dd28" (UID: "09efc573-dbb6-4249-bd59-9b87aba8dd28"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 11 17:31:09 crc kubenswrapper[4837]: I0111 17:31:09.192645 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/bf126b07-da06-4140-9a57-dfd54fc6b486-trusted-ca\") pod \"bf126b07-da06-4140-9a57-dfd54fc6b486\" (UID: \"bf126b07-da06-4140-9a57-dfd54fc6b486\") " Jan 11 17:31:09 crc kubenswrapper[4837]: I0111 17:31:09.192668 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7539238d-5fe0-46ed-884e-1c3b566537ec-config\") pod \"7539238d-5fe0-46ed-884e-1c3b566537ec\" (UID: \"7539238d-5fe0-46ed-884e-1c3b566537ec\") " Jan 11 17:31:09 crc kubenswrapper[4837]: I0111 17:31:09.192701 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-249nr\" (UniqueName: \"kubernetes.io/projected/b6312bbd-5731-4ea0-a20f-81d5a57df44a-kube-api-access-249nr\") pod \"b6312bbd-5731-4ea0-a20f-81d5a57df44a\" (UID: \"b6312bbd-5731-4ea0-a20f-81d5a57df44a\") " Jan 11 17:31:09 crc kubenswrapper[4837]: I0111 17:31:09.192718 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"image-import-ca\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-image-import-ca\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Jan 11 17:31:09 crc kubenswrapper[4837]: I0111 17:31:09.192734 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-console-config\") pod \"43509403-f426-496e-be36-56cef71462f5\" (UID: \"43509403-f426-496e-be36-56cef71462f5\") " Jan 11 17:31:09 crc kubenswrapper[4837]: I0111 17:31:09.192751 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-login\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Jan 11 17:31:09 crc kubenswrapper[4837]: I0111 17:31:09.192770 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-279lb\" (UniqueName: \"kubernetes.io/projected/7bb08738-c794-4ee8-9972-3a62ca171029-kube-api-access-279lb\") pod \"7bb08738-c794-4ee8-9972-3a62ca171029\" (UID: \"7bb08738-c794-4ee8-9972-3a62ca171029\") " Jan 11 17:31:09 crc kubenswrapper[4837]: I0111 17:31:09.192782 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "09ae3b1a-e8e7-4524-b54b-61eab6f9239a" (UID: "09ae3b1a-e8e7-4524-b54b-61eab6f9239a"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 11 17:31:09 crc kubenswrapper[4837]: I0111 17:31:09.192789 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"machine-approver-tls\" (UniqueName: \"kubernetes.io/secret/22c825df-677d-4ca6-82db-3454ed06e783-machine-approver-tls\") pod \"22c825df-677d-4ca6-82db-3454ed06e783\" (UID: \"22c825df-677d-4ca6-82db-3454ed06e783\") " Jan 11 17:31:09 crc kubenswrapper[4837]: I0111 17:31:09.192830 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-htfz6\" (UniqueName: \"kubernetes.io/projected/6ea678ab-3438-413e-bfe3-290ae7725660-kube-api-access-htfz6\") pod \"6ea678ab-3438-413e-bfe3-290ae7725660\" (UID: \"6ea678ab-3438-413e-bfe3-290ae7725660\") " Jan 11 17:31:09 crc kubenswrapper[4837]: I0111 17:31:09.192845 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-cni-binary-copy\") pod \"4bb40260-dbaa-4fb0-84df-5e680505d512\" (UID: \"4bb40260-dbaa-4fb0-84df-5e680505d512\") " Jan 11 17:31:09 crc kubenswrapper[4837]: I0111 17:31:09.192859 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-config\") pod \"8cea82b4-6893-4ddc-af9f-1bb5ae425c5b\" (UID: \"8cea82b4-6893-4ddc-af9f-1bb5ae425c5b\") " Jan 11 17:31:09 crc kubenswrapper[4837]: I0111 17:31:09.192874 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-w4xd4\" (UniqueName: \"kubernetes.io/projected/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-kube-api-access-w4xd4\") pod \"8cea82b4-6893-4ddc-af9f-1bb5ae425c5b\" (UID: \"8cea82b4-6893-4ddc-af9f-1bb5ae425c5b\") " Jan 11 17:31:09 crc kubenswrapper[4837]: I0111 17:31:09.192888 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-config\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Jan 11 17:31:09 crc kubenswrapper[4837]: I0111 17:31:09.192903 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-srv-cert\") pod \"f88749ec-7931-4ee7-b3fc-1ec5e11f92e9\" (UID: \"f88749ec-7931-4ee7-b3fc-1ec5e11f92e9\") " Jan 11 17:31:09 crc kubenswrapper[4837]: I0111 17:31:09.192922 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/8f668bae-612b-4b75-9490-919e737c6a3b-ca-trust-extracted\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Jan 11 17:31:09 crc kubenswrapper[4837]: I0111 17:31:09.192949 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/fda69060-fa79-4696-b1a6-7980f124bf7c-proxy-tls\") pod \"fda69060-fa79-4696-b1a6-7980f124bf7c\" (UID: \"fda69060-fa79-4696-b1a6-7980f124bf7c\") " Jan 11 17:31:09 crc kubenswrapper[4837]: I0111 17:31:09.192964 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-bound-sa-token\") pod \"a31745f5-9847-4afe-82a5-3161cc66ca93\" (UID: \"a31745f5-9847-4afe-82a5-3161cc66ca93\") " Jan 11 17:31:09 crc kubenswrapper[4837]: I0111 17:31:09.192979 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-config\") pod \"6ea678ab-3438-413e-bfe3-290ae7725660\" (UID: \"6ea678ab-3438-413e-bfe3-290ae7725660\") " Jan 11 17:31:09 crc kubenswrapper[4837]: I0111 17:31:09.192994 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/6ea678ab-3438-413e-bfe3-290ae7725660-ovn-node-metrics-cert\") pod \"6ea678ab-3438-413e-bfe3-290ae7725660\" (UID: \"6ea678ab-3438-413e-bfe3-290ae7725660\") " Jan 11 17:31:09 crc kubenswrapper[4837]: I0111 17:31:09.193009 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/0b574797-001e-440a-8f4e-c0be86edad0f-proxy-tls\") pod \"0b574797-001e-440a-8f4e-c0be86edad0f\" (UID: \"0b574797-001e-440a-8f4e-c0be86edad0f\") " Jan 11 17:31:09 crc kubenswrapper[4837]: I0111 17:31:09.193024 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-oauth-serving-cert\") pod \"43509403-f426-496e-be36-56cef71462f5\" (UID: \"43509403-f426-496e-be36-56cef71462f5\") " Jan 11 17:31:09 crc kubenswrapper[4837]: I0111 17:31:09.193038 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-config\") pod \"09efc573-dbb6-4249-bd59-9b87aba8dd28\" (UID: \"09efc573-dbb6-4249-bd59-9b87aba8dd28\") " Jan 11 17:31:09 crc kubenswrapper[4837]: I0111 17:31:09.193054 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-provider-selection\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Jan 11 17:31:09 crc kubenswrapper[4837]: I0111 17:31:09.193069 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7c4vf\" (UniqueName: \"kubernetes.io/projected/22c825df-677d-4ca6-82db-3454ed06e783-kube-api-access-7c4vf\") pod \"22c825df-677d-4ca6-82db-3454ed06e783\" (UID: \"22c825df-677d-4ca6-82db-3454ed06e783\") " Jan 11 17:31:09 crc kubenswrapper[4837]: I0111 17:31:09.193085 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-profile-collector-cert\") pod \"f88749ec-7931-4ee7-b3fc-1ec5e11f92e9\" (UID: \"f88749ec-7931-4ee7-b3fc-1ec5e11f92e9\") " Jan 11 17:31:09 crc kubenswrapper[4837]: I0111 17:31:09.193126 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-operator-metrics\") pod \"b6cd30de-2eeb-49a2-ab40-9167f4560ff5\" (UID: \"b6cd30de-2eeb-49a2-ab40-9167f4560ff5\") " Jan 11 17:31:09 crc kubenswrapper[4837]: I0111 17:31:09.193142 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pj782\" (UniqueName: \"kubernetes.io/projected/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-kube-api-access-pj782\") pod \"b6cd30de-2eeb-49a2-ab40-9167f4560ff5\" (UID: \"b6cd30de-2eeb-49a2-ab40-9167f4560ff5\") " Jan 11 17:31:09 crc kubenswrapper[4837]: I0111 17:31:09.193249 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Jan 11 17:31:09 crc kubenswrapper[4837]: I0111 17:31:09.193266 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-ocp-branding-template\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Jan 11 17:31:09 crc kubenswrapper[4837]: I0111 17:31:09.193273 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-config" (OuterVolumeSpecName: "config") pod "9d4552c7-cd75-42dd-8880-30dd377c49a4" (UID: "9d4552c7-cd75-42dd-8880-30dd377c49a4"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 11 17:31:09 crc kubenswrapper[4837]: I0111 17:31:09.193282 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-d4lsv\" (UniqueName: \"kubernetes.io/projected/25e176fe-21b4-4974-b1ed-c8b94f112a7f-kube-api-access-d4lsv\") pod \"25e176fe-21b4-4974-b1ed-c8b94f112a7f\" (UID: \"25e176fe-21b4-4974-b1ed-c8b94f112a7f\") " Jan 11 17:31:09 crc kubenswrapper[4837]: I0111 17:31:09.193302 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-config\") pod \"6509e943-70c6-444c-bc41-48a544e36fbd\" (UID: \"6509e943-70c6-444c-bc41-48a544e36fbd\") " Jan 11 17:31:09 crc kubenswrapper[4837]: I0111 17:31:09.193317 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-proxy-ca-bundles\") pod \"7583ce53-e0fe-4a16-9e4d-50516596a136\" (UID: \"7583ce53-e0fe-4a16-9e4d-50516596a136\") " Jan 11 17:31:09 crc kubenswrapper[4837]: I0111 17:31:09.193359 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-service-ca-bundle\") pod \"6509e943-70c6-444c-bc41-48a544e36fbd\" (UID: \"6509e943-70c6-444c-bc41-48a544e36fbd\") " Jan 11 17:31:09 crc kubenswrapper[4837]: I0111 17:31:09.193375 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-lzf88\" (UniqueName: \"kubernetes.io/projected/0b574797-001e-440a-8f4e-c0be86edad0f-kube-api-access-lzf88\") pod \"0b574797-001e-440a-8f4e-c0be86edad0f\" (UID: \"0b574797-001e-440a-8f4e-c0be86edad0f\") " Jan 11 17:31:09 crc kubenswrapper[4837]: I0111 17:31:09.193390 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-serving-cert\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Jan 11 17:31:09 crc kubenswrapper[4837]: I0111 17:31:09.193404 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-trusted-ca-bundle\") pod \"6509e943-70c6-444c-bc41-48a544e36fbd\" (UID: \"6509e943-70c6-444c-bc41-48a544e36fbd\") " Jan 11 17:31:09 crc kubenswrapper[4837]: I0111 17:31:09.193414 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-audit-policies" (OuterVolumeSpecName: "audit-policies") pod "09ae3b1a-e8e7-4524-b54b-61eab6f9239a" (UID: "09ae3b1a-e8e7-4524-b54b-61eab6f9239a"). InnerVolumeSpecName "audit-policies". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 11 17:31:09 crc kubenswrapper[4837]: I0111 17:31:09.193419 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qs4fp\" (UniqueName: \"kubernetes.io/projected/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-kube-api-access-qs4fp\") pod \"210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c\" (UID: \"210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c\") " Jan 11 17:31:09 crc kubenswrapper[4837]: I0111 17:31:09.193466 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/87cf06ed-a83f-41a7-828d-70653580a8cb-metrics-tls\") pod \"87cf06ed-a83f-41a7-828d-70653580a8cb\" (UID: \"87cf06ed-a83f-41a7-828d-70653580a8cb\") " Jan 11 17:31:09 crc kubenswrapper[4837]: I0111 17:31:09.193488 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-rnphk\" (UniqueName: \"kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-kube-api-access-rnphk\") pod \"bf126b07-da06-4140-9a57-dfd54fc6b486\" (UID: \"bf126b07-da06-4140-9a57-dfd54fc6b486\") " Jan 11 17:31:09 crc kubenswrapper[4837]: I0111 17:31:09.193507 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mg5zb\" (UniqueName: \"kubernetes.io/projected/6402fda4-df10-493c-b4e5-d0569419652d-kube-api-access-mg5zb\") pod \"6402fda4-df10-493c-b4e5-d0569419652d\" (UID: \"6402fda4-df10-493c-b4e5-d0569419652d\") " Jan 11 17:31:09 crc kubenswrapper[4837]: I0111 17:31:09.193524 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-x4zgh\" (UniqueName: \"kubernetes.io/projected/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-kube-api-access-x4zgh\") pod \"b11524ee-3fca-4b1b-9cdf-6da289fdbc7d\" (UID: \"b11524ee-3fca-4b1b-9cdf-6da289fdbc7d\") " Jan 11 17:31:09 crc kubenswrapper[4837]: I0111 17:31:09.193544 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-v47cf\" (UniqueName: \"kubernetes.io/projected/c03ee662-fb2f-4fc4-a2c1-af487c19d254-kube-api-access-v47cf\") pod \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\" (UID: \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\") " Jan 11 17:31:09 crc kubenswrapper[4837]: I0111 17:31:09.193562 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-env-overrides\") pod \"925f1c65-6136-48ba-85aa-3a3b50560753\" (UID: \"925f1c65-6136-48ba-85aa-3a3b50560753\") " Jan 11 17:31:09 crc kubenswrapper[4837]: I0111 17:31:09.193579 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-9xfj7\" (UniqueName: \"kubernetes.io/projected/5225d0e4-402f-4861-b410-819f433b1803-kube-api-access-9xfj7\") pod \"5225d0e4-402f-4861-b410-819f433b1803\" (UID: \"5225d0e4-402f-4861-b410-819f433b1803\") " Jan 11 17:31:09 crc kubenswrapper[4837]: I0111 17:31:09.193595 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-registry-certificates\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Jan 11 17:31:09 crc kubenswrapper[4837]: I0111 17:31:09.193614 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"audit\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-audit\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Jan 11 17:31:09 crc kubenswrapper[4837]: I0111 17:31:09.193618 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-kube-api-access-qs4fp" (OuterVolumeSpecName: "kube-api-access-qs4fp") pod "210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c" (UID: "210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c"). InnerVolumeSpecName "kube-api-access-qs4fp". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 11 17:31:09 crc kubenswrapper[4837]: I0111 17:31:09.193630 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-tk88c\" (UniqueName: \"kubernetes.io/projected/7539238d-5fe0-46ed-884e-1c3b566537ec-kube-api-access-tk88c\") pod \"7539238d-5fe0-46ed-884e-1c3b566537ec\" (UID: \"7539238d-5fe0-46ed-884e-1c3b566537ec\") " Jan 11 17:31:09 crc kubenswrapper[4837]: I0111 17:31:09.193650 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-lz9wn\" (UniqueName: \"kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-kube-api-access-lz9wn\") pod \"a31745f5-9847-4afe-82a5-3161cc66ca93\" (UID: \"a31745f5-9847-4afe-82a5-3161cc66ca93\") " Jan 11 17:31:09 crc kubenswrapper[4837]: I0111 17:31:09.193668 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-registry-tls\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Jan 11 17:31:09 crc kubenswrapper[4837]: I0111 17:31:09.193700 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"images\" (UniqueName: \"kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-images\") pod \"31d8b7a1-420e-4252-a5b7-eebe8a111292\" (UID: \"31d8b7a1-420e-4252-a5b7-eebe8a111292\") " Jan 11 17:31:09 crc kubenswrapper[4837]: I0111 17:31:09.193718 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-kfwg7\" (UniqueName: \"kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-kube-api-access-kfwg7\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Jan 11 17:31:09 crc kubenswrapper[4837]: I0111 17:31:09.193737 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-serving-cert\") pod \"8cea82b4-6893-4ddc-af9f-1bb5ae425c5b\" (UID: \"8cea82b4-6893-4ddc-af9f-1bb5ae425c5b\") " Jan 11 17:31:09 crc kubenswrapper[4837]: I0111 17:31:09.193754 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-serving-cert\") pod \"43509403-f426-496e-be36-56cef71462f5\" (UID: \"43509403-f426-496e-be36-56cef71462f5\") " Jan 11 17:31:09 crc kubenswrapper[4837]: I0111 17:31:09.193772 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-encryption-config\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Jan 11 17:31:09 crc kubenswrapper[4837]: I0111 17:31:09.193788 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/a31745f5-9847-4afe-82a5-3161cc66ca93-metrics-tls\") pod \"a31745f5-9847-4afe-82a5-3161cc66ca93\" (UID: \"a31745f5-9847-4afe-82a5-3161cc66ca93\") " Jan 11 17:31:09 crc kubenswrapper[4837]: I0111 17:31:09.193804 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/e7e6199b-1264-4501-8953-767f51328d08-serving-cert\") pod \"e7e6199b-1264-4501-8953-767f51328d08\" (UID: \"e7e6199b-1264-4501-8953-767f51328d08\") " Jan 11 17:31:09 crc kubenswrapper[4837]: I0111 17:31:09.193821 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-config\") pod \"7583ce53-e0fe-4a16-9e4d-50516596a136\" (UID: \"7583ce53-e0fe-4a16-9e4d-50516596a136\") " Jan 11 17:31:09 crc kubenswrapper[4837]: I0111 17:31:09.193838 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/496e6271-fb68-4057-954e-a0d97a4afa3f-serving-cert\") pod \"496e6271-fb68-4057-954e-a0d97a4afa3f\" (UID: \"496e6271-fb68-4057-954e-a0d97a4afa3f\") " Jan 11 17:31:09 crc kubenswrapper[4837]: I0111 17:31:09.193854 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-client\") pod \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\" (UID: \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\") " Jan 11 17:31:09 crc kubenswrapper[4837]: I0111 17:31:09.193869 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-auth-proxy-config\") pod \"31d8b7a1-420e-4252-a5b7-eebe8a111292\" (UID: \"31d8b7a1-420e-4252-a5b7-eebe8a111292\") " Jan 11 17:31:09 crc kubenswrapper[4837]: I0111 17:31:09.193884 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/7539238d-5fe0-46ed-884e-1c3b566537ec-serving-cert\") pod \"7539238d-5fe0-46ed-884e-1c3b566537ec\" (UID: \"7539238d-5fe0-46ed-884e-1c3b566537ec\") " Jan 11 17:31:09 crc kubenswrapper[4837]: I0111 17:31:09.193898 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-utilities\") pod \"57a731c4-ef35-47a8-b875-bfb08a7f8011\" (UID: \"57a731c4-ef35-47a8-b875-bfb08a7f8011\") " Jan 11 17:31:09 crc kubenswrapper[4837]: I0111 17:31:09.193914 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-client\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Jan 11 17:31:09 crc kubenswrapper[4837]: I0111 17:31:09.193933 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-dbsvg\" (UniqueName: \"kubernetes.io/projected/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-kube-api-access-dbsvg\") pod \"f88749ec-7931-4ee7-b3fc-1ec5e11f92e9\" (UID: \"f88749ec-7931-4ee7-b3fc-1ec5e11f92e9\") " Jan 11 17:31:09 crc kubenswrapper[4837]: I0111 17:31:09.193950 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-encryption-config\") pod \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\" (UID: \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\") " Jan 11 17:31:09 crc kubenswrapper[4837]: I0111 17:31:09.193966 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etcd-service-ca\" (UniqueName: \"kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-service-ca\") pod \"09efc573-dbb6-4249-bd59-9b87aba8dd28\" (UID: \"09efc573-dbb6-4249-bd59-9b87aba8dd28\") " Jan 11 17:31:09 crc kubenswrapper[4837]: I0111 17:31:09.193983 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-x2m85\" (UniqueName: \"kubernetes.io/projected/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d-kube-api-access-x2m85\") pod \"cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d\" (UID: \"cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d\") " Jan 11 17:31:09 crc kubenswrapper[4837]: I0111 17:31:09.193999 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-gf66m\" (UniqueName: \"kubernetes.io/projected/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-kube-api-access-gf66m\") pod \"a0128f3a-b052-44ed-a84e-c4c8aaf17c13\" (UID: \"a0128f3a-b052-44ed-a84e-c4c8aaf17c13\") " Jan 11 17:31:09 crc kubenswrapper[4837]: I0111 17:31:09.194016 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-cfbct\" (UniqueName: \"kubernetes.io/projected/57a731c4-ef35-47a8-b875-bfb08a7f8011-kube-api-access-cfbct\") pod \"57a731c4-ef35-47a8-b875-bfb08a7f8011\" (UID: \"57a731c4-ef35-47a8-b875-bfb08a7f8011\") " Jan 11 17:31:09 crc kubenswrapper[4837]: I0111 17:31:09.194031 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-script-lib\") pod \"6ea678ab-3438-413e-bfe3-290ae7725660\" (UID: \"6ea678ab-3438-413e-bfe3-290ae7725660\") " Jan 11 17:31:09 crc kubenswrapper[4837]: I0111 17:31:09.194048 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-x7zkh\" (UniqueName: \"kubernetes.io/projected/6731426b-95fe-49ff-bb5f-40441049fde2-kube-api-access-x7zkh\") pod \"6731426b-95fe-49ff-bb5f-40441049fde2\" (UID: \"6731426b-95fe-49ff-bb5f-40441049fde2\") " Jan 11 17:31:09 crc kubenswrapper[4837]: I0111 17:31:09.194064 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jkwtn\" (UniqueName: \"kubernetes.io/projected/5b88f790-22fa-440e-b583-365168c0b23d-kube-api-access-jkwtn\") pod \"5b88f790-22fa-440e-b583-365168c0b23d\" (UID: \"5b88f790-22fa-440e-b583-365168c0b23d\") " Jan 11 17:31:09 crc kubenswrapper[4837]: I0111 17:31:09.194080 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-trusted-ca-bundle\") pod \"43509403-f426-496e-be36-56cef71462f5\" (UID: \"43509403-f426-496e-be36-56cef71462f5\") " Jan 11 17:31:09 crc kubenswrapper[4837]: I0111 17:31:09.194094 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-multus-daemon-config\") pod \"4bb40260-dbaa-4fb0-84df-5e680505d512\" (UID: \"4bb40260-dbaa-4fb0-84df-5e680505d512\") " Jan 11 17:31:09 crc kubenswrapper[4837]: I0111 17:31:09.194111 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-nzwt7\" (UniqueName: \"kubernetes.io/projected/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-kube-api-access-nzwt7\") pod \"96b93a3a-6083-4aea-8eab-fe1aa8245ad9\" (UID: \"96b93a3a-6083-4aea-8eab-fe1aa8245ad9\") " Jan 11 17:31:09 crc kubenswrapper[4837]: I0111 17:31:09.194128 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fqsjt\" (UniqueName: \"kubernetes.io/projected/efdd0498-1daa-4136-9a4a-3b948c2293fc-kube-api-access-fqsjt\") pod \"efdd0498-1daa-4136-9a4a-3b948c2293fc\" (UID: \"efdd0498-1daa-4136-9a4a-3b948c2293fc\") " Jan 11 17:31:09 crc kubenswrapper[4837]: I0111 17:31:09.194143 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/20b0d48f-5fd6-431c-a545-e3c800c7b866-cert\") pod \"20b0d48f-5fd6-431c-a545-e3c800c7b866\" (UID: \"20b0d48f-5fd6-431c-a545-e3c800c7b866\") " Jan 11 17:31:09 crc kubenswrapper[4837]: I0111 17:31:09.194157 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/1386a44e-36a2-460c-96d0-0359d2b6f0f5-kube-api-access\") pod \"1386a44e-36a2-460c-96d0-0359d2b6f0f5\" (UID: \"1386a44e-36a2-460c-96d0-0359d2b6f0f5\") " Jan 11 17:31:09 crc kubenswrapper[4837]: I0111 17:31:09.194174 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-utilities\") pod \"1d611f23-29be-4491-8495-bee1670e935f\" (UID: \"1d611f23-29be-4491-8495-bee1670e935f\") " Jan 11 17:31:09 crc kubenswrapper[4837]: I0111 17:31:09.194188 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-srv-cert\") pod \"b6312bbd-5731-4ea0-a20f-81d5a57df44a\" (UID: \"b6312bbd-5731-4ea0-a20f-81d5a57df44a\") " Jan 11 17:31:09 crc kubenswrapper[4837]: I0111 17:31:09.194203 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-trusted-ca-bundle\") pod \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\" (UID: \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\") " Jan 11 17:31:09 crc kubenswrapper[4837]: I0111 17:31:09.194219 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-serving-ca\") pod \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\" (UID: \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\") " Jan 11 17:31:09 crc kubenswrapper[4837]: I0111 17:31:09.194236 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-sb6h7\" (UniqueName: \"kubernetes.io/projected/1bf7eb37-55a3-4c65-b768-a94c82151e69-kube-api-access-sb6h7\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Jan 11 17:31:09 crc kubenswrapper[4837]: I0111 17:31:09.194252 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/0b78653f-4ff9-4508-8672-245ed9b561e3-serving-cert\") pod \"0b78653f-4ff9-4508-8672-245ed9b561e3\" (UID: \"0b78653f-4ff9-4508-8672-245ed9b561e3\") " Jan 11 17:31:09 crc kubenswrapper[4837]: I0111 17:31:09.194270 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-catalog-content\") pod \"5225d0e4-402f-4861-b410-819f433b1803\" (UID: \"5225d0e4-402f-4861-b410-819f433b1803\") " Jan 11 17:31:09 crc kubenswrapper[4837]: I0111 17:31:09.194286 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"image-registry-operator-tls\" (UniqueName: \"kubernetes.io/secret/bf126b07-da06-4140-9a57-dfd54fc6b486-image-registry-operator-tls\") pod \"bf126b07-da06-4140-9a57-dfd54fc6b486\" (UID: \"bf126b07-da06-4140-9a57-dfd54fc6b486\") " Jan 11 17:31:09 crc kubenswrapper[4837]: I0111 17:31:09.194303 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/0b78653f-4ff9-4508-8672-245ed9b561e3-kube-api-access\") pod \"0b78653f-4ff9-4508-8672-245ed9b561e3\" (UID: \"0b78653f-4ff9-4508-8672-245ed9b561e3\") " Jan 11 17:31:09 crc kubenswrapper[4837]: I0111 17:31:09.194319 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-trusted-ca\") pod \"b6cd30de-2eeb-49a2-ab40-9167f4560ff5\" (UID: \"b6cd30de-2eeb-49a2-ab40-9167f4560ff5\") " Jan 11 17:31:09 crc kubenswrapper[4837]: I0111 17:31:09.194337 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-router-certs\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Jan 11 17:31:09 crc kubenswrapper[4837]: I0111 17:31:09.194354 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zkvpv\" (UniqueName: \"kubernetes.io/projected/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-kube-api-access-zkvpv\") pod \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\" (UID: \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\") " Jan 11 17:31:09 crc kubenswrapper[4837]: I0111 17:31:09.194394 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-utilities\") pod \"5225d0e4-402f-4861-b410-819f433b1803\" (UID: \"5225d0e4-402f-4861-b410-819f433b1803\") " Jan 11 17:31:09 crc kubenswrapper[4837]: I0111 17:31:09.194411 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2d4wz\" (UniqueName: \"kubernetes.io/projected/5441d097-087c-4d9a-baa8-b210afa90fc9-kube-api-access-2d4wz\") pod \"5441d097-087c-4d9a-baa8-b210afa90fc9\" (UID: \"5441d097-087c-4d9a-baa8-b210afa90fc9\") " Jan 11 17:31:09 crc kubenswrapper[4837]: I0111 17:31:09.194426 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-audit-policies\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Jan 11 17:31:09 crc kubenswrapper[4837]: I0111 17:31:09.194444 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/31d8b7a1-420e-4252-a5b7-eebe8a111292-proxy-tls\") pod \"31d8b7a1-420e-4252-a5b7-eebe8a111292\" (UID: \"31d8b7a1-420e-4252-a5b7-eebe8a111292\") " Jan 11 17:31:09 crc kubenswrapper[4837]: I0111 17:31:09.194459 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"default-certificate\" (UniqueName: \"kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-default-certificate\") pod \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\" (UID: \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\") " Jan 11 17:31:09 crc kubenswrapper[4837]: I0111 17:31:09.194475 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/8f668bae-612b-4b75-9490-919e737c6a3b-installation-pull-secrets\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Jan 11 17:31:09 crc kubenswrapper[4837]: I0111 17:31:09.194490 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/496e6271-fb68-4057-954e-a0d97a4afa3f-config\") pod \"496e6271-fb68-4057-954e-a0d97a4afa3f\" (UID: \"496e6271-fb68-4057-954e-a0d97a4afa3f\") " Jan 11 17:31:09 crc kubenswrapper[4837]: I0111 17:31:09.194506 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-env-overrides\") pod \"6ea678ab-3438-413e-bfe3-290ae7725660\" (UID: \"6ea678ab-3438-413e-bfe3-290ae7725660\") " Jan 11 17:31:09 crc kubenswrapper[4837]: I0111 17:31:09.194521 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-serving-ca\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Jan 11 17:31:09 crc kubenswrapper[4837]: I0111 17:31:09.194537 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovn-control-plane-metrics-cert\" (UniqueName: \"kubernetes.io/secret/925f1c65-6136-48ba-85aa-3a3b50560753-ovn-control-plane-metrics-cert\") pod \"925f1c65-6136-48ba-85aa-3a3b50560753\" (UID: \"925f1c65-6136-48ba-85aa-3a3b50560753\") " Jan 11 17:31:09 crc kubenswrapper[4837]: I0111 17:31:09.194553 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-ovnkube-config\") pod \"925f1c65-6136-48ba-85aa-3a3b50560753\" (UID: \"925f1c65-6136-48ba-85aa-3a3b50560753\") " Jan 11 17:31:09 crc kubenswrapper[4837]: I0111 17:31:09.194568 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/0b78653f-4ff9-4508-8672-245ed9b561e3-service-ca\") pod \"0b78653f-4ff9-4508-8672-245ed9b561e3\" (UID: \"0b78653f-4ff9-4508-8672-245ed9b561e3\") " Jan 11 17:31:09 crc kubenswrapper[4837]: I0111 17:31:09.194583 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-w9rds\" (UniqueName: \"kubernetes.io/projected/20b0d48f-5fd6-431c-a545-e3c800c7b866-kube-api-access-w9rds\") pod \"20b0d48f-5fd6-431c-a545-e3c800c7b866\" (UID: \"20b0d48f-5fd6-431c-a545-e3c800c7b866\") " Jan 11 17:31:09 crc kubenswrapper[4837]: I0111 17:31:09.194599 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4d4hj\" (UniqueName: \"kubernetes.io/projected/3ab1a177-2de0-46d9-b765-d0d0649bb42e-kube-api-access-4d4hj\") pod \"3ab1a177-2de0-46d9-b765-d0d0649bb42e\" (UID: \"3ab1a177-2de0-46d9-b765-d0d0649bb42e\") " Jan 11 17:31:09 crc kubenswrapper[4837]: I0111 17:31:09.194615 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6ccd8\" (UniqueName: \"kubernetes.io/projected/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-kube-api-access-6ccd8\") pod \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\" (UID: \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\") " Jan 11 17:31:09 crc kubenswrapper[4837]: I0111 17:31:09.194631 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"package-server-manager-serving-cert\" (UniqueName: \"kubernetes.io/secret/3ab1a177-2de0-46d9-b765-d0d0649bb42e-package-server-manager-serving-cert\") pod \"3ab1a177-2de0-46d9-b765-d0d0649bb42e\" (UID: \"3ab1a177-2de0-46d9-b765-d0d0649bb42e\") " Jan 11 17:31:09 crc kubenswrapper[4837]: I0111 17:31:09.194648 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pjr6v\" (UniqueName: \"kubernetes.io/projected/49ef4625-1d3a-4a9f-b595-c2433d32326d-kube-api-access-pjr6v\") pod \"49ef4625-1d3a-4a9f-b595-c2433d32326d\" (UID: \"49ef4625-1d3a-4a9f-b595-c2433d32326d\") " Jan 11 17:31:09 crc kubenswrapper[4837]: I0111 17:31:09.194665 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-bound-sa-token\") pod \"bf126b07-da06-4140-9a57-dfd54fc6b486\" (UID: \"bf126b07-da06-4140-9a57-dfd54fc6b486\") " Jan 11 17:31:09 crc kubenswrapper[4837]: I0111 17:31:09.194692 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/6509e943-70c6-444c-bc41-48a544e36fbd-serving-cert\") pod \"6509e943-70c6-444c-bc41-48a544e36fbd\" (UID: \"6509e943-70c6-444c-bc41-48a544e36fbd\") " Jan 11 17:31:09 crc kubenswrapper[4837]: I0111 17:31:09.194709 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-d6qdx\" (UniqueName: \"kubernetes.io/projected/87cf06ed-a83f-41a7-828d-70653580a8cb-kube-api-access-d6qdx\") pod \"87cf06ed-a83f-41a7-828d-70653580a8cb\" (UID: \"87cf06ed-a83f-41a7-828d-70653580a8cb\") " Jan 11 17:31:09 crc kubenswrapper[4837]: I0111 17:31:09.194726 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"certs\" (UniqueName: \"kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-certs\") pod \"5fe579f8-e8a6-4643-bce5-a661393c4dde\" (UID: \"5fe579f8-e8a6-4643-bce5-a661393c4dde\") " Jan 11 17:31:09 crc kubenswrapper[4837]: I0111 17:31:09.194742 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-serviceca\") pod \"3cb93b32-e0ae-4377-b9c8-fdb9842c6d59\" (UID: \"3cb93b32-e0ae-4377-b9c8-fdb9842c6d59\") " Jan 11 17:31:09 crc kubenswrapper[4837]: I0111 17:31:09.194757 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-auth-proxy-config\") pod \"22c825df-677d-4ca6-82db-3454ed06e783\" (UID: \"22c825df-677d-4ca6-82db-3454ed06e783\") " Jan 11 17:31:09 crc kubenswrapper[4837]: I0111 17:31:09.194775 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/7583ce53-e0fe-4a16-9e4d-50516596a136-serving-cert\") pod \"7583ce53-e0fe-4a16-9e4d-50516596a136\" (UID: \"7583ce53-e0fe-4a16-9e4d-50516596a136\") " Jan 11 17:31:09 crc kubenswrapper[4837]: I0111 17:31:09.194791 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2w9zh\" (UniqueName: \"kubernetes.io/projected/4bb40260-dbaa-4fb0-84df-5e680505d512-kube-api-access-2w9zh\") pod \"4bb40260-dbaa-4fb0-84df-5e680505d512\" (UID: \"4bb40260-dbaa-4fb0-84df-5e680505d512\") " Jan 11 17:31:09 crc kubenswrapper[4837]: I0111 17:31:09.194808 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"stats-auth\" (UniqueName: \"kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-stats-auth\") pod \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\" (UID: \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\") " Jan 11 17:31:09 crc kubenswrapper[4837]: I0111 17:31:09.194823 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/c03ee662-fb2f-4fc4-a2c1-af487c19d254-service-ca-bundle\") pod \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\" (UID: \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\") " Jan 11 17:31:09 crc kubenswrapper[4837]: I0111 17:31:09.194841 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-webhook-cert\") pod \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\" (UID: \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\") " Jan 11 17:31:09 crc kubenswrapper[4837]: I0111 17:31:09.194858 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xcphl\" (UniqueName: \"kubernetes.io/projected/7583ce53-e0fe-4a16-9e4d-50516596a136-kube-api-access-xcphl\") pod \"7583ce53-e0fe-4a16-9e4d-50516596a136\" (UID: \"7583ce53-e0fe-4a16-9e4d-50516596a136\") " Jan 11 17:31:09 crc kubenswrapper[4837]: I0111 17:31:09.194876 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-config\") pod \"210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c\" (UID: \"210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c\") " Jan 11 17:31:09 crc kubenswrapper[4837]: I0111 17:31:09.194895 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-utilities\") pod \"b11524ee-3fca-4b1b-9cdf-6da289fdbc7d\" (UID: \"b11524ee-3fca-4b1b-9cdf-6da289fdbc7d\") " Jan 11 17:31:09 crc kubenswrapper[4837]: I0111 17:31:09.194911 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-binary-copy\") pod \"7bb08738-c794-4ee8-9972-3a62ca171029\" (UID: \"7bb08738-c794-4ee8-9972-3a62ca171029\") " Jan 11 17:31:09 crc kubenswrapper[4837]: I0111 17:31:09.194959 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-service-ca\") pod \"43509403-f426-496e-be36-56cef71462f5\" (UID: \"43509403-f426-496e-be36-56cef71462f5\") " Jan 11 17:31:09 crc kubenswrapper[4837]: I0111 17:31:09.194976 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fcqwp\" (UniqueName: \"kubernetes.io/projected/5fe579f8-e8a6-4643-bce5-a661393c4dde-kube-api-access-fcqwp\") pod \"5fe579f8-e8a6-4643-bce5-a661393c4dde\" (UID: \"5fe579f8-e8a6-4643-bce5-a661393c4dde\") " Jan 11 17:31:09 crc kubenswrapper[4837]: I0111 17:31:09.194994 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-catalog-content\") pod \"1d611f23-29be-4491-8495-bee1670e935f\" (UID: \"1d611f23-29be-4491-8495-bee1670e935f\") " Jan 11 17:31:09 crc kubenswrapper[4837]: I0111 17:31:09.195009 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-bound-sa-token\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Jan 11 17:31:09 crc kubenswrapper[4837]: I0111 17:31:09.195026 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-serving-cert\") pod \"bc5039c0-ea34-426b-a2b7-fbbc87b49a6d\" (UID: \"bc5039c0-ea34-426b-a2b7-fbbc87b49a6d\") " Jan 11 17:31:09 crc kubenswrapper[4837]: I0111 17:31:09.195042 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-config\") pod \"22c825df-677d-4ca6-82db-3454ed06e783\" (UID: \"22c825df-677d-4ca6-82db-3454ed06e783\") " Jan 11 17:31:09 crc kubenswrapper[4837]: I0111 17:31:09.195059 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-oauth-config\") pod \"43509403-f426-496e-be36-56cef71462f5\" (UID: \"43509403-f426-496e-be36-56cef71462f5\") " Jan 11 17:31:09 crc kubenswrapper[4837]: I0111 17:31:09.195076 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-apiservice-cert\") pod \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\" (UID: \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\") " Jan 11 17:31:09 crc kubenswrapper[4837]: I0111 17:31:09.195092 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/efdd0498-1daa-4136-9a4a-3b948c2293fc-webhook-certs\") pod \"efdd0498-1daa-4136-9a4a-3b948c2293fc\" (UID: \"efdd0498-1daa-4136-9a4a-3b948c2293fc\") " Jan 11 17:31:09 crc kubenswrapper[4837]: I0111 17:31:09.195109 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zgdk5\" (UniqueName: \"kubernetes.io/projected/31d8b7a1-420e-4252-a5b7-eebe8a111292-kube-api-access-zgdk5\") pod \"31d8b7a1-420e-4252-a5b7-eebe8a111292\" (UID: \"31d8b7a1-420e-4252-a5b7-eebe8a111292\") " Jan 11 17:31:09 crc kubenswrapper[4837]: I0111 17:31:09.195126 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qg5z5\" (UniqueName: \"kubernetes.io/projected/43509403-f426-496e-be36-56cef71462f5-kube-api-access-qg5z5\") pod \"43509403-f426-496e-be36-56cef71462f5\" (UID: \"43509403-f426-496e-be36-56cef71462f5\") " Jan 11 17:31:09 crc kubenswrapper[4837]: I0111 17:31:09.195143 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-w7l8j\" (UniqueName: \"kubernetes.io/projected/01ab3dd5-8196-46d0-ad33-122e2ca51def-kube-api-access-w7l8j\") pod \"01ab3dd5-8196-46d0-ad33-122e2ca51def\" (UID: \"01ab3dd5-8196-46d0-ad33-122e2ca51def\") " Jan 11 17:31:09 crc kubenswrapper[4837]: I0111 17:31:09.195164 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/5b88f790-22fa-440e-b583-365168c0b23d-metrics-certs\") pod \"5b88f790-22fa-440e-b583-365168c0b23d\" (UID: \"5b88f790-22fa-440e-b583-365168c0b23d\") " Jan 11 17:31:09 crc kubenswrapper[4837]: I0111 17:31:09.195182 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-error\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Jan 11 17:31:09 crc kubenswrapper[4837]: I0111 17:31:09.195204 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/01ab3dd5-8196-46d0-ad33-122e2ca51def-config\") pod \"01ab3dd5-8196-46d0-ad33-122e2ca51def\" (UID: \"01ab3dd5-8196-46d0-ad33-122e2ca51def\") " Jan 11 17:31:09 crc kubenswrapper[4837]: I0111 17:31:09.195221 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/9d4552c7-cd75-42dd-8880-30dd377c49a4-serving-cert\") pod \"9d4552c7-cd75-42dd-8880-30dd377c49a4\" (UID: \"9d4552c7-cd75-42dd-8880-30dd377c49a4\") " Jan 11 17:31:09 crc kubenswrapper[4837]: I0111 17:31:09.195239 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-session\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Jan 11 17:31:09 crc kubenswrapper[4837]: I0111 17:31:09.195260 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-wxkg8\" (UniqueName: \"kubernetes.io/projected/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-kube-api-access-wxkg8\") pod \"3cb93b32-e0ae-4377-b9c8-fdb9842c6d59\" (UID: \"3cb93b32-e0ae-4377-b9c8-fdb9842c6d59\") " Jan 11 17:31:09 crc kubenswrapper[4837]: I0111 17:31:09.195278 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"control-plane-machine-set-operator-tls\" (UniqueName: \"kubernetes.io/secret/6731426b-95fe-49ff-bb5f-40441049fde2-control-plane-machine-set-operator-tls\") pod \"6731426b-95fe-49ff-bb5f-40441049fde2\" (UID: \"6731426b-95fe-49ff-bb5f-40441049fde2\") " Jan 11 17:31:09 crc kubenswrapper[4837]: I0111 17:31:09.195294 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/87cf06ed-a83f-41a7-828d-70653580a8cb-config-volume\") pod \"87cf06ed-a83f-41a7-828d-70653580a8cb\" (UID: \"87cf06ed-a83f-41a7-828d-70653580a8cb\") " Jan 11 17:31:09 crc kubenswrapper[4837]: I0111 17:31:09.195310 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-config\") pod \"5441d097-087c-4d9a-baa8-b210afa90fc9\" (UID: \"5441d097-087c-4d9a-baa8-b210afa90fc9\") " Jan 11 17:31:09 crc kubenswrapper[4837]: I0111 17:31:09.195326 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"signing-key\" (UniqueName: \"kubernetes.io/secret/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-key\") pod \"25e176fe-21b4-4974-b1ed-c8b94f112a7f\" (UID: \"25e176fe-21b4-4974-b1ed-c8b94f112a7f\") " Jan 11 17:31:09 crc kubenswrapper[4837]: I0111 17:31:09.195344 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-profile-collector-cert\") pod \"b6312bbd-5731-4ea0-a20f-81d5a57df44a\" (UID: \"b6312bbd-5731-4ea0-a20f-81d5a57df44a\") " Jan 11 17:31:09 crc kubenswrapper[4837]: I0111 17:31:09.195361 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etcd-ca\" (UniqueName: \"kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-ca\") pod \"09efc573-dbb6-4249-bd59-9b87aba8dd28\" (UID: \"09efc573-dbb6-4249-bd59-9b87aba8dd28\") " Jan 11 17:31:09 crc kubenswrapper[4837]: I0111 17:31:09.195376 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"tmpfs\" (UniqueName: \"kubernetes.io/empty-dir/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-tmpfs\") pod \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\" (UID: \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\") " Jan 11 17:31:09 crc kubenswrapper[4837]: I0111 17:31:09.195392 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-trusted-ca-bundle\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Jan 11 17:31:09 crc kubenswrapper[4837]: I0111 17:31:09.195408 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pcxfs\" (UniqueName: \"kubernetes.io/projected/9d4552c7-cd75-42dd-8880-30dd377c49a4-kube-api-access-pcxfs\") pod \"9d4552c7-cd75-42dd-8880-30dd377c49a4\" (UID: \"9d4552c7-cd75-42dd-8880-30dd377c49a4\") " Jan 11 17:31:09 crc kubenswrapper[4837]: I0111 17:31:09.195425 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-catalog-content\") pod \"b11524ee-3fca-4b1b-9cdf-6da289fdbc7d\" (UID: \"b11524ee-3fca-4b1b-9cdf-6da289fdbc7d\") " Jan 11 17:31:09 crc kubenswrapper[4837]: I0111 17:31:09.195443 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"signing-cabundle\" (UniqueName: \"kubernetes.io/configmap/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-cabundle\") pod \"25e176fe-21b4-4974-b1ed-c8b94f112a7f\" (UID: \"25e176fe-21b4-4974-b1ed-c8b94f112a7f\") " Jan 11 17:31:09 crc kubenswrapper[4837]: I0111 17:31:09.195460 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/496e6271-fb68-4057-954e-a0d97a4afa3f-kube-api-access\") pod \"496e6271-fb68-4057-954e-a0d97a4afa3f\" (UID: \"496e6271-fb68-4057-954e-a0d97a4afa3f\") " Jan 11 17:31:09 crc kubenswrapper[4837]: I0111 17:31:09.195478 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-trusted-ca\") pod \"9d4552c7-cd75-42dd-8880-30dd377c49a4\" (UID: \"9d4552c7-cd75-42dd-8880-30dd377c49a4\") " Jan 11 17:31:09 crc kubenswrapper[4837]: I0111 17:31:09.195493 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1386a44e-36a2-460c-96d0-0359d2b6f0f5-config\") pod \"1386a44e-36a2-460c-96d0-0359d2b6f0f5\" (UID: \"1386a44e-36a2-460c-96d0-0359d2b6f0f5\") " Jan 11 17:31:09 crc kubenswrapper[4837]: I0111 17:31:09.195510 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"available-featuregates\" (UniqueName: \"kubernetes.io/empty-dir/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-available-featuregates\") pod \"bc5039c0-ea34-426b-a2b7-fbbc87b49a6d\" (UID: \"bc5039c0-ea34-426b-a2b7-fbbc87b49a6d\") " Jan 11 17:31:09 crc kubenswrapper[4837]: I0111 17:31:09.195527 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mnrrd\" (UniqueName: \"kubernetes.io/projected/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-kube-api-access-mnrrd\") pod \"bc5039c0-ea34-426b-a2b7-fbbc87b49a6d\" (UID: \"bc5039c0-ea34-426b-a2b7-fbbc87b49a6d\") " Jan 11 17:31:09 crc kubenswrapper[4837]: I0111 17:31:09.195542 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/5441d097-087c-4d9a-baa8-b210afa90fc9-serving-cert\") pod \"5441d097-087c-4d9a-baa8-b210afa90fc9\" (UID: \"5441d097-087c-4d9a-baa8-b210afa90fc9\") " Jan 11 17:31:09 crc kubenswrapper[4837]: I0111 17:31:09.195559 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/1386a44e-36a2-460c-96d0-0359d2b6f0f5-serving-cert\") pod \"1386a44e-36a2-460c-96d0-0359d2b6f0f5\" (UID: \"1386a44e-36a2-460c-96d0-0359d2b6f0f5\") " Jan 11 17:31:09 crc kubenswrapper[4837]: I0111 17:31:09.195577 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-s4n52\" (UniqueName: \"kubernetes.io/projected/925f1c65-6136-48ba-85aa-3a3b50560753-kube-api-access-s4n52\") pod \"925f1c65-6136-48ba-85aa-3a3b50560753\" (UID: \"925f1c65-6136-48ba-85aa-3a3b50560753\") " Jan 11 17:31:09 crc kubenswrapper[4837]: I0111 17:31:09.195596 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xcgwh\" (UniqueName: \"kubernetes.io/projected/fda69060-fa79-4696-b1a6-7980f124bf7c-kube-api-access-xcgwh\") pod \"fda69060-fa79-4696-b1a6-7980f124bf7c\" (UID: \"fda69060-fa79-4696-b1a6-7980f124bf7c\") " Jan 11 17:31:09 crc kubenswrapper[4837]: I0111 17:31:09.195613 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8tdtz\" (UniqueName: \"kubernetes.io/projected/09efc573-dbb6-4249-bd59-9b87aba8dd28-kube-api-access-8tdtz\") pod \"09efc573-dbb6-4249-bd59-9b87aba8dd28\" (UID: \"09efc573-dbb6-4249-bd59-9b87aba8dd28\") " Jan 11 17:31:09 crc kubenswrapper[4837]: I0111 17:31:09.195629 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-catalog-content\") pod \"57a731c4-ef35-47a8-b875-bfb08a7f8011\" (UID: \"57a731c4-ef35-47a8-b875-bfb08a7f8011\") " Jan 11 17:31:09 crc kubenswrapper[4837]: I0111 17:31:09.195646 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-config\") pod \"6402fda4-df10-493c-b4e5-d0569419652d\" (UID: \"6402fda4-df10-493c-b4e5-d0569419652d\") " Jan 11 17:31:09 crc kubenswrapper[4837]: I0111 17:31:09.195662 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-metrics-certs\") pod \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\" (UID: \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\") " Jan 11 17:31:09 crc kubenswrapper[4837]: I0111 17:31:09.195929 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-samples-operator-tls\") pod \"a0128f3a-b052-44ed-a84e-c4c8aaf17c13\" (UID: \"a0128f3a-b052-44ed-a84e-c4c8aaf17c13\") " Jan 11 17:31:09 crc kubenswrapper[4837]: I0111 17:31:09.195947 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-bf2bz\" (UniqueName: \"kubernetes.io/projected/1d611f23-29be-4491-8495-bee1670e935f-kube-api-access-bf2bz\") pod \"1d611f23-29be-4491-8495-bee1670e935f\" (UID: \"1d611f23-29be-4491-8495-bee1670e935f\") " Jan 11 17:31:09 crc kubenswrapper[4837]: I0111 17:31:09.195964 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"machine-api-operator-tls\" (UniqueName: \"kubernetes.io/secret/6402fda4-df10-493c-b4e5-d0569419652d-machine-api-operator-tls\") pod \"6402fda4-df10-493c-b4e5-d0569419652d\" (UID: \"6402fda4-df10-493c-b4e5-d0569419652d\") " Jan 11 17:31:09 crc kubenswrapper[4837]: I0111 17:31:09.195981 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-serving-cert\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Jan 11 17:31:09 crc kubenswrapper[4837]: I0111 17:31:09.195997 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-sysctl-allowlist\") pod \"7bb08738-c794-4ee8-9972-3a62ca171029\" (UID: \"7bb08738-c794-4ee8-9972-3a62ca171029\") " Jan 11 17:31:09 crc kubenswrapper[4837]: I0111 17:31:09.196014 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-serving-cert\") pod \"210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c\" (UID: \"210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c\") " Jan 11 17:31:09 crc kubenswrapper[4837]: I0111 17:31:09.196030 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/01ab3dd5-8196-46d0-ad33-122e2ca51def-serving-cert\") pod \"01ab3dd5-8196-46d0-ad33-122e2ca51def\" (UID: \"01ab3dd5-8196-46d0-ad33-122e2ca51def\") " Jan 11 17:31:09 crc kubenswrapper[4837]: I0111 17:31:09.196047 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"images\" (UniqueName: \"kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-images\") pod \"6402fda4-df10-493c-b4e5-d0569419652d\" (UID: \"6402fda4-df10-493c-b4e5-d0569419652d\") " Jan 11 17:31:09 crc kubenswrapper[4837]: I0111 17:31:09.196062 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e7e6199b-1264-4501-8953-767f51328d08-config\") pod \"e7e6199b-1264-4501-8953-767f51328d08\" (UID: \"e7e6199b-1264-4501-8953-767f51328d08\") " Jan 11 17:31:09 crc kubenswrapper[4837]: I0111 17:31:09.196116 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-iptables-alerter-script\") pod \"iptables-alerter-4ln5h\" (UID: \"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\") " pod="openshift-network-operator/iptables-alerter-4ln5h" Jan 11 17:31:09 crc kubenswrapper[4837]: I0111 17:31:09.196138 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/e1452749-ce38-41f8-89dd-4b567f2a3250-log-socket\") pod \"ovnkube-node-b7lgc\" (UID: \"e1452749-ce38-41f8-89dd-4b567f2a3250\") " pod="openshift-ovn-kubernetes/ovnkube-node-b7lgc" Jan 11 17:31:09 crc kubenswrapper[4837]: I0111 17:31:09.196154 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/78cc7c3f-09f5-4200-a647-8fa4e9b2aae5-system-cni-dir\") pod \"multus-v5bf5\" (UID: \"78cc7c3f-09f5-4200-a647-8fa4e9b2aae5\") " pod="openshift-multus/multus-v5bf5" Jan 11 17:31:09 crc kubenswrapper[4837]: I0111 17:31:09.196169 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/78cc7c3f-09f5-4200-a647-8fa4e9b2aae5-multus-conf-dir\") pod \"multus-v5bf5\" (UID: \"78cc7c3f-09f5-4200-a647-8fa4e9b2aae5\") " pod="openshift-multus/multus-v5bf5" Jan 11 17:31:09 crc kubenswrapper[4837]: I0111 17:31:09.196186 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/78cc7c3f-09f5-4200-a647-8fa4e9b2aae5-host-var-lib-kubelet\") pod \"multus-v5bf5\" (UID: \"78cc7c3f-09f5-4200-a647-8fa4e9b2aae5\") " pod="openshift-multus/multus-v5bf5" Jan 11 17:31:09 crc kubenswrapper[4837]: I0111 17:31:09.196202 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/78cc7c3f-09f5-4200-a647-8fa4e9b2aae5-multus-daemon-config\") pod \"multus-v5bf5\" (UID: \"78cc7c3f-09f5-4200-a647-8fa4e9b2aae5\") " pod="openshift-multus/multus-v5bf5" Jan 11 17:31:09 crc kubenswrapper[4837]: I0111 17:31:09.196218 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/78cc7c3f-09f5-4200-a647-8fa4e9b2aae5-host-run-multus-certs\") pod \"multus-v5bf5\" (UID: \"78cc7c3f-09f5-4200-a647-8fa4e9b2aae5\") " pod="openshift-multus/multus-v5bf5" Jan 11 17:31:09 crc kubenswrapper[4837]: I0111 17:31:09.196234 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/e1452749-ce38-41f8-89dd-4b567f2a3250-var-lib-openvswitch\") pod \"ovnkube-node-b7lgc\" (UID: \"e1452749-ce38-41f8-89dd-4b567f2a3250\") " pod="openshift-ovn-kubernetes/ovnkube-node-b7lgc" Jan 11 17:31:09 crc kubenswrapper[4837]: I0111 17:31:09.196249 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/e1452749-ce38-41f8-89dd-4b567f2a3250-host-cni-bin\") pod \"ovnkube-node-b7lgc\" (UID: \"e1452749-ce38-41f8-89dd-4b567f2a3250\") " pod="openshift-ovn-kubernetes/ovnkube-node-b7lgc" Jan 11 17:31:09 crc kubenswrapper[4837]: I0111 17:31:09.196266 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/37a5e44f-9a88-4405-be8a-b645485e7312-metrics-tls\") pod \"network-operator-58b4c7f79c-55gtf\" (UID: \"37a5e44f-9a88-4405-be8a-b645485e7312\") " pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Jan 11 17:31:09 crc kubenswrapper[4837]: I0111 17:31:09.196284 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/78cc7c3f-09f5-4200-a647-8fa4e9b2aae5-host-run-netns\") pod \"multus-v5bf5\" (UID: \"78cc7c3f-09f5-4200-a647-8fa4e9b2aae5\") " pod="openshift-multus/multus-v5bf5" Jan 11 17:31:09 crc kubenswrapper[4837]: I0111 17:31:09.196300 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/78cc7c3f-09f5-4200-a647-8fa4e9b2aae5-host-var-lib-cni-bin\") pod \"multus-v5bf5\" (UID: \"78cc7c3f-09f5-4200-a647-8fa4e9b2aae5\") " pod="openshift-multus/multus-v5bf5" Jan 11 17:31:09 crc kubenswrapper[4837]: I0111 17:31:09.196316 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/e1452749-ce38-41f8-89dd-4b567f2a3250-ovn-node-metrics-cert\") pod \"ovnkube-node-b7lgc\" (UID: \"e1452749-ce38-41f8-89dd-4b567f2a3250\") " pod="openshift-ovn-kubernetes/ovnkube-node-b7lgc" Jan 11 17:31:09 crc kubenswrapper[4837]: I0111 17:31:09.196333 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fhxrk\" (UniqueName: \"kubernetes.io/projected/46729771-ba0b-4b7c-8245-b2d57acb5a2c-kube-api-access-fhxrk\") pod \"multus-additional-cni-plugins-7996w\" (UID: \"46729771-ba0b-4b7c-8245-b2d57acb5a2c\") " pod="openshift-multus/multus-additional-cni-plugins-7996w" Jan 11 17:31:09 crc kubenswrapper[4837]: I0111 17:31:09.196349 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/78cc7c3f-09f5-4200-a647-8fa4e9b2aae5-cni-binary-copy\") pod \"multus-v5bf5\" (UID: \"78cc7c3f-09f5-4200-a647-8fa4e9b2aae5\") " pod="openshift-multus/multus-v5bf5" Jan 11 17:31:09 crc kubenswrapper[4837]: I0111 17:31:09.196366 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/1cf6fa66-290a-4e29-bafc-e60185a22fe2-proxy-tls\") pod \"machine-config-daemon-pqnst\" (UID: \"1cf6fa66-290a-4e29-bafc-e60185a22fe2\") " pod="openshift-machine-config-operator/machine-config-daemon-pqnst" Jan 11 17:31:09 crc kubenswrapper[4837]: I0111 17:31:09.196383 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/e1452749-ce38-41f8-89dd-4b567f2a3250-run-openvswitch\") pod \"ovnkube-node-b7lgc\" (UID: \"e1452749-ce38-41f8-89dd-4b567f2a3250\") " pod="openshift-ovn-kubernetes/ovnkube-node-b7lgc" Jan 11 17:31:09 crc kubenswrapper[4837]: I0111 17:31:09.196397 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/e1452749-ce38-41f8-89dd-4b567f2a3250-node-log\") pod \"ovnkube-node-b7lgc\" (UID: \"e1452749-ce38-41f8-89dd-4b567f2a3250\") " pod="openshift-ovn-kubernetes/ovnkube-node-b7lgc" Jan 11 17:31:09 crc kubenswrapper[4837]: I0111 17:31:09.196413 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/46729771-ba0b-4b7c-8245-b2d57acb5a2c-tuning-conf-dir\") pod \"multus-additional-cni-plugins-7996w\" (UID: \"46729771-ba0b-4b7c-8245-b2d57acb5a2c\") " pod="openshift-multus/multus-additional-cni-plugins-7996w" Jan 11 17:31:09 crc kubenswrapper[4837]: I0111 17:31:09.196429 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ns552\" (UniqueName: \"kubernetes.io/projected/78cc7c3f-09f5-4200-a647-8fa4e9b2aae5-kube-api-access-ns552\") pod \"multus-v5bf5\" (UID: \"78cc7c3f-09f5-4200-a647-8fa4e9b2aae5\") " pod="openshift-multus/multus-v5bf5" Jan 11 17:31:09 crc kubenswrapper[4837]: I0111 17:31:09.196465 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Jan 11 17:31:09 crc kubenswrapper[4837]: I0111 17:31:09.196483 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/e1452749-ce38-41f8-89dd-4b567f2a3250-host-run-netns\") pod \"ovnkube-node-b7lgc\" (UID: \"e1452749-ce38-41f8-89dd-4b567f2a3250\") " pod="openshift-ovn-kubernetes/ovnkube-node-b7lgc" Jan 11 17:31:09 crc kubenswrapper[4837]: I0111 17:31:09.196498 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/e1452749-ce38-41f8-89dd-4b567f2a3250-etc-openvswitch\") pod \"ovnkube-node-b7lgc\" (UID: \"e1452749-ce38-41f8-89dd-4b567f2a3250\") " pod="openshift-ovn-kubernetes/ovnkube-node-b7lgc" Jan 11 17:31:09 crc kubenswrapper[4837]: I0111 17:31:09.196516 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/e1452749-ce38-41f8-89dd-4b567f2a3250-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-b7lgc\" (UID: \"e1452749-ce38-41f8-89dd-4b567f2a3250\") " pod="openshift-ovn-kubernetes/ovnkube-node-b7lgc" Jan 11 17:31:09 crc kubenswrapper[4837]: I0111 17:31:09.196536 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rdwmf\" (UniqueName: \"kubernetes.io/projected/37a5e44f-9a88-4405-be8a-b645485e7312-kube-api-access-rdwmf\") pod \"network-operator-58b4c7f79c-55gtf\" (UID: \"37a5e44f-9a88-4405-be8a-b645485e7312\") " pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Jan 11 17:31:09 crc kubenswrapper[4837]: I0111 17:31:09.196553 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/e1452749-ce38-41f8-89dd-4b567f2a3250-systemd-units\") pod \"ovnkube-node-b7lgc\" (UID: \"e1452749-ce38-41f8-89dd-4b567f2a3250\") " pod="openshift-ovn-kubernetes/ovnkube-node-b7lgc" Jan 11 17:31:09 crc kubenswrapper[4837]: I0111 17:31:09.196568 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/e1452749-ce38-41f8-89dd-4b567f2a3250-ovnkube-script-lib\") pod \"ovnkube-node-b7lgc\" (UID: \"e1452749-ce38-41f8-89dd-4b567f2a3250\") " pod="openshift-ovn-kubernetes/ovnkube-node-b7lgc" Jan 11 17:31:09 crc kubenswrapper[4837]: I0111 17:31:09.196584 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/78cc7c3f-09f5-4200-a647-8fa4e9b2aae5-hostroot\") pod \"multus-v5bf5\" (UID: \"78cc7c3f-09f5-4200-a647-8fa4e9b2aae5\") " pod="openshift-multus/multus-v5bf5" Jan 11 17:31:09 crc kubenswrapper[4837]: I0111 17:31:09.196598 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/78cc7c3f-09f5-4200-a647-8fa4e9b2aae5-etc-kubernetes\") pod \"multus-v5bf5\" (UID: \"78cc7c3f-09f5-4200-a647-8fa4e9b2aae5\") " pod="openshift-multus/multus-v5bf5" Jan 11 17:31:09 crc kubenswrapper[4837]: I0111 17:31:09.196615 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/a574a741-71b2-4d60-9553-3d85d815f4b0-hosts-file\") pod \"node-resolver-kcvkg\" (UID: \"a574a741-71b2-4d60-9553-3d85d815f4b0\") " pod="openshift-dns/node-resolver-kcvkg" Jan 11 17:31:09 crc kubenswrapper[4837]: I0111 17:31:09.196631 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xnsnp\" (UniqueName: \"kubernetes.io/projected/a574a741-71b2-4d60-9553-3d85d815f4b0-kube-api-access-xnsnp\") pod \"node-resolver-kcvkg\" (UID: \"a574a741-71b2-4d60-9553-3d85d815f4b0\") " pod="openshift-dns/node-resolver-kcvkg" Jan 11 17:31:09 crc kubenswrapper[4837]: I0111 17:31:09.196649 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/46729771-ba0b-4b7c-8245-b2d57acb5a2c-system-cni-dir\") pod \"multus-additional-cni-plugins-7996w\" (UID: \"46729771-ba0b-4b7c-8245-b2d57acb5a2c\") " pod="openshift-multus/multus-additional-cni-plugins-7996w" Jan 11 17:31:09 crc kubenswrapper[4837]: I0111 17:31:09.196664 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/78cc7c3f-09f5-4200-a647-8fa4e9b2aae5-host-var-lib-cni-multus\") pod \"multus-v5bf5\" (UID: \"78cc7c3f-09f5-4200-a647-8fa4e9b2aae5\") " pod="openshift-multus/multus-v5bf5" Jan 11 17:31:09 crc kubenswrapper[4837]: I0111 17:31:09.196693 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/46729771-ba0b-4b7c-8245-b2d57acb5a2c-os-release\") pod \"multus-additional-cni-plugins-7996w\" (UID: \"46729771-ba0b-4b7c-8245-b2d57acb5a2c\") " pod="openshift-multus/multus-additional-cni-plugins-7996w" Jan 11 17:31:09 crc kubenswrapper[4837]: I0111 17:31:09.196709 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/46729771-ba0b-4b7c-8245-b2d57acb5a2c-cni-binary-copy\") pod \"multus-additional-cni-plugins-7996w\" (UID: \"46729771-ba0b-4b7c-8245-b2d57acb5a2c\") " pod="openshift-multus/multus-additional-cni-plugins-7996w" Jan 11 17:31:09 crc kubenswrapper[4837]: I0111 17:31:09.196725 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rootfs\" (UniqueName: \"kubernetes.io/host-path/1cf6fa66-290a-4e29-bafc-e60185a22fe2-rootfs\") pod \"machine-config-daemon-pqnst\" (UID: \"1cf6fa66-290a-4e29-bafc-e60185a22fe2\") " pod="openshift-machine-config-operator/machine-config-daemon-pqnst" Jan 11 17:31:09 crc kubenswrapper[4837]: I0111 17:31:09.196741 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/e1452749-ce38-41f8-89dd-4b567f2a3250-host-kubelet\") pod \"ovnkube-node-b7lgc\" (UID: \"e1452749-ce38-41f8-89dd-4b567f2a3250\") " pod="openshift-ovn-kubernetes/ovnkube-node-b7lgc" Jan 11 17:31:09 crc kubenswrapper[4837]: I0111 17:31:09.196755 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/e1452749-ce38-41f8-89dd-4b567f2a3250-run-systemd\") pod \"ovnkube-node-b7lgc\" (UID: \"e1452749-ce38-41f8-89dd-4b567f2a3250\") " pod="openshift-ovn-kubernetes/ovnkube-node-b7lgc" Jan 11 17:31:09 crc kubenswrapper[4837]: I0111 17:31:09.196771 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/e1452749-ce38-41f8-89dd-4b567f2a3250-host-run-ovn-kubernetes\") pod \"ovnkube-node-b7lgc\" (UID: \"e1452749-ce38-41f8-89dd-4b567f2a3250\") " pod="openshift-ovn-kubernetes/ovnkube-node-b7lgc" Jan 11 17:31:09 crc kubenswrapper[4837]: I0111 17:31:09.196787 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/e1452749-ce38-41f8-89dd-4b567f2a3250-env-overrides\") pod \"ovnkube-node-b7lgc\" (UID: \"e1452749-ce38-41f8-89dd-4b567f2a3250\") " pod="openshift-ovn-kubernetes/ovnkube-node-b7lgc" Jan 11 17:31:09 crc kubenswrapper[4837]: I0111 17:31:09.196802 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/e1452749-ce38-41f8-89dd-4b567f2a3250-host-slash\") pod \"ovnkube-node-b7lgc\" (UID: \"e1452749-ce38-41f8-89dd-4b567f2a3250\") " pod="openshift-ovn-kubernetes/ovnkube-node-b7lgc" Jan 11 17:31:09 crc kubenswrapper[4837]: I0111 17:31:09.196821 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Jan 11 17:31:09 crc kubenswrapper[4837]: I0111 17:31:09.196839 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/78cc7c3f-09f5-4200-a647-8fa4e9b2aae5-cnibin\") pod \"multus-v5bf5\" (UID: \"78cc7c3f-09f5-4200-a647-8fa4e9b2aae5\") " pod="openshift-multus/multus-v5bf5" Jan 11 17:31:09 crc kubenswrapper[4837]: I0111 17:31:09.196853 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/78cc7c3f-09f5-4200-a647-8fa4e9b2aae5-multus-socket-dir-parent\") pod \"multus-v5bf5\" (UID: \"78cc7c3f-09f5-4200-a647-8fa4e9b2aae5\") " pod="openshift-multus/multus-v5bf5" Jan 11 17:31:09 crc kubenswrapper[4837]: I0111 17:31:09.196869 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"mcd-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/1cf6fa66-290a-4e29-bafc-e60185a22fe2-mcd-auth-proxy-config\") pod \"machine-config-daemon-pqnst\" (UID: \"1cf6fa66-290a-4e29-bafc-e60185a22fe2\") " pod="openshift-machine-config-operator/machine-config-daemon-pqnst" Jan 11 17:31:09 crc kubenswrapper[4837]: I0111 17:31:09.196886 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-host-slash\") pod \"iptables-alerter-4ln5h\" (UID: \"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\") " pod="openshift-network-operator/iptables-alerter-4ln5h" Jan 11 17:31:09 crc kubenswrapper[4837]: I0111 17:31:09.196902 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6f6rv\" (UniqueName: \"kubernetes.io/projected/e1452749-ce38-41f8-89dd-4b567f2a3250-kube-api-access-6f6rv\") pod \"ovnkube-node-b7lgc\" (UID: \"e1452749-ce38-41f8-89dd-4b567f2a3250\") " pod="openshift-ovn-kubernetes/ovnkube-node-b7lgc" Jan 11 17:31:09 crc kubenswrapper[4837]: I0111 17:31:09.196918 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/46729771-ba0b-4b7c-8245-b2d57acb5a2c-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-7996w\" (UID: \"46729771-ba0b-4b7c-8245-b2d57acb5a2c\") " pod="openshift-multus/multus-additional-cni-plugins-7996w" Jan 11 17:31:09 crc kubenswrapper[4837]: I0111 17:31:09.196933 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/78cc7c3f-09f5-4200-a647-8fa4e9b2aae5-multus-cni-dir\") pod \"multus-v5bf5\" (UID: \"78cc7c3f-09f5-4200-a647-8fa4e9b2aae5\") " pod="openshift-multus/multus-v5bf5" Jan 11 17:31:09 crc kubenswrapper[4837]: I0111 17:31:09.196952 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rczfb\" (UniqueName: \"kubernetes.io/projected/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-kube-api-access-rczfb\") pod \"iptables-alerter-4ln5h\" (UID: \"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\") " pod="openshift-network-operator/iptables-alerter-4ln5h" Jan 11 17:31:09 crc kubenswrapper[4837]: I0111 17:31:09.196967 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/e1452749-ce38-41f8-89dd-4b567f2a3250-host-cni-netd\") pod \"ovnkube-node-b7lgc\" (UID: \"e1452749-ce38-41f8-89dd-4b567f2a3250\") " pod="openshift-ovn-kubernetes/ovnkube-node-b7lgc" Jan 11 17:31:09 crc kubenswrapper[4837]: I0111 17:31:09.196984 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/e1452749-ce38-41f8-89dd-4b567f2a3250-ovnkube-config\") pod \"ovnkube-node-b7lgc\" (UID: \"e1452749-ce38-41f8-89dd-4b567f2a3250\") " pod="openshift-ovn-kubernetes/ovnkube-node-b7lgc" Jan 11 17:31:09 crc kubenswrapper[4837]: I0111 17:31:09.197003 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Jan 11 17:31:09 crc kubenswrapper[4837]: I0111 17:31:09.197020 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/78cc7c3f-09f5-4200-a647-8fa4e9b2aae5-os-release\") pod \"multus-v5bf5\" (UID: \"78cc7c3f-09f5-4200-a647-8fa4e9b2aae5\") " pod="openshift-multus/multus-v5bf5" Jan 11 17:31:09 crc kubenswrapper[4837]: I0111 17:31:09.197038 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/ef543e1b-8068-4ea3-b32a-61027b32e95d-webhook-cert\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Jan 11 17:31:09 crc kubenswrapper[4837]: I0111 17:31:09.197057 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovnkube-identity-cm\" (UniqueName: \"kubernetes.io/configmap/ef543e1b-8068-4ea3-b32a-61027b32e95d-ovnkube-identity-cm\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Jan 11 17:31:09 crc kubenswrapper[4837]: I0111 17:31:09.197075 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Jan 11 17:31:09 crc kubenswrapper[4837]: I0111 17:31:09.197092 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/e1452749-ce38-41f8-89dd-4b567f2a3250-run-ovn\") pod \"ovnkube-node-b7lgc\" (UID: \"e1452749-ce38-41f8-89dd-4b567f2a3250\") " pod="openshift-ovn-kubernetes/ovnkube-node-b7lgc" Jan 11 17:31:09 crc kubenswrapper[4837]: I0111 17:31:09.197108 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-etc-kube\" (UniqueName: \"kubernetes.io/host-path/37a5e44f-9a88-4405-be8a-b645485e7312-host-etc-kube\") pod \"network-operator-58b4c7f79c-55gtf\" (UID: \"37a5e44f-9a88-4405-be8a-b645485e7312\") " pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Jan 11 17:31:09 crc kubenswrapper[4837]: I0111 17:31:09.197124 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/46729771-ba0b-4b7c-8245-b2d57acb5a2c-cnibin\") pod \"multus-additional-cni-plugins-7996w\" (UID: \"46729771-ba0b-4b7c-8245-b2d57acb5a2c\") " pod="openshift-multus/multus-additional-cni-plugins-7996w" Jan 11 17:31:09 crc kubenswrapper[4837]: I0111 17:31:09.197143 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2kz5\" (UniqueName: \"kubernetes.io/projected/ef543e1b-8068-4ea3-b32a-61027b32e95d-kube-api-access-s2kz5\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Jan 11 17:31:09 crc kubenswrapper[4837]: I0111 17:31:09.197162 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/78cc7c3f-09f5-4200-a647-8fa4e9b2aae5-host-run-k8s-cni-cncf-io\") pod \"multus-v5bf5\" (UID: \"78cc7c3f-09f5-4200-a647-8fa4e9b2aae5\") " pod="openshift-multus/multus-v5bf5" Jan 11 17:31:09 crc kubenswrapper[4837]: I0111 17:31:09.197180 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/ef543e1b-8068-4ea3-b32a-61027b32e95d-env-overrides\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Jan 11 17:31:09 crc kubenswrapper[4837]: I0111 17:31:09.197197 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wgdql\" (UniqueName: \"kubernetes.io/projected/1cf6fa66-290a-4e29-bafc-e60185a22fe2-kube-api-access-wgdql\") pod \"machine-config-daemon-pqnst\" (UID: \"1cf6fa66-290a-4e29-bafc-e60185a22fe2\") " pod="openshift-machine-config-operator/machine-config-daemon-pqnst" Jan 11 17:31:09 crc kubenswrapper[4837]: I0111 17:31:09.197244 4837 reconciler_common.go:293] "Volume detached for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-metrics-tls\") on node \"crc\" DevicePath \"\"" Jan 11 17:31:09 crc kubenswrapper[4837]: I0111 17:31:09.197255 4837 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-serving-cert\") on node \"crc\" DevicePath \"\"" Jan 11 17:31:09 crc kubenswrapper[4837]: I0111 17:31:09.197265 4837 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-serving-cert\") on node \"crc\" DevicePath \"\"" Jan 11 17:31:09 crc kubenswrapper[4837]: I0111 17:31:09.197275 4837 reconciler_common.go:293] "Volume detached for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-audit-policies\") on node \"crc\" DevicePath \"\"" Jan 11 17:31:09 crc kubenswrapper[4837]: I0111 17:31:09.197287 4837 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qs4fp\" (UniqueName: \"kubernetes.io/projected/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-kube-api-access-qs4fp\") on node \"crc\" DevicePath \"\"" Jan 11 17:31:09 crc kubenswrapper[4837]: I0111 17:31:09.193647 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/0b574797-001e-440a-8f4e-c0be86edad0f-mcc-auth-proxy-config" (OuterVolumeSpecName: "mcc-auth-proxy-config") pod "0b574797-001e-440a-8f4e-c0be86edad0f" (UID: "0b574797-001e-440a-8f4e-c0be86edad0f"). InnerVolumeSpecName "mcc-auth-proxy-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 11 17:31:09 crc kubenswrapper[4837]: I0111 17:31:09.193812 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-kube-api-access-ngvvp" (OuterVolumeSpecName: "kube-api-access-ngvvp") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "kube-api-access-ngvvp". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 11 17:31:09 crc kubenswrapper[4837]: I0111 17:31:09.193921 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-client" (OuterVolumeSpecName: "etcd-client") pod "09efc573-dbb6-4249-bd59-9b87aba8dd28" (UID: "09efc573-dbb6-4249-bd59-9b87aba8dd28"). InnerVolumeSpecName "etcd-client". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 11 17:31:09 crc kubenswrapper[4837]: I0111 17:31:09.194236 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-trusted-ca-bundle" (OuterVolumeSpecName: "trusted-ca-bundle") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 11 17:31:09 crc kubenswrapper[4837]: I0111 17:31:09.199686 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7bb08738-c794-4ee8-9972-3a62ca171029-kube-api-access-279lb" (OuterVolumeSpecName: "kube-api-access-279lb") pod "7bb08738-c794-4ee8-9972-3a62ca171029" (UID: "7bb08738-c794-4ee8-9972-3a62ca171029"). InnerVolumeSpecName "kube-api-access-279lb". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 11 17:31:09 crc kubenswrapper[4837]: I0111 17:31:09.194286 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-client-ca" (OuterVolumeSpecName: "client-ca") pod "5441d097-087c-4d9a-baa8-b210afa90fc9" (UID: "5441d097-087c-4d9a-baa8-b210afa90fc9"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 11 17:31:09 crc kubenswrapper[4837]: I0111 17:31:09.194434 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-idp-0-file-data" (OuterVolumeSpecName: "v4-0-config-user-idp-0-file-data") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-user-idp-0-file-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 11 17:31:09 crc kubenswrapper[4837]: I0111 17:31:09.194645 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/fda69060-fa79-4696-b1a6-7980f124bf7c-mcd-auth-proxy-config" (OuterVolumeSpecName: "mcd-auth-proxy-config") pod "fda69060-fa79-4696-b1a6-7980f124bf7c" (UID: "fda69060-fa79-4696-b1a6-7980f124bf7c"). InnerVolumeSpecName "mcd-auth-proxy-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 11 17:31:09 crc kubenswrapper[4837]: I0111 17:31:09.194707 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-config" (OuterVolumeSpecName: "config") pod "7583ce53-e0fe-4a16-9e4d-50516596a136" (UID: "7583ce53-e0fe-4a16-9e4d-50516596a136"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 11 17:31:09 crc kubenswrapper[4837]: I0111 17:31:09.194833 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/44663579-783b-4372-86d6-acf235a62d72-kube-api-access-vt5rc" (OuterVolumeSpecName: "kube-api-access-vt5rc") pod "44663579-783b-4372-86d6-acf235a62d72" (UID: "44663579-783b-4372-86d6-acf235a62d72"). InnerVolumeSpecName "kube-api-access-vt5rc". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 11 17:31:09 crc kubenswrapper[4837]: I0111 17:31:09.194964 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-node-bootstrap-token" (OuterVolumeSpecName: "node-bootstrap-token") pod "5fe579f8-e8a6-4643-bce5-a661393c4dde" (UID: "5fe579f8-e8a6-4643-bce5-a661393c4dde"). InnerVolumeSpecName "node-bootstrap-token". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 11 17:31:09 crc kubenswrapper[4837]: I0111 17:31:09.195029 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-trusted-ca" (OuterVolumeSpecName: "trusted-ca") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "trusted-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 11 17:31:09 crc kubenswrapper[4837]: I0111 17:31:09.195129 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e7e6199b-1264-4501-8953-767f51328d08-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "e7e6199b-1264-4501-8953-767f51328d08" (UID: "e7e6199b-1264-4501-8953-767f51328d08"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 11 17:31:09 crc kubenswrapper[4837]: I0111 17:31:09.195244 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-client-ca" (OuterVolumeSpecName: "client-ca") pod "7583ce53-e0fe-4a16-9e4d-50516596a136" (UID: "7583ce53-e0fe-4a16-9e4d-50516596a136"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 11 17:31:09 crc kubenswrapper[4837]: I0111 17:31:09.195592 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a31745f5-9847-4afe-82a5-3161cc66ca93-trusted-ca" (OuterVolumeSpecName: "trusted-ca") pod "a31745f5-9847-4afe-82a5-3161cc66ca93" (UID: "a31745f5-9847-4afe-82a5-3161cc66ca93"). InnerVolumeSpecName "trusted-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 11 17:31:09 crc kubenswrapper[4837]: I0111 17:31:09.196341 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-service-ca" (OuterVolumeSpecName: "v4-0-config-system-service-ca") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-system-service-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 11 17:31:09 crc kubenswrapper[4837]: I0111 17:31:09.196439 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bd23aa5c-e532-4e53-bccf-e79f130c5ae8-kube-api-access-jhbk2" (OuterVolumeSpecName: "kube-api-access-jhbk2") pod "bd23aa5c-e532-4e53-bccf-e79f130c5ae8" (UID: "bd23aa5c-e532-4e53-bccf-e79f130c5ae8"). InnerVolumeSpecName "kube-api-access-jhbk2". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 11 17:31:09 crc kubenswrapper[4837]: I0111 17:31:09.196719 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-cliconfig" (OuterVolumeSpecName: "v4-0-config-system-cliconfig") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-system-cliconfig". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 11 17:31:09 crc kubenswrapper[4837]: I0111 17:31:09.196964 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/87cf06ed-a83f-41a7-828d-70653580a8cb-metrics-tls" (OuterVolumeSpecName: "metrics-tls") pod "87cf06ed-a83f-41a7-828d-70653580a8cb" (UID: "87cf06ed-a83f-41a7-828d-70653580a8cb"). InnerVolumeSpecName "metrics-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 11 17:31:09 crc kubenswrapper[4837]: I0111 17:31:09.197052 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6509e943-70c6-444c-bc41-48a544e36fbd-kube-api-access-6g6sz" (OuterVolumeSpecName: "kube-api-access-6g6sz") pod "6509e943-70c6-444c-bc41-48a544e36fbd" (UID: "6509e943-70c6-444c-bc41-48a544e36fbd"). InnerVolumeSpecName "kube-api-access-6g6sz". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 11 17:31:09 crc kubenswrapper[4837]: I0111 17:31:09.197141 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-kube-api-access-rnphk" (OuterVolumeSpecName: "kube-api-access-rnphk") pod "bf126b07-da06-4140-9a57-dfd54fc6b486" (UID: "bf126b07-da06-4140-9a57-dfd54fc6b486"). InnerVolumeSpecName "kube-api-access-rnphk". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 11 17:31:09 crc kubenswrapper[4837]: I0111 17:31:09.197310 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6402fda4-df10-493c-b4e5-d0569419652d-kube-api-access-mg5zb" (OuterVolumeSpecName: "kube-api-access-mg5zb") pod "6402fda4-df10-493c-b4e5-d0569419652d" (UID: "6402fda4-df10-493c-b4e5-d0569419652d"). InnerVolumeSpecName "kube-api-access-mg5zb". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 11 17:31:09 crc kubenswrapper[4837]: I0111 17:31:09.197344 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/bf126b07-da06-4140-9a57-dfd54fc6b486-trusted-ca" (OuterVolumeSpecName: "trusted-ca") pod "bf126b07-da06-4140-9a57-dfd54fc6b486" (UID: "bf126b07-da06-4140-9a57-dfd54fc6b486"). InnerVolumeSpecName "trusted-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 11 17:31:09 crc kubenswrapper[4837]: I0111 17:31:09.197514 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-kube-api-access-x4zgh" (OuterVolumeSpecName: "kube-api-access-x4zgh") pod "b11524ee-3fca-4b1b-9cdf-6da289fdbc7d" (UID: "b11524ee-3fca-4b1b-9cdf-6da289fdbc7d"). InnerVolumeSpecName "kube-api-access-x4zgh". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 11 17:31:09 crc kubenswrapper[4837]: I0111 17:31:09.197544 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/496e6271-fb68-4057-954e-a0d97a4afa3f-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "496e6271-fb68-4057-954e-a0d97a4afa3f" (UID: "496e6271-fb68-4057-954e-a0d97a4afa3f"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 11 17:31:09 crc kubenswrapper[4837]: I0111 17:31:09.197702 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c03ee662-fb2f-4fc4-a2c1-af487c19d254-kube-api-access-v47cf" (OuterVolumeSpecName: "kube-api-access-v47cf") pod "c03ee662-fb2f-4fc4-a2c1-af487c19d254" (UID: "c03ee662-fb2f-4fc4-a2c1-af487c19d254"). InnerVolumeSpecName "kube-api-access-v47cf". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 11 17:31:09 crc kubenswrapper[4837]: I0111 17:31:09.197733 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-client" (OuterVolumeSpecName: "etcd-client") pod "09ae3b1a-e8e7-4524-b54b-61eab6f9239a" (UID: "09ae3b1a-e8e7-4524-b54b-61eab6f9239a"). InnerVolumeSpecName "etcd-client". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 11 17:31:09 crc kubenswrapper[4837]: I0111 17:31:09.197961 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7539238d-5fe0-46ed-884e-1c3b566537ec-config" (OuterVolumeSpecName: "config") pod "7539238d-5fe0-46ed-884e-1c3b566537ec" (UID: "7539238d-5fe0-46ed-884e-1c3b566537ec"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 11 17:31:09 crc kubenswrapper[4837]: I0111 17:31:09.198009 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-auth-proxy-config" (OuterVolumeSpecName: "auth-proxy-config") pod "31d8b7a1-420e-4252-a5b7-eebe8a111292" (UID: "31d8b7a1-420e-4252-a5b7-eebe8a111292"). InnerVolumeSpecName "auth-proxy-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 11 17:31:09 crc kubenswrapper[4837]: I0111 17:31:09.198032 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-env-overrides" (OuterVolumeSpecName: "env-overrides") pod "925f1c65-6136-48ba-85aa-3a3b50560753" (UID: "925f1c65-6136-48ba-85aa-3a3b50560753"). InnerVolumeSpecName "env-overrides". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 11 17:31:09 crc kubenswrapper[4837]: I0111 17:31:09.198195 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b6312bbd-5731-4ea0-a20f-81d5a57df44a-kube-api-access-249nr" (OuterVolumeSpecName: "kube-api-access-249nr") pod "b6312bbd-5731-4ea0-a20f-81d5a57df44a" (UID: "b6312bbd-5731-4ea0-a20f-81d5a57df44a"). InnerVolumeSpecName "kube-api-access-249nr". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 11 17:31:09 crc kubenswrapper[4837]: I0111 17:31:09.198258 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5225d0e4-402f-4861-b410-819f433b1803-kube-api-access-9xfj7" (OuterVolumeSpecName: "kube-api-access-9xfj7") pod "5225d0e4-402f-4861-b410-819f433b1803" (UID: "5225d0e4-402f-4861-b410-819f433b1803"). InnerVolumeSpecName "kube-api-access-9xfj7". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 11 17:31:09 crc kubenswrapper[4837]: I0111 17:31:09.198268 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7539238d-5fe0-46ed-884e-1c3b566537ec-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "7539238d-5fe0-46ed-884e-1c3b566537ec" (UID: "7539238d-5fe0-46ed-884e-1c3b566537ec"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 11 17:31:09 crc kubenswrapper[4837]: I0111 17:31:09.198718 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-utilities" (OuterVolumeSpecName: "utilities") pod "57a731c4-ef35-47a8-b875-bfb08a7f8011" (UID: "57a731c4-ef35-47a8-b875-bfb08a7f8011"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 11 17:31:09 crc kubenswrapper[4837]: I0111 17:31:09.198724 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-registry-certificates" (OuterVolumeSpecName: "registry-certificates") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "registry-certificates". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 11 17:31:09 crc kubenswrapper[4837]: I0111 17:31:09.198742 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-image-import-ca" (OuterVolumeSpecName: "image-import-ca") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "image-import-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 11 17:31:09 crc kubenswrapper[4837]: I0111 17:31:09.198981 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-client" (OuterVolumeSpecName: "etcd-client") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "etcd-client". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 11 17:31:09 crc kubenswrapper[4837]: I0111 17:31:09.199845 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-registry-tls" (OuterVolumeSpecName: "registry-tls") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "registry-tls". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 11 17:31:09 crc kubenswrapper[4837]: I0111 17:31:09.199171 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-audit" (OuterVolumeSpecName: "audit") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "audit". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 11 17:31:09 crc kubenswrapper[4837]: I0111 17:31:09.199183 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-kube-api-access-dbsvg" (OuterVolumeSpecName: "kube-api-access-dbsvg") pod "f88749ec-7931-4ee7-b3fc-1ec5e11f92e9" (UID: "f88749ec-7931-4ee7-b3fc-1ec5e11f92e9"). InnerVolumeSpecName "kube-api-access-dbsvg". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 11 17:31:09 crc kubenswrapper[4837]: I0111 17:31:09.199211 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-console-config" (OuterVolumeSpecName: "console-config") pod "43509403-f426-496e-be36-56cef71462f5" (UID: "43509403-f426-496e-be36-56cef71462f5"). InnerVolumeSpecName "console-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 11 17:31:09 crc kubenswrapper[4837]: I0111 17:31:09.199876 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-service-ca" (OuterVolumeSpecName: "etcd-service-ca") pod "09efc573-dbb6-4249-bd59-9b87aba8dd28" (UID: "09efc573-dbb6-4249-bd59-9b87aba8dd28"). InnerVolumeSpecName "etcd-service-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 11 17:31:09 crc kubenswrapper[4837]: I0111 17:31:09.199360 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-encryption-config" (OuterVolumeSpecName: "encryption-config") pod "09ae3b1a-e8e7-4524-b54b-61eab6f9239a" (UID: "09ae3b1a-e8e7-4524-b54b-61eab6f9239a"). InnerVolumeSpecName "encryption-config". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 11 17:31:09 crc kubenswrapper[4837]: I0111 17:31:09.199428 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7539238d-5fe0-46ed-884e-1c3b566537ec-kube-api-access-tk88c" (OuterVolumeSpecName: "kube-api-access-tk88c") pod "7539238d-5fe0-46ed-884e-1c3b566537ec" (UID: "7539238d-5fe0-46ed-884e-1c3b566537ec"). InnerVolumeSpecName "kube-api-access-tk88c". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 11 17:31:09 crc kubenswrapper[4837]: I0111 17:31:09.199486 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-login" (OuterVolumeSpecName: "v4-0-config-user-template-login") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-user-template-login". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 11 17:31:09 crc kubenswrapper[4837]: I0111 17:31:09.199611 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-kube-api-access-lz9wn" (OuterVolumeSpecName: "kube-api-access-lz9wn") pod "a31745f5-9847-4afe-82a5-3161cc66ca93" (UID: "a31745f5-9847-4afe-82a5-3161cc66ca93"). InnerVolumeSpecName "kube-api-access-lz9wn". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 11 17:31:09 crc kubenswrapper[4837]: I0111 17:31:09.200044 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/22c825df-677d-4ca6-82db-3454ed06e783-machine-approver-tls" (OuterVolumeSpecName: "machine-approver-tls") pod "22c825df-677d-4ca6-82db-3454ed06e783" (UID: "22c825df-677d-4ca6-82db-3454ed06e783"). InnerVolumeSpecName "machine-approver-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 11 17:31:09 crc kubenswrapper[4837]: I0111 17:31:09.200182 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-images" (OuterVolumeSpecName: "images") pod "31d8b7a1-420e-4252-a5b7-eebe8a111292" (UID: "31d8b7a1-420e-4252-a5b7-eebe8a111292"). InnerVolumeSpecName "images". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 11 17:31:09 crc kubenswrapper[4837]: I0111 17:31:09.200284 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6ea678ab-3438-413e-bfe3-290ae7725660-kube-api-access-htfz6" (OuterVolumeSpecName: "kube-api-access-htfz6") pod "6ea678ab-3438-413e-bfe3-290ae7725660" (UID: "6ea678ab-3438-413e-bfe3-290ae7725660"). InnerVolumeSpecName "kube-api-access-htfz6". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 11 17:31:09 crc kubenswrapper[4837]: I0111 17:31:09.200429 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-kube-api-access-kfwg7" (OuterVolumeSpecName: "kube-api-access-kfwg7") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "kube-api-access-kfwg7". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 11 17:31:09 crc kubenswrapper[4837]: I0111 17:31:09.200623 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "8cea82b4-6893-4ddc-af9f-1bb5ae425c5b" (UID: "8cea82b4-6893-4ddc-af9f-1bb5ae425c5b"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 11 17:31:09 crc kubenswrapper[4837]: I0111 17:31:09.200791 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-serving-cert" (OuterVolumeSpecName: "console-serving-cert") pod "43509403-f426-496e-be36-56cef71462f5" (UID: "43509403-f426-496e-be36-56cef71462f5"). InnerVolumeSpecName "console-serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 11 17:31:09 crc kubenswrapper[4837]: I0111 17:31:09.200910 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-encryption-config" (OuterVolumeSpecName: "encryption-config") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "encryption-config". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 11 17:31:09 crc kubenswrapper[4837]: I0111 17:31:09.201030 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a31745f5-9847-4afe-82a5-3161cc66ca93-metrics-tls" (OuterVolumeSpecName: "metrics-tls") pod "a31745f5-9847-4afe-82a5-3161cc66ca93" (UID: "a31745f5-9847-4afe-82a5-3161cc66ca93"). InnerVolumeSpecName "metrics-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 11 17:31:09 crc kubenswrapper[4837]: I0111 17:31:09.201155 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e7e6199b-1264-4501-8953-767f51328d08-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "e7e6199b-1264-4501-8953-767f51328d08" (UID: "e7e6199b-1264-4501-8953-767f51328d08"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 11 17:31:09 crc kubenswrapper[4837]: I0111 17:31:09.201918 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d-kube-api-access-x2m85" (OuterVolumeSpecName: "kube-api-access-x2m85") pod "cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d" (UID: "cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d"). InnerVolumeSpecName "kube-api-access-x2m85". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 11 17:31:09 crc kubenswrapper[4837]: I0111 17:31:09.202090 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-config" (OuterVolumeSpecName: "config") pod "8cea82b4-6893-4ddc-af9f-1bb5ae425c5b" (UID: "8cea82b4-6893-4ddc-af9f-1bb5ae425c5b"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 11 17:31:09 crc kubenswrapper[4837]: I0111 17:31:09.202120 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-kube-api-access-gf66m" (OuterVolumeSpecName: "kube-api-access-gf66m") pod "a0128f3a-b052-44ed-a84e-c4c8aaf17c13" (UID: "a0128f3a-b052-44ed-a84e-c4c8aaf17c13"). InnerVolumeSpecName "kube-api-access-gf66m". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 11 17:31:09 crc kubenswrapper[4837]: I0111 17:31:09.202235 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-kube-api-access-w4xd4" (OuterVolumeSpecName: "kube-api-access-w4xd4") pod "8cea82b4-6893-4ddc-af9f-1bb5ae425c5b" (UID: "8cea82b4-6893-4ddc-af9f-1bb5ae425c5b"). InnerVolumeSpecName "kube-api-access-w4xd4". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 11 17:31:09 crc kubenswrapper[4837]: I0111 17:31:09.202309 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/57a731c4-ef35-47a8-b875-bfb08a7f8011-kube-api-access-cfbct" (OuterVolumeSpecName: "kube-api-access-cfbct") pod "57a731c4-ef35-47a8-b875-bfb08a7f8011" (UID: "57a731c4-ef35-47a8-b875-bfb08a7f8011"). InnerVolumeSpecName "kube-api-access-cfbct". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 11 17:31:09 crc kubenswrapper[4837]: I0111 17:31:09.202491 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-config" (OuterVolumeSpecName: "config") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 11 17:31:09 crc kubenswrapper[4837]: I0111 17:31:09.202624 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-srv-cert" (OuterVolumeSpecName: "srv-cert") pod "f88749ec-7931-4ee7-b3fc-1ec5e11f92e9" (UID: "f88749ec-7931-4ee7-b3fc-1ec5e11f92e9"). InnerVolumeSpecName "srv-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 11 17:31:09 crc kubenswrapper[4837]: I0111 17:31:09.202889 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1386a44e-36a2-460c-96d0-0359d2b6f0f5-config" (OuterVolumeSpecName: "config") pod "1386a44e-36a2-460c-96d0-0359d2b6f0f5" (UID: "1386a44e-36a2-460c-96d0-0359d2b6f0f5"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 11 17:31:09 crc kubenswrapper[4837]: I0111 17:31:09.203321 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-available-featuregates" (OuterVolumeSpecName: "available-featuregates") pod "bc5039c0-ea34-426b-a2b7-fbbc87b49a6d" (UID: "bc5039c0-ea34-426b-a2b7-fbbc87b49a6d"). InnerVolumeSpecName "available-featuregates". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 11 17:31:09 crc kubenswrapper[4837]: I0111 17:31:09.203507 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-cni-binary-copy" (OuterVolumeSpecName: "cni-binary-copy") pod "4bb40260-dbaa-4fb0-84df-5e680505d512" (UID: "4bb40260-dbaa-4fb0-84df-5e680505d512"). InnerVolumeSpecName "cni-binary-copy". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 11 17:31:09 crc kubenswrapper[4837]: I0111 17:31:09.203704 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-kube-api-access-mnrrd" (OuterVolumeSpecName: "kube-api-access-mnrrd") pod "bc5039c0-ea34-426b-a2b7-fbbc87b49a6d" (UID: "bc5039c0-ea34-426b-a2b7-fbbc87b49a6d"). InnerVolumeSpecName "kube-api-access-mnrrd". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 11 17:31:09 crc kubenswrapper[4837]: I0111 17:31:09.203876 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-certs" (OuterVolumeSpecName: "certs") pod "5fe579f8-e8a6-4643-bce5-a661393c4dde" (UID: "5fe579f8-e8a6-4643-bce5-a661393c4dde"). InnerVolumeSpecName "certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 11 17:31:09 crc kubenswrapper[4837]: I0111 17:31:09.203959 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5441d097-087c-4d9a-baa8-b210afa90fc9-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "5441d097-087c-4d9a-baa8-b210afa90fc9" (UID: "5441d097-087c-4d9a-baa8-b210afa90fc9"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 11 17:31:09 crc kubenswrapper[4837]: I0111 17:31:09.204183 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1386a44e-36a2-460c-96d0-0359d2b6f0f5-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "1386a44e-36a2-460c-96d0-0359d2b6f0f5" (UID: "1386a44e-36a2-460c-96d0-0359d2b6f0f5"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 11 17:31:09 crc kubenswrapper[4837]: I0111 17:31:09.204240 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-script-lib" (OuterVolumeSpecName: "ovnkube-script-lib") pod "6ea678ab-3438-413e-bfe3-290ae7725660" (UID: "6ea678ab-3438-413e-bfe3-290ae7725660"). InnerVolumeSpecName "ovnkube-script-lib". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 11 17:31:09 crc kubenswrapper[4837]: I0111 17:31:09.204365 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-trusted-ca" (OuterVolumeSpecName: "trusted-ca") pod "9d4552c7-cd75-42dd-8880-30dd377c49a4" (UID: "9d4552c7-cd75-42dd-8880-30dd377c49a4"). InnerVolumeSpecName "trusted-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 11 17:31:09 crc kubenswrapper[4837]: I0111 17:31:09.204395 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/925f1c65-6136-48ba-85aa-3a3b50560753-kube-api-access-s4n52" (OuterVolumeSpecName: "kube-api-access-s4n52") pod "925f1c65-6136-48ba-85aa-3a3b50560753" (UID: "925f1c65-6136-48ba-85aa-3a3b50560753"). InnerVolumeSpecName "kube-api-access-s4n52". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 11 17:31:09 crc kubenswrapper[4837]: I0111 17:31:09.204624 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/fda69060-fa79-4696-b1a6-7980f124bf7c-kube-api-access-xcgwh" (OuterVolumeSpecName: "kube-api-access-xcgwh") pod "fda69060-fa79-4696-b1a6-7980f124bf7c" (UID: "fda69060-fa79-4696-b1a6-7980f124bf7c"). InnerVolumeSpecName "kube-api-access-xcgwh". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 11 17:31:09 crc kubenswrapper[4837]: I0111 17:31:09.204827 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/09efc573-dbb6-4249-bd59-9b87aba8dd28-kube-api-access-8tdtz" (OuterVolumeSpecName: "kube-api-access-8tdtz") pod "09efc573-dbb6-4249-bd59-9b87aba8dd28" (UID: "09efc573-dbb6-4249-bd59-9b87aba8dd28"). InnerVolumeSpecName "kube-api-access-8tdtz". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 11 17:31:09 crc kubenswrapper[4837]: I0111 17:31:09.205444 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-serviceca" (OuterVolumeSpecName: "serviceca") pod "3cb93b32-e0ae-4377-b9c8-fdb9842c6d59" (UID: "3cb93b32-e0ae-4377-b9c8-fdb9842c6d59"). InnerVolumeSpecName "serviceca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 11 17:31:09 crc kubenswrapper[4837]: I0111 17:31:09.205725 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6731426b-95fe-49ff-bb5f-40441049fde2-kube-api-access-x7zkh" (OuterVolumeSpecName: "kube-api-access-x7zkh") pod "6731426b-95fe-49ff-bb5f-40441049fde2" (UID: "6731426b-95fe-49ff-bb5f-40441049fde2"). InnerVolumeSpecName "kube-api-access-x7zkh". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 11 17:31:09 crc kubenswrapper[4837]: I0111 17:31:09.205886 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-utilities" (OuterVolumeSpecName: "utilities") pod "5225d0e4-402f-4861-b410-819f433b1803" (UID: "5225d0e4-402f-4861-b410-819f433b1803"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 11 17:31:09 crc kubenswrapper[4837]: I0111 17:31:09.205960 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5441d097-087c-4d9a-baa8-b210afa90fc9-kube-api-access-2d4wz" (OuterVolumeSpecName: "kube-api-access-2d4wz") pod "5441d097-087c-4d9a-baa8-b210afa90fc9" (UID: "5441d097-087c-4d9a-baa8-b210afa90fc9"). InnerVolumeSpecName "kube-api-access-2d4wz". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 11 17:31:09 crc kubenswrapper[4837]: I0111 17:31:09.206181 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-serving-ca" (OuterVolumeSpecName: "etcd-serving-ca") pod "09ae3b1a-e8e7-4524-b54b-61eab6f9239a" (UID: "09ae3b1a-e8e7-4524-b54b-61eab6f9239a"). InnerVolumeSpecName "etcd-serving-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 11 17:31:09 crc kubenswrapper[4837]: I0111 17:31:09.206252 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-config" (OuterVolumeSpecName: "config") pod "6402fda4-df10-493c-b4e5-d0569419652d" (UID: "6402fda4-df10-493c-b4e5-d0569419652d"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 11 17:31:09 crc kubenswrapper[4837]: I0111 17:31:09.206283 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-audit-policies" (OuterVolumeSpecName: "audit-policies") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "audit-policies". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 11 17:31:09 crc kubenswrapper[4837]: I0111 17:31:09.206337 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1bf7eb37-55a3-4c65-b768-a94c82151e69-kube-api-access-sb6h7" (OuterVolumeSpecName: "kube-api-access-sb6h7") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "kube-api-access-sb6h7". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 11 17:31:09 crc kubenswrapper[4837]: I0111 17:31:09.206691 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-metrics-certs" (OuterVolumeSpecName: "metrics-certs") pod "c03ee662-fb2f-4fc4-a2c1-af487c19d254" (UID: "c03ee662-fb2f-4fc4-a2c1-af487c19d254"). InnerVolumeSpecName "metrics-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 11 17:31:09 crc kubenswrapper[4837]: E0111 17:31:09.206744 4837 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-01-11 17:31:09.70672427 +0000 UTC m=+43.884917056 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 11 17:31:09 crc kubenswrapper[4837]: I0111 17:31:09.206751 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/31d8b7a1-420e-4252-a5b7-eebe8a111292-proxy-tls" (OuterVolumeSpecName: "proxy-tls") pod "31d8b7a1-420e-4252-a5b7-eebe8a111292" (UID: "31d8b7a1-420e-4252-a5b7-eebe8a111292"). InnerVolumeSpecName "proxy-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 11 17:31:09 crc kubenswrapper[4837]: I0111 17:31:09.207038 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-default-certificate" (OuterVolumeSpecName: "default-certificate") pod "c03ee662-fb2f-4fc4-a2c1-af487c19d254" (UID: "c03ee662-fb2f-4fc4-a2c1-af487c19d254"). InnerVolumeSpecName "default-certificate". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 11 17:31:09 crc kubenswrapper[4837]: I0111 17:31:09.207056 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0b78653f-4ff9-4508-8672-245ed9b561e3-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "0b78653f-4ff9-4508-8672-245ed9b561e3" (UID: "0b78653f-4ff9-4508-8672-245ed9b561e3"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 11 17:31:09 crc kubenswrapper[4837]: I0111 17:31:09.208964 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-samples-operator-tls" (OuterVolumeSpecName: "samples-operator-tls") pod "a0128f3a-b052-44ed-a84e-c4c8aaf17c13" (UID: "a0128f3a-b052-44ed-a84e-c4c8aaf17c13"). InnerVolumeSpecName "samples-operator-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 11 17:31:09 crc kubenswrapper[4837]: I0111 17:31:09.209355 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8f668bae-612b-4b75-9490-919e737c6a3b-installation-pull-secrets" (OuterVolumeSpecName: "installation-pull-secrets") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "installation-pull-secrets". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 11 17:31:09 crc kubenswrapper[4837]: I0111 17:31:09.209737 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-service-ca-bundle" (OuterVolumeSpecName: "service-ca-bundle") pod "6509e943-70c6-444c-bc41-48a544e36fbd" (UID: "6509e943-70c6-444c-bc41-48a544e36fbd"). InnerVolumeSpecName "service-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 11 17:31:09 crc kubenswrapper[4837]: I0111 17:31:09.209807 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-env-overrides" (OuterVolumeSpecName: "env-overrides") pod "6ea678ab-3438-413e-bfe3-290ae7725660" (UID: "6ea678ab-3438-413e-bfe3-290ae7725660"). InnerVolumeSpecName "env-overrides". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 11 17:31:09 crc kubenswrapper[4837]: I0111 17:31:09.209981 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5b88f790-22fa-440e-b583-365168c0b23d-kube-api-access-jkwtn" (OuterVolumeSpecName: "kube-api-access-jkwtn") pod "5b88f790-22fa-440e-b583-365168c0b23d" (UID: "5b88f790-22fa-440e-b583-365168c0b23d"). InnerVolumeSpecName "kube-api-access-jkwtn". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 11 17:31:09 crc kubenswrapper[4837]: I0111 17:31:09.210171 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-serving-ca" (OuterVolumeSpecName: "etcd-serving-ca") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "etcd-serving-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 11 17:31:09 crc kubenswrapper[4837]: I0111 17:31:09.210291 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-trusted-ca-bundle" (OuterVolumeSpecName: "trusted-ca-bundle") pod "43509403-f426-496e-be36-56cef71462f5" (UID: "43509403-f426-496e-be36-56cef71462f5"). InnerVolumeSpecName "trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 11 17:31:09 crc kubenswrapper[4837]: I0111 17:31:09.210390 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/925f1c65-6136-48ba-85aa-3a3b50560753-ovn-control-plane-metrics-cert" (OuterVolumeSpecName: "ovn-control-plane-metrics-cert") pod "925f1c65-6136-48ba-85aa-3a3b50560753" (UID: "925f1c65-6136-48ba-85aa-3a3b50560753"). InnerVolumeSpecName "ovn-control-plane-metrics-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 11 17:31:09 crc kubenswrapper[4837]: I0111 17:31:09.210554 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-multus-daemon-config" (OuterVolumeSpecName: "multus-daemon-config") pod "4bb40260-dbaa-4fb0-84df-5e680505d512" (UID: "4bb40260-dbaa-4fb0-84df-5e680505d512"). InnerVolumeSpecName "multus-daemon-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 11 17:31:09 crc kubenswrapper[4837]: I0111 17:31:09.210730 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-kube-api-access-nzwt7" (OuterVolumeSpecName: "kube-api-access-nzwt7") pod "96b93a3a-6083-4aea-8eab-fe1aa8245ad9" (UID: "96b93a3a-6083-4aea-8eab-fe1aa8245ad9"). InnerVolumeSpecName "kube-api-access-nzwt7". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 11 17:31:09 crc kubenswrapper[4837]: I0111 17:31:09.210885 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/efdd0498-1daa-4136-9a4a-3b948c2293fc-kube-api-access-fqsjt" (OuterVolumeSpecName: "kube-api-access-fqsjt") pod "efdd0498-1daa-4136-9a4a-3b948c2293fc" (UID: "efdd0498-1daa-4136-9a4a-3b948c2293fc"). InnerVolumeSpecName "kube-api-access-fqsjt". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 11 17:31:09 crc kubenswrapper[4837]: I0111 17:31:09.210911 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-ovnkube-config" (OuterVolumeSpecName: "ovnkube-config") pod "925f1c65-6136-48ba-85aa-3a3b50560753" (UID: "925f1c65-6136-48ba-85aa-3a3b50560753"). InnerVolumeSpecName "ovnkube-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 11 17:31:09 crc kubenswrapper[4837]: I0111 17:31:09.211037 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/20b0d48f-5fd6-431c-a545-e3c800c7b866-cert" (OuterVolumeSpecName: "cert") pod "20b0d48f-5fd6-431c-a545-e3c800c7b866" (UID: "20b0d48f-5fd6-431c-a545-e3c800c7b866"). InnerVolumeSpecName "cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 11 17:31:09 crc kubenswrapper[4837]: I0111 17:31:09.211181 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1386a44e-36a2-460c-96d0-0359d2b6f0f5-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "1386a44e-36a2-460c-96d0-0359d2b6f0f5" (UID: "1386a44e-36a2-460c-96d0-0359d2b6f0f5"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 11 17:31:09 crc kubenswrapper[4837]: I0111 17:31:09.211635 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-ocp-branding-template" (OuterVolumeSpecName: "v4-0-config-system-ocp-branding-template") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-system-ocp-branding-template". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 11 17:31:09 crc kubenswrapper[4837]: I0111 17:31:09.211750 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/0b78653f-4ff9-4508-8672-245ed9b561e3-service-ca" (OuterVolumeSpecName: "service-ca") pod "0b78653f-4ff9-4508-8672-245ed9b561e3" (UID: "0b78653f-4ff9-4508-8672-245ed9b561e3"). InnerVolumeSpecName "service-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 11 17:31:09 crc kubenswrapper[4837]: E0111 17:31:09.212216 4837 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Jan 11 17:31:09 crc kubenswrapper[4837]: I0111 17:31:09.212836 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-auth-proxy-config" (OuterVolumeSpecName: "auth-proxy-config") pod "22c825df-677d-4ca6-82db-3454ed06e783" (UID: "22c825df-677d-4ca6-82db-3454ed06e783"). InnerVolumeSpecName "auth-proxy-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 11 17:31:09 crc kubenswrapper[4837]: E0111 17:31:09.213098 4837 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-01-11 17:31:09.712665633 +0000 UTC m=+43.890858339 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Jan 11 17:31:09 crc kubenswrapper[4837]: I0111 17:31:09.212254 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-utilities" (OuterVolumeSpecName: "utilities") pod "1d611f23-29be-4491-8495-bee1670e935f" (UID: "1d611f23-29be-4491-8495-bee1670e935f"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 11 17:31:09 crc kubenswrapper[4837]: I0111 17:31:09.212457 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-srv-cert" (OuterVolumeSpecName: "srv-cert") pod "b6312bbd-5731-4ea0-a20f-81d5a57df44a" (UID: "b6312bbd-5731-4ea0-a20f-81d5a57df44a"). InnerVolumeSpecName "srv-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 11 17:31:09 crc kubenswrapper[4837]: I0111 17:31:09.212518 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-trusted-ca-bundle" (OuterVolumeSpecName: "trusted-ca-bundle") pod "09ae3b1a-e8e7-4524-b54b-61eab6f9239a" (UID: "09ae3b1a-e8e7-4524-b54b-61eab6f9239a"). InnerVolumeSpecName "trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 11 17:31:09 crc kubenswrapper[4837]: I0111 17:31:09.212586 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7583ce53-e0fe-4a16-9e4d-50516596a136-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "7583ce53-e0fe-4a16-9e4d-50516596a136" (UID: "7583ce53-e0fe-4a16-9e4d-50516596a136"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 11 17:31:09 crc kubenswrapper[4837]: I0111 17:31:09.213580 4837 swap_util.go:74] "error creating dir to test if tmpfs noswap is enabled. Assuming not supported" mount path="" error="stat /var/lib/kubelet/plugins/kubernetes.io/empty-dir: no such file or directory" Jan 11 17:31:09 crc kubenswrapper[4837]: I0111 17:31:09.213829 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1d611f23-29be-4491-8495-bee1670e935f-kube-api-access-bf2bz" (OuterVolumeSpecName: "kube-api-access-bf2bz") pod "1d611f23-29be-4491-8495-bee1670e935f" (UID: "1d611f23-29be-4491-8495-bee1670e935f"). InnerVolumeSpecName "kube-api-access-bf2bz". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 11 17:31:09 crc kubenswrapper[4837]: I0111 17:31:09.214025 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6402fda4-df10-493c-b4e5-d0569419652d-machine-api-operator-tls" (OuterVolumeSpecName: "machine-api-operator-tls") pod "6402fda4-df10-493c-b4e5-d0569419652d" (UID: "6402fda4-df10-493c-b4e5-d0569419652d"). InnerVolumeSpecName "machine-api-operator-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 11 17:31:09 crc kubenswrapper[4837]: I0111 17:31:09.214236 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-serving-cert" (OuterVolumeSpecName: "v4-0-config-system-serving-cert") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-system-serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 11 17:31:09 crc kubenswrapper[4837]: I0111 17:31:09.214857 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-sysctl-allowlist" (OuterVolumeSpecName: "cni-sysctl-allowlist") pod "7bb08738-c794-4ee8-9972-3a62ca171029" (UID: "7bb08738-c794-4ee8-9972-3a62ca171029"). InnerVolumeSpecName "cni-sysctl-allowlist". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 11 17:31:09 crc kubenswrapper[4837]: I0111 17:31:09.215133 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c" (UID: "210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 11 17:31:09 crc kubenswrapper[4837]: I0111 17:31:09.215559 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/01ab3dd5-8196-46d0-ad33-122e2ca51def-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "01ab3dd5-8196-46d0-ad33-122e2ca51def" (UID: "01ab3dd5-8196-46d0-ad33-122e2ca51def"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 11 17:31:09 crc kubenswrapper[4837]: I0111 17:31:09.216021 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/fda69060-fa79-4696-b1a6-7980f124bf7c-proxy-tls" (OuterVolumeSpecName: "proxy-tls") pod "fda69060-fa79-4696-b1a6-7980f124bf7c" (UID: "fda69060-fa79-4696-b1a6-7980f124bf7c"). InnerVolumeSpecName "proxy-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 11 17:31:09 crc kubenswrapper[4837]: I0111 17:31:09.216249 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-bound-sa-token" (OuterVolumeSpecName: "bound-sa-token") pod "a31745f5-9847-4afe-82a5-3161cc66ca93" (UID: "a31745f5-9847-4afe-82a5-3161cc66ca93"). InnerVolumeSpecName "bound-sa-token". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 11 17:31:09 crc kubenswrapper[4837]: I0111 17:31:09.216571 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-config" (OuterVolumeSpecName: "ovnkube-config") pod "6ea678ab-3438-413e-bfe3-290ae7725660" (UID: "6ea678ab-3438-413e-bfe3-290ae7725660"). InnerVolumeSpecName "ovnkube-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 11 17:31:09 crc kubenswrapper[4837]: I0111 17:31:09.216601 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/20b0d48f-5fd6-431c-a545-e3c800c7b866-kube-api-access-w9rds" (OuterVolumeSpecName: "kube-api-access-w9rds") pod "20b0d48f-5fd6-431c-a545-e3c800c7b866" (UID: "20b0d48f-5fd6-431c-a545-e3c800c7b866"). InnerVolumeSpecName "kube-api-access-w9rds". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 11 17:31:09 crc kubenswrapper[4837]: I0111 17:31:09.216774 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-config" (OuterVolumeSpecName: "config") pod "09efc573-dbb6-4249-bd59-9b87aba8dd28" (UID: "09efc573-dbb6-4249-bd59-9b87aba8dd28"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 11 17:31:09 crc kubenswrapper[4837]: I0111 17:31:09.216989 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-provider-selection" (OuterVolumeSpecName: "v4-0-config-user-template-provider-selection") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-user-template-provider-selection". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 11 17:31:09 crc kubenswrapper[4837]: I0111 17:31:09.217399 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/22c825df-677d-4ca6-82db-3454ed06e783-kube-api-access-7c4vf" (OuterVolumeSpecName: "kube-api-access-7c4vf") pod "22c825df-677d-4ca6-82db-3454ed06e783" (UID: "22c825df-677d-4ca6-82db-3454ed06e783"). InnerVolumeSpecName "kube-api-access-7c4vf". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 11 17:31:09 crc kubenswrapper[4837]: I0111 17:31:09.217543 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-profile-collector-cert" (OuterVolumeSpecName: "profile-collector-cert") pod "f88749ec-7931-4ee7-b3fc-1ec5e11f92e9" (UID: "f88749ec-7931-4ee7-b3fc-1ec5e11f92e9"). InnerVolumeSpecName "profile-collector-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 11 17:31:09 crc kubenswrapper[4837]: I0111 17:31:09.217698 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-operator-metrics" (OuterVolumeSpecName: "marketplace-operator-metrics") pod "b6cd30de-2eeb-49a2-ab40-9167f4560ff5" (UID: "b6cd30de-2eeb-49a2-ab40-9167f4560ff5"). InnerVolumeSpecName "marketplace-operator-metrics". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 11 17:31:09 crc kubenswrapper[4837]: I0111 17:31:09.217873 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-trusted-ca-bundle" (OuterVolumeSpecName: "trusted-ca-bundle") pod "6509e943-70c6-444c-bc41-48a544e36fbd" (UID: "6509e943-70c6-444c-bc41-48a544e36fbd"). InnerVolumeSpecName "trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 11 17:31:09 crc kubenswrapper[4837]: I0111 17:31:09.218117 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-kube-api-access-6ccd8" (OuterVolumeSpecName: "kube-api-access-6ccd8") pod "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" (UID: "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b"). InnerVolumeSpecName "kube-api-access-6ccd8". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 11 17:31:09 crc kubenswrapper[4837]: I0111 17:31:09.218262 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3ab1a177-2de0-46d9-b765-d0d0649bb42e-package-server-manager-serving-cert" (OuterVolumeSpecName: "package-server-manager-serving-cert") pod "3ab1a177-2de0-46d9-b765-d0d0649bb42e" (UID: "3ab1a177-2de0-46d9-b765-d0d0649bb42e"). InnerVolumeSpecName "package-server-manager-serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 11 17:31:09 crc kubenswrapper[4837]: I0111 17:31:09.218397 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-bound-sa-token" (OuterVolumeSpecName: "bound-sa-token") pod "bf126b07-da06-4140-9a57-dfd54fc6b486" (UID: "bf126b07-da06-4140-9a57-dfd54fc6b486"). InnerVolumeSpecName "bound-sa-token". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 11 17:31:09 crc kubenswrapper[4837]: I0111 17:31:09.218518 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6509e943-70c6-444c-bc41-48a544e36fbd-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "6509e943-70c6-444c-bc41-48a544e36fbd" (UID: "6509e943-70c6-444c-bc41-48a544e36fbd"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 11 17:31:09 crc kubenswrapper[4837]: I0111 17:31:09.218694 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/87cf06ed-a83f-41a7-828d-70653580a8cb-kube-api-access-d6qdx" (OuterVolumeSpecName: "kube-api-access-d6qdx") pod "87cf06ed-a83f-41a7-828d-70653580a8cb" (UID: "87cf06ed-a83f-41a7-828d-70653580a8cb"). InnerVolumeSpecName "kube-api-access-d6qdx". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 11 17:31:09 crc kubenswrapper[4837]: I0111 17:31:09.219129 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/496e6271-fb68-4057-954e-a0d97a4afa3f-config" (OuterVolumeSpecName: "config") pod "496e6271-fb68-4057-954e-a0d97a4afa3f" (UID: "496e6271-fb68-4057-954e-a0d97a4afa3f"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 11 17:31:09 crc kubenswrapper[4837]: I0111 17:31:09.219292 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/49ef4625-1d3a-4a9f-b595-c2433d32326d-kube-api-access-pjr6v" (OuterVolumeSpecName: "kube-api-access-pjr6v") pod "49ef4625-1d3a-4a9f-b595-c2433d32326d" (UID: "49ef4625-1d3a-4a9f-b595-c2433d32326d"). InnerVolumeSpecName "kube-api-access-pjr6v". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 11 17:31:09 crc kubenswrapper[4837]: I0111 17:31:09.220064 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/ef543e1b-8068-4ea3-b32a-61027b32e95d-webhook-cert\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Jan 11 17:31:09 crc kubenswrapper[4837]: I0111 17:31:09.221838 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-oauth-serving-cert" (OuterVolumeSpecName: "oauth-serving-cert") pod "43509403-f426-496e-be36-56cef71462f5" (UID: "43509403-f426-496e-be36-56cef71462f5"). InnerVolumeSpecName "oauth-serving-cert". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 11 17:31:09 crc kubenswrapper[4837]: I0111 17:31:09.221685 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bf126b07-da06-4140-9a57-dfd54fc6b486-image-registry-operator-tls" (OuterVolumeSpecName: "image-registry-operator-tls") pod "bf126b07-da06-4140-9a57-dfd54fc6b486" (UID: "bf126b07-da06-4140-9a57-dfd54fc6b486"). InnerVolumeSpecName "image-registry-operator-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 11 17:31:09 crc kubenswrapper[4837]: I0111 17:31:09.222619 4837 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-11T17:31:09Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-11T17:31:09Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Jan 11 17:31:09 crc kubenswrapper[4837]: I0111 17:31:09.222877 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e7e6199b-1264-4501-8953-767f51328d08-config" (OuterVolumeSpecName: "config") pod "e7e6199b-1264-4501-8953-767f51328d08" (UID: "e7e6199b-1264-4501-8953-767f51328d08"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 11 17:31:09 crc kubenswrapper[4837]: I0111 17:31:09.222914 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-router-certs" (OuterVolumeSpecName: "v4-0-config-system-router-certs") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-system-router-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 11 17:31:09 crc kubenswrapper[4837]: I0111 17:31:09.222951 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-kube-api-access-pj782" (OuterVolumeSpecName: "kube-api-access-pj782") pod "b6cd30de-2eeb-49a2-ab40-9167f4560ff5" (UID: "b6cd30de-2eeb-49a2-ab40-9167f4560ff5"). InnerVolumeSpecName "kube-api-access-pj782". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 11 17:31:09 crc kubenswrapper[4837]: I0111 17:31:09.222634 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-iptables-alerter-script\") pod \"iptables-alerter-4ln5h\" (UID: \"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\") " pod="openshift-network-operator/iptables-alerter-4ln5h" Jan 11 17:31:09 crc kubenswrapper[4837]: E0111 17:31:09.223219 4837 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Jan 11 17:31:09 crc kubenswrapper[4837]: I0111 17:31:09.223264 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-kube-api-access-zkvpv" (OuterVolumeSpecName: "kube-api-access-zkvpv") pod "09ae3b1a-e8e7-4524-b54b-61eab6f9239a" (UID: "09ae3b1a-e8e7-4524-b54b-61eab6f9239a"). InnerVolumeSpecName "kube-api-access-zkvpv". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 11 17:31:09 crc kubenswrapper[4837]: E0111 17:31:09.223295 4837 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-01-11 17:31:09.723270215 +0000 UTC m=+43.901462941 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Jan 11 17:31:09 crc kubenswrapper[4837]: I0111 17:31:09.223517 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4bb40260-dbaa-4fb0-84df-5e680505d512-kube-api-access-2w9zh" (OuterVolumeSpecName: "kube-api-access-2w9zh") pod "4bb40260-dbaa-4fb0-84df-5e680505d512" (UID: "4bb40260-dbaa-4fb0-84df-5e680505d512"). InnerVolumeSpecName "kube-api-access-2w9zh". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 11 17:31:09 crc kubenswrapper[4837]: I0111 17:31:09.223261 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/43509403-f426-496e-be36-56cef71462f5-kube-api-access-qg5z5" (OuterVolumeSpecName: "kube-api-access-qg5z5") pod "43509403-f426-496e-be36-56cef71462f5" (UID: "43509403-f426-496e-be36-56cef71462f5"). InnerVolumeSpecName "kube-api-access-qg5z5". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 11 17:31:09 crc kubenswrapper[4837]: I0111 17:31:09.223749 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5b88f790-22fa-440e-b583-365168c0b23d-metrics-certs" (OuterVolumeSpecName: "metrics-certs") pod "5b88f790-22fa-440e-b583-365168c0b23d" (UID: "5b88f790-22fa-440e-b583-365168c0b23d"). InnerVolumeSpecName "metrics-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 11 17:31:09 crc kubenswrapper[4837]: I0111 17:31:09.223883 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-stats-auth" (OuterVolumeSpecName: "stats-auth") pod "c03ee662-fb2f-4fc4-a2c1-af487c19d254" (UID: "c03ee662-fb2f-4fc4-a2c1-af487c19d254"). InnerVolumeSpecName "stats-auth". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 11 17:31:09 crc kubenswrapper[4837]: I0111 17:31:09.223635 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 11 17:31:09 crc kubenswrapper[4837]: I0111 17:31:09.224357 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-error" (OuterVolumeSpecName: "v4-0-config-user-template-error") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-user-template-error". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 11 17:31:09 crc kubenswrapper[4837]: I0111 17:31:09.224501 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9d4552c7-cd75-42dd-8880-30dd377c49a4-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "9d4552c7-cd75-42dd-8880-30dd377c49a4" (UID: "9d4552c7-cd75-42dd-8880-30dd377c49a4"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 11 17:31:09 crc kubenswrapper[4837]: I0111 17:31:09.224619 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/01ab3dd5-8196-46d0-ad33-122e2ca51def-config" (OuterVolumeSpecName: "config") pod "01ab3dd5-8196-46d0-ad33-122e2ca51def" (UID: "01ab3dd5-8196-46d0-ad33-122e2ca51def"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 11 17:31:09 crc kubenswrapper[4837]: I0111 17:31:09.224574 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0b574797-001e-440a-8f4e-c0be86edad0f-kube-api-access-lzf88" (OuterVolumeSpecName: "kube-api-access-lzf88") pod "0b574797-001e-440a-8f4e-c0be86edad0f" (UID: "0b574797-001e-440a-8f4e-c0be86edad0f"). InnerVolumeSpecName "kube-api-access-lzf88". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 11 17:31:09 crc kubenswrapper[4837]: I0111 17:31:09.224859 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-kube-api-access-wxkg8" (OuterVolumeSpecName: "kube-api-access-wxkg8") pod "3cb93b32-e0ae-4377-b9c8-fdb9842c6d59" (UID: "3cb93b32-e0ae-4377-b9c8-fdb9842c6d59"). InnerVolumeSpecName "kube-api-access-wxkg8". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 11 17:31:09 crc kubenswrapper[4837]: I0111 17:31:09.225030 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-profile-collector-cert" (OuterVolumeSpecName: "profile-collector-cert") pod "b6312bbd-5731-4ea0-a20f-81d5a57df44a" (UID: "b6312bbd-5731-4ea0-a20f-81d5a57df44a"). InnerVolumeSpecName "profile-collector-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 11 17:31:09 crc kubenswrapper[4837]: I0111 17:31:09.225134 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0b574797-001e-440a-8f4e-c0be86edad0f-proxy-tls" (OuterVolumeSpecName: "proxy-tls") pod "0b574797-001e-440a-8f4e-c0be86edad0f" (UID: "0b574797-001e-440a-8f4e-c0be86edad0f"). InnerVolumeSpecName "proxy-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 11 17:31:09 crc kubenswrapper[4837]: I0111 17:31:09.225156 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6ea678ab-3438-413e-bfe3-290ae7725660-ovn-node-metrics-cert" (OuterVolumeSpecName: "ovn-node-metrics-cert") pod "6ea678ab-3438-413e-bfe3-290ae7725660" (UID: "6ea678ab-3438-413e-bfe3-290ae7725660"). InnerVolumeSpecName "ovn-node-metrics-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 11 17:31:09 crc kubenswrapper[4837]: I0111 17:31:09.225269 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-webhook-cert" (OuterVolumeSpecName: "webhook-cert") pod "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" (UID: "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b"). InnerVolumeSpecName "webhook-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 11 17:31:09 crc kubenswrapper[4837]: I0111 17:31:09.225573 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0b78653f-4ff9-4508-8672-245ed9b561e3-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "0b78653f-4ff9-4508-8672-245ed9b561e3" (UID: "0b78653f-4ff9-4508-8672-245ed9b561e3"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 11 17:31:09 crc kubenswrapper[4837]: I0111 17:31:09.225965 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-config" (OuterVolumeSpecName: "config") pod "6509e943-70c6-444c-bc41-48a544e36fbd" (UID: "6509e943-70c6-444c-bc41-48a544e36fbd"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 11 17:31:09 crc kubenswrapper[4837]: I0111 17:31:09.224256 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/ef543e1b-8068-4ea3-b32a-61027b32e95d-env-overrides\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Jan 11 17:31:09 crc kubenswrapper[4837]: I0111 17:31:09.237764 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/25e176fe-21b4-4974-b1ed-c8b94f112a7f-kube-api-access-d4lsv" (OuterVolumeSpecName: "kube-api-access-d4lsv") pod "25e176fe-21b4-4974-b1ed-c8b94f112a7f" (UID: "25e176fe-21b4-4974-b1ed-c8b94f112a7f"). InnerVolumeSpecName "kube-api-access-d4lsv". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 11 17:31:09 crc kubenswrapper[4837]: I0111 17:31:09.225490 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovnkube-identity-cm\" (UniqueName: \"kubernetes.io/configmap/ef543e1b-8068-4ea3-b32a-61027b32e95d-ovnkube-identity-cm\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Jan 11 17:31:09 crc kubenswrapper[4837]: I0111 17:31:09.240788 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-config" (OuterVolumeSpecName: "config") pod "5441d097-087c-4d9a-baa8-b210afa90fc9" (UID: "5441d097-087c-4d9a-baa8-b210afa90fc9"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 11 17:31:09 crc kubenswrapper[4837]: I0111 17:31:09.240884 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-images" (OuterVolumeSpecName: "images") pod "6402fda4-df10-493c-b4e5-d0569419652d" (UID: "6402fda4-df10-493c-b4e5-d0569419652d"). InnerVolumeSpecName "images". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 11 17:31:09 crc kubenswrapper[4837]: I0111 17:31:09.240981 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7583ce53-e0fe-4a16-9e4d-50516596a136-kube-api-access-xcphl" (OuterVolumeSpecName: "kube-api-access-xcphl") pod "7583ce53-e0fe-4a16-9e4d-50516596a136" (UID: "7583ce53-e0fe-4a16-9e4d-50516596a136"). InnerVolumeSpecName "kube-api-access-xcphl". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 11 17:31:09 crc kubenswrapper[4837]: I0111 17:31:09.241014 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-ca" (OuterVolumeSpecName: "etcd-ca") pod "09efc573-dbb6-4249-bd59-9b87aba8dd28" (UID: "09efc573-dbb6-4249-bd59-9b87aba8dd28"). InnerVolumeSpecName "etcd-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 11 17:31:09 crc kubenswrapper[4837]: I0111 17:31:09.241176 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-trusted-ca" (OuterVolumeSpecName: "marketplace-trusted-ca") pod "b6cd30de-2eeb-49a2-ab40-9167f4560ff5" (UID: "b6cd30de-2eeb-49a2-ab40-9167f4560ff5"). InnerVolumeSpecName "marketplace-trusted-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 11 17:31:09 crc kubenswrapper[4837]: I0111 17:31:09.241435 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-key" (OuterVolumeSpecName: "signing-key") pod "25e176fe-21b4-4974-b1ed-c8b94f112a7f" (UID: "25e176fe-21b4-4974-b1ed-c8b94f112a7f"). InnerVolumeSpecName "signing-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 11 17:31:09 crc kubenswrapper[4837]: I0111 17:31:09.241806 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-cabundle" (OuterVolumeSpecName: "signing-cabundle") pod "25e176fe-21b4-4974-b1ed-c8b94f112a7f" (UID: "25e176fe-21b4-4974-b1ed-c8b94f112a7f"). InnerVolumeSpecName "signing-cabundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 11 17:31:09 crc kubenswrapper[4837]: I0111 17:31:09.243116 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c03ee662-fb2f-4fc4-a2c1-af487c19d254-service-ca-bundle" (OuterVolumeSpecName: "service-ca-bundle") pod "c03ee662-fb2f-4fc4-a2c1-af487c19d254" (UID: "c03ee662-fb2f-4fc4-a2c1-af487c19d254"). InnerVolumeSpecName "service-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 11 17:31:09 crc kubenswrapper[4837]: I0111 17:31:09.243184 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-trusted-ca-bundle" (OuterVolumeSpecName: "v4-0-config-system-trusted-ca-bundle") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-system-trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 11 17:31:09 crc kubenswrapper[4837]: I0111 17:31:09.243312 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-session" (OuterVolumeSpecName: "v4-0-config-system-session") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-system-session". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 11 17:31:09 crc kubenswrapper[4837]: E0111 17:31:09.243944 4837 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Jan 11 17:31:09 crc kubenswrapper[4837]: E0111 17:31:09.244824 4837 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Jan 11 17:31:09 crc kubenswrapper[4837]: E0111 17:31:09.244881 4837 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Jan 11 17:31:09 crc kubenswrapper[4837]: E0111 17:31:09.244980 4837 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2026-01-11 17:31:09.744963702 +0000 UTC m=+43.923156408 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Jan 11 17:31:09 crc kubenswrapper[4837]: I0111 17:31:09.245416 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rczfb\" (UniqueName: \"kubernetes.io/projected/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-kube-api-access-rczfb\") pod \"iptables-alerter-4ln5h\" (UID: \"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\") " pod="openshift-network-operator/iptables-alerter-4ln5h" Jan 11 17:31:09 crc kubenswrapper[4837]: E0111 17:31:09.246045 4837 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Jan 11 17:31:09 crc kubenswrapper[4837]: E0111 17:31:09.246065 4837 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Jan 11 17:31:09 crc kubenswrapper[4837]: E0111 17:31:09.246076 4837 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Jan 11 17:31:09 crc kubenswrapper[4837]: E0111 17:31:09.246121 4837 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2026-01-11 17:31:09.74610584 +0000 UTC m=+43.924298546 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Jan 11 17:31:09 crc kubenswrapper[4837]: I0111 17:31:09.246237 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-config" (OuterVolumeSpecName: "config") pod "210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c" (UID: "210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 11 17:31:09 crc kubenswrapper[4837]: I0111 17:31:09.248799 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-s2kz5\" (UniqueName: \"kubernetes.io/projected/ef543e1b-8068-4ea3-b32a-61027b32e95d-kube-api-access-s2kz5\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Jan 11 17:31:09 crc kubenswrapper[4837]: I0111 17:31:09.249261 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/01ab3dd5-8196-46d0-ad33-122e2ca51def-kube-api-access-w7l8j" (OuterVolumeSpecName: "kube-api-access-w7l8j") pod "01ab3dd5-8196-46d0-ad33-122e2ca51def" (UID: "01ab3dd5-8196-46d0-ad33-122e2ca51def"). InnerVolumeSpecName "kube-api-access-w7l8j". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 11 17:31:09 crc kubenswrapper[4837]: I0111 17:31:09.249579 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-proxy-ca-bundles" (OuterVolumeSpecName: "proxy-ca-bundles") pod "7583ce53-e0fe-4a16-9e4d-50516596a136" (UID: "7583ce53-e0fe-4a16-9e4d-50516596a136"). InnerVolumeSpecName "proxy-ca-bundles". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 11 17:31:09 crc kubenswrapper[4837]: I0111 17:31:09.250501 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rdwmf\" (UniqueName: \"kubernetes.io/projected/37a5e44f-9a88-4405-be8a-b645485e7312-kube-api-access-rdwmf\") pod \"network-operator-58b4c7f79c-55gtf\" (UID: \"37a5e44f-9a88-4405-be8a-b645485e7312\") " pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Jan 11 17:31:09 crc kubenswrapper[4837]: I0111 17:31:09.250511 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3ab1a177-2de0-46d9-b765-d0d0649bb42e-kube-api-access-4d4hj" (OuterVolumeSpecName: "kube-api-access-4d4hj") pod "3ab1a177-2de0-46d9-b765-d0d0649bb42e" (UID: "3ab1a177-2de0-46d9-b765-d0d0649bb42e"). InnerVolumeSpecName "kube-api-access-4d4hj". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 11 17:31:09 crc kubenswrapper[4837]: I0111 17:31:09.250586 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6731426b-95fe-49ff-bb5f-40441049fde2-control-plane-machine-set-operator-tls" (OuterVolumeSpecName: "control-plane-machine-set-operator-tls") pod "6731426b-95fe-49ff-bb5f-40441049fde2" (UID: "6731426b-95fe-49ff-bb5f-40441049fde2"). InnerVolumeSpecName "control-plane-machine-set-operator-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 11 17:31:09 crc kubenswrapper[4837]: I0111 17:31:09.250843 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-apiservice-cert" (OuterVolumeSpecName: "apiservice-cert") pod "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" (UID: "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b"). InnerVolumeSpecName "apiservice-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 11 17:31:09 crc kubenswrapper[4837]: I0111 17:31:09.251062 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-tmpfs" (OuterVolumeSpecName: "tmpfs") pod "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" (UID: "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b"). InnerVolumeSpecName "tmpfs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 11 17:31:09 crc kubenswrapper[4837]: I0111 17:31:09.253584 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-utilities" (OuterVolumeSpecName: "utilities") pod "b11524ee-3fca-4b1b-9cdf-6da289fdbc7d" (UID: "b11524ee-3fca-4b1b-9cdf-6da289fdbc7d"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 11 17:31:09 crc kubenswrapper[4837]: I0111 17:31:09.256279 4837 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-b7lgc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e1452749-ce38-41f8-89dd-4b567f2a3250\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-11T17:31:09Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-11T17:31:09Z\\\",\\\"message\\\":\\\"containers with incomplete status: [kubecfg-setup]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-11T17:31:09Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-11T17:31:09Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6f6rv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6f6rv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6f6rv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6f6rv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6f6rv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6f6rv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6f6rv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6f6rv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6f6rv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-11T17:31:09Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-b7lgc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Jan 11 17:31:09 crc kubenswrapper[4837]: I0111 17:31:09.257441 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/496e6271-fb68-4057-954e-a0d97a4afa3f-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "496e6271-fb68-4057-954e-a0d97a4afa3f" (UID: "496e6271-fb68-4057-954e-a0d97a4afa3f"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 11 17:31:09 crc kubenswrapper[4837]: I0111 17:31:09.258128 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-oauth-config" (OuterVolumeSpecName: "console-oauth-config") pod "43509403-f426-496e-be36-56cef71462f5" (UID: "43509403-f426-496e-be36-56cef71462f5"). InnerVolumeSpecName "console-oauth-config". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 11 17:31:09 crc kubenswrapper[4837]: I0111 17:31:09.258301 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-binary-copy" (OuterVolumeSpecName: "cni-binary-copy") pod "7bb08738-c794-4ee8-9972-3a62ca171029" (UID: "7bb08738-c794-4ee8-9972-3a62ca171029"). InnerVolumeSpecName "cni-binary-copy". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 11 17:31:09 crc kubenswrapper[4837]: I0111 17:31:09.258353 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/efdd0498-1daa-4136-9a4a-3b948c2293fc-webhook-certs" (OuterVolumeSpecName: "webhook-certs") pod "efdd0498-1daa-4136-9a4a-3b948c2293fc" (UID: "efdd0498-1daa-4136-9a4a-3b948c2293fc"). InnerVolumeSpecName "webhook-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 11 17:31:09 crc kubenswrapper[4837]: I0111 17:31:09.259042 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-config" (OuterVolumeSpecName: "config") pod "22c825df-677d-4ca6-82db-3454ed06e783" (UID: "22c825df-677d-4ca6-82db-3454ed06e783"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 11 17:31:09 crc kubenswrapper[4837]: I0111 17:31:09.259391 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-service-ca" (OuterVolumeSpecName: "service-ca") pod "43509403-f426-496e-be36-56cef71462f5" (UID: "43509403-f426-496e-be36-56cef71462f5"). InnerVolumeSpecName "service-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 11 17:31:09 crc kubenswrapper[4837]: I0111 17:31:09.259635 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/31d8b7a1-420e-4252-a5b7-eebe8a111292-kube-api-access-zgdk5" (OuterVolumeSpecName: "kube-api-access-zgdk5") pod "31d8b7a1-420e-4252-a5b7-eebe8a111292" (UID: "31d8b7a1-420e-4252-a5b7-eebe8a111292"). InnerVolumeSpecName "kube-api-access-zgdk5". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 11 17:31:09 crc kubenswrapper[4837]: I0111 17:31:09.261477 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "1d611f23-29be-4491-8495-bee1670e935f" (UID: "1d611f23-29be-4491-8495-bee1670e935f"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 11 17:31:09 crc kubenswrapper[4837]: I0111 17:31:09.259638 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5fe579f8-e8a6-4643-bce5-a661393c4dde-kube-api-access-fcqwp" (OuterVolumeSpecName: "kube-api-access-fcqwp") pod "5fe579f8-e8a6-4643-bce5-a661393c4dde" (UID: "5fe579f8-e8a6-4643-bce5-a661393c4dde"). InnerVolumeSpecName "kube-api-access-fcqwp". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 11 17:31:09 crc kubenswrapper[4837]: I0111 17:31:09.262309 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-bound-sa-token" (OuterVolumeSpecName: "bound-sa-token") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "bound-sa-token". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 11 17:31:09 crc kubenswrapper[4837]: I0111 17:31:09.263250 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "bc5039c0-ea34-426b-a2b7-fbbc87b49a6d" (UID: "bc5039c0-ea34-426b-a2b7-fbbc87b49a6d"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 11 17:31:09 crc kubenswrapper[4837]: I0111 17:31:09.263437 4837 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-kcvkg" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a574a741-71b2-4d60-9553-3d85d815f4b0\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-11T17:31:09Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-11T17:31:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-11T17:31:09Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-11T17:31:09Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xnsnp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-11T17:31:09Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-kcvkg\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Jan 11 17:31:09 crc kubenswrapper[4837]: I0111 17:31:09.263585 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/37a5e44f-9a88-4405-be8a-b645485e7312-metrics-tls\") pod \"network-operator-58b4c7f79c-55gtf\" (UID: \"37a5e44f-9a88-4405-be8a-b645485e7312\") " pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Jan 11 17:31:09 crc kubenswrapper[4837]: I0111 17:31:09.264171 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/87cf06ed-a83f-41a7-828d-70653580a8cb-config-volume" (OuterVolumeSpecName: "config-volume") pod "87cf06ed-a83f-41a7-828d-70653580a8cb" (UID: "87cf06ed-a83f-41a7-828d-70653580a8cb"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 11 17:31:09 crc kubenswrapper[4837]: I0111 17:31:09.264728 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9d4552c7-cd75-42dd-8880-30dd377c49a4-kube-api-access-pcxfs" (OuterVolumeSpecName: "kube-api-access-pcxfs") pod "9d4552c7-cd75-42dd-8880-30dd377c49a4" (UID: "9d4552c7-cd75-42dd-8880-30dd377c49a4"). InnerVolumeSpecName "kube-api-access-pcxfs". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 11 17:31:09 crc kubenswrapper[4837]: I0111 17:31:09.269920 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "57a731c4-ef35-47a8-b875-bfb08a7f8011" (UID: "57a731c4-ef35-47a8-b875-bfb08a7f8011"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 11 17:31:09 crc kubenswrapper[4837]: I0111 17:31:09.270756 4837 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-11T17:31:09Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-11T17:31:09Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Jan 11 17:31:09 crc kubenswrapper[4837]: I0111 17:31:09.279419 4837 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-v5bf5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"78cc7c3f-09f5-4200-a647-8fa4e9b2aae5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-11T17:31:09Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-11T17:31:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-11T17:31:09Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-11T17:31:09Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ns552\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-11T17:31:09Z\\\"}}\" for pod \"openshift-multus\"/\"multus-v5bf5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Jan 11 17:31:09 crc kubenswrapper[4837]: I0111 17:31:09.286414 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "b11524ee-3fca-4b1b-9cdf-6da289fdbc7d" (UID: "b11524ee-3fca-4b1b-9cdf-6da289fdbc7d"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 11 17:31:09 crc kubenswrapper[4837]: I0111 17:31:09.286878 4837 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-11T17:31:09Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-11T17:31:09Z\\\",\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":false,\\\"restartCount\\\":6,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Jan 11 17:31:09 crc kubenswrapper[4837]: I0111 17:31:09.291806 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/8f668bae-612b-4b75-9490-919e737c6a3b-ca-trust-extracted" (OuterVolumeSpecName: "ca-trust-extracted") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "ca-trust-extracted". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 11 17:31:09 crc kubenswrapper[4837]: I0111 17:31:09.293206 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "5225d0e4-402f-4861-b410-819f433b1803" (UID: "5225d0e4-402f-4861-b410-819f433b1803"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 11 17:31:09 crc kubenswrapper[4837]: I0111 17:31:09.298253 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/e1452749-ce38-41f8-89dd-4b567f2a3250-run-ovn\") pod \"ovnkube-node-b7lgc\" (UID: \"e1452749-ce38-41f8-89dd-4b567f2a3250\") " pod="openshift-ovn-kubernetes/ovnkube-node-b7lgc" Jan 11 17:31:09 crc kubenswrapper[4837]: I0111 17:31:09.298290 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-etc-kube\" (UniqueName: \"kubernetes.io/host-path/37a5e44f-9a88-4405-be8a-b645485e7312-host-etc-kube\") pod \"network-operator-58b4c7f79c-55gtf\" (UID: \"37a5e44f-9a88-4405-be8a-b645485e7312\") " pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Jan 11 17:31:09 crc kubenswrapper[4837]: I0111 17:31:09.298306 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/46729771-ba0b-4b7c-8245-b2d57acb5a2c-cnibin\") pod \"multus-additional-cni-plugins-7996w\" (UID: \"46729771-ba0b-4b7c-8245-b2d57acb5a2c\") " pod="openshift-multus/multus-additional-cni-plugins-7996w" Jan 11 17:31:09 crc kubenswrapper[4837]: I0111 17:31:09.298322 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/78cc7c3f-09f5-4200-a647-8fa4e9b2aae5-os-release\") pod \"multus-v5bf5\" (UID: \"78cc7c3f-09f5-4200-a647-8fa4e9b2aae5\") " pod="openshift-multus/multus-v5bf5" Jan 11 17:31:09 crc kubenswrapper[4837]: I0111 17:31:09.298345 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/78cc7c3f-09f5-4200-a647-8fa4e9b2aae5-host-run-k8s-cni-cncf-io\") pod \"multus-v5bf5\" (UID: \"78cc7c3f-09f5-4200-a647-8fa4e9b2aae5\") " pod="openshift-multus/multus-v5bf5" Jan 11 17:31:09 crc kubenswrapper[4837]: I0111 17:31:09.298368 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wgdql\" (UniqueName: \"kubernetes.io/projected/1cf6fa66-290a-4e29-bafc-e60185a22fe2-kube-api-access-wgdql\") pod \"machine-config-daemon-pqnst\" (UID: \"1cf6fa66-290a-4e29-bafc-e60185a22fe2\") " pod="openshift-machine-config-operator/machine-config-daemon-pqnst" Jan 11 17:31:09 crc kubenswrapper[4837]: I0111 17:31:09.298378 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/e1452749-ce38-41f8-89dd-4b567f2a3250-run-ovn\") pod \"ovnkube-node-b7lgc\" (UID: \"e1452749-ce38-41f8-89dd-4b567f2a3250\") " pod="openshift-ovn-kubernetes/ovnkube-node-b7lgc" Jan 11 17:31:09 crc kubenswrapper[4837]: I0111 17:31:09.298419 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/e1452749-ce38-41f8-89dd-4b567f2a3250-log-socket\") pod \"ovnkube-node-b7lgc\" (UID: \"e1452749-ce38-41f8-89dd-4b567f2a3250\") " pod="openshift-ovn-kubernetes/ovnkube-node-b7lgc" Jan 11 17:31:09 crc kubenswrapper[4837]: I0111 17:31:09.298384 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/e1452749-ce38-41f8-89dd-4b567f2a3250-log-socket\") pod \"ovnkube-node-b7lgc\" (UID: \"e1452749-ce38-41f8-89dd-4b567f2a3250\") " pod="openshift-ovn-kubernetes/ovnkube-node-b7lgc" Jan 11 17:31:09 crc kubenswrapper[4837]: I0111 17:31:09.298427 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/46729771-ba0b-4b7c-8245-b2d57acb5a2c-cnibin\") pod \"multus-additional-cni-plugins-7996w\" (UID: \"46729771-ba0b-4b7c-8245-b2d57acb5a2c\") " pod="openshift-multus/multus-additional-cni-plugins-7996w" Jan 11 17:31:09 crc kubenswrapper[4837]: I0111 17:31:09.298457 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/78cc7c3f-09f5-4200-a647-8fa4e9b2aae5-system-cni-dir\") pod \"multus-v5bf5\" (UID: \"78cc7c3f-09f5-4200-a647-8fa4e9b2aae5\") " pod="openshift-multus/multus-v5bf5" Jan 11 17:31:09 crc kubenswrapper[4837]: I0111 17:31:09.298476 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/78cc7c3f-09f5-4200-a647-8fa4e9b2aae5-multus-conf-dir\") pod \"multus-v5bf5\" (UID: \"78cc7c3f-09f5-4200-a647-8fa4e9b2aae5\") " pod="openshift-multus/multus-v5bf5" Jan 11 17:31:09 crc kubenswrapper[4837]: I0111 17:31:09.298494 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/78cc7c3f-09f5-4200-a647-8fa4e9b2aae5-host-run-netns\") pod \"multus-v5bf5\" (UID: \"78cc7c3f-09f5-4200-a647-8fa4e9b2aae5\") " pod="openshift-multus/multus-v5bf5" Jan 11 17:31:09 crc kubenswrapper[4837]: I0111 17:31:09.298511 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/78cc7c3f-09f5-4200-a647-8fa4e9b2aae5-host-var-lib-cni-bin\") pod \"multus-v5bf5\" (UID: \"78cc7c3f-09f5-4200-a647-8fa4e9b2aae5\") " pod="openshift-multus/multus-v5bf5" Jan 11 17:31:09 crc kubenswrapper[4837]: I0111 17:31:09.298526 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/78cc7c3f-09f5-4200-a647-8fa4e9b2aae5-host-var-lib-kubelet\") pod \"multus-v5bf5\" (UID: \"78cc7c3f-09f5-4200-a647-8fa4e9b2aae5\") " pod="openshift-multus/multus-v5bf5" Jan 11 17:31:09 crc kubenswrapper[4837]: I0111 17:31:09.298540 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/78cc7c3f-09f5-4200-a647-8fa4e9b2aae5-multus-daemon-config\") pod \"multus-v5bf5\" (UID: \"78cc7c3f-09f5-4200-a647-8fa4e9b2aae5\") " pod="openshift-multus/multus-v5bf5" Jan 11 17:31:09 crc kubenswrapper[4837]: I0111 17:31:09.298542 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/78cc7c3f-09f5-4200-a647-8fa4e9b2aae5-os-release\") pod \"multus-v5bf5\" (UID: \"78cc7c3f-09f5-4200-a647-8fa4e9b2aae5\") " pod="openshift-multus/multus-v5bf5" Jan 11 17:31:09 crc kubenswrapper[4837]: I0111 17:31:09.298555 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/78cc7c3f-09f5-4200-a647-8fa4e9b2aae5-host-run-multus-certs\") pod \"multus-v5bf5\" (UID: \"78cc7c3f-09f5-4200-a647-8fa4e9b2aae5\") " pod="openshift-multus/multus-v5bf5" Jan 11 17:31:09 crc kubenswrapper[4837]: I0111 17:31:09.298576 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/78cc7c3f-09f5-4200-a647-8fa4e9b2aae5-host-run-multus-certs\") pod \"multus-v5bf5\" (UID: \"78cc7c3f-09f5-4200-a647-8fa4e9b2aae5\") " pod="openshift-multus/multus-v5bf5" Jan 11 17:31:09 crc kubenswrapper[4837]: I0111 17:31:09.298580 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/e1452749-ce38-41f8-89dd-4b567f2a3250-var-lib-openvswitch\") pod \"ovnkube-node-b7lgc\" (UID: \"e1452749-ce38-41f8-89dd-4b567f2a3250\") " pod="openshift-ovn-kubernetes/ovnkube-node-b7lgc" Jan 11 17:31:09 crc kubenswrapper[4837]: I0111 17:31:09.298595 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/e1452749-ce38-41f8-89dd-4b567f2a3250-host-cni-bin\") pod \"ovnkube-node-b7lgc\" (UID: \"e1452749-ce38-41f8-89dd-4b567f2a3250\") " pod="openshift-ovn-kubernetes/ovnkube-node-b7lgc" Jan 11 17:31:09 crc kubenswrapper[4837]: I0111 17:31:09.298606 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-etc-kube\" (UniqueName: \"kubernetes.io/host-path/37a5e44f-9a88-4405-be8a-b645485e7312-host-etc-kube\") pod \"network-operator-58b4c7f79c-55gtf\" (UID: \"37a5e44f-9a88-4405-be8a-b645485e7312\") " pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Jan 11 17:31:09 crc kubenswrapper[4837]: I0111 17:31:09.298611 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/78cc7c3f-09f5-4200-a647-8fa4e9b2aae5-cni-binary-copy\") pod \"multus-v5bf5\" (UID: \"78cc7c3f-09f5-4200-a647-8fa4e9b2aae5\") " pod="openshift-multus/multus-v5bf5" Jan 11 17:31:09 crc kubenswrapper[4837]: I0111 17:31:09.298628 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/78cc7c3f-09f5-4200-a647-8fa4e9b2aae5-multus-conf-dir\") pod \"multus-v5bf5\" (UID: \"78cc7c3f-09f5-4200-a647-8fa4e9b2aae5\") " pod="openshift-multus/multus-v5bf5" Jan 11 17:31:09 crc kubenswrapper[4837]: I0111 17:31:09.298650 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/1cf6fa66-290a-4e29-bafc-e60185a22fe2-proxy-tls\") pod \"machine-config-daemon-pqnst\" (UID: \"1cf6fa66-290a-4e29-bafc-e60185a22fe2\") " pod="openshift-machine-config-operator/machine-config-daemon-pqnst" Jan 11 17:31:09 crc kubenswrapper[4837]: I0111 17:31:09.298666 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/e1452749-ce38-41f8-89dd-4b567f2a3250-ovn-node-metrics-cert\") pod \"ovnkube-node-b7lgc\" (UID: \"e1452749-ce38-41f8-89dd-4b567f2a3250\") " pod="openshift-ovn-kubernetes/ovnkube-node-b7lgc" Jan 11 17:31:09 crc kubenswrapper[4837]: I0111 17:31:09.298682 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/78cc7c3f-09f5-4200-a647-8fa4e9b2aae5-host-var-lib-cni-bin\") pod \"multus-v5bf5\" (UID: \"78cc7c3f-09f5-4200-a647-8fa4e9b2aae5\") " pod="openshift-multus/multus-v5bf5" Jan 11 17:31:09 crc kubenswrapper[4837]: I0111 17:31:09.298697 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fhxrk\" (UniqueName: \"kubernetes.io/projected/46729771-ba0b-4b7c-8245-b2d57acb5a2c-kube-api-access-fhxrk\") pod \"multus-additional-cni-plugins-7996w\" (UID: \"46729771-ba0b-4b7c-8245-b2d57acb5a2c\") " pod="openshift-multus/multus-additional-cni-plugins-7996w" Jan 11 17:31:09 crc kubenswrapper[4837]: I0111 17:31:09.298710 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/78cc7c3f-09f5-4200-a647-8fa4e9b2aae5-host-var-lib-kubelet\") pod \"multus-v5bf5\" (UID: \"78cc7c3f-09f5-4200-a647-8fa4e9b2aae5\") " pod="openshift-multus/multus-v5bf5" Jan 11 17:31:09 crc kubenswrapper[4837]: I0111 17:31:09.298713 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/e1452749-ce38-41f8-89dd-4b567f2a3250-run-openvswitch\") pod \"ovnkube-node-b7lgc\" (UID: \"e1452749-ce38-41f8-89dd-4b567f2a3250\") " pod="openshift-ovn-kubernetes/ovnkube-node-b7lgc" Jan 11 17:31:09 crc kubenswrapper[4837]: I0111 17:31:09.298728 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/e1452749-ce38-41f8-89dd-4b567f2a3250-node-log\") pod \"ovnkube-node-b7lgc\" (UID: \"e1452749-ce38-41f8-89dd-4b567f2a3250\") " pod="openshift-ovn-kubernetes/ovnkube-node-b7lgc" Jan 11 17:31:09 crc kubenswrapper[4837]: I0111 17:31:09.298743 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/46729771-ba0b-4b7c-8245-b2d57acb5a2c-tuning-conf-dir\") pod \"multus-additional-cni-plugins-7996w\" (UID: \"46729771-ba0b-4b7c-8245-b2d57acb5a2c\") " pod="openshift-multus/multus-additional-cni-plugins-7996w" Jan 11 17:31:09 crc kubenswrapper[4837]: I0111 17:31:09.298760 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/e1452749-ce38-41f8-89dd-4b567f2a3250-etc-openvswitch\") pod \"ovnkube-node-b7lgc\" (UID: \"e1452749-ce38-41f8-89dd-4b567f2a3250\") " pod="openshift-ovn-kubernetes/ovnkube-node-b7lgc" Jan 11 17:31:09 crc kubenswrapper[4837]: I0111 17:31:09.298775 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/e1452749-ce38-41f8-89dd-4b567f2a3250-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-b7lgc\" (UID: \"e1452749-ce38-41f8-89dd-4b567f2a3250\") " pod="openshift-ovn-kubernetes/ovnkube-node-b7lgc" Jan 11 17:31:09 crc kubenswrapper[4837]: I0111 17:31:09.298791 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ns552\" (UniqueName: \"kubernetes.io/projected/78cc7c3f-09f5-4200-a647-8fa4e9b2aae5-kube-api-access-ns552\") pod \"multus-v5bf5\" (UID: \"78cc7c3f-09f5-4200-a647-8fa4e9b2aae5\") " pod="openshift-multus/multus-v5bf5" Jan 11 17:31:09 crc kubenswrapper[4837]: I0111 17:31:09.298815 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/e1452749-ce38-41f8-89dd-4b567f2a3250-host-run-netns\") pod \"ovnkube-node-b7lgc\" (UID: \"e1452749-ce38-41f8-89dd-4b567f2a3250\") " pod="openshift-ovn-kubernetes/ovnkube-node-b7lgc" Jan 11 17:31:09 crc kubenswrapper[4837]: I0111 17:31:09.298830 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/e1452749-ce38-41f8-89dd-4b567f2a3250-ovnkube-script-lib\") pod \"ovnkube-node-b7lgc\" (UID: \"e1452749-ce38-41f8-89dd-4b567f2a3250\") " pod="openshift-ovn-kubernetes/ovnkube-node-b7lgc" Jan 11 17:31:09 crc kubenswrapper[4837]: I0111 17:31:09.298845 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/78cc7c3f-09f5-4200-a647-8fa4e9b2aae5-hostroot\") pod \"multus-v5bf5\" (UID: \"78cc7c3f-09f5-4200-a647-8fa4e9b2aae5\") " pod="openshift-multus/multus-v5bf5" Jan 11 17:31:09 crc kubenswrapper[4837]: I0111 17:31:09.298860 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/78cc7c3f-09f5-4200-a647-8fa4e9b2aae5-etc-kubernetes\") pod \"multus-v5bf5\" (UID: \"78cc7c3f-09f5-4200-a647-8fa4e9b2aae5\") " pod="openshift-multus/multus-v5bf5" Jan 11 17:31:09 crc kubenswrapper[4837]: I0111 17:31:09.298874 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/e1452749-ce38-41f8-89dd-4b567f2a3250-systemd-units\") pod \"ovnkube-node-b7lgc\" (UID: \"e1452749-ce38-41f8-89dd-4b567f2a3250\") " pod="openshift-ovn-kubernetes/ovnkube-node-b7lgc" Jan 11 17:31:09 crc kubenswrapper[4837]: I0111 17:31:09.298889 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xnsnp\" (UniqueName: \"kubernetes.io/projected/a574a741-71b2-4d60-9553-3d85d815f4b0-kube-api-access-xnsnp\") pod \"node-resolver-kcvkg\" (UID: \"a574a741-71b2-4d60-9553-3d85d815f4b0\") " pod="openshift-dns/node-resolver-kcvkg" Jan 11 17:31:09 crc kubenswrapper[4837]: I0111 17:31:09.298905 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/a574a741-71b2-4d60-9553-3d85d815f4b0-hosts-file\") pod \"node-resolver-kcvkg\" (UID: \"a574a741-71b2-4d60-9553-3d85d815f4b0\") " pod="openshift-dns/node-resolver-kcvkg" Jan 11 17:31:09 crc kubenswrapper[4837]: I0111 17:31:09.298921 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/46729771-ba0b-4b7c-8245-b2d57acb5a2c-system-cni-dir\") pod \"multus-additional-cni-plugins-7996w\" (UID: \"46729771-ba0b-4b7c-8245-b2d57acb5a2c\") " pod="openshift-multus/multus-additional-cni-plugins-7996w" Jan 11 17:31:09 crc kubenswrapper[4837]: I0111 17:31:09.298941 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/78cc7c3f-09f5-4200-a647-8fa4e9b2aae5-host-var-lib-cni-multus\") pod \"multus-v5bf5\" (UID: \"78cc7c3f-09f5-4200-a647-8fa4e9b2aae5\") " pod="openshift-multus/multus-v5bf5" Jan 11 17:31:09 crc kubenswrapper[4837]: I0111 17:31:09.298956 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/e1452749-ce38-41f8-89dd-4b567f2a3250-run-systemd\") pod \"ovnkube-node-b7lgc\" (UID: \"e1452749-ce38-41f8-89dd-4b567f2a3250\") " pod="openshift-ovn-kubernetes/ovnkube-node-b7lgc" Jan 11 17:31:09 crc kubenswrapper[4837]: I0111 17:31:09.298970 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/e1452749-ce38-41f8-89dd-4b567f2a3250-host-run-ovn-kubernetes\") pod \"ovnkube-node-b7lgc\" (UID: \"e1452749-ce38-41f8-89dd-4b567f2a3250\") " pod="openshift-ovn-kubernetes/ovnkube-node-b7lgc" Jan 11 17:31:09 crc kubenswrapper[4837]: I0111 17:31:09.298985 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/e1452749-ce38-41f8-89dd-4b567f2a3250-env-overrides\") pod \"ovnkube-node-b7lgc\" (UID: \"e1452749-ce38-41f8-89dd-4b567f2a3250\") " pod="openshift-ovn-kubernetes/ovnkube-node-b7lgc" Jan 11 17:31:09 crc kubenswrapper[4837]: I0111 17:31:09.299002 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/46729771-ba0b-4b7c-8245-b2d57acb5a2c-os-release\") pod \"multus-additional-cni-plugins-7996w\" (UID: \"46729771-ba0b-4b7c-8245-b2d57acb5a2c\") " pod="openshift-multus/multus-additional-cni-plugins-7996w" Jan 11 17:31:09 crc kubenswrapper[4837]: I0111 17:31:09.299023 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/46729771-ba0b-4b7c-8245-b2d57acb5a2c-cni-binary-copy\") pod \"multus-additional-cni-plugins-7996w\" (UID: \"46729771-ba0b-4b7c-8245-b2d57acb5a2c\") " pod="openshift-multus/multus-additional-cni-plugins-7996w" Jan 11 17:31:09 crc kubenswrapper[4837]: I0111 17:31:09.299041 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rootfs\" (UniqueName: \"kubernetes.io/host-path/1cf6fa66-290a-4e29-bafc-e60185a22fe2-rootfs\") pod \"machine-config-daemon-pqnst\" (UID: \"1cf6fa66-290a-4e29-bafc-e60185a22fe2\") " pod="openshift-machine-config-operator/machine-config-daemon-pqnst" Jan 11 17:31:09 crc kubenswrapper[4837]: I0111 17:31:09.299059 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/e1452749-ce38-41f8-89dd-4b567f2a3250-host-kubelet\") pod \"ovnkube-node-b7lgc\" (UID: \"e1452749-ce38-41f8-89dd-4b567f2a3250\") " pod="openshift-ovn-kubernetes/ovnkube-node-b7lgc" Jan 11 17:31:09 crc kubenswrapper[4837]: I0111 17:31:09.299077 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/e1452749-ce38-41f8-89dd-4b567f2a3250-host-slash\") pod \"ovnkube-node-b7lgc\" (UID: \"e1452749-ce38-41f8-89dd-4b567f2a3250\") " pod="openshift-ovn-kubernetes/ovnkube-node-b7lgc" Jan 11 17:31:09 crc kubenswrapper[4837]: I0111 17:31:09.299096 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/78cc7c3f-09f5-4200-a647-8fa4e9b2aae5-cnibin\") pod \"multus-v5bf5\" (UID: \"78cc7c3f-09f5-4200-a647-8fa4e9b2aae5\") " pod="openshift-multus/multus-v5bf5" Jan 11 17:31:09 crc kubenswrapper[4837]: I0111 17:31:09.299116 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/78cc7c3f-09f5-4200-a647-8fa4e9b2aae5-multus-socket-dir-parent\") pod \"multus-v5bf5\" (UID: \"78cc7c3f-09f5-4200-a647-8fa4e9b2aae5\") " pod="openshift-multus/multus-v5bf5" Jan 11 17:31:09 crc kubenswrapper[4837]: I0111 17:31:09.299142 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6f6rv\" (UniqueName: \"kubernetes.io/projected/e1452749-ce38-41f8-89dd-4b567f2a3250-kube-api-access-6f6rv\") pod \"ovnkube-node-b7lgc\" (UID: \"e1452749-ce38-41f8-89dd-4b567f2a3250\") " pod="openshift-ovn-kubernetes/ovnkube-node-b7lgc" Jan 11 17:31:09 crc kubenswrapper[4837]: I0111 17:31:09.299159 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/46729771-ba0b-4b7c-8245-b2d57acb5a2c-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-7996w\" (UID: \"46729771-ba0b-4b7c-8245-b2d57acb5a2c\") " pod="openshift-multus/multus-additional-cni-plugins-7996w" Jan 11 17:31:09 crc kubenswrapper[4837]: I0111 17:31:09.299176 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/78cc7c3f-09f5-4200-a647-8fa4e9b2aae5-multus-cni-dir\") pod \"multus-v5bf5\" (UID: \"78cc7c3f-09f5-4200-a647-8fa4e9b2aae5\") " pod="openshift-multus/multus-v5bf5" Jan 11 17:31:09 crc kubenswrapper[4837]: I0111 17:31:09.299193 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"mcd-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/1cf6fa66-290a-4e29-bafc-e60185a22fe2-mcd-auth-proxy-config\") pod \"machine-config-daemon-pqnst\" (UID: \"1cf6fa66-290a-4e29-bafc-e60185a22fe2\") " pod="openshift-machine-config-operator/machine-config-daemon-pqnst" Jan 11 17:31:09 crc kubenswrapper[4837]: I0111 17:31:09.299203 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/78cc7c3f-09f5-4200-a647-8fa4e9b2aae5-cni-binary-copy\") pod \"multus-v5bf5\" (UID: \"78cc7c3f-09f5-4200-a647-8fa4e9b2aae5\") " pod="openshift-multus/multus-v5bf5" Jan 11 17:31:09 crc kubenswrapper[4837]: I0111 17:31:09.299209 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-host-slash\") pod \"iptables-alerter-4ln5h\" (UID: \"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\") " pod="openshift-network-operator/iptables-alerter-4ln5h" Jan 11 17:31:09 crc kubenswrapper[4837]: I0111 17:31:09.299226 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/e1452749-ce38-41f8-89dd-4b567f2a3250-ovnkube-config\") pod \"ovnkube-node-b7lgc\" (UID: \"e1452749-ce38-41f8-89dd-4b567f2a3250\") " pod="openshift-ovn-kubernetes/ovnkube-node-b7lgc" Jan 11 17:31:09 crc kubenswrapper[4837]: I0111 17:31:09.299240 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/78cc7c3f-09f5-4200-a647-8fa4e9b2aae5-host-run-k8s-cni-cncf-io\") pod \"multus-v5bf5\" (UID: \"78cc7c3f-09f5-4200-a647-8fa4e9b2aae5\") " pod="openshift-multus/multus-v5bf5" Jan 11 17:31:09 crc kubenswrapper[4837]: I0111 17:31:09.299253 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/e1452749-ce38-41f8-89dd-4b567f2a3250-host-cni-netd\") pod \"ovnkube-node-b7lgc\" (UID: \"e1452749-ce38-41f8-89dd-4b567f2a3250\") " pod="openshift-ovn-kubernetes/ovnkube-node-b7lgc" Jan 11 17:31:09 crc kubenswrapper[4837]: I0111 17:31:09.299260 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/78cc7c3f-09f5-4200-a647-8fa4e9b2aae5-multus-daemon-config\") pod \"multus-v5bf5\" (UID: \"78cc7c3f-09f5-4200-a647-8fa4e9b2aae5\") " pod="openshift-multus/multus-v5bf5" Jan 11 17:31:09 crc kubenswrapper[4837]: I0111 17:31:09.298612 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/78cc7c3f-09f5-4200-a647-8fa4e9b2aae5-system-cni-dir\") pod \"multus-v5bf5\" (UID: \"78cc7c3f-09f5-4200-a647-8fa4e9b2aae5\") " pod="openshift-multus/multus-v5bf5" Jan 11 17:31:09 crc kubenswrapper[4837]: I0111 17:31:09.298650 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/78cc7c3f-09f5-4200-a647-8fa4e9b2aae5-host-run-netns\") pod \"multus-v5bf5\" (UID: \"78cc7c3f-09f5-4200-a647-8fa4e9b2aae5\") " pod="openshift-multus/multus-v5bf5" Jan 11 17:31:09 crc kubenswrapper[4837]: I0111 17:31:09.299338 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/e1452749-ce38-41f8-89dd-4b567f2a3250-node-log\") pod \"ovnkube-node-b7lgc\" (UID: \"e1452749-ce38-41f8-89dd-4b567f2a3250\") " pod="openshift-ovn-kubernetes/ovnkube-node-b7lgc" Jan 11 17:31:09 crc kubenswrapper[4837]: I0111 17:31:09.299351 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/46729771-ba0b-4b7c-8245-b2d57acb5a2c-system-cni-dir\") pod \"multus-additional-cni-plugins-7996w\" (UID: \"46729771-ba0b-4b7c-8245-b2d57acb5a2c\") " pod="openshift-multus/multus-additional-cni-plugins-7996w" Jan 11 17:31:09 crc kubenswrapper[4837]: I0111 17:31:09.299379 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/78cc7c3f-09f5-4200-a647-8fa4e9b2aae5-host-var-lib-cni-multus\") pod \"multus-v5bf5\" (UID: \"78cc7c3f-09f5-4200-a647-8fa4e9b2aae5\") " pod="openshift-multus/multus-v5bf5" Jan 11 17:31:09 crc kubenswrapper[4837]: I0111 17:31:09.299403 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/e1452749-ce38-41f8-89dd-4b567f2a3250-run-systemd\") pod \"ovnkube-node-b7lgc\" (UID: \"e1452749-ce38-41f8-89dd-4b567f2a3250\") " pod="openshift-ovn-kubernetes/ovnkube-node-b7lgc" Jan 11 17:31:09 crc kubenswrapper[4837]: I0111 17:31:09.299424 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/e1452749-ce38-41f8-89dd-4b567f2a3250-host-run-ovn-kubernetes\") pod \"ovnkube-node-b7lgc\" (UID: \"e1452749-ce38-41f8-89dd-4b567f2a3250\") " pod="openshift-ovn-kubernetes/ovnkube-node-b7lgc" Jan 11 17:31:09 crc kubenswrapper[4837]: I0111 17:31:09.299621 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/e1452749-ce38-41f8-89dd-4b567f2a3250-var-lib-openvswitch\") pod \"ovnkube-node-b7lgc\" (UID: \"e1452749-ce38-41f8-89dd-4b567f2a3250\") " pod="openshift-ovn-kubernetes/ovnkube-node-b7lgc" Jan 11 17:31:09 crc kubenswrapper[4837]: I0111 17:31:09.299638 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/e1452749-ce38-41f8-89dd-4b567f2a3250-host-cni-bin\") pod \"ovnkube-node-b7lgc\" (UID: \"e1452749-ce38-41f8-89dd-4b567f2a3250\") " pod="openshift-ovn-kubernetes/ovnkube-node-b7lgc" Jan 11 17:31:09 crc kubenswrapper[4837]: I0111 17:31:09.299687 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/78cc7c3f-09f5-4200-a647-8fa4e9b2aae5-multus-cni-dir\") pod \"multus-v5bf5\" (UID: \"78cc7c3f-09f5-4200-a647-8fa4e9b2aae5\") " pod="openshift-multus/multus-v5bf5" Jan 11 17:31:09 crc kubenswrapper[4837]: I0111 17:31:09.299696 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/46729771-ba0b-4b7c-8245-b2d57acb5a2c-tuning-conf-dir\") pod \"multus-additional-cni-plugins-7996w\" (UID: \"46729771-ba0b-4b7c-8245-b2d57acb5a2c\") " pod="openshift-multus/multus-additional-cni-plugins-7996w" Jan 11 17:31:09 crc kubenswrapper[4837]: I0111 17:31:09.299714 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/46729771-ba0b-4b7c-8245-b2d57acb5a2c-os-release\") pod \"multus-additional-cni-plugins-7996w\" (UID: \"46729771-ba0b-4b7c-8245-b2d57acb5a2c\") " pod="openshift-multus/multus-additional-cni-plugins-7996w" Jan 11 17:31:09 crc kubenswrapper[4837]: I0111 17:31:09.299975 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/e1452749-ce38-41f8-89dd-4b567f2a3250-env-overrides\") pod \"ovnkube-node-b7lgc\" (UID: \"e1452749-ce38-41f8-89dd-4b567f2a3250\") " pod="openshift-ovn-kubernetes/ovnkube-node-b7lgc" Jan 11 17:31:09 crc kubenswrapper[4837]: I0111 17:31:09.300244 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rootfs\" (UniqueName: \"kubernetes.io/host-path/1cf6fa66-290a-4e29-bafc-e60185a22fe2-rootfs\") pod \"machine-config-daemon-pqnst\" (UID: \"1cf6fa66-290a-4e29-bafc-e60185a22fe2\") " pod="openshift-machine-config-operator/machine-config-daemon-pqnst" Jan 11 17:31:09 crc kubenswrapper[4837]: I0111 17:31:09.300304 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/e1452749-ce38-41f8-89dd-4b567f2a3250-host-kubelet\") pod \"ovnkube-node-b7lgc\" (UID: \"e1452749-ce38-41f8-89dd-4b567f2a3250\") " pod="openshift-ovn-kubernetes/ovnkube-node-b7lgc" Jan 11 17:31:09 crc kubenswrapper[4837]: I0111 17:31:09.300323 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/e1452749-ce38-41f8-89dd-4b567f2a3250-host-slash\") pod \"ovnkube-node-b7lgc\" (UID: \"e1452749-ce38-41f8-89dd-4b567f2a3250\") " pod="openshift-ovn-kubernetes/ovnkube-node-b7lgc" Jan 11 17:31:09 crc kubenswrapper[4837]: I0111 17:31:09.300350 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/78cc7c3f-09f5-4200-a647-8fa4e9b2aae5-cnibin\") pod \"multus-v5bf5\" (UID: \"78cc7c3f-09f5-4200-a647-8fa4e9b2aae5\") " pod="openshift-multus/multus-v5bf5" Jan 11 17:31:09 crc kubenswrapper[4837]: I0111 17:31:09.300368 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/46729771-ba0b-4b7c-8245-b2d57acb5a2c-cni-binary-copy\") pod \"multus-additional-cni-plugins-7996w\" (UID: \"46729771-ba0b-4b7c-8245-b2d57acb5a2c\") " pod="openshift-multus/multus-additional-cni-plugins-7996w" Jan 11 17:31:09 crc kubenswrapper[4837]: I0111 17:31:09.300379 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/78cc7c3f-09f5-4200-a647-8fa4e9b2aae5-multus-socket-dir-parent\") pod \"multus-v5bf5\" (UID: \"78cc7c3f-09f5-4200-a647-8fa4e9b2aae5\") " pod="openshift-multus/multus-v5bf5" Jan 11 17:31:09 crc kubenswrapper[4837]: I0111 17:31:09.300395 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-host-slash\") pod \"iptables-alerter-4ln5h\" (UID: \"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\") " pod="openshift-network-operator/iptables-alerter-4ln5h" Jan 11 17:31:09 crc kubenswrapper[4837]: I0111 17:31:09.300545 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"mcd-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/1cf6fa66-290a-4e29-bafc-e60185a22fe2-mcd-auth-proxy-config\") pod \"machine-config-daemon-pqnst\" (UID: \"1cf6fa66-290a-4e29-bafc-e60185a22fe2\") " pod="openshift-machine-config-operator/machine-config-daemon-pqnst" Jan 11 17:31:09 crc kubenswrapper[4837]: I0111 17:31:09.300923 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/e1452749-ce38-41f8-89dd-4b567f2a3250-ovnkube-script-lib\") pod \"ovnkube-node-b7lgc\" (UID: \"e1452749-ce38-41f8-89dd-4b567f2a3250\") " pod="openshift-ovn-kubernetes/ovnkube-node-b7lgc" Jan 11 17:31:09 crc kubenswrapper[4837]: I0111 17:31:09.300931 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/e1452749-ce38-41f8-89dd-4b567f2a3250-ovnkube-config\") pod \"ovnkube-node-b7lgc\" (UID: \"e1452749-ce38-41f8-89dd-4b567f2a3250\") " pod="openshift-ovn-kubernetes/ovnkube-node-b7lgc" Jan 11 17:31:09 crc kubenswrapper[4837]: I0111 17:31:09.300958 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/e1452749-ce38-41f8-89dd-4b567f2a3250-host-cni-netd\") pod \"ovnkube-node-b7lgc\" (UID: \"e1452749-ce38-41f8-89dd-4b567f2a3250\") " pod="openshift-ovn-kubernetes/ovnkube-node-b7lgc" Jan 11 17:31:09 crc kubenswrapper[4837]: I0111 17:31:09.300972 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/a574a741-71b2-4d60-9553-3d85d815f4b0-hosts-file\") pod \"node-resolver-kcvkg\" (UID: \"a574a741-71b2-4d60-9553-3d85d815f4b0\") " pod="openshift-dns/node-resolver-kcvkg" Jan 11 17:31:09 crc kubenswrapper[4837]: I0111 17:31:09.300980 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/e1452749-ce38-41f8-89dd-4b567f2a3250-etc-openvswitch\") pod \"ovnkube-node-b7lgc\" (UID: \"e1452749-ce38-41f8-89dd-4b567f2a3250\") " pod="openshift-ovn-kubernetes/ovnkube-node-b7lgc" Jan 11 17:31:09 crc kubenswrapper[4837]: I0111 17:31:09.300985 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/46729771-ba0b-4b7c-8245-b2d57acb5a2c-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-7996w\" (UID: \"46729771-ba0b-4b7c-8245-b2d57acb5a2c\") " pod="openshift-multus/multus-additional-cni-plugins-7996w" Jan 11 17:31:09 crc kubenswrapper[4837]: I0111 17:31:09.301003 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/e1452749-ce38-41f8-89dd-4b567f2a3250-host-run-netns\") pod \"ovnkube-node-b7lgc\" (UID: \"e1452749-ce38-41f8-89dd-4b567f2a3250\") " pod="openshift-ovn-kubernetes/ovnkube-node-b7lgc" Jan 11 17:31:09 crc kubenswrapper[4837]: I0111 17:31:09.301017 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/e1452749-ce38-41f8-89dd-4b567f2a3250-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-b7lgc\" (UID: \"e1452749-ce38-41f8-89dd-4b567f2a3250\") " pod="openshift-ovn-kubernetes/ovnkube-node-b7lgc" Jan 11 17:31:09 crc kubenswrapper[4837]: I0111 17:31:09.301024 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/e1452749-ce38-41f8-89dd-4b567f2a3250-systemd-units\") pod \"ovnkube-node-b7lgc\" (UID: \"e1452749-ce38-41f8-89dd-4b567f2a3250\") " pod="openshift-ovn-kubernetes/ovnkube-node-b7lgc" Jan 11 17:31:09 crc kubenswrapper[4837]: I0111 17:31:09.300997 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/78cc7c3f-09f5-4200-a647-8fa4e9b2aae5-etc-kubernetes\") pod \"multus-v5bf5\" (UID: \"78cc7c3f-09f5-4200-a647-8fa4e9b2aae5\") " pod="openshift-multus/multus-v5bf5" Jan 11 17:31:09 crc kubenswrapper[4837]: I0111 17:31:09.301048 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/78cc7c3f-09f5-4200-a647-8fa4e9b2aae5-hostroot\") pod \"multus-v5bf5\" (UID: \"78cc7c3f-09f5-4200-a647-8fa4e9b2aae5\") " pod="openshift-multus/multus-v5bf5" Jan 11 17:31:09 crc kubenswrapper[4837]: I0111 17:31:09.301063 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/e1452749-ce38-41f8-89dd-4b567f2a3250-run-openvswitch\") pod \"ovnkube-node-b7lgc\" (UID: \"e1452749-ce38-41f8-89dd-4b567f2a3250\") " pod="openshift-ovn-kubernetes/ovnkube-node-b7lgc" Jan 11 17:31:09 crc kubenswrapper[4837]: I0111 17:31:09.301252 4837 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-serving-cert\") on node \"crc\" DevicePath \"\"" Jan 11 17:31:09 crc kubenswrapper[4837]: I0111 17:31:09.301267 4837 reconciler_common.go:293] "Volume detached for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-metrics-certs\") on node \"crc\" DevicePath \"\"" Jan 11 17:31:09 crc kubenswrapper[4837]: I0111 17:31:09.301280 4837 reconciler_common.go:293] "Volume detached for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-samples-operator-tls\") on node \"crc\" DevicePath \"\"" Jan 11 17:31:09 crc kubenswrapper[4837]: I0111 17:31:09.301290 4837 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-bf2bz\" (UniqueName: \"kubernetes.io/projected/1d611f23-29be-4491-8495-bee1670e935f-kube-api-access-bf2bz\") on node \"crc\" DevicePath \"\"" Jan 11 17:31:09 crc kubenswrapper[4837]: I0111 17:31:09.301332 4837 reconciler_common.go:293] "Volume detached for volume \"machine-api-operator-tls\" (UniqueName: \"kubernetes.io/secret/6402fda4-df10-493c-b4e5-d0569419652d-machine-api-operator-tls\") on node \"crc\" DevicePath \"\"" Jan 11 17:31:09 crc kubenswrapper[4837]: I0111 17:31:09.301343 4837 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e7e6199b-1264-4501-8953-767f51328d08-config\") on node \"crc\" DevicePath \"\"" Jan 11 17:31:09 crc kubenswrapper[4837]: I0111 17:31:09.301353 4837 reconciler_common.go:293] "Volume detached for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-sysctl-allowlist\") on node \"crc\" DevicePath \"\"" Jan 11 17:31:09 crc kubenswrapper[4837]: I0111 17:31:09.301362 4837 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-serving-cert\") on node \"crc\" DevicePath \"\"" Jan 11 17:31:09 crc kubenswrapper[4837]: I0111 17:31:09.301431 4837 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/01ab3dd5-8196-46d0-ad33-122e2ca51def-serving-cert\") on node \"crc\" DevicePath \"\"" Jan 11 17:31:09 crc kubenswrapper[4837]: I0111 17:31:09.301453 4837 reconciler_common.go:293] "Volume detached for volume \"images\" (UniqueName: \"kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-images\") on node \"crc\" DevicePath \"\"" Jan 11 17:31:09 crc kubenswrapper[4837]: I0111 17:31:09.301466 4837 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-vt5rc\" (UniqueName: \"kubernetes.io/projected/44663579-783b-4372-86d6-acf235a62d72-kube-api-access-vt5rc\") on node \"crc\" DevicePath \"\"" Jan 11 17:31:09 crc kubenswrapper[4837]: I0111 17:31:09.301479 4837 reconciler_common.go:293] "Volume detached for volume \"mcc-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/0b574797-001e-440a-8f4e-c0be86edad0f-mcc-auth-proxy-config\") on node \"crc\" DevicePath \"\"" Jan 11 17:31:09 crc kubenswrapper[4837]: I0111 17:31:09.301489 4837 reconciler_common.go:293] "Volume detached for volume \"node-bootstrap-token\" (UniqueName: \"kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-node-bootstrap-token\") on node \"crc\" DevicePath \"\"" Jan 11 17:31:09 crc kubenswrapper[4837]: I0111 17:31:09.301499 4837 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/e7e6199b-1264-4501-8953-767f51328d08-kube-api-access\") on node \"crc\" DevicePath \"\"" Jan 11 17:31:09 crc kubenswrapper[4837]: I0111 17:31:09.301509 4837 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-service-ca\") on node \"crc\" DevicePath \"\"" Jan 11 17:31:09 crc kubenswrapper[4837]: I0111 17:31:09.301520 4837 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-ngvvp\" (UniqueName: \"kubernetes.io/projected/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-kube-api-access-ngvvp\") on node \"crc\" DevicePath \"\"" Jan 11 17:31:09 crc kubenswrapper[4837]: I0111 17:31:09.301531 4837 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-idp-0-file-data\") on node \"crc\" DevicePath \"\"" Jan 11 17:31:09 crc kubenswrapper[4837]: I0111 17:31:09.301542 4837 reconciler_common.go:293] "Volume detached for volume \"mcd-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/fda69060-fa79-4696-b1a6-7980f124bf7c-mcd-auth-proxy-config\") on node \"crc\" DevicePath \"\"" Jan 11 17:31:09 crc kubenswrapper[4837]: I0111 17:31:09.301552 4837 reconciler_common.go:293] "Volume detached for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-trusted-ca\") on node \"crc\" DevicePath \"\"" Jan 11 17:31:09 crc kubenswrapper[4837]: I0111 17:31:09.301561 4837 reconciler_common.go:293] "Volume detached for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-client\") on node \"crc\" DevicePath \"\"" Jan 11 17:31:09 crc kubenswrapper[4837]: I0111 17:31:09.301570 4837 reconciler_common.go:293] "Volume detached for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-trusted-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 11 17:31:09 crc kubenswrapper[4837]: I0111 17:31:09.301580 4837 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-config\") on node \"crc\" DevicePath \"\"" Jan 11 17:31:09 crc kubenswrapper[4837]: I0111 17:31:09.301590 4837 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-client-ca\") on node \"crc\" DevicePath \"\"" Jan 11 17:31:09 crc kubenswrapper[4837]: I0111 17:31:09.301599 4837 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-cliconfig\") on node \"crc\" DevicePath \"\"" Jan 11 17:31:09 crc kubenswrapper[4837]: I0111 17:31:09.301610 4837 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6g6sz\" (UniqueName: \"kubernetes.io/projected/6509e943-70c6-444c-bc41-48a544e36fbd-kube-api-access-6g6sz\") on node \"crc\" DevicePath \"\"" Jan 11 17:31:09 crc kubenswrapper[4837]: I0111 17:31:09.301620 4837 reconciler_common.go:293] "Volume detached for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/bf126b07-da06-4140-9a57-dfd54fc6b486-trusted-ca\") on node \"crc\" DevicePath \"\"" Jan 11 17:31:09 crc kubenswrapper[4837]: I0111 17:31:09.301629 4837 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-client-ca\") on node \"crc\" DevicePath \"\"" Jan 11 17:31:09 crc kubenswrapper[4837]: I0111 17:31:09.301639 4837 reconciler_common.go:293] "Volume detached for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/a31745f5-9847-4afe-82a5-3161cc66ca93-trusted-ca\") on node \"crc\" DevicePath \"\"" Jan 11 17:31:09 crc kubenswrapper[4837]: I0111 17:31:09.301649 4837 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jhbk2\" (UniqueName: \"kubernetes.io/projected/bd23aa5c-e532-4e53-bccf-e79f130c5ae8-kube-api-access-jhbk2\") on node \"crc\" DevicePath \"\"" Jan 11 17:31:09 crc kubenswrapper[4837]: I0111 17:31:09.301660 4837 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7539238d-5fe0-46ed-884e-1c3b566537ec-config\") on node \"crc\" DevicePath \"\"" Jan 11 17:31:09 crc kubenswrapper[4837]: I0111 17:31:09.301686 4837 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-249nr\" (UniqueName: \"kubernetes.io/projected/b6312bbd-5731-4ea0-a20f-81d5a57df44a-kube-api-access-249nr\") on node \"crc\" DevicePath \"\"" Jan 11 17:31:09 crc kubenswrapper[4837]: I0111 17:31:09.301697 4837 reconciler_common.go:293] "Volume detached for volume \"machine-approver-tls\" (UniqueName: \"kubernetes.io/secret/22c825df-677d-4ca6-82db-3454ed06e783-machine-approver-tls\") on node \"crc\" DevicePath \"\"" Jan 11 17:31:09 crc kubenswrapper[4837]: I0111 17:31:09.301707 4837 reconciler_common.go:293] "Volume detached for volume \"image-import-ca\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-image-import-ca\") on node \"crc\" DevicePath \"\"" Jan 11 17:31:09 crc kubenswrapper[4837]: I0111 17:31:09.301718 4837 reconciler_common.go:293] "Volume detached for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-console-config\") on node \"crc\" DevicePath \"\"" Jan 11 17:31:09 crc kubenswrapper[4837]: I0111 17:31:09.301729 4837 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-login\") on node \"crc\" DevicePath \"\"" Jan 11 17:31:09 crc kubenswrapper[4837]: I0111 17:31:09.301740 4837 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-279lb\" (UniqueName: \"kubernetes.io/projected/7bb08738-c794-4ee8-9972-3a62ca171029-kube-api-access-279lb\") on node \"crc\" DevicePath \"\"" Jan 11 17:31:09 crc kubenswrapper[4837]: I0111 17:31:09.301750 4837 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-htfz6\" (UniqueName: \"kubernetes.io/projected/6ea678ab-3438-413e-bfe3-290ae7725660-kube-api-access-htfz6\") on node \"crc\" DevicePath \"\"" Jan 11 17:31:09 crc kubenswrapper[4837]: I0111 17:31:09.301760 4837 reconciler_common.go:293] "Volume detached for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-cni-binary-copy\") on node \"crc\" DevicePath \"\"" Jan 11 17:31:09 crc kubenswrapper[4837]: I0111 17:31:09.301770 4837 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-config\") on node \"crc\" DevicePath \"\"" Jan 11 17:31:09 crc kubenswrapper[4837]: I0111 17:31:09.301779 4837 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-w4xd4\" (UniqueName: \"kubernetes.io/projected/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-kube-api-access-w4xd4\") on node \"crc\" DevicePath \"\"" Jan 11 17:31:09 crc kubenswrapper[4837]: I0111 17:31:09.301788 4837 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-config\") on node \"crc\" DevicePath \"\"" Jan 11 17:31:09 crc kubenswrapper[4837]: I0111 17:31:09.301798 4837 reconciler_common.go:293] "Volume detached for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-srv-cert\") on node \"crc\" DevicePath \"\"" Jan 11 17:31:09 crc kubenswrapper[4837]: I0111 17:31:09.301808 4837 reconciler_common.go:293] "Volume detached for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/8f668bae-612b-4b75-9490-919e737c6a3b-ca-trust-extracted\") on node \"crc\" DevicePath \"\"" Jan 11 17:31:09 crc kubenswrapper[4837]: I0111 17:31:09.301817 4837 reconciler_common.go:293] "Volume detached for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/fda69060-fa79-4696-b1a6-7980f124bf7c-proxy-tls\") on node \"crc\" DevicePath \"\"" Jan 11 17:31:09 crc kubenswrapper[4837]: I0111 17:31:09.301827 4837 reconciler_common.go:293] "Volume detached for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-bound-sa-token\") on node \"crc\" DevicePath \"\"" Jan 11 17:31:09 crc kubenswrapper[4837]: I0111 17:31:09.301837 4837 reconciler_common.go:293] "Volume detached for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-config\") on node \"crc\" DevicePath \"\"" Jan 11 17:31:09 crc kubenswrapper[4837]: I0111 17:31:09.301846 4837 reconciler_common.go:293] "Volume detached for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/6ea678ab-3438-413e-bfe3-290ae7725660-ovn-node-metrics-cert\") on node \"crc\" DevicePath \"\"" Jan 11 17:31:09 crc kubenswrapper[4837]: I0111 17:31:09.301855 4837 reconciler_common.go:293] "Volume detached for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/0b574797-001e-440a-8f4e-c0be86edad0f-proxy-tls\") on node \"crc\" DevicePath \"\"" Jan 11 17:31:09 crc kubenswrapper[4837]: I0111 17:31:09.301864 4837 reconciler_common.go:293] "Volume detached for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-oauth-serving-cert\") on node \"crc\" DevicePath \"\"" Jan 11 17:31:09 crc kubenswrapper[4837]: I0111 17:31:09.301873 4837 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-config\") on node \"crc\" DevicePath \"\"" Jan 11 17:31:09 crc kubenswrapper[4837]: I0111 17:31:09.301883 4837 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-provider-selection\") on node \"crc\" DevicePath \"\"" Jan 11 17:31:09 crc kubenswrapper[4837]: I0111 17:31:09.301893 4837 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-7c4vf\" (UniqueName: \"kubernetes.io/projected/22c825df-677d-4ca6-82db-3454ed06e783-kube-api-access-7c4vf\") on node \"crc\" DevicePath \"\"" Jan 11 17:31:09 crc kubenswrapper[4837]: I0111 17:31:09.301903 4837 reconciler_common.go:293] "Volume detached for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-profile-collector-cert\") on node \"crc\" DevicePath \"\"" Jan 11 17:31:09 crc kubenswrapper[4837]: I0111 17:31:09.301912 4837 reconciler_common.go:293] "Volume detached for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-operator-metrics\") on node \"crc\" DevicePath \"\"" Jan 11 17:31:09 crc kubenswrapper[4837]: I0111 17:31:09.301921 4837 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-pj782\" (UniqueName: \"kubernetes.io/projected/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-kube-api-access-pj782\") on node \"crc\" DevicePath \"\"" Jan 11 17:31:09 crc kubenswrapper[4837]: I0111 17:31:09.301931 4837 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-ocp-branding-template\") on node \"crc\" DevicePath \"\"" Jan 11 17:31:09 crc kubenswrapper[4837]: I0111 17:31:09.301941 4837 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-d4lsv\" (UniqueName: \"kubernetes.io/projected/25e176fe-21b4-4974-b1ed-c8b94f112a7f-kube-api-access-d4lsv\") on node \"crc\" DevicePath \"\"" Jan 11 17:31:09 crc kubenswrapper[4837]: I0111 17:31:09.301950 4837 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-config\") on node \"crc\" DevicePath \"\"" Jan 11 17:31:09 crc kubenswrapper[4837]: I0111 17:31:09.301959 4837 reconciler_common.go:293] "Volume detached for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-proxy-ca-bundles\") on node \"crc\" DevicePath \"\"" Jan 11 17:31:09 crc kubenswrapper[4837]: I0111 17:31:09.301968 4837 reconciler_common.go:293] "Volume detached for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-service-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 11 17:31:09 crc kubenswrapper[4837]: I0111 17:31:09.301978 4837 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-lzf88\" (UniqueName: \"kubernetes.io/projected/0b574797-001e-440a-8f4e-c0be86edad0f-kube-api-access-lzf88\") on node \"crc\" DevicePath \"\"" Jan 11 17:31:09 crc kubenswrapper[4837]: I0111 17:31:09.301987 4837 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-serving-cert\") on node \"crc\" DevicePath \"\"" Jan 11 17:31:09 crc kubenswrapper[4837]: I0111 17:31:09.301996 4837 reconciler_common.go:293] "Volume detached for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-trusted-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 11 17:31:09 crc kubenswrapper[4837]: I0111 17:31:09.302004 4837 reconciler_common.go:293] "Volume detached for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/87cf06ed-a83f-41a7-828d-70653580a8cb-metrics-tls\") on node \"crc\" DevicePath \"\"" Jan 11 17:31:09 crc kubenswrapper[4837]: I0111 17:31:09.302015 4837 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-rnphk\" (UniqueName: \"kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-kube-api-access-rnphk\") on node \"crc\" DevicePath \"\"" Jan 11 17:31:09 crc kubenswrapper[4837]: I0111 17:31:09.302027 4837 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mg5zb\" (UniqueName: \"kubernetes.io/projected/6402fda4-df10-493c-b4e5-d0569419652d-kube-api-access-mg5zb\") on node \"crc\" DevicePath \"\"" Jan 11 17:31:09 crc kubenswrapper[4837]: I0111 17:31:09.302037 4837 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-x4zgh\" (UniqueName: \"kubernetes.io/projected/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-kube-api-access-x4zgh\") on node \"crc\" DevicePath \"\"" Jan 11 17:31:09 crc kubenswrapper[4837]: I0111 17:31:09.302046 4837 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-v47cf\" (UniqueName: \"kubernetes.io/projected/c03ee662-fb2f-4fc4-a2c1-af487c19d254-kube-api-access-v47cf\") on node \"crc\" DevicePath \"\"" Jan 11 17:31:09 crc kubenswrapper[4837]: I0111 17:31:09.302055 4837 reconciler_common.go:293] "Volume detached for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-env-overrides\") on node \"crc\" DevicePath \"\"" Jan 11 17:31:09 crc kubenswrapper[4837]: I0111 17:31:09.302065 4837 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-9xfj7\" (UniqueName: \"kubernetes.io/projected/5225d0e4-402f-4861-b410-819f433b1803-kube-api-access-9xfj7\") on node \"crc\" DevicePath \"\"" Jan 11 17:31:09 crc kubenswrapper[4837]: I0111 17:31:09.302075 4837 reconciler_common.go:293] "Volume detached for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-registry-certificates\") on node \"crc\" DevicePath \"\"" Jan 11 17:31:09 crc kubenswrapper[4837]: I0111 17:31:09.302084 4837 reconciler_common.go:293] "Volume detached for volume \"audit\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-audit\") on node \"crc\" DevicePath \"\"" Jan 11 17:31:09 crc kubenswrapper[4837]: I0111 17:31:09.302094 4837 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-tk88c\" (UniqueName: \"kubernetes.io/projected/7539238d-5fe0-46ed-884e-1c3b566537ec-kube-api-access-tk88c\") on node \"crc\" DevicePath \"\"" Jan 11 17:31:09 crc kubenswrapper[4837]: I0111 17:31:09.302105 4837 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-lz9wn\" (UniqueName: \"kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-kube-api-access-lz9wn\") on node \"crc\" DevicePath \"\"" Jan 11 17:31:09 crc kubenswrapper[4837]: I0111 17:31:09.302114 4837 reconciler_common.go:293] "Volume detached for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-registry-tls\") on node \"crc\" DevicePath \"\"" Jan 11 17:31:09 crc kubenswrapper[4837]: I0111 17:31:09.302123 4837 reconciler_common.go:293] "Volume detached for volume \"images\" (UniqueName: \"kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-images\") on node \"crc\" DevicePath \"\"" Jan 11 17:31:09 crc kubenswrapper[4837]: I0111 17:31:09.302132 4837 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-kfwg7\" (UniqueName: \"kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-kube-api-access-kfwg7\") on node \"crc\" DevicePath \"\"" Jan 11 17:31:09 crc kubenswrapper[4837]: I0111 17:31:09.302141 4837 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-serving-cert\") on node \"crc\" DevicePath \"\"" Jan 11 17:31:09 crc kubenswrapper[4837]: I0111 17:31:09.302151 4837 reconciler_common.go:293] "Volume detached for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-serving-cert\") on node \"crc\" DevicePath \"\"" Jan 11 17:31:09 crc kubenswrapper[4837]: I0111 17:31:09.302160 4837 reconciler_common.go:293] "Volume detached for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-encryption-config\") on node \"crc\" DevicePath \"\"" Jan 11 17:31:09 crc kubenswrapper[4837]: I0111 17:31:09.302169 4837 reconciler_common.go:293] "Volume detached for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/a31745f5-9847-4afe-82a5-3161cc66ca93-metrics-tls\") on node \"crc\" DevicePath \"\"" Jan 11 17:31:09 crc kubenswrapper[4837]: I0111 17:31:09.302178 4837 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/e7e6199b-1264-4501-8953-767f51328d08-serving-cert\") on node \"crc\" DevicePath \"\"" Jan 11 17:31:09 crc kubenswrapper[4837]: I0111 17:31:09.302187 4837 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-config\") on node \"crc\" DevicePath \"\"" Jan 11 17:31:09 crc kubenswrapper[4837]: I0111 17:31:09.302196 4837 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/496e6271-fb68-4057-954e-a0d97a4afa3f-serving-cert\") on node \"crc\" DevicePath \"\"" Jan 11 17:31:09 crc kubenswrapper[4837]: I0111 17:31:09.302205 4837 reconciler_common.go:293] "Volume detached for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-client\") on node \"crc\" DevicePath \"\"" Jan 11 17:31:09 crc kubenswrapper[4837]: I0111 17:31:09.302214 4837 reconciler_common.go:293] "Volume detached for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-auth-proxy-config\") on node \"crc\" DevicePath \"\"" Jan 11 17:31:09 crc kubenswrapper[4837]: I0111 17:31:09.302224 4837 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/7539238d-5fe0-46ed-884e-1c3b566537ec-serving-cert\") on node \"crc\" DevicePath \"\"" Jan 11 17:31:09 crc kubenswrapper[4837]: I0111 17:31:09.302233 4837 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-utilities\") on node \"crc\" DevicePath \"\"" Jan 11 17:31:09 crc kubenswrapper[4837]: I0111 17:31:09.302243 4837 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-x2m85\" (UniqueName: \"kubernetes.io/projected/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d-kube-api-access-x2m85\") on node \"crc\" DevicePath \"\"" Jan 11 17:31:09 crc kubenswrapper[4837]: I0111 17:31:09.302253 4837 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-gf66m\" (UniqueName: \"kubernetes.io/projected/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-kube-api-access-gf66m\") on node \"crc\" DevicePath \"\"" Jan 11 17:31:09 crc kubenswrapper[4837]: I0111 17:31:09.302262 4837 reconciler_common.go:293] "Volume detached for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-client\") on node \"crc\" DevicePath \"\"" Jan 11 17:31:09 crc kubenswrapper[4837]: I0111 17:31:09.302272 4837 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-dbsvg\" (UniqueName: \"kubernetes.io/projected/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-kube-api-access-dbsvg\") on node \"crc\" DevicePath \"\"" Jan 11 17:31:09 crc kubenswrapper[4837]: I0111 17:31:09.302282 4837 reconciler_common.go:293] "Volume detached for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-encryption-config\") on node \"crc\" DevicePath \"\"" Jan 11 17:31:09 crc kubenswrapper[4837]: I0111 17:31:09.302291 4837 reconciler_common.go:293] "Volume detached for volume \"etcd-service-ca\" (UniqueName: \"kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-service-ca\") on node \"crc\" DevicePath \"\"" Jan 11 17:31:09 crc kubenswrapper[4837]: I0111 17:31:09.302300 4837 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-cfbct\" (UniqueName: \"kubernetes.io/projected/57a731c4-ef35-47a8-b875-bfb08a7f8011-kube-api-access-cfbct\") on node \"crc\" DevicePath \"\"" Jan 11 17:31:09 crc kubenswrapper[4837]: I0111 17:31:09.302310 4837 reconciler_common.go:293] "Volume detached for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-script-lib\") on node \"crc\" DevicePath \"\"" Jan 11 17:31:09 crc kubenswrapper[4837]: I0111 17:31:09.302322 4837 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-x7zkh\" (UniqueName: \"kubernetes.io/projected/6731426b-95fe-49ff-bb5f-40441049fde2-kube-api-access-x7zkh\") on node \"crc\" DevicePath \"\"" Jan 11 17:31:09 crc kubenswrapper[4837]: I0111 17:31:09.302331 4837 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jkwtn\" (UniqueName: \"kubernetes.io/projected/5b88f790-22fa-440e-b583-365168c0b23d-kube-api-access-jkwtn\") on node \"crc\" DevicePath \"\"" Jan 11 17:31:09 crc kubenswrapper[4837]: I0111 17:31:09.302341 4837 reconciler_common.go:293] "Volume detached for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-trusted-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 11 17:31:09 crc kubenswrapper[4837]: I0111 17:31:09.302350 4837 reconciler_common.go:293] "Volume detached for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-multus-daemon-config\") on node \"crc\" DevicePath \"\"" Jan 11 17:31:09 crc kubenswrapper[4837]: I0111 17:31:09.302360 4837 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-nzwt7\" (UniqueName: \"kubernetes.io/projected/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-kube-api-access-nzwt7\") on node \"crc\" DevicePath \"\"" Jan 11 17:31:09 crc kubenswrapper[4837]: I0111 17:31:09.302370 4837 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-fqsjt\" (UniqueName: \"kubernetes.io/projected/efdd0498-1daa-4136-9a4a-3b948c2293fc-kube-api-access-fqsjt\") on node \"crc\" DevicePath \"\"" Jan 11 17:31:09 crc kubenswrapper[4837]: I0111 17:31:09.302379 4837 reconciler_common.go:293] "Volume detached for volume \"cert\" (UniqueName: \"kubernetes.io/secret/20b0d48f-5fd6-431c-a545-e3c800c7b866-cert\") on node \"crc\" DevicePath \"\"" Jan 11 17:31:09 crc kubenswrapper[4837]: I0111 17:31:09.302389 4837 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/1386a44e-36a2-460c-96d0-0359d2b6f0f5-kube-api-access\") on node \"crc\" DevicePath \"\"" Jan 11 17:31:09 crc kubenswrapper[4837]: I0111 17:31:09.302398 4837 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-utilities\") on node \"crc\" DevicePath \"\"" Jan 11 17:31:09 crc kubenswrapper[4837]: I0111 17:31:09.302406 4837 reconciler_common.go:293] "Volume detached for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-srv-cert\") on node \"crc\" DevicePath \"\"" Jan 11 17:31:09 crc kubenswrapper[4837]: I0111 17:31:09.302415 4837 reconciler_common.go:293] "Volume detached for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-trusted-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 11 17:31:09 crc kubenswrapper[4837]: I0111 17:31:09.302424 4837 reconciler_common.go:293] "Volume detached for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-serving-ca\") on node \"crc\" DevicePath \"\"" Jan 11 17:31:09 crc kubenswrapper[4837]: I0111 17:31:09.302433 4837 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-sb6h7\" (UniqueName: \"kubernetes.io/projected/1bf7eb37-55a3-4c65-b768-a94c82151e69-kube-api-access-sb6h7\") on node \"crc\" DevicePath \"\"" Jan 11 17:31:09 crc kubenswrapper[4837]: I0111 17:31:09.302442 4837 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/0b78653f-4ff9-4508-8672-245ed9b561e3-serving-cert\") on node \"crc\" DevicePath \"\"" Jan 11 17:31:09 crc kubenswrapper[4837]: I0111 17:31:09.302451 4837 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-catalog-content\") on node \"crc\" DevicePath \"\"" Jan 11 17:31:09 crc kubenswrapper[4837]: I0111 17:31:09.302461 4837 reconciler_common.go:293] "Volume detached for volume \"image-registry-operator-tls\" (UniqueName: \"kubernetes.io/secret/bf126b07-da06-4140-9a57-dfd54fc6b486-image-registry-operator-tls\") on node \"crc\" DevicePath \"\"" Jan 11 17:31:09 crc kubenswrapper[4837]: I0111 17:31:09.302470 4837 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/0b78653f-4ff9-4508-8672-245ed9b561e3-kube-api-access\") on node \"crc\" DevicePath \"\"" Jan 11 17:31:09 crc kubenswrapper[4837]: I0111 17:31:09.302480 4837 reconciler_common.go:293] "Volume detached for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-trusted-ca\") on node \"crc\" DevicePath \"\"" Jan 11 17:31:09 crc kubenswrapper[4837]: I0111 17:31:09.302489 4837 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-router-certs\") on node \"crc\" DevicePath \"\"" Jan 11 17:31:09 crc kubenswrapper[4837]: I0111 17:31:09.302507 4837 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-zkvpv\" (UniqueName: \"kubernetes.io/projected/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-kube-api-access-zkvpv\") on node \"crc\" DevicePath \"\"" Jan 11 17:31:09 crc kubenswrapper[4837]: I0111 17:31:09.302516 4837 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-utilities\") on node \"crc\" DevicePath \"\"" Jan 11 17:31:09 crc kubenswrapper[4837]: I0111 17:31:09.302525 4837 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2d4wz\" (UniqueName: \"kubernetes.io/projected/5441d097-087c-4d9a-baa8-b210afa90fc9-kube-api-access-2d4wz\") on node \"crc\" DevicePath \"\"" Jan 11 17:31:09 crc kubenswrapper[4837]: I0111 17:31:09.302534 4837 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/496e6271-fb68-4057-954e-a0d97a4afa3f-config\") on node \"crc\" DevicePath \"\"" Jan 11 17:31:09 crc kubenswrapper[4837]: I0111 17:31:09.302543 4837 reconciler_common.go:293] "Volume detached for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-audit-policies\") on node \"crc\" DevicePath \"\"" Jan 11 17:31:09 crc kubenswrapper[4837]: I0111 17:31:09.302552 4837 reconciler_common.go:293] "Volume detached for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/31d8b7a1-420e-4252-a5b7-eebe8a111292-proxy-tls\") on node \"crc\" DevicePath \"\"" Jan 11 17:31:09 crc kubenswrapper[4837]: I0111 17:31:09.302561 4837 reconciler_common.go:293] "Volume detached for volume \"default-certificate\" (UniqueName: \"kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-default-certificate\") on node \"crc\" DevicePath \"\"" Jan 11 17:31:09 crc kubenswrapper[4837]: I0111 17:31:09.302570 4837 reconciler_common.go:293] "Volume detached for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/8f668bae-612b-4b75-9490-919e737c6a3b-installation-pull-secrets\") on node \"crc\" DevicePath \"\"" Jan 11 17:31:09 crc kubenswrapper[4837]: I0111 17:31:09.302579 4837 reconciler_common.go:293] "Volume detached for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-env-overrides\") on node \"crc\" DevicePath \"\"" Jan 11 17:31:09 crc kubenswrapper[4837]: I0111 17:31:09.302589 4837 reconciler_common.go:293] "Volume detached for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-serving-ca\") on node \"crc\" DevicePath \"\"" Jan 11 17:31:09 crc kubenswrapper[4837]: I0111 17:31:09.302599 4837 reconciler_common.go:293] "Volume detached for volume \"ovn-control-plane-metrics-cert\" (UniqueName: \"kubernetes.io/secret/925f1c65-6136-48ba-85aa-3a3b50560753-ovn-control-plane-metrics-cert\") on node \"crc\" DevicePath \"\"" Jan 11 17:31:09 crc kubenswrapper[4837]: I0111 17:31:09.302608 4837 reconciler_common.go:293] "Volume detached for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-ovnkube-config\") on node \"crc\" DevicePath \"\"" Jan 11 17:31:09 crc kubenswrapper[4837]: I0111 17:31:09.302619 4837 reconciler_common.go:293] "Volume detached for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/0b78653f-4ff9-4508-8672-245ed9b561e3-service-ca\") on node \"crc\" DevicePath \"\"" Jan 11 17:31:09 crc kubenswrapper[4837]: I0111 17:31:09.302629 4837 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-w9rds\" (UniqueName: \"kubernetes.io/projected/20b0d48f-5fd6-431c-a545-e3c800c7b866-kube-api-access-w9rds\") on node \"crc\" DevicePath \"\"" Jan 11 17:31:09 crc kubenswrapper[4837]: I0111 17:31:09.302640 4837 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-4d4hj\" (UniqueName: \"kubernetes.io/projected/3ab1a177-2de0-46d9-b765-d0d0649bb42e-kube-api-access-4d4hj\") on node \"crc\" DevicePath \"\"" Jan 11 17:31:09 crc kubenswrapper[4837]: I0111 17:31:09.302649 4837 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6ccd8\" (UniqueName: \"kubernetes.io/projected/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-kube-api-access-6ccd8\") on node \"crc\" DevicePath \"\"" Jan 11 17:31:09 crc kubenswrapper[4837]: I0111 17:31:09.302659 4837 reconciler_common.go:293] "Volume detached for volume \"package-server-manager-serving-cert\" (UniqueName: \"kubernetes.io/secret/3ab1a177-2de0-46d9-b765-d0d0649bb42e-package-server-manager-serving-cert\") on node \"crc\" DevicePath \"\"" Jan 11 17:31:09 crc kubenswrapper[4837]: I0111 17:31:09.302669 4837 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-pjr6v\" (UniqueName: \"kubernetes.io/projected/49ef4625-1d3a-4a9f-b595-c2433d32326d-kube-api-access-pjr6v\") on node \"crc\" DevicePath \"\"" Jan 11 17:31:09 crc kubenswrapper[4837]: I0111 17:31:09.302694 4837 reconciler_common.go:293] "Volume detached for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-bound-sa-token\") on node \"crc\" DevicePath \"\"" Jan 11 17:31:09 crc kubenswrapper[4837]: I0111 17:31:09.302704 4837 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/6509e943-70c6-444c-bc41-48a544e36fbd-serving-cert\") on node \"crc\" DevicePath \"\"" Jan 11 17:31:09 crc kubenswrapper[4837]: I0111 17:31:09.302713 4837 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-d6qdx\" (UniqueName: \"kubernetes.io/projected/87cf06ed-a83f-41a7-828d-70653580a8cb-kube-api-access-d6qdx\") on node \"crc\" DevicePath \"\"" Jan 11 17:31:09 crc kubenswrapper[4837]: I0111 17:31:09.302722 4837 reconciler_common.go:293] "Volume detached for volume \"certs\" (UniqueName: \"kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-certs\") on node \"crc\" DevicePath \"\"" Jan 11 17:31:09 crc kubenswrapper[4837]: I0111 17:31:09.302730 4837 reconciler_common.go:293] "Volume detached for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-serviceca\") on node \"crc\" DevicePath \"\"" Jan 11 17:31:09 crc kubenswrapper[4837]: I0111 17:31:09.302740 4837 reconciler_common.go:293] "Volume detached for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-auth-proxy-config\") on node \"crc\" DevicePath \"\"" Jan 11 17:31:09 crc kubenswrapper[4837]: I0111 17:31:09.302750 4837 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/7583ce53-e0fe-4a16-9e4d-50516596a136-serving-cert\") on node \"crc\" DevicePath \"\"" Jan 11 17:31:09 crc kubenswrapper[4837]: I0111 17:31:09.302760 4837 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2w9zh\" (UniqueName: \"kubernetes.io/projected/4bb40260-dbaa-4fb0-84df-5e680505d512-kube-api-access-2w9zh\") on node \"crc\" DevicePath \"\"" Jan 11 17:31:09 crc kubenswrapper[4837]: I0111 17:31:09.302769 4837 reconciler_common.go:293] "Volume detached for volume \"stats-auth\" (UniqueName: \"kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-stats-auth\") on node \"crc\" DevicePath \"\"" Jan 11 17:31:09 crc kubenswrapper[4837]: I0111 17:31:09.302778 4837 reconciler_common.go:293] "Volume detached for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/c03ee662-fb2f-4fc4-a2c1-af487c19d254-service-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 11 17:31:09 crc kubenswrapper[4837]: I0111 17:31:09.302788 4837 reconciler_common.go:293] "Volume detached for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-webhook-cert\") on node \"crc\" DevicePath \"\"" Jan 11 17:31:09 crc kubenswrapper[4837]: I0111 17:31:09.302797 4837 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xcphl\" (UniqueName: \"kubernetes.io/projected/7583ce53-e0fe-4a16-9e4d-50516596a136-kube-api-access-xcphl\") on node \"crc\" DevicePath \"\"" Jan 11 17:31:09 crc kubenswrapper[4837]: I0111 17:31:09.302813 4837 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-config\") on node \"crc\" DevicePath \"\"" Jan 11 17:31:09 crc kubenswrapper[4837]: I0111 17:31:09.302822 4837 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-utilities\") on node \"crc\" DevicePath \"\"" Jan 11 17:31:09 crc kubenswrapper[4837]: I0111 17:31:09.302832 4837 reconciler_common.go:293] "Volume detached for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-bound-sa-token\") on node \"crc\" DevicePath \"\"" Jan 11 17:31:09 crc kubenswrapper[4837]: I0111 17:31:09.302841 4837 reconciler_common.go:293] "Volume detached for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-binary-copy\") on node \"crc\" DevicePath \"\"" Jan 11 17:31:09 crc kubenswrapper[4837]: I0111 17:31:09.302850 4837 reconciler_common.go:293] "Volume detached for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-service-ca\") on node \"crc\" DevicePath \"\"" Jan 11 17:31:09 crc kubenswrapper[4837]: I0111 17:31:09.302859 4837 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-fcqwp\" (UniqueName: \"kubernetes.io/projected/5fe579f8-e8a6-4643-bce5-a661393c4dde-kube-api-access-fcqwp\") on node \"crc\" DevicePath \"\"" Jan 11 17:31:09 crc kubenswrapper[4837]: I0111 17:31:09.302869 4837 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-catalog-content\") on node \"crc\" DevicePath \"\"" Jan 11 17:31:09 crc kubenswrapper[4837]: I0111 17:31:09.302881 4837 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-serving-cert\") on node \"crc\" DevicePath \"\"" Jan 11 17:31:09 crc kubenswrapper[4837]: I0111 17:31:09.302889 4837 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-config\") on node \"crc\" DevicePath \"\"" Jan 11 17:31:09 crc kubenswrapper[4837]: I0111 17:31:09.302901 4837 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-zgdk5\" (UniqueName: \"kubernetes.io/projected/31d8b7a1-420e-4252-a5b7-eebe8a111292-kube-api-access-zgdk5\") on node \"crc\" DevicePath \"\"" Jan 11 17:31:09 crc kubenswrapper[4837]: I0111 17:31:09.302911 4837 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qg5z5\" (UniqueName: \"kubernetes.io/projected/43509403-f426-496e-be36-56cef71462f5-kube-api-access-qg5z5\") on node \"crc\" DevicePath \"\"" Jan 11 17:31:09 crc kubenswrapper[4837]: I0111 17:31:09.302920 4837 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-w7l8j\" (UniqueName: \"kubernetes.io/projected/01ab3dd5-8196-46d0-ad33-122e2ca51def-kube-api-access-w7l8j\") on node \"crc\" DevicePath \"\"" Jan 11 17:31:09 crc kubenswrapper[4837]: I0111 17:31:09.302930 4837 reconciler_common.go:293] "Volume detached for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-oauth-config\") on node \"crc\" DevicePath \"\"" Jan 11 17:31:09 crc kubenswrapper[4837]: I0111 17:31:09.302939 4837 reconciler_common.go:293] "Volume detached for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-apiservice-cert\") on node \"crc\" DevicePath \"\"" Jan 11 17:31:09 crc kubenswrapper[4837]: I0111 17:31:09.302947 4837 reconciler_common.go:293] "Volume detached for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/efdd0498-1daa-4136-9a4a-3b948c2293fc-webhook-certs\") on node \"crc\" DevicePath \"\"" Jan 11 17:31:09 crc kubenswrapper[4837]: I0111 17:31:09.302958 4837 reconciler_common.go:293] "Volume detached for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/5b88f790-22fa-440e-b583-365168c0b23d-metrics-certs\") on node \"crc\" DevicePath \"\"" Jan 11 17:31:09 crc kubenswrapper[4837]: I0111 17:31:09.302968 4837 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-error\") on node \"crc\" DevicePath \"\"" Jan 11 17:31:09 crc kubenswrapper[4837]: I0111 17:31:09.302977 4837 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/01ab3dd5-8196-46d0-ad33-122e2ca51def-config\") on node \"crc\" DevicePath \"\"" Jan 11 17:31:09 crc kubenswrapper[4837]: I0111 17:31:09.302986 4837 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/87cf06ed-a83f-41a7-828d-70653580a8cb-config-volume\") on node \"crc\" DevicePath \"\"" Jan 11 17:31:09 crc kubenswrapper[4837]: I0111 17:31:09.302996 4837 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/9d4552c7-cd75-42dd-8880-30dd377c49a4-serving-cert\") on node \"crc\" DevicePath \"\"" Jan 11 17:31:09 crc kubenswrapper[4837]: I0111 17:31:09.303005 4837 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-session\") on node \"crc\" DevicePath \"\"" Jan 11 17:31:09 crc kubenswrapper[4837]: I0111 17:31:09.303014 4837 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-wxkg8\" (UniqueName: \"kubernetes.io/projected/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-kube-api-access-wxkg8\") on node \"crc\" DevicePath \"\"" Jan 11 17:31:09 crc kubenswrapper[4837]: I0111 17:31:09.303023 4837 reconciler_common.go:293] "Volume detached for volume \"control-plane-machine-set-operator-tls\" (UniqueName: \"kubernetes.io/secret/6731426b-95fe-49ff-bb5f-40441049fde2-control-plane-machine-set-operator-tls\") on node \"crc\" DevicePath \"\"" Jan 11 17:31:09 crc kubenswrapper[4837]: I0111 17:31:09.303034 4837 reconciler_common.go:293] "Volume detached for volume \"etcd-ca\" (UniqueName: \"kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-ca\") on node \"crc\" DevicePath \"\"" Jan 11 17:31:09 crc kubenswrapper[4837]: I0111 17:31:09.303043 4837 reconciler_common.go:293] "Volume detached for volume \"tmpfs\" (UniqueName: \"kubernetes.io/empty-dir/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-tmpfs\") on node \"crc\" DevicePath \"\"" Jan 11 17:31:09 crc kubenswrapper[4837]: I0111 17:31:09.303052 4837 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-config\") on node \"crc\" DevicePath \"\"" Jan 11 17:31:09 crc kubenswrapper[4837]: I0111 17:31:09.303061 4837 reconciler_common.go:293] "Volume detached for volume \"signing-key\" (UniqueName: \"kubernetes.io/secret/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-key\") on node \"crc\" DevicePath \"\"" Jan 11 17:31:09 crc kubenswrapper[4837]: I0111 17:31:09.303073 4837 reconciler_common.go:293] "Volume detached for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-profile-collector-cert\") on node \"crc\" DevicePath \"\"" Jan 11 17:31:09 crc kubenswrapper[4837]: I0111 17:31:09.303083 4837 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-trusted-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 11 17:31:09 crc kubenswrapper[4837]: I0111 17:31:09.303092 4837 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-pcxfs\" (UniqueName: \"kubernetes.io/projected/9d4552c7-cd75-42dd-8880-30dd377c49a4-kube-api-access-pcxfs\") on node \"crc\" DevicePath \"\"" Jan 11 17:31:09 crc kubenswrapper[4837]: I0111 17:31:09.303101 4837 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-catalog-content\") on node \"crc\" DevicePath \"\"" Jan 11 17:31:09 crc kubenswrapper[4837]: I0111 17:31:09.303111 4837 reconciler_common.go:293] "Volume detached for volume \"signing-cabundle\" (UniqueName: \"kubernetes.io/configmap/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-cabundle\") on node \"crc\" DevicePath \"\"" Jan 11 17:31:09 crc kubenswrapper[4837]: I0111 17:31:09.303119 4837 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/496e6271-fb68-4057-954e-a0d97a4afa3f-kube-api-access\") on node \"crc\" DevicePath \"\"" Jan 11 17:31:09 crc kubenswrapper[4837]: I0111 17:31:09.303127 4837 reconciler_common.go:293] "Volume detached for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-trusted-ca\") on node \"crc\" DevicePath \"\"" Jan 11 17:31:09 crc kubenswrapper[4837]: I0111 17:31:09.303130 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/1cf6fa66-290a-4e29-bafc-e60185a22fe2-proxy-tls\") pod \"machine-config-daemon-pqnst\" (UID: \"1cf6fa66-290a-4e29-bafc-e60185a22fe2\") " pod="openshift-machine-config-operator/machine-config-daemon-pqnst" Jan 11 17:31:09 crc kubenswrapper[4837]: I0111 17:31:09.303136 4837 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1386a44e-36a2-460c-96d0-0359d2b6f0f5-config\") on node \"crc\" DevicePath \"\"" Jan 11 17:31:09 crc kubenswrapper[4837]: I0111 17:31:09.303172 4837 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-s4n52\" (UniqueName: \"kubernetes.io/projected/925f1c65-6136-48ba-85aa-3a3b50560753-kube-api-access-s4n52\") on node \"crc\" DevicePath \"\"" Jan 11 17:31:09 crc kubenswrapper[4837]: I0111 17:31:09.303182 4837 reconciler_common.go:293] "Volume detached for volume \"available-featuregates\" (UniqueName: \"kubernetes.io/empty-dir/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-available-featuregates\") on node \"crc\" DevicePath \"\"" Jan 11 17:31:09 crc kubenswrapper[4837]: I0111 17:31:09.303191 4837 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mnrrd\" (UniqueName: \"kubernetes.io/projected/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-kube-api-access-mnrrd\") on node \"crc\" DevicePath \"\"" Jan 11 17:31:09 crc kubenswrapper[4837]: I0111 17:31:09.303200 4837 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/5441d097-087c-4d9a-baa8-b210afa90fc9-serving-cert\") on node \"crc\" DevicePath \"\"" Jan 11 17:31:09 crc kubenswrapper[4837]: I0111 17:31:09.303210 4837 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/1386a44e-36a2-460c-96d0-0359d2b6f0f5-serving-cert\") on node \"crc\" DevicePath \"\"" Jan 11 17:31:09 crc kubenswrapper[4837]: I0111 17:31:09.303219 4837 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xcgwh\" (UniqueName: \"kubernetes.io/projected/fda69060-fa79-4696-b1a6-7980f124bf7c-kube-api-access-xcgwh\") on node \"crc\" DevicePath \"\"" Jan 11 17:31:09 crc kubenswrapper[4837]: I0111 17:31:09.303228 4837 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-8tdtz\" (UniqueName: \"kubernetes.io/projected/09efc573-dbb6-4249-bd59-9b87aba8dd28-kube-api-access-8tdtz\") on node \"crc\" DevicePath \"\"" Jan 11 17:31:09 crc kubenswrapper[4837]: I0111 17:31:09.303236 4837 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-catalog-content\") on node \"crc\" DevicePath \"\"" Jan 11 17:31:09 crc kubenswrapper[4837]: I0111 17:31:09.303244 4837 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-config\") on node \"crc\" DevicePath \"\"" Jan 11 17:31:09 crc kubenswrapper[4837]: I0111 17:31:09.303876 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/e1452749-ce38-41f8-89dd-4b567f2a3250-ovn-node-metrics-cert\") pod \"ovnkube-node-b7lgc\" (UID: \"e1452749-ce38-41f8-89dd-4b567f2a3250\") " pod="openshift-ovn-kubernetes/ovnkube-node-b7lgc" Jan 11 17:31:09 crc kubenswrapper[4837]: I0111 17:31:09.314846 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wgdql\" (UniqueName: \"kubernetes.io/projected/1cf6fa66-290a-4e29-bafc-e60185a22fe2-kube-api-access-wgdql\") pod \"machine-config-daemon-pqnst\" (UID: \"1cf6fa66-290a-4e29-bafc-e60185a22fe2\") " pod="openshift-machine-config-operator/machine-config-daemon-pqnst" Jan 11 17:31:09 crc kubenswrapper[4837]: I0111 17:31:09.316643 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fhxrk\" (UniqueName: \"kubernetes.io/projected/46729771-ba0b-4b7c-8245-b2d57acb5a2c-kube-api-access-fhxrk\") pod \"multus-additional-cni-plugins-7996w\" (UID: \"46729771-ba0b-4b7c-8245-b2d57acb5a2c\") " pod="openshift-multus/multus-additional-cni-plugins-7996w" Jan 11 17:31:09 crc kubenswrapper[4837]: I0111 17:31:09.317210 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xnsnp\" (UniqueName: \"kubernetes.io/projected/a574a741-71b2-4d60-9553-3d85d815f4b0-kube-api-access-xnsnp\") pod \"node-resolver-kcvkg\" (UID: \"a574a741-71b2-4d60-9553-3d85d815f4b0\") " pod="openshift-dns/node-resolver-kcvkg" Jan 11 17:31:09 crc kubenswrapper[4837]: I0111 17:31:09.317850 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ns552\" (UniqueName: \"kubernetes.io/projected/78cc7c3f-09f5-4200-a647-8fa4e9b2aae5-kube-api-access-ns552\") pod \"multus-v5bf5\" (UID: \"78cc7c3f-09f5-4200-a647-8fa4e9b2aae5\") " pod="openshift-multus/multus-v5bf5" Jan 11 17:31:09 crc kubenswrapper[4837]: I0111 17:31:09.322851 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6f6rv\" (UniqueName: \"kubernetes.io/projected/e1452749-ce38-41f8-89dd-4b567f2a3250-kube-api-access-6f6rv\") pod \"ovnkube-node-b7lgc\" (UID: \"e1452749-ce38-41f8-89dd-4b567f2a3250\") " pod="openshift-ovn-kubernetes/ovnkube-node-b7lgc" Jan 11 17:31:09 crc kubenswrapper[4837]: I0111 17:31:09.403954 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Jan 11 17:31:09 crc kubenswrapper[4837]: I0111 17:31:09.413231 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-node-identity/network-node-identity-vrzqb" Jan 11 17:31:09 crc kubenswrapper[4837]: W0111 17:31:09.426704 4837 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podef543e1b_8068_4ea3_b32a_61027b32e95d.slice/crio-1ab7706cbdd2ee524581aa63fa224233c72d2b9dee907b5fd705f8575c353a1e WatchSource:0}: Error finding container 1ab7706cbdd2ee524581aa63fa224233c72d2b9dee907b5fd705f8575c353a1e: Status 404 returned error can't find the container with id 1ab7706cbdd2ee524581aa63fa224233c72d2b9dee907b5fd705f8575c353a1e Jan 11 17:31:09 crc kubenswrapper[4837]: I0111 17:31:09.427006 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/iptables-alerter-4ln5h" Jan 11 17:31:09 crc kubenswrapper[4837]: I0111 17:31:09.435564 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" event={"ID":"ef543e1b-8068-4ea3-b32a-61027b32e95d","Type":"ContainerStarted","Data":"1ab7706cbdd2ee524581aa63fa224233c72d2b9dee907b5fd705f8575c353a1e"} Jan 11 17:31:09 crc kubenswrapper[4837]: I0111 17:31:09.436610 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" event={"ID":"37a5e44f-9a88-4405-be8a-b645485e7312","Type":"ContainerStarted","Data":"790be83ccf3197cb9232b80047378468877211ac004c7ed9751eb84ea94dab06"} Jan 11 17:31:09 crc kubenswrapper[4837]: I0111 17:31:09.443002 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-daemon-pqnst" Jan 11 17:31:09 crc kubenswrapper[4837]: W0111 17:31:09.448786 4837 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podd75a4c96_2883_4a0b_bab2_0fab2b6c0b49.slice/crio-aaa925b4986bb3f2a88d72f7624f9aa55b04c3354ae5ba545e89ae9044dce0ff WatchSource:0}: Error finding container aaa925b4986bb3f2a88d72f7624f9aa55b04c3354ae5ba545e89ae9044dce0ff: Status 404 returned error can't find the container with id aaa925b4986bb3f2a88d72f7624f9aa55b04c3354ae5ba545e89ae9044dce0ff Jan 11 17:31:09 crc kubenswrapper[4837]: I0111 17:31:09.452934 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-v5bf5" Jan 11 17:31:09 crc kubenswrapper[4837]: I0111 17:31:09.460327 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-b7lgc" Jan 11 17:31:09 crc kubenswrapper[4837]: W0111 17:31:09.463405 4837 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod1cf6fa66_290a_4e29_bafc_e60185a22fe2.slice/crio-e521a1db0cbaa129b7dcfb9861b46017b1809bae798187b74b6afb64f02a6858 WatchSource:0}: Error finding container e521a1db0cbaa129b7dcfb9861b46017b1809bae798187b74b6afb64f02a6858: Status 404 returned error can't find the container with id e521a1db0cbaa129b7dcfb9861b46017b1809bae798187b74b6afb64f02a6858 Jan 11 17:31:09 crc kubenswrapper[4837]: I0111 17:31:09.467822 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-additional-cni-plugins-7996w" Jan 11 17:31:09 crc kubenswrapper[4837]: I0111 17:31:09.473208 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/node-resolver-kcvkg" Jan 11 17:31:09 crc kubenswrapper[4837]: W0111 17:31:09.483709 4837 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pode1452749_ce38_41f8_89dd_4b567f2a3250.slice/crio-6736d3517f7eec45b1843f7580da009352a6d13f6cc3bf67f0047b8b967eac72 WatchSource:0}: Error finding container 6736d3517f7eec45b1843f7580da009352a6d13f6cc3bf67f0047b8b967eac72: Status 404 returned error can't find the container with id 6736d3517f7eec45b1843f7580da009352a6d13f6cc3bf67f0047b8b967eac72 Jan 11 17:31:09 crc kubenswrapper[4837]: W0111 17:31:09.488227 4837 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod78cc7c3f_09f5_4200_a647_8fa4e9b2aae5.slice/crio-7bb8bd4a15d6d53a4b07ff4a9a281ed517b6656458d4bd396c2b05432fd28b43 WatchSource:0}: Error finding container 7bb8bd4a15d6d53a4b07ff4a9a281ed517b6656458d4bd396c2b05432fd28b43: Status 404 returned error can't find the container with id 7bb8bd4a15d6d53a4b07ff4a9a281ed517b6656458d4bd396c2b05432fd28b43 Jan 11 17:31:09 crc kubenswrapper[4837]: I0111 17:31:09.707176 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Jan 11 17:31:09 crc kubenswrapper[4837]: E0111 17:31:09.707372 4837 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-01-11 17:31:10.707348083 +0000 UTC m=+44.885540789 (durationBeforeRetry 1s). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 11 17:31:09 crc kubenswrapper[4837]: I0111 17:31:09.808764 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Jan 11 17:31:09 crc kubenswrapper[4837]: I0111 17:31:09.808811 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Jan 11 17:31:09 crc kubenswrapper[4837]: I0111 17:31:09.808831 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Jan 11 17:31:09 crc kubenswrapper[4837]: I0111 17:31:09.808854 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Jan 11 17:31:09 crc kubenswrapper[4837]: E0111 17:31:09.808969 4837 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Jan 11 17:31:09 crc kubenswrapper[4837]: E0111 17:31:09.808971 4837 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Jan 11 17:31:09 crc kubenswrapper[4837]: E0111 17:31:09.809022 4837 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Jan 11 17:31:09 crc kubenswrapper[4837]: E0111 17:31:09.809038 4837 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Jan 11 17:31:09 crc kubenswrapper[4837]: E0111 17:31:09.809105 4837 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Jan 11 17:31:09 crc kubenswrapper[4837]: E0111 17:31:09.809047 4837 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-01-11 17:31:10.809030841 +0000 UTC m=+44.987223547 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Jan 11 17:31:09 crc kubenswrapper[4837]: E0111 17:31:09.809139 4837 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Jan 11 17:31:09 crc kubenswrapper[4837]: E0111 17:31:09.809156 4837 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-01-11 17:31:10.809146093 +0000 UTC m=+44.987338799 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Jan 11 17:31:09 crc kubenswrapper[4837]: E0111 17:31:09.809198 4837 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2026-01-11 17:31:10.809180694 +0000 UTC m=+44.987373430 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Jan 11 17:31:09 crc kubenswrapper[4837]: E0111 17:31:09.808984 4837 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Jan 11 17:31:09 crc kubenswrapper[4837]: E0111 17:31:09.809230 4837 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Jan 11 17:31:09 crc kubenswrapper[4837]: E0111 17:31:09.809286 4837 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2026-01-11 17:31:10.809271497 +0000 UTC m=+44.987464243 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Jan 11 17:31:10 crc kubenswrapper[4837]: I0111 17:31:10.364157 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Jan 11 17:31:10 crc kubenswrapper[4837]: E0111 17:31:10.364386 4837 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Jan 11 17:31:10 crc kubenswrapper[4837]: I0111 17:31:10.364531 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Jan 11 17:31:10 crc kubenswrapper[4837]: E0111 17:31:10.364765 4837 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Jan 11 17:31:10 crc kubenswrapper[4837]: I0111 17:31:10.372235 4837 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="01ab3dd5-8196-46d0-ad33-122e2ca51def" path="/var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/volumes" Jan 11 17:31:10 crc kubenswrapper[4837]: I0111 17:31:10.373601 4837 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" path="/var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes" Jan 11 17:31:10 crc kubenswrapper[4837]: I0111 17:31:10.376641 4837 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="09efc573-dbb6-4249-bd59-9b87aba8dd28" path="/var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes" Jan 11 17:31:10 crc kubenswrapper[4837]: I0111 17:31:10.378516 4837 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0b574797-001e-440a-8f4e-c0be86edad0f" path="/var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/volumes" Jan 11 17:31:10 crc kubenswrapper[4837]: I0111 17:31:10.380601 4837 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0b78653f-4ff9-4508-8672-245ed9b561e3" path="/var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/volumes" Jan 11 17:31:10 crc kubenswrapper[4837]: I0111 17:31:10.381859 4837 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1386a44e-36a2-460c-96d0-0359d2b6f0f5" path="/var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/volumes" Jan 11 17:31:10 crc kubenswrapper[4837]: I0111 17:31:10.383159 4837 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1bf7eb37-55a3-4c65-b768-a94c82151e69" path="/var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes" Jan 11 17:31:10 crc kubenswrapper[4837]: I0111 17:31:10.385226 4837 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1d611f23-29be-4491-8495-bee1670e935f" path="/var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes" Jan 11 17:31:10 crc kubenswrapper[4837]: I0111 17:31:10.386542 4837 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="20b0d48f-5fd6-431c-a545-e3c800c7b866" path="/var/lib/kubelet/pods/20b0d48f-5fd6-431c-a545-e3c800c7b866/volumes" Jan 11 17:31:10 crc kubenswrapper[4837]: I0111 17:31:10.388795 4837 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c" path="/var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/volumes" Jan 11 17:31:10 crc kubenswrapper[4837]: I0111 17:31:10.390217 4837 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="22c825df-677d-4ca6-82db-3454ed06e783" path="/var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes" Jan 11 17:31:10 crc kubenswrapper[4837]: I0111 17:31:10.392387 4837 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="25e176fe-21b4-4974-b1ed-c8b94f112a7f" path="/var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/volumes" Jan 11 17:31:10 crc kubenswrapper[4837]: I0111 17:31:10.393544 4837 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" path="/var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/volumes" Jan 11 17:31:10 crc kubenswrapper[4837]: I0111 17:31:10.394751 4837 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="31d8b7a1-420e-4252-a5b7-eebe8a111292" path="/var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes" Jan 11 17:31:10 crc kubenswrapper[4837]: I0111 17:31:10.397168 4837 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3ab1a177-2de0-46d9-b765-d0d0649bb42e" path="/var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/volumes" Jan 11 17:31:10 crc kubenswrapper[4837]: I0111 17:31:10.398801 4837 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3cb93b32-e0ae-4377-b9c8-fdb9842c6d59" path="/var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes" Jan 11 17:31:10 crc kubenswrapper[4837]: I0111 17:31:10.401065 4837 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="43509403-f426-496e-be36-56cef71462f5" path="/var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes" Jan 11 17:31:10 crc kubenswrapper[4837]: I0111 17:31:10.402017 4837 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="44663579-783b-4372-86d6-acf235a62d72" path="/var/lib/kubelet/pods/44663579-783b-4372-86d6-acf235a62d72/volumes" Jan 11 17:31:10 crc kubenswrapper[4837]: I0111 17:31:10.403213 4837 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="496e6271-fb68-4057-954e-a0d97a4afa3f" path="/var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/volumes" Jan 11 17:31:10 crc kubenswrapper[4837]: I0111 17:31:10.405316 4837 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" path="/var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes" Jan 11 17:31:10 crc kubenswrapper[4837]: I0111 17:31:10.406449 4837 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="49ef4625-1d3a-4a9f-b595-c2433d32326d" path="/var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/volumes" Jan 11 17:31:10 crc kubenswrapper[4837]: I0111 17:31:10.408317 4837 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="4bb40260-dbaa-4fb0-84df-5e680505d512" path="/var/lib/kubelet/pods/4bb40260-dbaa-4fb0-84df-5e680505d512/volumes" Jan 11 17:31:10 crc kubenswrapper[4837]: I0111 17:31:10.409254 4837 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5225d0e4-402f-4861-b410-819f433b1803" path="/var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes" Jan 11 17:31:10 crc kubenswrapper[4837]: I0111 17:31:10.411727 4837 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5441d097-087c-4d9a-baa8-b210afa90fc9" path="/var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes" Jan 11 17:31:10 crc kubenswrapper[4837]: I0111 17:31:10.412716 4837 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="57a731c4-ef35-47a8-b875-bfb08a7f8011" path="/var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes" Jan 11 17:31:10 crc kubenswrapper[4837]: I0111 17:31:10.414016 4837 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5b88f790-22fa-440e-b583-365168c0b23d" path="/var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/volumes" Jan 11 17:31:10 crc kubenswrapper[4837]: I0111 17:31:10.416107 4837 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5fe579f8-e8a6-4643-bce5-a661393c4dde" path="/var/lib/kubelet/pods/5fe579f8-e8a6-4643-bce5-a661393c4dde/volumes" Jan 11 17:31:10 crc kubenswrapper[4837]: I0111 17:31:10.417106 4837 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6402fda4-df10-493c-b4e5-d0569419652d" path="/var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes" Jan 11 17:31:10 crc kubenswrapper[4837]: I0111 17:31:10.419041 4837 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6509e943-70c6-444c-bc41-48a544e36fbd" path="/var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes" Jan 11 17:31:10 crc kubenswrapper[4837]: I0111 17:31:10.420255 4837 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6731426b-95fe-49ff-bb5f-40441049fde2" path="/var/lib/kubelet/pods/6731426b-95fe-49ff-bb5f-40441049fde2/volumes" Jan 11 17:31:10 crc kubenswrapper[4837]: I0111 17:31:10.422484 4837 kubelet_volumes.go:152] "Cleaned up orphaned volume subpath from pod" podUID="6ea678ab-3438-413e-bfe3-290ae7725660" path="/var/lib/kubelet/pods/6ea678ab-3438-413e-bfe3-290ae7725660/volume-subpaths/run-systemd/ovnkube-controller/6" Jan 11 17:31:10 crc kubenswrapper[4837]: I0111 17:31:10.422777 4837 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6ea678ab-3438-413e-bfe3-290ae7725660" path="/var/lib/kubelet/pods/6ea678ab-3438-413e-bfe3-290ae7725660/volumes" Jan 11 17:31:10 crc kubenswrapper[4837]: I0111 17:31:10.426589 4837 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7539238d-5fe0-46ed-884e-1c3b566537ec" path="/var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/volumes" Jan 11 17:31:10 crc kubenswrapper[4837]: I0111 17:31:10.428809 4837 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7583ce53-e0fe-4a16-9e4d-50516596a136" path="/var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes" Jan 11 17:31:10 crc kubenswrapper[4837]: I0111 17:31:10.429874 4837 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7bb08738-c794-4ee8-9972-3a62ca171029" path="/var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes" Jan 11 17:31:10 crc kubenswrapper[4837]: I0111 17:31:10.432353 4837 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="87cf06ed-a83f-41a7-828d-70653580a8cb" path="/var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/volumes" Jan 11 17:31:10 crc kubenswrapper[4837]: I0111 17:31:10.433840 4837 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8cea82b4-6893-4ddc-af9f-1bb5ae425c5b" path="/var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/volumes" Jan 11 17:31:10 crc kubenswrapper[4837]: I0111 17:31:10.435112 4837 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="925f1c65-6136-48ba-85aa-3a3b50560753" path="/var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes" Jan 11 17:31:10 crc kubenswrapper[4837]: I0111 17:31:10.436661 4837 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="96b93a3a-6083-4aea-8eab-fe1aa8245ad9" path="/var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/volumes" Jan 11 17:31:10 crc kubenswrapper[4837]: I0111 17:31:10.441231 4837 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="9d4552c7-cd75-42dd-8880-30dd377c49a4" path="/var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes" Jan 11 17:31:10 crc kubenswrapper[4837]: I0111 17:31:10.442791 4837 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a0128f3a-b052-44ed-a84e-c4c8aaf17c13" path="/var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/volumes" Jan 11 17:31:10 crc kubenswrapper[4837]: I0111 17:31:10.445966 4837 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a31745f5-9847-4afe-82a5-3161cc66ca93" path="/var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/volumes" Jan 11 17:31:10 crc kubenswrapper[4837]: I0111 17:31:10.454897 4837 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b11524ee-3fca-4b1b-9cdf-6da289fdbc7d" path="/var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes" Jan 11 17:31:10 crc kubenswrapper[4837]: I0111 17:31:10.457466 4837 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b6312bbd-5731-4ea0-a20f-81d5a57df44a" path="/var/lib/kubelet/pods/b6312bbd-5731-4ea0-a20f-81d5a57df44a/volumes" Jan 11 17:31:10 crc kubenswrapper[4837]: I0111 17:31:10.460131 4837 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b6cd30de-2eeb-49a2-ab40-9167f4560ff5" path="/var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/volumes" Jan 11 17:31:10 crc kubenswrapper[4837]: I0111 17:31:10.461621 4837 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="bc5039c0-ea34-426b-a2b7-fbbc87b49a6d" path="/var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes" Jan 11 17:31:10 crc kubenswrapper[4837]: I0111 17:31:10.465152 4837 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="bd23aa5c-e532-4e53-bccf-e79f130c5ae8" path="/var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/volumes" Jan 11 17:31:10 crc kubenswrapper[4837]: I0111 17:31:10.467286 4837 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="bf126b07-da06-4140-9a57-dfd54fc6b486" path="/var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/volumes" Jan 11 17:31:10 crc kubenswrapper[4837]: I0111 17:31:10.469434 4837 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c03ee662-fb2f-4fc4-a2c1-af487c19d254" path="/var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/volumes" Jan 11 17:31:10 crc kubenswrapper[4837]: I0111 17:31:10.471122 4837 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d" path="/var/lib/kubelet/pods/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d/volumes" Jan 11 17:31:10 crc kubenswrapper[4837]: I0111 17:31:10.473632 4837 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e7e6199b-1264-4501-8953-767f51328d08" path="/var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/volumes" Jan 11 17:31:10 crc kubenswrapper[4837]: I0111 17:31:10.475372 4837 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="efdd0498-1daa-4136-9a4a-3b948c2293fc" path="/var/lib/kubelet/pods/efdd0498-1daa-4136-9a4a-3b948c2293fc/volumes" Jan 11 17:31:10 crc kubenswrapper[4837]: I0111 17:31:10.479150 4837 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f88749ec-7931-4ee7-b3fc-1ec5e11f92e9" path="/var/lib/kubelet/pods/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9/volumes" Jan 11 17:31:10 crc kubenswrapper[4837]: I0111 17:31:10.481125 4837 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="fda69060-fa79-4696-b1a6-7980f124bf7c" path="/var/lib/kubelet/pods/fda69060-fa79-4696-b1a6-7980f124bf7c/volumes" Jan 11 17:31:10 crc kubenswrapper[4837]: I0111 17:31:10.482569 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-7996w" event={"ID":"46729771-ba0b-4b7c-8245-b2d57acb5a2c","Type":"ContainerStarted","Data":"69ae5d64e947eab02b974d10408eaab2d46c08a193a3f22bfa31bcac81e92db8"} Jan 11 17:31:10 crc kubenswrapper[4837]: I0111 17:31:10.482898 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-v5bf5" event={"ID":"78cc7c3f-09f5-4200-a647-8fa4e9b2aae5","Type":"ContainerStarted","Data":"7bb8bd4a15d6d53a4b07ff4a9a281ed517b6656458d4bd396c2b05432fd28b43"} Jan 11 17:31:10 crc kubenswrapper[4837]: I0111 17:31:10.483020 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-pqnst" event={"ID":"1cf6fa66-290a-4e29-bafc-e60185a22fe2","Type":"ContainerStarted","Data":"e521a1db0cbaa129b7dcfb9861b46017b1809bae798187b74b6afb64f02a6858"} Jan 11 17:31:10 crc kubenswrapper[4837]: I0111 17:31:10.483112 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-b7lgc" event={"ID":"e1452749-ce38-41f8-89dd-4b567f2a3250","Type":"ContainerStarted","Data":"6736d3517f7eec45b1843f7580da009352a6d13f6cc3bf67f0047b8b967eac72"} Jan 11 17:31:10 crc kubenswrapper[4837]: I0111 17:31:10.483189 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" event={"ID":"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49","Type":"ContainerStarted","Data":"aaa925b4986bb3f2a88d72f7624f9aa55b04c3354ae5ba545e89ae9044dce0ff"} Jan 11 17:31:10 crc kubenswrapper[4837]: I0111 17:31:10.483264 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns/node-resolver-kcvkg" event={"ID":"a574a741-71b2-4d60-9553-3d85d815f4b0","Type":"ContainerStarted","Data":"8eb59b9c34127e9fe93f67b4feb88774e28f737c4062dc0b232c521b86228b7b"} Jan 11 17:31:10 crc kubenswrapper[4837]: I0111 17:31:10.503816 4837 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-image-registry/node-ca-zd2gg"] Jan 11 17:31:10 crc kubenswrapper[4837]: I0111 17:31:10.504639 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/node-ca-zd2gg" Jan 11 17:31:10 crc kubenswrapper[4837]: I0111 17:31:10.507797 4837 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"openshift-service-ca.crt" Jan 11 17:31:10 crc kubenswrapper[4837]: I0111 17:31:10.507838 4837 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"image-registry-certificates" Jan 11 17:31:10 crc kubenswrapper[4837]: I0111 17:31:10.508080 4837 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"kube-root-ca.crt" Jan 11 17:31:10 crc kubenswrapper[4837]: I0111 17:31:10.508618 4837 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"node-ca-dockercfg-4777p" Jan 11 17:31:10 crc kubenswrapper[4837]: I0111 17:31:10.529186 4837 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-11T17:31:09Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-11T17:31:09Z\\\",\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":false,\\\"restartCount\\\":5,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Jan 11 17:31:10 crc kubenswrapper[4837]: I0111 17:31:10.540850 4837 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-11T17:31:09Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-11T17:31:09Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Jan 11 17:31:10 crc kubenswrapper[4837]: I0111 17:31:10.553332 4837 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-pqnst" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1cf6fa66-290a-4e29-bafc-e60185a22fe2\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-11T17:31:09Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-11T17:31:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-11T17:31:09Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-11T17:31:09Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wgdql\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wgdql\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-11T17:31:09Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-pqnst\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Jan 11 17:31:10 crc kubenswrapper[4837]: I0111 17:31:10.567066 4837 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-7996w" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"46729771-ba0b-4b7c-8245-b2d57acb5a2c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-11T17:31:09Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-11T17:31:09Z\\\",\\\"message\\\":\\\"containers with incomplete status: [egress-router-binary-copy cni-plugins bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-11T17:31:09Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-11T17:31:09Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fhxrk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fhxrk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fhxrk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fhxrk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fhxrk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fhxrk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fhxrk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-11T17:31:09Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-7996w\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Jan 11 17:31:10 crc kubenswrapper[4837]: I0111 17:31:10.579021 4837 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-11T17:31:09Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-11T17:31:09Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Jan 11 17:31:10 crc kubenswrapper[4837]: I0111 17:31:10.597423 4837 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-b7lgc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e1452749-ce38-41f8-89dd-4b567f2a3250\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-11T17:31:09Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-11T17:31:09Z\\\",\\\"message\\\":\\\"containers with incomplete status: [kubecfg-setup]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-11T17:31:09Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-11T17:31:09Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6f6rv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6f6rv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6f6rv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6f6rv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6f6rv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6f6rv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6f6rv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6f6rv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6f6rv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-11T17:31:09Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-b7lgc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Jan 11 17:31:10 crc kubenswrapper[4837]: I0111 17:31:10.607204 4837 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-kcvkg" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a574a741-71b2-4d60-9553-3d85d815f4b0\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-11T17:31:09Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-11T17:31:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-11T17:31:09Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-11T17:31:09Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xnsnp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-11T17:31:09Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-kcvkg\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Jan 11 17:31:10 crc kubenswrapper[4837]: I0111 17:31:10.615976 4837 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-zd2gg" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"cfaa7eab-7649-4a7d-8e95-aab12da5f86d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-11T17:31:10Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-11T17:31:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-11T17:31:10Z\\\",\\\"message\\\":\\\"containers with unready status: [node-ca]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-11T17:31:10Z\\\",\\\"message\\\":\\\"containers with unready status: [node-ca]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gq6v5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-11T17:31:10Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-zd2gg\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Jan 11 17:31:10 crc kubenswrapper[4837]: I0111 17:31:10.616091 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/cfaa7eab-7649-4a7d-8e95-aab12da5f86d-serviceca\") pod \"node-ca-zd2gg\" (UID: \"cfaa7eab-7649-4a7d-8e95-aab12da5f86d\") " pod="openshift-image-registry/node-ca-zd2gg" Jan 11 17:31:10 crc kubenswrapper[4837]: I0111 17:31:10.616114 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gq6v5\" (UniqueName: \"kubernetes.io/projected/cfaa7eab-7649-4a7d-8e95-aab12da5f86d-kube-api-access-gq6v5\") pod \"node-ca-zd2gg\" (UID: \"cfaa7eab-7649-4a7d-8e95-aab12da5f86d\") " pod="openshift-image-registry/node-ca-zd2gg" Jan 11 17:31:10 crc kubenswrapper[4837]: I0111 17:31:10.616181 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/cfaa7eab-7649-4a7d-8e95-aab12da5f86d-host\") pod \"node-ca-zd2gg\" (UID: \"cfaa7eab-7649-4a7d-8e95-aab12da5f86d\") " pod="openshift-image-registry/node-ca-zd2gg" Jan 11 17:31:10 crc kubenswrapper[4837]: I0111 17:31:10.624566 4837 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-11T17:31:09Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-11T17:31:09Z\\\",\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":false,\\\"restartCount\\\":6,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Jan 11 17:31:10 crc kubenswrapper[4837]: I0111 17:31:10.633867 4837 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-11T17:31:09Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-11T17:31:09Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Jan 11 17:31:10 crc kubenswrapper[4837]: I0111 17:31:10.648054 4837 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-v5bf5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"78cc7c3f-09f5-4200-a647-8fa4e9b2aae5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-11T17:31:09Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-11T17:31:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-11T17:31:09Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-11T17:31:09Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ns552\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-11T17:31:09Z\\\"}}\" for pod \"openshift-multus\"/\"multus-v5bf5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Jan 11 17:31:10 crc kubenswrapper[4837]: I0111 17:31:10.660525 4837 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-11T17:31:09Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-11T17:31:09Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Jan 11 17:31:10 crc kubenswrapper[4837]: I0111 17:31:10.688176 4837 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Jan 11 17:31:10 crc kubenswrapper[4837]: I0111 17:31:10.695345 4837 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Jan 11 17:31:10 crc kubenswrapper[4837]: I0111 17:31:10.698739 4837 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-controller-manager/kube-controller-manager-crc"] Jan 11 17:31:10 crc kubenswrapper[4837]: I0111 17:31:10.707098 4837 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-11T17:31:09Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-11T17:31:09Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Jan 11 17:31:10 crc kubenswrapper[4837]: I0111 17:31:10.717319 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Jan 11 17:31:10 crc kubenswrapper[4837]: I0111 17:31:10.717467 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/cfaa7eab-7649-4a7d-8e95-aab12da5f86d-host\") pod \"node-ca-zd2gg\" (UID: \"cfaa7eab-7649-4a7d-8e95-aab12da5f86d\") " pod="openshift-image-registry/node-ca-zd2gg" Jan 11 17:31:10 crc kubenswrapper[4837]: I0111 17:31:10.717502 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/cfaa7eab-7649-4a7d-8e95-aab12da5f86d-serviceca\") pod \"node-ca-zd2gg\" (UID: \"cfaa7eab-7649-4a7d-8e95-aab12da5f86d\") " pod="openshift-image-registry/node-ca-zd2gg" Jan 11 17:31:10 crc kubenswrapper[4837]: I0111 17:31:10.717521 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gq6v5\" (UniqueName: \"kubernetes.io/projected/cfaa7eab-7649-4a7d-8e95-aab12da5f86d-kube-api-access-gq6v5\") pod \"node-ca-zd2gg\" (UID: \"cfaa7eab-7649-4a7d-8e95-aab12da5f86d\") " pod="openshift-image-registry/node-ca-zd2gg" Jan 11 17:31:10 crc kubenswrapper[4837]: I0111 17:31:10.717785 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/cfaa7eab-7649-4a7d-8e95-aab12da5f86d-host\") pod \"node-ca-zd2gg\" (UID: \"cfaa7eab-7649-4a7d-8e95-aab12da5f86d\") " pod="openshift-image-registry/node-ca-zd2gg" Jan 11 17:31:10 crc kubenswrapper[4837]: E0111 17:31:10.718071 4837 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-01-11 17:31:12.718030687 +0000 UTC m=+46.896223433 (durationBeforeRetry 2s). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 11 17:31:10 crc kubenswrapper[4837]: I0111 17:31:10.726383 4837 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-b7lgc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e1452749-ce38-41f8-89dd-4b567f2a3250\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-11T17:31:09Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-11T17:31:09Z\\\",\\\"message\\\":\\\"containers with incomplete status: [kubecfg-setup]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-11T17:31:09Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-11T17:31:09Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6f6rv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6f6rv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6f6rv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6f6rv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6f6rv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6f6rv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6f6rv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6f6rv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6f6rv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-11T17:31:09Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-b7lgc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Jan 11 17:31:10 crc kubenswrapper[4837]: I0111 17:31:10.735533 4837 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-kcvkg" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a574a741-71b2-4d60-9553-3d85d815f4b0\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-11T17:31:09Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-11T17:31:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-11T17:31:09Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-11T17:31:09Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xnsnp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-11T17:31:09Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-kcvkg\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Jan 11 17:31:10 crc kubenswrapper[4837]: I0111 17:31:10.736251 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gq6v5\" (UniqueName: \"kubernetes.io/projected/cfaa7eab-7649-4a7d-8e95-aab12da5f86d-kube-api-access-gq6v5\") pod \"node-ca-zd2gg\" (UID: \"cfaa7eab-7649-4a7d-8e95-aab12da5f86d\") " pod="openshift-image-registry/node-ca-zd2gg" Jan 11 17:31:10 crc kubenswrapper[4837]: I0111 17:31:10.746161 4837 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-zd2gg" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"cfaa7eab-7649-4a7d-8e95-aab12da5f86d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-11T17:31:10Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-11T17:31:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-11T17:31:10Z\\\",\\\"message\\\":\\\"containers with unready status: [node-ca]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-11T17:31:10Z\\\",\\\"message\\\":\\\"containers with unready status: [node-ca]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gq6v5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-11T17:31:10Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-zd2gg\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Jan 11 17:31:10 crc kubenswrapper[4837]: I0111 17:31:10.759892 4837 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-11T17:31:09Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-11T17:31:09Z\\\",\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":false,\\\"restartCount\\\":6,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Jan 11 17:31:10 crc kubenswrapper[4837]: I0111 17:31:10.771430 4837 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-11T17:31:09Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-11T17:31:09Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Jan 11 17:31:10 crc kubenswrapper[4837]: I0111 17:31:10.781874 4837 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-v5bf5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"78cc7c3f-09f5-4200-a647-8fa4e9b2aae5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-11T17:31:09Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-11T17:31:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-11T17:31:09Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-11T17:31:09Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ns552\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-11T17:31:09Z\\\"}}\" for pod \"openshift-multus\"/\"multus-v5bf5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Jan 11 17:31:10 crc kubenswrapper[4837]: I0111 17:31:10.790497 4837 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-11T17:31:09Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-11T17:31:09Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Jan 11 17:31:10 crc kubenswrapper[4837]: I0111 17:31:10.801927 4837 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-11T17:31:09Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-11T17:31:09Z\\\",\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":false,\\\"restartCount\\\":5,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Jan 11 17:31:10 crc kubenswrapper[4837]: I0111 17:31:10.812266 4837 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-11T17:31:09Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-11T17:31:09Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Jan 11 17:31:10 crc kubenswrapper[4837]: I0111 17:31:10.819447 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Jan 11 17:31:10 crc kubenswrapper[4837]: I0111 17:31:10.819491 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Jan 11 17:31:10 crc kubenswrapper[4837]: E0111 17:31:10.819693 4837 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Jan 11 17:31:10 crc kubenswrapper[4837]: E0111 17:31:10.819879 4837 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-01-11 17:31:12.819783539 +0000 UTC m=+46.997976315 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Jan 11 17:31:10 crc kubenswrapper[4837]: E0111 17:31:10.820273 4837 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Jan 11 17:31:10 crc kubenswrapper[4837]: E0111 17:31:10.820298 4837 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Jan 11 17:31:10 crc kubenswrapper[4837]: E0111 17:31:10.820309 4837 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Jan 11 17:31:10 crc kubenswrapper[4837]: E0111 17:31:10.820359 4837 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2026-01-11 17:31:12.820343433 +0000 UTC m=+46.998536139 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Jan 11 17:31:10 crc kubenswrapper[4837]: I0111 17:31:10.821307 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Jan 11 17:31:10 crc kubenswrapper[4837]: I0111 17:31:10.821438 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Jan 11 17:31:10 crc kubenswrapper[4837]: E0111 17:31:10.821488 4837 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Jan 11 17:31:10 crc kubenswrapper[4837]: E0111 17:31:10.821572 4837 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Jan 11 17:31:10 crc kubenswrapper[4837]: E0111 17:31:10.821635 4837 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Jan 11 17:31:10 crc kubenswrapper[4837]: E0111 17:31:10.821648 4837 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Jan 11 17:31:10 crc kubenswrapper[4837]: E0111 17:31:10.821614 4837 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-01-11 17:31:12.821586514 +0000 UTC m=+46.999779270 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Jan 11 17:31:10 crc kubenswrapper[4837]: E0111 17:31:10.821777 4837 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2026-01-11 17:31:12.821752108 +0000 UTC m=+46.999944894 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Jan 11 17:31:10 crc kubenswrapper[4837]: I0111 17:31:10.823778 4837 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-pqnst" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1cf6fa66-290a-4e29-bafc-e60185a22fe2\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-11T17:31:09Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-11T17:31:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-11T17:31:09Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-11T17:31:09Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wgdql\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wgdql\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-11T17:31:09Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-pqnst\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Jan 11 17:31:10 crc kubenswrapper[4837]: I0111 17:31:10.839222 4837 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-7996w" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"46729771-ba0b-4b7c-8245-b2d57acb5a2c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-11T17:31:09Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-11T17:31:09Z\\\",\\\"message\\\":\\\"containers with incomplete status: [egress-router-binary-copy cni-plugins bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-11T17:31:09Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-11T17:31:09Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fhxrk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fhxrk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fhxrk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fhxrk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fhxrk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fhxrk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fhxrk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-11T17:31:09Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-7996w\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Jan 11 17:31:11 crc kubenswrapper[4837]: I0111 17:31:11.363075 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Jan 11 17:31:11 crc kubenswrapper[4837]: E0111 17:31:11.363260 4837 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Jan 11 17:31:12 crc kubenswrapper[4837]: I0111 17:31:12.363568 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Jan 11 17:31:12 crc kubenswrapper[4837]: I0111 17:31:12.363760 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Jan 11 17:31:12 crc kubenswrapper[4837]: E0111 17:31:12.363923 4837 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Jan 11 17:31:12 crc kubenswrapper[4837]: E0111 17:31:12.364039 4837 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Jan 11 17:31:12 crc kubenswrapper[4837]: I0111 17:31:12.744083 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Jan 11 17:31:12 crc kubenswrapper[4837]: E0111 17:31:12.744293 4837 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-01-11 17:31:16.744254476 +0000 UTC m=+50.922447222 (durationBeforeRetry 4s). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 11 17:31:12 crc kubenswrapper[4837]: I0111 17:31:12.845799 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Jan 11 17:31:12 crc kubenswrapper[4837]: I0111 17:31:12.845887 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Jan 11 17:31:12 crc kubenswrapper[4837]: I0111 17:31:12.845947 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Jan 11 17:31:12 crc kubenswrapper[4837]: E0111 17:31:12.846137 4837 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Jan 11 17:31:12 crc kubenswrapper[4837]: E0111 17:31:12.846197 4837 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Jan 11 17:31:12 crc kubenswrapper[4837]: E0111 17:31:12.846213 4837 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Jan 11 17:31:12 crc kubenswrapper[4837]: E0111 17:31:12.846244 4837 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Jan 11 17:31:12 crc kubenswrapper[4837]: E0111 17:31:12.846248 4837 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Jan 11 17:31:12 crc kubenswrapper[4837]: I0111 17:31:12.846006 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Jan 11 17:31:12 crc kubenswrapper[4837]: E0111 17:31:12.846223 4837 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Jan 11 17:31:12 crc kubenswrapper[4837]: E0111 17:31:12.846266 4837 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Jan 11 17:31:12 crc kubenswrapper[4837]: E0111 17:31:12.846141 4837 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Jan 11 17:31:12 crc kubenswrapper[4837]: E0111 17:31:12.846339 4837 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-01-11 17:31:16.846313323 +0000 UTC m=+51.024506059 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Jan 11 17:31:12 crc kubenswrapper[4837]: E0111 17:31:12.846573 4837 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2026-01-11 17:31:16.846541179 +0000 UTC m=+51.024733925 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Jan 11 17:31:12 crc kubenswrapper[4837]: E0111 17:31:12.846606 4837 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2026-01-11 17:31:16.84658969 +0000 UTC m=+51.024782556 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Jan 11 17:31:12 crc kubenswrapper[4837]: E0111 17:31:12.846636 4837 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-01-11 17:31:16.846620981 +0000 UTC m=+51.024813827 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Jan 11 17:31:13 crc kubenswrapper[4837]: I0111 17:31:13.047286 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/cfaa7eab-7649-4a7d-8e95-aab12da5f86d-serviceca\") pod \"node-ca-zd2gg\" (UID: \"cfaa7eab-7649-4a7d-8e95-aab12da5f86d\") " pod="openshift-image-registry/node-ca-zd2gg" Jan 11 17:31:13 crc kubenswrapper[4837]: I0111 17:31:13.228513 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/node-ca-zd2gg" Jan 11 17:31:13 crc kubenswrapper[4837]: W0111 17:31:13.247826 4837 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podcfaa7eab_7649_4a7d_8e95_aab12da5f86d.slice/crio-5f66394a6abb28512fc9c530411c47b725ee1b492ad6ece7a13888c8e79b5231 WatchSource:0}: Error finding container 5f66394a6abb28512fc9c530411c47b725ee1b492ad6ece7a13888c8e79b5231: Status 404 returned error can't find the container with id 5f66394a6abb28512fc9c530411c47b725ee1b492ad6ece7a13888c8e79b5231 Jan 11 17:31:13 crc kubenswrapper[4837]: I0111 17:31:13.363308 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Jan 11 17:31:13 crc kubenswrapper[4837]: E0111 17:31:13.363504 4837 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Jan 11 17:31:13 crc kubenswrapper[4837]: I0111 17:31:13.477896 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/node-ca-zd2gg" event={"ID":"cfaa7eab-7649-4a7d-8e95-aab12da5f86d","Type":"ContainerStarted","Data":"5f66394a6abb28512fc9c530411c47b725ee1b492ad6ece7a13888c8e79b5231"} Jan 11 17:31:14 crc kubenswrapper[4837]: I0111 17:31:14.363312 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Jan 11 17:31:14 crc kubenswrapper[4837]: I0111 17:31:14.363376 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Jan 11 17:31:14 crc kubenswrapper[4837]: E0111 17:31:14.363531 4837 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Jan 11 17:31:14 crc kubenswrapper[4837]: E0111 17:31:14.363658 4837 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Jan 11 17:31:14 crc kubenswrapper[4837]: I0111 17:31:14.483511 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" event={"ID":"ef543e1b-8068-4ea3-b32a-61027b32e95d","Type":"ContainerStarted","Data":"f21bc9ccad340e4d66f5f81f4d3917c28326d9633319d8252a9faebd954cf921"} Jan 11 17:31:14 crc kubenswrapper[4837]: I0111 17:31:14.571186 4837 patch_prober.go:28] interesting pod/kube-apiserver-crc container/kube-apiserver-check-endpoints namespace/openshift-kube-apiserver: Liveness probe status=failure output="Get \"https://192.168.126.11:17697/healthz\": dial tcp 192.168.126.11:17697: connect: connection refused" start-of-body= Jan 11 17:31:14 crc kubenswrapper[4837]: I0111 17:31:14.571264 4837 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" probeResult="failure" output="Get \"https://192.168.126.11:17697/healthz\": dial tcp 192.168.126.11:17697: connect: connection refused" Jan 11 17:31:15 crc kubenswrapper[4837]: I0111 17:31:15.283815 4837 patch_prober.go:28] interesting pod/kube-apiserver-crc container/kube-apiserver-check-endpoints namespace/openshift-kube-apiserver: Readiness probe status=failure output="Get \"https://192.168.126.11:17697/healthz\": dial tcp 192.168.126.11:17697: connect: connection refused" start-of-body= Jan 11 17:31:15 crc kubenswrapper[4837]: I0111 17:31:15.283899 4837 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" probeResult="failure" output="Get \"https://192.168.126.11:17697/healthz\": dial tcp 192.168.126.11:17697: connect: connection refused" Jan 11 17:31:15 crc kubenswrapper[4837]: I0111 17:31:15.364047 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Jan 11 17:31:15 crc kubenswrapper[4837]: E0111 17:31:15.364353 4837 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Jan 11 17:31:15 crc kubenswrapper[4837]: I0111 17:31:15.386513 4837 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Jan 11 17:31:15 crc kubenswrapper[4837]: I0111 17:31:15.389314 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 11 17:31:15 crc kubenswrapper[4837]: I0111 17:31:15.389375 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 11 17:31:15 crc kubenswrapper[4837]: I0111 17:31:15.390123 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 11 17:31:15 crc kubenswrapper[4837]: I0111 17:31:15.390358 4837 kubelet_node_status.go:76] "Attempting to register node" node="crc" Jan 11 17:31:15 crc kubenswrapper[4837]: I0111 17:31:15.400398 4837 kubelet_node_status.go:115] "Node was previously registered" node="crc" Jan 11 17:31:15 crc kubenswrapper[4837]: I0111 17:31:15.400915 4837 kubelet_node_status.go:79] "Successfully registered node" node="crc" Jan 11 17:31:15 crc kubenswrapper[4837]: I0111 17:31:15.402668 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 11 17:31:15 crc kubenswrapper[4837]: I0111 17:31:15.402738 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 11 17:31:15 crc kubenswrapper[4837]: I0111 17:31:15.402755 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 11 17:31:15 crc kubenswrapper[4837]: I0111 17:31:15.402778 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 11 17:31:15 crc kubenswrapper[4837]: I0111 17:31:15.402795 4837 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-11T17:31:15Z","lastTransitionTime":"2026-01-11T17:31:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 11 17:31:15 crc kubenswrapper[4837]: E0111 17:31:15.430186 4837 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-01-11T17:31:15Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-11T17:31:15Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-11T17:31:15Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-11T17:31:15Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-11T17:31:15Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-11T17:31:15Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-11T17:31:15Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-11T17:31:15Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"998e7fc1-b4f8-46e4-96eb-3a9f319ce8e1\\\",\\\"systemUUID\\\":\\\"860923a9-1393-406f-8d70-e9e11a3b51a7\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-11T17:31:15Z is after 2025-08-24T17:21:41Z" Jan 11 17:31:15 crc kubenswrapper[4837]: I0111 17:31:15.434980 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 11 17:31:15 crc kubenswrapper[4837]: I0111 17:31:15.435036 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 11 17:31:15 crc kubenswrapper[4837]: I0111 17:31:15.435055 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 11 17:31:15 crc kubenswrapper[4837]: I0111 17:31:15.435079 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 11 17:31:15 crc kubenswrapper[4837]: I0111 17:31:15.435097 4837 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-11T17:31:15Z","lastTransitionTime":"2026-01-11T17:31:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 11 17:31:15 crc kubenswrapper[4837]: E0111 17:31:15.456582 4837 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-01-11T17:31:15Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-11T17:31:15Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-11T17:31:15Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-11T17:31:15Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-11T17:31:15Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-11T17:31:15Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-11T17:31:15Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-11T17:31:15Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"998e7fc1-b4f8-46e4-96eb-3a9f319ce8e1\\\",\\\"systemUUID\\\":\\\"860923a9-1393-406f-8d70-e9e11a3b51a7\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-11T17:31:15Z is after 2025-08-24T17:21:41Z" Jan 11 17:31:15 crc kubenswrapper[4837]: I0111 17:31:15.460746 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 11 17:31:15 crc kubenswrapper[4837]: I0111 17:31:15.460801 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 11 17:31:15 crc kubenswrapper[4837]: I0111 17:31:15.460821 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 11 17:31:15 crc kubenswrapper[4837]: I0111 17:31:15.460843 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 11 17:31:15 crc kubenswrapper[4837]: I0111 17:31:15.460860 4837 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-11T17:31:15Z","lastTransitionTime":"2026-01-11T17:31:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 11 17:31:15 crc kubenswrapper[4837]: E0111 17:31:15.480450 4837 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-01-11T17:31:15Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-11T17:31:15Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-11T17:31:15Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-11T17:31:15Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-11T17:31:15Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-11T17:31:15Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-11T17:31:15Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-11T17:31:15Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"998e7fc1-b4f8-46e4-96eb-3a9f319ce8e1\\\",\\\"systemUUID\\\":\\\"860923a9-1393-406f-8d70-e9e11a3b51a7\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-11T17:31:15Z is after 2025-08-24T17:21:41Z" Jan 11 17:31:15 crc kubenswrapper[4837]: I0111 17:31:15.485071 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 11 17:31:15 crc kubenswrapper[4837]: I0111 17:31:15.485163 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 11 17:31:15 crc kubenswrapper[4837]: I0111 17:31:15.485181 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 11 17:31:15 crc kubenswrapper[4837]: I0111 17:31:15.485239 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 11 17:31:15 crc kubenswrapper[4837]: I0111 17:31:15.485257 4837 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-11T17:31:15Z","lastTransitionTime":"2026-01-11T17:31:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 11 17:31:15 crc kubenswrapper[4837]: I0111 17:31:15.488372 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-pqnst" event={"ID":"1cf6fa66-290a-4e29-bafc-e60185a22fe2","Type":"ContainerStarted","Data":"ef1d1b5ff926a1f2f0f357177d19214a1c92fbf76f445fb1a767d0d17cd1b4cb"} Jan 11 17:31:15 crc kubenswrapper[4837]: I0111 17:31:15.490544 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" event={"ID":"37a5e44f-9a88-4405-be8a-b645485e7312","Type":"ContainerStarted","Data":"a2c723a71bf258ca0d2cc93b21a3cb0aa257b26d456401a12ea75b342b5cc4a3"} Jan 11 17:31:15 crc kubenswrapper[4837]: E0111 17:31:15.506807 4837 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-01-11T17:31:15Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-11T17:31:15Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-11T17:31:15Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-11T17:31:15Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-11T17:31:15Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-11T17:31:15Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-11T17:31:15Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-11T17:31:15Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"998e7fc1-b4f8-46e4-96eb-3a9f319ce8e1\\\",\\\"systemUUID\\\":\\\"860923a9-1393-406f-8d70-e9e11a3b51a7\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-11T17:31:15Z is after 2025-08-24T17:21:41Z" Jan 11 17:31:15 crc kubenswrapper[4837]: I0111 17:31:15.511922 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 11 17:31:15 crc kubenswrapper[4837]: I0111 17:31:15.511972 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 11 17:31:15 crc kubenswrapper[4837]: I0111 17:31:15.511992 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 11 17:31:15 crc kubenswrapper[4837]: I0111 17:31:15.512014 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 11 17:31:15 crc kubenswrapper[4837]: I0111 17:31:15.512031 4837 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-11T17:31:15Z","lastTransitionTime":"2026-01-11T17:31:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 11 17:31:15 crc kubenswrapper[4837]: E0111 17:31:15.531780 4837 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-01-11T17:31:15Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-11T17:31:15Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-11T17:31:15Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-11T17:31:15Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-11T17:31:15Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-11T17:31:15Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-11T17:31:15Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-11T17:31:15Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"998e7fc1-b4f8-46e4-96eb-3a9f319ce8e1\\\",\\\"systemUUID\\\":\\\"860923a9-1393-406f-8d70-e9e11a3b51a7\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-11T17:31:15Z is after 2025-08-24T17:21:41Z" Jan 11 17:31:15 crc kubenswrapper[4837]: E0111 17:31:15.531999 4837 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Jan 11 17:31:15 crc kubenswrapper[4837]: I0111 17:31:15.534174 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 11 17:31:15 crc kubenswrapper[4837]: I0111 17:31:15.534225 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 11 17:31:15 crc kubenswrapper[4837]: I0111 17:31:15.534242 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 11 17:31:15 crc kubenswrapper[4837]: I0111 17:31:15.534263 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 11 17:31:15 crc kubenswrapper[4837]: I0111 17:31:15.534280 4837 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-11T17:31:15Z","lastTransitionTime":"2026-01-11T17:31:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 11 17:31:15 crc kubenswrapper[4837]: I0111 17:31:15.637954 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 11 17:31:15 crc kubenswrapper[4837]: I0111 17:31:15.638013 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 11 17:31:15 crc kubenswrapper[4837]: I0111 17:31:15.638030 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 11 17:31:15 crc kubenswrapper[4837]: I0111 17:31:15.638055 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 11 17:31:15 crc kubenswrapper[4837]: I0111 17:31:15.638072 4837 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-11T17:31:15Z","lastTransitionTime":"2026-01-11T17:31:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 11 17:31:15 crc kubenswrapper[4837]: I0111 17:31:15.741521 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 11 17:31:15 crc kubenswrapper[4837]: I0111 17:31:15.741758 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 11 17:31:15 crc kubenswrapper[4837]: I0111 17:31:15.741816 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 11 17:31:15 crc kubenswrapper[4837]: I0111 17:31:15.741960 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 11 17:31:15 crc kubenswrapper[4837]: I0111 17:31:15.742017 4837 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-11T17:31:15Z","lastTransitionTime":"2026-01-11T17:31:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 11 17:31:15 crc kubenswrapper[4837]: I0111 17:31:15.844754 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 11 17:31:15 crc kubenswrapper[4837]: I0111 17:31:15.845059 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 11 17:31:15 crc kubenswrapper[4837]: I0111 17:31:15.845145 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 11 17:31:15 crc kubenswrapper[4837]: I0111 17:31:15.845288 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 11 17:31:15 crc kubenswrapper[4837]: I0111 17:31:15.845372 4837 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-11T17:31:15Z","lastTransitionTime":"2026-01-11T17:31:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 11 17:31:15 crc kubenswrapper[4837]: I0111 17:31:15.948969 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 11 17:31:15 crc kubenswrapper[4837]: I0111 17:31:15.949229 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 11 17:31:15 crc kubenswrapper[4837]: I0111 17:31:15.949240 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 11 17:31:15 crc kubenswrapper[4837]: I0111 17:31:15.949254 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 11 17:31:15 crc kubenswrapper[4837]: I0111 17:31:15.949265 4837 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-11T17:31:15Z","lastTransitionTime":"2026-01-11T17:31:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 11 17:31:15 crc kubenswrapper[4837]: I0111 17:31:15.985749 4837 transport.go:147] "Certificate rotation detected, shutting down client connections to start using new credentials" Jan 11 17:31:16 crc kubenswrapper[4837]: I0111 17:31:16.051467 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 11 17:31:16 crc kubenswrapper[4837]: I0111 17:31:16.051512 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 11 17:31:16 crc kubenswrapper[4837]: I0111 17:31:16.051522 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 11 17:31:16 crc kubenswrapper[4837]: I0111 17:31:16.051536 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 11 17:31:16 crc kubenswrapper[4837]: I0111 17:31:16.051547 4837 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-11T17:31:16Z","lastTransitionTime":"2026-01-11T17:31:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 11 17:31:16 crc kubenswrapper[4837]: I0111 17:31:16.153950 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 11 17:31:16 crc kubenswrapper[4837]: I0111 17:31:16.153997 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 11 17:31:16 crc kubenswrapper[4837]: I0111 17:31:16.154010 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 11 17:31:16 crc kubenswrapper[4837]: I0111 17:31:16.154030 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 11 17:31:16 crc kubenswrapper[4837]: I0111 17:31:16.154046 4837 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-11T17:31:16Z","lastTransitionTime":"2026-01-11T17:31:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 11 17:31:16 crc kubenswrapper[4837]: I0111 17:31:16.256595 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 11 17:31:16 crc kubenswrapper[4837]: I0111 17:31:16.256635 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 11 17:31:16 crc kubenswrapper[4837]: I0111 17:31:16.256646 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 11 17:31:16 crc kubenswrapper[4837]: I0111 17:31:16.256662 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 11 17:31:16 crc kubenswrapper[4837]: I0111 17:31:16.256699 4837 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-11T17:31:16Z","lastTransitionTime":"2026-01-11T17:31:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 11 17:31:16 crc kubenswrapper[4837]: I0111 17:31:16.359067 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 11 17:31:16 crc kubenswrapper[4837]: I0111 17:31:16.359108 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 11 17:31:16 crc kubenswrapper[4837]: I0111 17:31:16.359118 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 11 17:31:16 crc kubenswrapper[4837]: I0111 17:31:16.359133 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 11 17:31:16 crc kubenswrapper[4837]: I0111 17:31:16.359143 4837 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-11T17:31:16Z","lastTransitionTime":"2026-01-11T17:31:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 11 17:31:16 crc kubenswrapper[4837]: I0111 17:31:16.366180 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Jan 11 17:31:16 crc kubenswrapper[4837]: E0111 17:31:16.366301 4837 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Jan 11 17:31:16 crc kubenswrapper[4837]: I0111 17:31:16.366402 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Jan 11 17:31:16 crc kubenswrapper[4837]: E0111 17:31:16.366480 4837 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Jan 11 17:31:16 crc kubenswrapper[4837]: I0111 17:31:16.377508 4837 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-11T17:31:09Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-11T17:31:09Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-11T17:31:16Z is after 2025-08-24T17:21:41Z" Jan 11 17:31:16 crc kubenswrapper[4837]: I0111 17:31:16.392864 4837 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-b7lgc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e1452749-ce38-41f8-89dd-4b567f2a3250\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-11T17:31:09Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-11T17:31:09Z\\\",\\\"message\\\":\\\"containers with incomplete status: [kubecfg-setup]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-11T17:31:09Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-11T17:31:09Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6f6rv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6f6rv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6f6rv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6f6rv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6f6rv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6f6rv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6f6rv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6f6rv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6f6rv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-11T17:31:09Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-b7lgc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-11T17:31:16Z is after 2025-08-24T17:21:41Z" Jan 11 17:31:16 crc kubenswrapper[4837]: I0111 17:31:16.403017 4837 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-kcvkg" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a574a741-71b2-4d60-9553-3d85d815f4b0\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-11T17:31:09Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-11T17:31:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-11T17:31:09Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-11T17:31:09Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xnsnp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-11T17:31:09Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-kcvkg\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-11T17:31:16Z is after 2025-08-24T17:21:41Z" Jan 11 17:31:16 crc kubenswrapper[4837]: I0111 17:31:16.413891 4837 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-zd2gg" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"cfaa7eab-7649-4a7d-8e95-aab12da5f86d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-11T17:31:10Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-11T17:31:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-11T17:31:10Z\\\",\\\"message\\\":\\\"containers with unready status: [node-ca]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-11T17:31:10Z\\\",\\\"message\\\":\\\"containers with unready status: [node-ca]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gq6v5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-11T17:31:10Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-zd2gg\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-11T17:31:16Z is after 2025-08-24T17:21:41Z" Jan 11 17:31:16 crc kubenswrapper[4837]: I0111 17:31:16.424917 4837 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"62ccdf63-9fbb-4a5c-825b-aefdb1153bfd\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-11T17:30:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-11T17:30:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-11T17:30:26Z\\\",\\\"message\\\":\\\"containers with unready status: [cluster-policy-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-11T17:30:26Z\\\",\\\"message\\\":\\\"containers with unready status: [cluster-policy-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-11T17:30:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4d900bf1f3701fc269e75b435a2660c096b93a7c043693a5f294666defbe51c1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f143223d5223a4e1e7056b09d8a43a04fde926d4c15184b5b452190356db5881\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-01-11T17:31:03Z\\\",\\\"message\\\":\\\"+ timeout 3m /bin/bash -exuo pipefail -c 'while [ -n \\\\\\\"$(ss -Htanop \\\\\\\\( sport = 10357 \\\\\\\\))\\\\\\\" ]; do sleep 1; done'\\\\n++ ss -Htanop '(' sport = 10357 ')'\\\\n+ '[' -n '' ']'\\\\n+ exec cluster-policy-controller start --config=/etc/kubernetes/static-pod-resources/configmaps/cluster-policy-controller-config/config.yaml --kubeconfig=/etc/kubernetes/static-pod-resources/configmaps/controller-manager-kubeconfig/kubeconfig --namespace=openshift-kube-controller-manager -v=2\\\\nI0111 17:30:32.298484 1 leaderelection.go:121] The leader election gives 4 retries and allows for 30s of clock skew. The kube-apiserver downtime tolerance is 78s. Worst non-graceful lease acquisition is 2m43s. Worst graceful lease acquisition is {26s}.\\\\nI0111 17:30:32.299973 1 observer_polling.go:159] Starting file observer\\\\nI0111 17:30:32.306834 1 builder.go:298] cluster-policy-controller version 4.18.0-202501230001.p0.g5fd8525.assembly.stream.el9-5fd8525-5fd852525909ce6eab52972ba9ce8fcf56528eb9\\\\nI0111 17:30:32.308440 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/etc/kubernetes/static-pod-resources/secrets/serving-cert/tls.crt::/etc/kubernetes/static-pod-resources/secrets/serving-cert/tls.key\\\\\\\"\\\\nF0111 17:31:02.868341 1 cmd.go:179] failed checking apiserver connectivity: Get \\\\\\\"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/openshift-kube-controller-manager/leases/cluster-policy-controller-lock\\\\\\\": context deadline exceeded\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-01-11T17:30:30Z\\\"}},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-11T17:31:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e4a6334ba6c5d767d1ea3901dae9be1569836c6043224f9ccedd0100448a80fe\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-11T17:30:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9d344161514ccacd69f22e10f043ccdcf3b4f5a5cdcc0a0578bcb7b135f2f829\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-11T17:30:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5e28776f0910d2b90b9ec6cd95b16bfc77cf6eda9a615666acf6e461076ed9d3\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-11T17:30:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-11T17:30:26Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-11T17:31:16Z is after 2025-08-24T17:21:41Z" Jan 11 17:31:16 crc kubenswrapper[4837]: I0111 17:31:16.434659 4837 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-11T17:31:09Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-11T17:31:09Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-11T17:31:16Z is after 2025-08-24T17:21:41Z" Jan 11 17:31:16 crc kubenswrapper[4837]: I0111 17:31:16.448925 4837 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-v5bf5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"78cc7c3f-09f5-4200-a647-8fa4e9b2aae5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-11T17:31:09Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-11T17:31:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-11T17:31:09Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-11T17:31:09Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ns552\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-11T17:31:09Z\\\"}}\" for pod \"openshift-multus\"/\"multus-v5bf5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-11T17:31:16Z is after 2025-08-24T17:21:41Z" Jan 11 17:31:16 crc kubenswrapper[4837]: I0111 17:31:16.460558 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 11 17:31:16 crc kubenswrapper[4837]: I0111 17:31:16.460587 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 11 17:31:16 crc kubenswrapper[4837]: I0111 17:31:16.460596 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 11 17:31:16 crc kubenswrapper[4837]: I0111 17:31:16.460608 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 11 17:31:16 crc kubenswrapper[4837]: I0111 17:31:16.460617 4837 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-11T17:31:16Z","lastTransitionTime":"2026-01-11T17:31:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 11 17:31:16 crc kubenswrapper[4837]: I0111 17:31:16.464762 4837 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-11T17:31:09Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-11T17:31:09Z\\\",\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":false,\\\"restartCount\\\":6,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-11T17:31:16Z is after 2025-08-24T17:21:41Z" Jan 11 17:31:16 crc kubenswrapper[4837]: I0111 17:31:16.478610 4837 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-11T17:31:09Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-11T17:31:09Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-11T17:31:16Z is after 2025-08-24T17:21:41Z" Jan 11 17:31:16 crc kubenswrapper[4837]: I0111 17:31:16.494147 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" event={"ID":"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49","Type":"ContainerStarted","Data":"9d45ed917f3e9313641645bd7d23d83bb94083cc3ed630249101362de9325827"} Jan 11 17:31:16 crc kubenswrapper[4837]: I0111 17:31:16.495136 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-7996w" event={"ID":"46729771-ba0b-4b7c-8245-b2d57acb5a2c","Type":"ContainerStarted","Data":"5f7aabf84755ea73f102e79b2c39d54635582f0a3cf82ccaa8b9f26c6adcb173"} Jan 11 17:31:16 crc kubenswrapper[4837]: I0111 17:31:16.495933 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-v5bf5" event={"ID":"78cc7c3f-09f5-4200-a647-8fa4e9b2aae5","Type":"ContainerStarted","Data":"66c1f7c32c3134fb7c0af16ffb4d62add39d4be549cf3bc7a575e94dd9352cc2"} Jan 11 17:31:16 crc kubenswrapper[4837]: I0111 17:31:16.497044 4837 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-check-endpoints/1.log" Jan 11 17:31:16 crc kubenswrapper[4837]: I0111 17:31:16.497483 4837 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-check-endpoints/0.log" Jan 11 17:31:16 crc kubenswrapper[4837]: I0111 17:31:16.498639 4837 generic.go:334] "Generic (PLEG): container finished" podID="f4b27818a5e8e43d0dc095d08835c792" containerID="45dc7defae85e8f8ec0f441595434d86ab3cf87a7c81cafb9ca178af9a300027" exitCode=255 Jan 11 17:31:16 crc kubenswrapper[4837]: I0111 17:31:16.498697 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerDied","Data":"45dc7defae85e8f8ec0f441595434d86ab3cf87a7c81cafb9ca178af9a300027"} Jan 11 17:31:16 crc kubenswrapper[4837]: I0111 17:31:16.498729 4837 scope.go:117] "RemoveContainer" containerID="9f228eaa8376ca17f605b85e71a21a46e226d720350baabc72ef9ef6d468cd0e" Jan 11 17:31:16 crc kubenswrapper[4837]: I0111 17:31:16.500509 4837 generic.go:334] "Generic (PLEG): container finished" podID="e1452749-ce38-41f8-89dd-4b567f2a3250" containerID="c84b06f66f7c241b37748895091d23bd71f1ca684c1fe1ef6116b709873ea779" exitCode=0 Jan 11 17:31:16 crc kubenswrapper[4837]: I0111 17:31:16.500550 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-b7lgc" event={"ID":"e1452749-ce38-41f8-89dd-4b567f2a3250","Type":"ContainerDied","Data":"c84b06f66f7c241b37748895091d23bd71f1ca684c1fe1ef6116b709873ea779"} Jan 11 17:31:16 crc kubenswrapper[4837]: I0111 17:31:16.502263 4837 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-11T17:31:09Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-11T17:31:09Z\\\",\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":false,\\\"restartCount\\\":5,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-11T17:31:16Z is after 2025-08-24T17:21:41Z" Jan 11 17:31:16 crc kubenswrapper[4837]: I0111 17:31:16.502459 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns/node-resolver-kcvkg" event={"ID":"a574a741-71b2-4d60-9553-3d85d815f4b0","Type":"ContainerStarted","Data":"b8a5a6666f6458eb724b8347642ead2c550ab24da78edfc3fd7ecddaa0398400"} Jan 11 17:31:16 crc kubenswrapper[4837]: I0111 17:31:16.507253 4837 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-apiserver/kube-apiserver-crc"] Jan 11 17:31:16 crc kubenswrapper[4837]: I0111 17:31:16.507567 4837 scope.go:117] "RemoveContainer" containerID="45dc7defae85e8f8ec0f441595434d86ab3cf87a7c81cafb9ca178af9a300027" Jan 11 17:31:16 crc kubenswrapper[4837]: E0111 17:31:16.507783 4837 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-apiserver-check-endpoints\" with CrashLoopBackOff: \"back-off 10s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\"" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" Jan 11 17:31:16 crc kubenswrapper[4837]: I0111 17:31:16.515176 4837 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-11T17:31:09Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-11T17:31:09Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-11T17:31:16Z is after 2025-08-24T17:21:41Z" Jan 11 17:31:16 crc kubenswrapper[4837]: I0111 17:31:16.526743 4837 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-pqnst" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1cf6fa66-290a-4e29-bafc-e60185a22fe2\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-11T17:31:09Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-11T17:31:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-11T17:31:09Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-11T17:31:09Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wgdql\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wgdql\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-11T17:31:09Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-pqnst\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-11T17:31:16Z is after 2025-08-24T17:21:41Z" Jan 11 17:31:16 crc kubenswrapper[4837]: I0111 17:31:16.538782 4837 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-7996w" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"46729771-ba0b-4b7c-8245-b2d57acb5a2c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-11T17:31:09Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-11T17:31:09Z\\\",\\\"message\\\":\\\"containers with incomplete status: [egress-router-binary-copy cni-plugins bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-11T17:31:09Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-11T17:31:09Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fhxrk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fhxrk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fhxrk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fhxrk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fhxrk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fhxrk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fhxrk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-11T17:31:09Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-7996w\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-11T17:31:16Z is after 2025-08-24T17:21:41Z" Jan 11 17:31:16 crc kubenswrapper[4837]: I0111 17:31:16.551519 4837 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-11T17:31:09Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-11T17:31:09Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-11T17:31:16Z is after 2025-08-24T17:21:41Z" Jan 11 17:31:16 crc kubenswrapper[4837]: I0111 17:31:16.562871 4837 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-pqnst" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1cf6fa66-290a-4e29-bafc-e60185a22fe2\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-11T17:31:09Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-11T17:31:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-11T17:31:09Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-11T17:31:09Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wgdql\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wgdql\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-11T17:31:09Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-pqnst\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-11T17:31:16Z is after 2025-08-24T17:21:41Z" Jan 11 17:31:16 crc kubenswrapper[4837]: I0111 17:31:16.563000 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 11 17:31:16 crc kubenswrapper[4837]: I0111 17:31:16.563045 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 11 17:31:16 crc kubenswrapper[4837]: I0111 17:31:16.563063 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 11 17:31:16 crc kubenswrapper[4837]: I0111 17:31:16.563087 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 11 17:31:16 crc kubenswrapper[4837]: I0111 17:31:16.563107 4837 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-11T17:31:16Z","lastTransitionTime":"2026-01-11T17:31:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 11 17:31:16 crc kubenswrapper[4837]: I0111 17:31:16.578826 4837 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-7996w" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"46729771-ba0b-4b7c-8245-b2d57acb5a2c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-11T17:31:09Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-11T17:31:09Z\\\",\\\"message\\\":\\\"containers with incomplete status: [egress-router-binary-copy cni-plugins bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-11T17:31:09Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-11T17:31:09Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fhxrk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fhxrk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fhxrk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fhxrk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fhxrk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fhxrk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fhxrk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-11T17:31:09Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-7996w\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-11T17:31:16Z is after 2025-08-24T17:21:41Z" Jan 11 17:31:16 crc kubenswrapper[4837]: I0111 17:31:16.595148 4837 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-11T17:31:09Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-11T17:31:09Z\\\",\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":false,\\\"restartCount\\\":5,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-11T17:31:16Z is after 2025-08-24T17:21:41Z" Jan 11 17:31:16 crc kubenswrapper[4837]: I0111 17:31:16.610321 4837 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-11T17:31:09Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-11T17:31:09Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-11T17:31:16Z is after 2025-08-24T17:21:41Z" Jan 11 17:31:16 crc kubenswrapper[4837]: I0111 17:31:16.635724 4837 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-b7lgc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e1452749-ce38-41f8-89dd-4b567f2a3250\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-11T17:31:09Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-11T17:31:09Z\\\",\\\"message\\\":\\\"containers with incomplete status: [kubecfg-setup]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-11T17:31:09Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-11T17:31:09Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6f6rv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6f6rv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6f6rv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6f6rv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6f6rv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6f6rv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6f6rv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6f6rv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6f6rv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-11T17:31:09Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-b7lgc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-11T17:31:16Z is after 2025-08-24T17:21:41Z" Jan 11 17:31:16 crc kubenswrapper[4837]: I0111 17:31:16.645505 4837 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-kcvkg" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a574a741-71b2-4d60-9553-3d85d815f4b0\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-11T17:31:09Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-11T17:31:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-11T17:31:09Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-11T17:31:09Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xnsnp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-11T17:31:09Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-kcvkg\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-11T17:31:16Z is after 2025-08-24T17:21:41Z" Jan 11 17:31:16 crc kubenswrapper[4837]: I0111 17:31:16.653916 4837 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-zd2gg" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"cfaa7eab-7649-4a7d-8e95-aab12da5f86d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-11T17:31:10Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-11T17:31:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-11T17:31:10Z\\\",\\\"message\\\":\\\"containers with unready status: [node-ca]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-11T17:31:10Z\\\",\\\"message\\\":\\\"containers with unready status: [node-ca]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gq6v5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-11T17:31:10Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-zd2gg\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-11T17:31:16Z is after 2025-08-24T17:21:41Z" Jan 11 17:31:16 crc kubenswrapper[4837]: I0111 17:31:16.665111 4837 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"62ccdf63-9fbb-4a5c-825b-aefdb1153bfd\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-11T17:30:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-11T17:30:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-11T17:30:26Z\\\",\\\"message\\\":\\\"containers with unready status: [cluster-policy-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-11T17:30:26Z\\\",\\\"message\\\":\\\"containers with unready status: [cluster-policy-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-11T17:30:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4d900bf1f3701fc269e75b435a2660c096b93a7c043693a5f294666defbe51c1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f143223d5223a4e1e7056b09d8a43a04fde926d4c15184b5b452190356db5881\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-01-11T17:31:03Z\\\",\\\"message\\\":\\\"+ timeout 3m /bin/bash -exuo pipefail -c 'while [ -n \\\\\\\"$(ss -Htanop \\\\\\\\( sport = 10357 \\\\\\\\))\\\\\\\" ]; do sleep 1; done'\\\\n++ ss -Htanop '(' sport = 10357 ')'\\\\n+ '[' -n '' ']'\\\\n+ exec cluster-policy-controller start --config=/etc/kubernetes/static-pod-resources/configmaps/cluster-policy-controller-config/config.yaml --kubeconfig=/etc/kubernetes/static-pod-resources/configmaps/controller-manager-kubeconfig/kubeconfig --namespace=openshift-kube-controller-manager -v=2\\\\nI0111 17:30:32.298484 1 leaderelection.go:121] The leader election gives 4 retries and allows for 30s of clock skew. The kube-apiserver downtime tolerance is 78s. Worst non-graceful lease acquisition is 2m43s. Worst graceful lease acquisition is {26s}.\\\\nI0111 17:30:32.299973 1 observer_polling.go:159] Starting file observer\\\\nI0111 17:30:32.306834 1 builder.go:298] cluster-policy-controller version 4.18.0-202501230001.p0.g5fd8525.assembly.stream.el9-5fd8525-5fd852525909ce6eab52972ba9ce8fcf56528eb9\\\\nI0111 17:30:32.308440 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/etc/kubernetes/static-pod-resources/secrets/serving-cert/tls.crt::/etc/kubernetes/static-pod-resources/secrets/serving-cert/tls.key\\\\\\\"\\\\nF0111 17:31:02.868341 1 cmd.go:179] failed checking apiserver connectivity: Get \\\\\\\"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/openshift-kube-controller-manager/leases/cluster-policy-controller-lock\\\\\\\": context deadline exceeded\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-01-11T17:30:30Z\\\"}},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-11T17:31:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e4a6334ba6c5d767d1ea3901dae9be1569836c6043224f9ccedd0100448a80fe\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-11T17:30:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9d344161514ccacd69f22e10f043ccdcf3b4f5a5cdcc0a0578bcb7b135f2f829\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-11T17:30:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5e28776f0910d2b90b9ec6cd95b16bfc77cf6eda9a615666acf6e461076ed9d3\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-11T17:30:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-11T17:30:26Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-11T17:31:16Z is after 2025-08-24T17:21:41Z" Jan 11 17:31:16 crc kubenswrapper[4837]: I0111 17:31:16.665398 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 11 17:31:16 crc kubenswrapper[4837]: I0111 17:31:16.665444 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 11 17:31:16 crc kubenswrapper[4837]: I0111 17:31:16.665461 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 11 17:31:16 crc kubenswrapper[4837]: I0111 17:31:16.665484 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 11 17:31:16 crc kubenswrapper[4837]: I0111 17:31:16.665502 4837 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-11T17:31:16Z","lastTransitionTime":"2026-01-11T17:31:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 11 17:31:16 crc kubenswrapper[4837]: I0111 17:31:16.675044 4837 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-v5bf5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"78cc7c3f-09f5-4200-a647-8fa4e9b2aae5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-11T17:31:09Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-11T17:31:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-11T17:31:09Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-11T17:31:09Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ns552\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-11T17:31:09Z\\\"}}\" for pod \"openshift-multus\"/\"multus-v5bf5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-11T17:31:16Z is after 2025-08-24T17:21:41Z" Jan 11 17:31:16 crc kubenswrapper[4837]: I0111 17:31:16.692430 4837 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-11T17:31:09Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-11T17:31:09Z\\\",\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":false,\\\"restartCount\\\":6,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-11T17:31:16Z is after 2025-08-24T17:21:41Z" Jan 11 17:31:16 crc kubenswrapper[4837]: I0111 17:31:16.703165 4837 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-11T17:31:09Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-11T17:31:09Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-11T17:31:16Z is after 2025-08-24T17:21:41Z" Jan 11 17:31:16 crc kubenswrapper[4837]: I0111 17:31:16.716931 4837 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-11T17:31:09Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-11T17:31:09Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-11T17:31:16Z is after 2025-08-24T17:21:41Z" Jan 11 17:31:16 crc kubenswrapper[4837]: I0111 17:31:16.738332 4837 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"46acf340-e77e-47f8-ba25-95b9b5870af3\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-11T17:30:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-11T17:30:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-11T17:30:26Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-11T17:30:26Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-11T17:30:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7d60a2ce54687f5ded99fd2e70894c23fe35a18b1bd557e6f05edca994881a3c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-11T17:30:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d3ff334f7925a2e69fa040b1ebfee0fd51669c60ff75534c1e08a9bf30b28d4d\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-11T17:30:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b0188eb536b523f8b0f5373307a7649fedd407a748d413a27ca1f61ed03e9e1d\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-11T17:30:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://45dc7defae85e8f8ec0f441595434d86ab3cf87a7c81cafb9ca178af9a300027\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9f228eaa8376ca17f605b85e71a21a46e226d720350baabc72ef9ef6d468cd0e\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-01-11T17:31:02Z\\\",\\\"message\\\":\\\"W0111 17:30:38.127484 1 cmd.go:257] Using insecure, self-signed certificates\\\\nI0111 17:30:38.128018 1 crypto.go:601] Generating new CA for check-endpoints-signer@1768152638 cert, and key in /tmp/serving-cert-969611356/serving-signer.crt, /tmp/serving-cert-969611356/serving-signer.key\\\\nI0111 17:30:38.511822 1 observer_polling.go:159] Starting file observer\\\\nW0111 17:30:48.515539 1 builder.go:272] unable to get owner reference (falling back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI0111 17:30:48.515925 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0111 17:30:49.350573 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-969611356/tls.crt::/tmp/serving-cert-969611356/tls.key\\\\\\\"\\\\nF0111 17:31:00.709066 1 cmd.go:182] error initializing delegating authentication: unable to load configmap based request-header-client-ca-file: Get \\\\\\\"https://localhost:6443/api/v1/namespaces/kube-system/configmaps/extension-apiserver-authentication\\\\\\\": net/http: TLS handshake timeout\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-01-11T17:30:38Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://45dc7defae85e8f8ec0f441595434d86ab3cf87a7c81cafb9ca178af9a300027\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-01-11T17:31:15Z\\\",\\\"message\\\":\\\"le observer\\\\nW0111 17:31:09.479691 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0111 17:31:09.479821 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0111 17:31:09.480410 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-4110613780/tls.crt::/tmp/serving-cert-4110613780/tls.key\\\\\\\"\\\\nI0111 17:31:09.725824 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0111 17:31:10.438563 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0111 17:31:10.438624 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0111 17:31:10.438713 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0111 17:31:10.438732 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0111 17:31:10.449082 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0111 17:31:10.449135 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0111 17:31:10.449146 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0111 17:31:10.449159 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0111 17:31:10.449169 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0111 17:31:10.449179 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0111 17:31:10.449187 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0111 17:31:10.449256 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0111 17:31:10.451583 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-01-11T17:31:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c40f593938fad7c4202ddde8c6125eed092683e19464cc63448e7579e8174e54\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-11T17:30:35Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7e80355f699d3f0ea37d79f71e08cabb59de047b88df75691a2da0ef7de9d52b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7e80355f699d3f0ea37d79f71e08cabb59de047b88df75691a2da0ef7de9d52b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-11T17:30:27Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-11T17:30:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-11T17:30:26Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-11T17:31:16Z is after 2025-08-24T17:21:41Z" Jan 11 17:31:16 crc kubenswrapper[4837]: I0111 17:31:16.767891 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 11 17:31:16 crc kubenswrapper[4837]: I0111 17:31:16.768218 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 11 17:31:16 crc kubenswrapper[4837]: I0111 17:31:16.768235 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 11 17:31:16 crc kubenswrapper[4837]: I0111 17:31:16.768260 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 11 17:31:16 crc kubenswrapper[4837]: I0111 17:31:16.768276 4837 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-11T17:31:16Z","lastTransitionTime":"2026-01-11T17:31:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 11 17:31:16 crc kubenswrapper[4837]: I0111 17:31:16.784350 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Jan 11 17:31:16 crc kubenswrapper[4837]: E0111 17:31:16.784543 4837 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-01-11 17:31:24.784515655 +0000 UTC m=+58.962708371 (durationBeforeRetry 8s). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 11 17:31:16 crc kubenswrapper[4837]: I0111 17:31:16.871042 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 11 17:31:16 crc kubenswrapper[4837]: I0111 17:31:16.871107 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 11 17:31:16 crc kubenswrapper[4837]: I0111 17:31:16.871124 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 11 17:31:16 crc kubenswrapper[4837]: I0111 17:31:16.871151 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 11 17:31:16 crc kubenswrapper[4837]: I0111 17:31:16.871168 4837 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-11T17:31:16Z","lastTransitionTime":"2026-01-11T17:31:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 11 17:31:16 crc kubenswrapper[4837]: I0111 17:31:16.885647 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Jan 11 17:31:16 crc kubenswrapper[4837]: I0111 17:31:16.885721 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Jan 11 17:31:16 crc kubenswrapper[4837]: I0111 17:31:16.885750 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Jan 11 17:31:16 crc kubenswrapper[4837]: I0111 17:31:16.885783 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Jan 11 17:31:16 crc kubenswrapper[4837]: E0111 17:31:16.885912 4837 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Jan 11 17:31:16 crc kubenswrapper[4837]: E0111 17:31:16.885924 4837 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Jan 11 17:31:16 crc kubenswrapper[4837]: E0111 17:31:16.885950 4837 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Jan 11 17:31:16 crc kubenswrapper[4837]: E0111 17:31:16.885962 4837 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Jan 11 17:31:16 crc kubenswrapper[4837]: E0111 17:31:16.885983 4837 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Jan 11 17:31:16 crc kubenswrapper[4837]: E0111 17:31:16.886015 4837 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-01-11 17:31:24.885992578 +0000 UTC m=+59.064185374 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Jan 11 17:31:16 crc kubenswrapper[4837]: E0111 17:31:16.886020 4837 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Jan 11 17:31:16 crc kubenswrapper[4837]: E0111 17:31:16.886052 4837 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-01-11 17:31:24.886027829 +0000 UTC m=+59.064220575 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Jan 11 17:31:16 crc kubenswrapper[4837]: E0111 17:31:16.885933 4837 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Jan 11 17:31:16 crc kubenswrapper[4837]: E0111 17:31:16.886082 4837 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2026-01-11 17:31:24.88606755 +0000 UTC m=+59.064260296 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Jan 11 17:31:16 crc kubenswrapper[4837]: E0111 17:31:16.886089 4837 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Jan 11 17:31:16 crc kubenswrapper[4837]: E0111 17:31:16.886136 4837 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2026-01-11 17:31:24.886119031 +0000 UTC m=+59.064311767 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Jan 11 17:31:16 crc kubenswrapper[4837]: I0111 17:31:16.973507 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 11 17:31:16 crc kubenswrapper[4837]: I0111 17:31:16.973538 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 11 17:31:16 crc kubenswrapper[4837]: I0111 17:31:16.973550 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 11 17:31:16 crc kubenswrapper[4837]: I0111 17:31:16.973566 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 11 17:31:16 crc kubenswrapper[4837]: I0111 17:31:16.973577 4837 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-11T17:31:16Z","lastTransitionTime":"2026-01-11T17:31:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 11 17:31:17 crc kubenswrapper[4837]: I0111 17:31:17.076378 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 11 17:31:17 crc kubenswrapper[4837]: I0111 17:31:17.076447 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 11 17:31:17 crc kubenswrapper[4837]: I0111 17:31:17.076465 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 11 17:31:17 crc kubenswrapper[4837]: I0111 17:31:17.076486 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 11 17:31:17 crc kubenswrapper[4837]: I0111 17:31:17.076504 4837 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-11T17:31:17Z","lastTransitionTime":"2026-01-11T17:31:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 11 17:31:17 crc kubenswrapper[4837]: I0111 17:31:17.179276 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 11 17:31:17 crc kubenswrapper[4837]: I0111 17:31:17.179329 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 11 17:31:17 crc kubenswrapper[4837]: I0111 17:31:17.179346 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 11 17:31:17 crc kubenswrapper[4837]: I0111 17:31:17.179368 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 11 17:31:17 crc kubenswrapper[4837]: I0111 17:31:17.179385 4837 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-11T17:31:17Z","lastTransitionTime":"2026-01-11T17:31:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 11 17:31:17 crc kubenswrapper[4837]: I0111 17:31:17.282381 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 11 17:31:17 crc kubenswrapper[4837]: I0111 17:31:17.282434 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 11 17:31:17 crc kubenswrapper[4837]: I0111 17:31:17.282451 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 11 17:31:17 crc kubenswrapper[4837]: I0111 17:31:17.282474 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 11 17:31:17 crc kubenswrapper[4837]: I0111 17:31:17.282491 4837 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-11T17:31:17Z","lastTransitionTime":"2026-01-11T17:31:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 11 17:31:17 crc kubenswrapper[4837]: I0111 17:31:17.363560 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Jan 11 17:31:17 crc kubenswrapper[4837]: E0111 17:31:17.363796 4837 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Jan 11 17:31:17 crc kubenswrapper[4837]: I0111 17:31:17.385735 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 11 17:31:17 crc kubenswrapper[4837]: I0111 17:31:17.385799 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 11 17:31:17 crc kubenswrapper[4837]: I0111 17:31:17.385818 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 11 17:31:17 crc kubenswrapper[4837]: I0111 17:31:17.385842 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 11 17:31:17 crc kubenswrapper[4837]: I0111 17:31:17.385858 4837 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-11T17:31:17Z","lastTransitionTime":"2026-01-11T17:31:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 11 17:31:17 crc kubenswrapper[4837]: I0111 17:31:17.488383 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 11 17:31:17 crc kubenswrapper[4837]: I0111 17:31:17.488440 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 11 17:31:17 crc kubenswrapper[4837]: I0111 17:31:17.488457 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 11 17:31:17 crc kubenswrapper[4837]: I0111 17:31:17.488480 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 11 17:31:17 crc kubenswrapper[4837]: I0111 17:31:17.488497 4837 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-11T17:31:17Z","lastTransitionTime":"2026-01-11T17:31:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 11 17:31:17 crc kubenswrapper[4837]: I0111 17:31:17.505723 4837 scope.go:117] "RemoveContainer" containerID="45dc7defae85e8f8ec0f441595434d86ab3cf87a7c81cafb9ca178af9a300027" Jan 11 17:31:17 crc kubenswrapper[4837]: E0111 17:31:17.506007 4837 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-apiserver-check-endpoints\" with CrashLoopBackOff: \"back-off 10s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\"" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" Jan 11 17:31:17 crc kubenswrapper[4837]: I0111 17:31:17.523934 4837 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-v5bf5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"78cc7c3f-09f5-4200-a647-8fa4e9b2aae5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-11T17:31:09Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-11T17:31:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-11T17:31:09Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-11T17:31:09Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ns552\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-11T17:31:09Z\\\"}}\" for pod \"openshift-multus\"/\"multus-v5bf5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-11T17:31:17Z is after 2025-08-24T17:21:41Z" Jan 11 17:31:17 crc kubenswrapper[4837]: I0111 17:31:17.541343 4837 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-11T17:31:09Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-11T17:31:09Z\\\",\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":false,\\\"restartCount\\\":6,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-11T17:31:17Z is after 2025-08-24T17:21:41Z" Jan 11 17:31:17 crc kubenswrapper[4837]: I0111 17:31:17.553602 4837 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-11T17:31:09Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-11T17:31:09Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-11T17:31:17Z is after 2025-08-24T17:21:41Z" Jan 11 17:31:17 crc kubenswrapper[4837]: I0111 17:31:17.566858 4837 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-11T17:31:09Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-11T17:31:09Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-11T17:31:17Z is after 2025-08-24T17:21:41Z" Jan 11 17:31:17 crc kubenswrapper[4837]: I0111 17:31:17.581879 4837 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"46acf340-e77e-47f8-ba25-95b9b5870af3\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-11T17:30:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-11T17:30:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-11T17:30:26Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-11T17:30:26Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-11T17:30:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7d60a2ce54687f5ded99fd2e70894c23fe35a18b1bd557e6f05edca994881a3c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-11T17:30:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d3ff334f7925a2e69fa040b1ebfee0fd51669c60ff75534c1e08a9bf30b28d4d\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-11T17:30:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b0188eb536b523f8b0f5373307a7649fedd407a748d413a27ca1f61ed03e9e1d\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-11T17:30:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://45dc7defae85e8f8ec0f441595434d86ab3cf87a7c81cafb9ca178af9a300027\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://45dc7defae85e8f8ec0f441595434d86ab3cf87a7c81cafb9ca178af9a300027\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-01-11T17:31:15Z\\\",\\\"message\\\":\\\"le observer\\\\nW0111 17:31:09.479691 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0111 17:31:09.479821 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0111 17:31:09.480410 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-4110613780/tls.crt::/tmp/serving-cert-4110613780/tls.key\\\\\\\"\\\\nI0111 17:31:09.725824 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0111 17:31:10.438563 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0111 17:31:10.438624 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0111 17:31:10.438713 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0111 17:31:10.438732 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0111 17:31:10.449082 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0111 17:31:10.449135 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0111 17:31:10.449146 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0111 17:31:10.449159 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0111 17:31:10.449169 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0111 17:31:10.449179 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0111 17:31:10.449187 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0111 17:31:10.449256 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0111 17:31:10.451583 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-01-11T17:31:03Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 10s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c40f593938fad7c4202ddde8c6125eed092683e19464cc63448e7579e8174e54\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-11T17:30:35Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7e80355f699d3f0ea37d79f71e08cabb59de047b88df75691a2da0ef7de9d52b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7e80355f699d3f0ea37d79f71e08cabb59de047b88df75691a2da0ef7de9d52b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-11T17:30:27Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-11T17:30:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-11T17:30:26Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-11T17:31:17Z is after 2025-08-24T17:21:41Z" Jan 11 17:31:17 crc kubenswrapper[4837]: I0111 17:31:17.590909 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 11 17:31:17 crc kubenswrapper[4837]: I0111 17:31:17.590967 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 11 17:31:17 crc kubenswrapper[4837]: I0111 17:31:17.590991 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 11 17:31:17 crc kubenswrapper[4837]: I0111 17:31:17.591013 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 11 17:31:17 crc kubenswrapper[4837]: I0111 17:31:17.591029 4837 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-11T17:31:17Z","lastTransitionTime":"2026-01-11T17:31:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 11 17:31:17 crc kubenswrapper[4837]: I0111 17:31:17.596570 4837 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-11T17:31:09Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-11T17:31:09Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-11T17:31:17Z is after 2025-08-24T17:21:41Z" Jan 11 17:31:17 crc kubenswrapper[4837]: I0111 17:31:17.607413 4837 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-pqnst" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1cf6fa66-290a-4e29-bafc-e60185a22fe2\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-11T17:31:09Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-11T17:31:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-11T17:31:09Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-11T17:31:09Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wgdql\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wgdql\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-11T17:31:09Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-pqnst\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-11T17:31:17Z is after 2025-08-24T17:21:41Z" Jan 11 17:31:17 crc kubenswrapper[4837]: I0111 17:31:17.621955 4837 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-7996w" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"46729771-ba0b-4b7c-8245-b2d57acb5a2c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-11T17:31:09Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-11T17:31:09Z\\\",\\\"message\\\":\\\"containers with incomplete status: [egress-router-binary-copy cni-plugins bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-11T17:31:09Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-11T17:31:09Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fhxrk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fhxrk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fhxrk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fhxrk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fhxrk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fhxrk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fhxrk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-11T17:31:09Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-7996w\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-11T17:31:17Z is after 2025-08-24T17:21:41Z" Jan 11 17:31:17 crc kubenswrapper[4837]: I0111 17:31:17.635285 4837 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-11T17:31:09Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-11T17:31:09Z\\\",\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":false,\\\"restartCount\\\":5,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-11T17:31:17Z is after 2025-08-24T17:21:41Z" Jan 11 17:31:17 crc kubenswrapper[4837]: I0111 17:31:17.650515 4837 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-11T17:31:09Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-11T17:31:09Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-11T17:31:17Z is after 2025-08-24T17:21:41Z" Jan 11 17:31:17 crc kubenswrapper[4837]: I0111 17:31:17.671328 4837 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-b7lgc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e1452749-ce38-41f8-89dd-4b567f2a3250\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-11T17:31:09Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-11T17:31:09Z\\\",\\\"message\\\":\\\"containers with incomplete status: [kubecfg-setup]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-11T17:31:09Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-11T17:31:09Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6f6rv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6f6rv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6f6rv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6f6rv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6f6rv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6f6rv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6f6rv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6f6rv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6f6rv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-11T17:31:09Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-b7lgc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-11T17:31:17Z is after 2025-08-24T17:21:41Z" Jan 11 17:31:17 crc kubenswrapper[4837]: I0111 17:31:17.685779 4837 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-kcvkg" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a574a741-71b2-4d60-9553-3d85d815f4b0\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-11T17:31:09Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-11T17:31:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-11T17:31:09Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-11T17:31:09Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xnsnp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-11T17:31:09Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-kcvkg\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-11T17:31:17Z is after 2025-08-24T17:21:41Z" Jan 11 17:31:17 crc kubenswrapper[4837]: I0111 17:31:17.693700 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 11 17:31:17 crc kubenswrapper[4837]: I0111 17:31:17.693753 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 11 17:31:17 crc kubenswrapper[4837]: I0111 17:31:17.693764 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 11 17:31:17 crc kubenswrapper[4837]: I0111 17:31:17.693781 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 11 17:31:17 crc kubenswrapper[4837]: I0111 17:31:17.693815 4837 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-11T17:31:17Z","lastTransitionTime":"2026-01-11T17:31:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 11 17:31:17 crc kubenswrapper[4837]: I0111 17:31:17.701001 4837 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-zd2gg" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"cfaa7eab-7649-4a7d-8e95-aab12da5f86d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-11T17:31:10Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-11T17:31:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-11T17:31:10Z\\\",\\\"message\\\":\\\"containers with unready status: [node-ca]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-11T17:31:10Z\\\",\\\"message\\\":\\\"containers with unready status: [node-ca]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gq6v5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-11T17:31:10Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-zd2gg\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-11T17:31:17Z is after 2025-08-24T17:21:41Z" Jan 11 17:31:17 crc kubenswrapper[4837]: I0111 17:31:17.717743 4837 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"62ccdf63-9fbb-4a5c-825b-aefdb1153bfd\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-11T17:30:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-11T17:30:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-11T17:30:26Z\\\",\\\"message\\\":\\\"containers with unready status: [cluster-policy-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-11T17:30:26Z\\\",\\\"message\\\":\\\"containers with unready status: [cluster-policy-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-11T17:30:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4d900bf1f3701fc269e75b435a2660c096b93a7c043693a5f294666defbe51c1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f143223d5223a4e1e7056b09d8a43a04fde926d4c15184b5b452190356db5881\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-01-11T17:31:03Z\\\",\\\"message\\\":\\\"+ timeout 3m /bin/bash -exuo pipefail -c 'while [ -n \\\\\\\"$(ss -Htanop \\\\\\\\( sport = 10357 \\\\\\\\))\\\\\\\" ]; do sleep 1; done'\\\\n++ ss -Htanop '(' sport = 10357 ')'\\\\n+ '[' -n '' ']'\\\\n+ exec cluster-policy-controller start --config=/etc/kubernetes/static-pod-resources/configmaps/cluster-policy-controller-config/config.yaml --kubeconfig=/etc/kubernetes/static-pod-resources/configmaps/controller-manager-kubeconfig/kubeconfig --namespace=openshift-kube-controller-manager -v=2\\\\nI0111 17:30:32.298484 1 leaderelection.go:121] The leader election gives 4 retries and allows for 30s of clock skew. The kube-apiserver downtime tolerance is 78s. Worst non-graceful lease acquisition is 2m43s. Worst graceful lease acquisition is {26s}.\\\\nI0111 17:30:32.299973 1 observer_polling.go:159] Starting file observer\\\\nI0111 17:30:32.306834 1 builder.go:298] cluster-policy-controller version 4.18.0-202501230001.p0.g5fd8525.assembly.stream.el9-5fd8525-5fd852525909ce6eab52972ba9ce8fcf56528eb9\\\\nI0111 17:30:32.308440 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/etc/kubernetes/static-pod-resources/secrets/serving-cert/tls.crt::/etc/kubernetes/static-pod-resources/secrets/serving-cert/tls.key\\\\\\\"\\\\nF0111 17:31:02.868341 1 cmd.go:179] failed checking apiserver connectivity: Get \\\\\\\"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/openshift-kube-controller-manager/leases/cluster-policy-controller-lock\\\\\\\": context deadline exceeded\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-01-11T17:30:30Z\\\"}},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-11T17:31:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e4a6334ba6c5d767d1ea3901dae9be1569836c6043224f9ccedd0100448a80fe\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-11T17:30:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9d344161514ccacd69f22e10f043ccdcf3b4f5a5cdcc0a0578bcb7b135f2f829\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-11T17:30:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5e28776f0910d2b90b9ec6cd95b16bfc77cf6eda9a615666acf6e461076ed9d3\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-11T17:30:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-11T17:30:26Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-11T17:31:17Z is after 2025-08-24T17:21:41Z" Jan 11 17:31:17 crc kubenswrapper[4837]: I0111 17:31:17.796426 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 11 17:31:17 crc kubenswrapper[4837]: I0111 17:31:17.796485 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 11 17:31:17 crc kubenswrapper[4837]: I0111 17:31:17.796506 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 11 17:31:17 crc kubenswrapper[4837]: I0111 17:31:17.796535 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 11 17:31:17 crc kubenswrapper[4837]: I0111 17:31:17.796554 4837 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-11T17:31:17Z","lastTransitionTime":"2026-01-11T17:31:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 11 17:31:17 crc kubenswrapper[4837]: I0111 17:31:17.899593 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 11 17:31:17 crc kubenswrapper[4837]: I0111 17:31:17.899656 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 11 17:31:17 crc kubenswrapper[4837]: I0111 17:31:17.899708 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 11 17:31:17 crc kubenswrapper[4837]: I0111 17:31:17.899737 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 11 17:31:17 crc kubenswrapper[4837]: I0111 17:31:17.899757 4837 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-11T17:31:17Z","lastTransitionTime":"2026-01-11T17:31:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 11 17:31:18 crc kubenswrapper[4837]: I0111 17:31:18.003156 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 11 17:31:18 crc kubenswrapper[4837]: I0111 17:31:18.003408 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 11 17:31:18 crc kubenswrapper[4837]: I0111 17:31:18.003572 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 11 17:31:18 crc kubenswrapper[4837]: I0111 17:31:18.003737 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 11 17:31:18 crc kubenswrapper[4837]: I0111 17:31:18.003859 4837 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-11T17:31:18Z","lastTransitionTime":"2026-01-11T17:31:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 11 17:31:18 crc kubenswrapper[4837]: I0111 17:31:18.037843 4837 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Jan 11 17:31:18 crc kubenswrapper[4837]: I0111 17:31:18.063337 4837 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-7996w" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"46729771-ba0b-4b7c-8245-b2d57acb5a2c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-11T17:31:09Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-11T17:31:09Z\\\",\\\"message\\\":\\\"containers with incomplete status: [egress-router-binary-copy cni-plugins bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-11T17:31:09Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-11T17:31:09Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fhxrk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fhxrk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fhxrk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fhxrk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fhxrk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fhxrk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fhxrk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-11T17:31:09Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-7996w\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-11T17:31:18Z is after 2025-08-24T17:21:41Z" Jan 11 17:31:18 crc kubenswrapper[4837]: I0111 17:31:18.085185 4837 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-11T17:31:09Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-11T17:31:09Z\\\",\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":false,\\\"restartCount\\\":5,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-11T17:31:18Z is after 2025-08-24T17:21:41Z" Jan 11 17:31:18 crc kubenswrapper[4837]: I0111 17:31:18.104004 4837 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-11T17:31:09Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-11T17:31:09Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-11T17:31:18Z is after 2025-08-24T17:21:41Z" Jan 11 17:31:18 crc kubenswrapper[4837]: I0111 17:31:18.106330 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 11 17:31:18 crc kubenswrapper[4837]: I0111 17:31:18.106386 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 11 17:31:18 crc kubenswrapper[4837]: I0111 17:31:18.106404 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 11 17:31:18 crc kubenswrapper[4837]: I0111 17:31:18.106428 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 11 17:31:18 crc kubenswrapper[4837]: I0111 17:31:18.106448 4837 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-11T17:31:18Z","lastTransitionTime":"2026-01-11T17:31:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 11 17:31:18 crc kubenswrapper[4837]: I0111 17:31:18.125270 4837 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-pqnst" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1cf6fa66-290a-4e29-bafc-e60185a22fe2\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-11T17:31:09Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-11T17:31:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-11T17:31:09Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-11T17:31:09Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wgdql\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wgdql\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-11T17:31:09Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-pqnst\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-11T17:31:18Z is after 2025-08-24T17:21:41Z" Jan 11 17:31:18 crc kubenswrapper[4837]: I0111 17:31:18.145607 4837 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-kcvkg" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a574a741-71b2-4d60-9553-3d85d815f4b0\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-11T17:31:09Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-11T17:31:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-11T17:31:09Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-11T17:31:09Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xnsnp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-11T17:31:09Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-kcvkg\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-11T17:31:18Z is after 2025-08-24T17:21:41Z" Jan 11 17:31:18 crc kubenswrapper[4837]: I0111 17:31:18.165398 4837 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-zd2gg" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"cfaa7eab-7649-4a7d-8e95-aab12da5f86d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-11T17:31:10Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-11T17:31:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-11T17:31:10Z\\\",\\\"message\\\":\\\"containers with unready status: [node-ca]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-11T17:31:10Z\\\",\\\"message\\\":\\\"containers with unready status: [node-ca]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gq6v5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-11T17:31:10Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-zd2gg\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-11T17:31:18Z is after 2025-08-24T17:21:41Z" Jan 11 17:31:18 crc kubenswrapper[4837]: I0111 17:31:18.188461 4837 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"62ccdf63-9fbb-4a5c-825b-aefdb1153bfd\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-11T17:30:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-11T17:30:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-11T17:31:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-11T17:31:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-11T17:30:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4d900bf1f3701fc269e75b435a2660c096b93a7c043693a5f294666defbe51c1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f143223d5223a4e1e7056b09d8a43a04fde926d4c15184b5b452190356db5881\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-01-11T17:31:03Z\\\",\\\"message\\\":\\\"+ timeout 3m /bin/bash -exuo pipefail -c 'while [ -n \\\\\\\"$(ss -Htanop \\\\\\\\( sport = 10357 \\\\\\\\))\\\\\\\" ]; do sleep 1; done'\\\\n++ ss -Htanop '(' sport = 10357 ')'\\\\n+ '[' -n '' ']'\\\\n+ exec cluster-policy-controller start --config=/etc/kubernetes/static-pod-resources/configmaps/cluster-policy-controller-config/config.yaml --kubeconfig=/etc/kubernetes/static-pod-resources/configmaps/controller-manager-kubeconfig/kubeconfig --namespace=openshift-kube-controller-manager -v=2\\\\nI0111 17:30:32.298484 1 leaderelection.go:121] The leader election gives 4 retries and allows for 30s of clock skew. The kube-apiserver downtime tolerance is 78s. Worst non-graceful lease acquisition is 2m43s. Worst graceful lease acquisition is {26s}.\\\\nI0111 17:30:32.299973 1 observer_polling.go:159] Starting file observer\\\\nI0111 17:30:32.306834 1 builder.go:298] cluster-policy-controller version 4.18.0-202501230001.p0.g5fd8525.assembly.stream.el9-5fd8525-5fd852525909ce6eab52972ba9ce8fcf56528eb9\\\\nI0111 17:30:32.308440 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/etc/kubernetes/static-pod-resources/secrets/serving-cert/tls.crt::/etc/kubernetes/static-pod-resources/secrets/serving-cert/tls.key\\\\\\\"\\\\nF0111 17:31:02.868341 1 cmd.go:179] failed checking apiserver connectivity: Get \\\\\\\"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/openshift-kube-controller-manager/leases/cluster-policy-controller-lock\\\\\\\": context deadline exceeded\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-01-11T17:30:30Z\\\"}},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-11T17:31:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e4a6334ba6c5d767d1ea3901dae9be1569836c6043224f9ccedd0100448a80fe\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-11T17:30:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9d344161514ccacd69f22e10f043ccdcf3b4f5a5cdcc0a0578bcb7b135f2f829\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-11T17:30:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5e28776f0910d2b90b9ec6cd95b16bfc77cf6eda9a615666acf6e461076ed9d3\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-11T17:30:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-11T17:30:26Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-11T17:31:18Z is after 2025-08-24T17:21:41Z" Jan 11 17:31:18 crc kubenswrapper[4837]: I0111 17:31:18.209725 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 11 17:31:18 crc kubenswrapper[4837]: I0111 17:31:18.209768 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 11 17:31:18 crc kubenswrapper[4837]: I0111 17:31:18.209785 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 11 17:31:18 crc kubenswrapper[4837]: I0111 17:31:18.209809 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 11 17:31:18 crc kubenswrapper[4837]: I0111 17:31:18.209827 4837 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-11T17:31:18Z","lastTransitionTime":"2026-01-11T17:31:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 11 17:31:18 crc kubenswrapper[4837]: I0111 17:31:18.212809 4837 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-11T17:31:09Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-11T17:31:09Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-11T17:31:18Z is after 2025-08-24T17:21:41Z" Jan 11 17:31:18 crc kubenswrapper[4837]: I0111 17:31:18.245367 4837 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-b7lgc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e1452749-ce38-41f8-89dd-4b567f2a3250\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-11T17:31:09Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-11T17:31:09Z\\\",\\\"message\\\":\\\"containers with incomplete status: [kubecfg-setup]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-11T17:31:09Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-11T17:31:09Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6f6rv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6f6rv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6f6rv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6f6rv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6f6rv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6f6rv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6f6rv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6f6rv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6f6rv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-11T17:31:09Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-b7lgc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-11T17:31:18Z is after 2025-08-24T17:21:41Z" Jan 11 17:31:18 crc kubenswrapper[4837]: I0111 17:31:18.265747 4837 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-11T17:31:09Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-11T17:31:09Z\\\",\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":false,\\\"restartCount\\\":6,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-11T17:31:18Z is after 2025-08-24T17:21:41Z" Jan 11 17:31:18 crc kubenswrapper[4837]: I0111 17:31:18.282772 4837 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-11T17:31:09Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-11T17:31:09Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-11T17:31:18Z is after 2025-08-24T17:21:41Z" Jan 11 17:31:18 crc kubenswrapper[4837]: I0111 17:31:18.295568 4837 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-v5bf5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"78cc7c3f-09f5-4200-a647-8fa4e9b2aae5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-11T17:31:09Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-11T17:31:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-11T17:31:09Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-11T17:31:09Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ns552\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-11T17:31:09Z\\\"}}\" for pod \"openshift-multus\"/\"multus-v5bf5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-11T17:31:18Z is after 2025-08-24T17:21:41Z" Jan 11 17:31:18 crc kubenswrapper[4837]: I0111 17:31:18.307202 4837 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"46acf340-e77e-47f8-ba25-95b9b5870af3\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-11T17:30:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-11T17:30:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-11T17:30:26Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-11T17:30:26Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-11T17:30:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7d60a2ce54687f5ded99fd2e70894c23fe35a18b1bd557e6f05edca994881a3c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-11T17:30:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d3ff334f7925a2e69fa040b1ebfee0fd51669c60ff75534c1e08a9bf30b28d4d\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-11T17:30:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b0188eb536b523f8b0f5373307a7649fedd407a748d413a27ca1f61ed03e9e1d\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-11T17:30:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://45dc7defae85e8f8ec0f441595434d86ab3cf87a7c81cafb9ca178af9a300027\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://45dc7defae85e8f8ec0f441595434d86ab3cf87a7c81cafb9ca178af9a300027\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-01-11T17:31:15Z\\\",\\\"message\\\":\\\"le observer\\\\nW0111 17:31:09.479691 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0111 17:31:09.479821 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0111 17:31:09.480410 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-4110613780/tls.crt::/tmp/serving-cert-4110613780/tls.key\\\\\\\"\\\\nI0111 17:31:09.725824 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0111 17:31:10.438563 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0111 17:31:10.438624 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0111 17:31:10.438713 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0111 17:31:10.438732 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0111 17:31:10.449082 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0111 17:31:10.449135 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0111 17:31:10.449146 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0111 17:31:10.449159 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0111 17:31:10.449169 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0111 17:31:10.449179 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0111 17:31:10.449187 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0111 17:31:10.449256 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0111 17:31:10.451583 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-01-11T17:31:03Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 10s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c40f593938fad7c4202ddde8c6125eed092683e19464cc63448e7579e8174e54\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-11T17:30:35Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7e80355f699d3f0ea37d79f71e08cabb59de047b88df75691a2da0ef7de9d52b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7e80355f699d3f0ea37d79f71e08cabb59de047b88df75691a2da0ef7de9d52b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-11T17:30:27Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-11T17:30:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-11T17:30:26Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-11T17:31:18Z is after 2025-08-24T17:21:41Z" Jan 11 17:31:18 crc kubenswrapper[4837]: I0111 17:31:18.312325 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 11 17:31:18 crc kubenswrapper[4837]: I0111 17:31:18.312382 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 11 17:31:18 crc kubenswrapper[4837]: I0111 17:31:18.312405 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 11 17:31:18 crc kubenswrapper[4837]: I0111 17:31:18.312432 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 11 17:31:18 crc kubenswrapper[4837]: I0111 17:31:18.312457 4837 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-11T17:31:18Z","lastTransitionTime":"2026-01-11T17:31:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 11 17:31:18 crc kubenswrapper[4837]: I0111 17:31:18.322368 4837 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-11T17:31:09Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-11T17:31:09Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-11T17:31:18Z is after 2025-08-24T17:21:41Z" Jan 11 17:31:18 crc kubenswrapper[4837]: I0111 17:31:18.363845 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Jan 11 17:31:18 crc kubenswrapper[4837]: I0111 17:31:18.363895 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Jan 11 17:31:18 crc kubenswrapper[4837]: E0111 17:31:18.363958 4837 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Jan 11 17:31:18 crc kubenswrapper[4837]: E0111 17:31:18.364066 4837 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Jan 11 17:31:18 crc kubenswrapper[4837]: I0111 17:31:18.414647 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 11 17:31:18 crc kubenswrapper[4837]: I0111 17:31:18.414747 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 11 17:31:18 crc kubenswrapper[4837]: I0111 17:31:18.414770 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 11 17:31:18 crc kubenswrapper[4837]: I0111 17:31:18.414799 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 11 17:31:18 crc kubenswrapper[4837]: I0111 17:31:18.414819 4837 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-11T17:31:18Z","lastTransitionTime":"2026-01-11T17:31:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 11 17:31:18 crc kubenswrapper[4837]: I0111 17:31:18.516769 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 11 17:31:18 crc kubenswrapper[4837]: I0111 17:31:18.516849 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 11 17:31:18 crc kubenswrapper[4837]: I0111 17:31:18.516877 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 11 17:31:18 crc kubenswrapper[4837]: I0111 17:31:18.516908 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 11 17:31:18 crc kubenswrapper[4837]: I0111 17:31:18.516931 4837 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-11T17:31:18Z","lastTransitionTime":"2026-01-11T17:31:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 11 17:31:18 crc kubenswrapper[4837]: I0111 17:31:18.619560 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 11 17:31:18 crc kubenswrapper[4837]: I0111 17:31:18.619937 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 11 17:31:18 crc kubenswrapper[4837]: I0111 17:31:18.620086 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 11 17:31:18 crc kubenswrapper[4837]: I0111 17:31:18.620239 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 11 17:31:18 crc kubenswrapper[4837]: I0111 17:31:18.620377 4837 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-11T17:31:18Z","lastTransitionTime":"2026-01-11T17:31:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 11 17:31:18 crc kubenswrapper[4837]: I0111 17:31:18.724050 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 11 17:31:18 crc kubenswrapper[4837]: I0111 17:31:18.724105 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 11 17:31:18 crc kubenswrapper[4837]: I0111 17:31:18.724121 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 11 17:31:18 crc kubenswrapper[4837]: I0111 17:31:18.724144 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 11 17:31:18 crc kubenswrapper[4837]: I0111 17:31:18.724160 4837 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-11T17:31:18Z","lastTransitionTime":"2026-01-11T17:31:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 11 17:31:18 crc kubenswrapper[4837]: I0111 17:31:18.827247 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 11 17:31:18 crc kubenswrapper[4837]: I0111 17:31:18.827299 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 11 17:31:18 crc kubenswrapper[4837]: I0111 17:31:18.827321 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 11 17:31:18 crc kubenswrapper[4837]: I0111 17:31:18.827346 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 11 17:31:18 crc kubenswrapper[4837]: I0111 17:31:18.827363 4837 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-11T17:31:18Z","lastTransitionTime":"2026-01-11T17:31:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 11 17:31:18 crc kubenswrapper[4837]: I0111 17:31:18.930073 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 11 17:31:18 crc kubenswrapper[4837]: I0111 17:31:18.930491 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 11 17:31:18 crc kubenswrapper[4837]: I0111 17:31:18.930645 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 11 17:31:18 crc kubenswrapper[4837]: I0111 17:31:18.930826 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 11 17:31:18 crc kubenswrapper[4837]: I0111 17:31:18.931002 4837 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-11T17:31:18Z","lastTransitionTime":"2026-01-11T17:31:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 11 17:31:19 crc kubenswrapper[4837]: I0111 17:31:19.034237 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 11 17:31:19 crc kubenswrapper[4837]: I0111 17:31:19.034268 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 11 17:31:19 crc kubenswrapper[4837]: I0111 17:31:19.034276 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 11 17:31:19 crc kubenswrapper[4837]: I0111 17:31:19.034291 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 11 17:31:19 crc kubenswrapper[4837]: I0111 17:31:19.034300 4837 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-11T17:31:19Z","lastTransitionTime":"2026-01-11T17:31:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 11 17:31:19 crc kubenswrapper[4837]: I0111 17:31:19.137384 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 11 17:31:19 crc kubenswrapper[4837]: I0111 17:31:19.137455 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 11 17:31:19 crc kubenswrapper[4837]: I0111 17:31:19.137472 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 11 17:31:19 crc kubenswrapper[4837]: I0111 17:31:19.137497 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 11 17:31:19 crc kubenswrapper[4837]: I0111 17:31:19.137515 4837 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-11T17:31:19Z","lastTransitionTime":"2026-01-11T17:31:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 11 17:31:19 crc kubenswrapper[4837]: I0111 17:31:19.240206 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 11 17:31:19 crc kubenswrapper[4837]: I0111 17:31:19.240284 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 11 17:31:19 crc kubenswrapper[4837]: I0111 17:31:19.240301 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 11 17:31:19 crc kubenswrapper[4837]: I0111 17:31:19.240326 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 11 17:31:19 crc kubenswrapper[4837]: I0111 17:31:19.240345 4837 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-11T17:31:19Z","lastTransitionTime":"2026-01-11T17:31:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 11 17:31:19 crc kubenswrapper[4837]: I0111 17:31:19.343620 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 11 17:31:19 crc kubenswrapper[4837]: I0111 17:31:19.343743 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 11 17:31:19 crc kubenswrapper[4837]: I0111 17:31:19.343763 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 11 17:31:19 crc kubenswrapper[4837]: I0111 17:31:19.343821 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 11 17:31:19 crc kubenswrapper[4837]: I0111 17:31:19.343841 4837 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-11T17:31:19Z","lastTransitionTime":"2026-01-11T17:31:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 11 17:31:19 crc kubenswrapper[4837]: I0111 17:31:19.363357 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Jan 11 17:31:19 crc kubenswrapper[4837]: E0111 17:31:19.363538 4837 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Jan 11 17:31:19 crc kubenswrapper[4837]: I0111 17:31:19.450062 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 11 17:31:19 crc kubenswrapper[4837]: I0111 17:31:19.450119 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 11 17:31:19 crc kubenswrapper[4837]: I0111 17:31:19.450137 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 11 17:31:19 crc kubenswrapper[4837]: I0111 17:31:19.450161 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 11 17:31:19 crc kubenswrapper[4837]: I0111 17:31:19.450179 4837 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-11T17:31:19Z","lastTransitionTime":"2026-01-11T17:31:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 11 17:31:19 crc kubenswrapper[4837]: I0111 17:31:19.553500 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 11 17:31:19 crc kubenswrapper[4837]: I0111 17:31:19.553544 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 11 17:31:19 crc kubenswrapper[4837]: I0111 17:31:19.553562 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 11 17:31:19 crc kubenswrapper[4837]: I0111 17:31:19.553585 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 11 17:31:19 crc kubenswrapper[4837]: I0111 17:31:19.553601 4837 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-11T17:31:19Z","lastTransitionTime":"2026-01-11T17:31:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 11 17:31:19 crc kubenswrapper[4837]: I0111 17:31:19.656731 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 11 17:31:19 crc kubenswrapper[4837]: I0111 17:31:19.656778 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 11 17:31:19 crc kubenswrapper[4837]: I0111 17:31:19.656795 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 11 17:31:19 crc kubenswrapper[4837]: I0111 17:31:19.656817 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 11 17:31:19 crc kubenswrapper[4837]: I0111 17:31:19.656834 4837 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-11T17:31:19Z","lastTransitionTime":"2026-01-11T17:31:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 11 17:31:19 crc kubenswrapper[4837]: I0111 17:31:19.760164 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 11 17:31:19 crc kubenswrapper[4837]: I0111 17:31:19.760240 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 11 17:31:19 crc kubenswrapper[4837]: I0111 17:31:19.760260 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 11 17:31:19 crc kubenswrapper[4837]: I0111 17:31:19.760285 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 11 17:31:19 crc kubenswrapper[4837]: I0111 17:31:19.760305 4837 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-11T17:31:19Z","lastTransitionTime":"2026-01-11T17:31:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 11 17:31:19 crc kubenswrapper[4837]: I0111 17:31:19.863618 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 11 17:31:19 crc kubenswrapper[4837]: I0111 17:31:19.863720 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 11 17:31:19 crc kubenswrapper[4837]: I0111 17:31:19.863743 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 11 17:31:19 crc kubenswrapper[4837]: I0111 17:31:19.863771 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 11 17:31:19 crc kubenswrapper[4837]: I0111 17:31:19.863793 4837 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-11T17:31:19Z","lastTransitionTime":"2026-01-11T17:31:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 11 17:31:19 crc kubenswrapper[4837]: I0111 17:31:19.967485 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 11 17:31:19 crc kubenswrapper[4837]: I0111 17:31:19.967877 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 11 17:31:19 crc kubenswrapper[4837]: I0111 17:31:19.968025 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 11 17:31:19 crc kubenswrapper[4837]: I0111 17:31:19.968147 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 11 17:31:19 crc kubenswrapper[4837]: I0111 17:31:19.968287 4837 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-11T17:31:19Z","lastTransitionTime":"2026-01-11T17:31:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 11 17:31:20 crc kubenswrapper[4837]: I0111 17:31:20.071587 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 11 17:31:20 crc kubenswrapper[4837]: I0111 17:31:20.071641 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 11 17:31:20 crc kubenswrapper[4837]: I0111 17:31:20.071657 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 11 17:31:20 crc kubenswrapper[4837]: I0111 17:31:20.071714 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 11 17:31:20 crc kubenswrapper[4837]: I0111 17:31:20.071732 4837 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-11T17:31:20Z","lastTransitionTime":"2026-01-11T17:31:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 11 17:31:20 crc kubenswrapper[4837]: I0111 17:31:20.174429 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 11 17:31:20 crc kubenswrapper[4837]: I0111 17:31:20.174487 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 11 17:31:20 crc kubenswrapper[4837]: I0111 17:31:20.174507 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 11 17:31:20 crc kubenswrapper[4837]: I0111 17:31:20.174537 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 11 17:31:20 crc kubenswrapper[4837]: I0111 17:31:20.174561 4837 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-11T17:31:20Z","lastTransitionTime":"2026-01-11T17:31:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 11 17:31:20 crc kubenswrapper[4837]: I0111 17:31:20.278602 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 11 17:31:20 crc kubenswrapper[4837]: I0111 17:31:20.278668 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 11 17:31:20 crc kubenswrapper[4837]: I0111 17:31:20.278722 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 11 17:31:20 crc kubenswrapper[4837]: I0111 17:31:20.278747 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 11 17:31:20 crc kubenswrapper[4837]: I0111 17:31:20.278764 4837 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-11T17:31:20Z","lastTransitionTime":"2026-01-11T17:31:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 11 17:31:20 crc kubenswrapper[4837]: I0111 17:31:20.323828 4837 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Jan 11 17:31:20 crc kubenswrapper[4837]: I0111 17:31:20.334128 4837 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-scheduler/openshift-kube-scheduler-crc"] Jan 11 17:31:20 crc kubenswrapper[4837]: I0111 17:31:20.342161 4837 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-v5bf5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"78cc7c3f-09f5-4200-a647-8fa4e9b2aae5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-11T17:31:09Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-11T17:31:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-11T17:31:09Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-11T17:31:09Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ns552\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-11T17:31:09Z\\\"}}\" for pod \"openshift-multus\"/\"multus-v5bf5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-11T17:31:20Z is after 2025-08-24T17:21:41Z" Jan 11 17:31:20 crc kubenswrapper[4837]: I0111 17:31:20.363855 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Jan 11 17:31:20 crc kubenswrapper[4837]: I0111 17:31:20.363918 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Jan 11 17:31:20 crc kubenswrapper[4837]: E0111 17:31:20.364097 4837 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Jan 11 17:31:20 crc kubenswrapper[4837]: E0111 17:31:20.364377 4837 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Jan 11 17:31:20 crc kubenswrapper[4837]: I0111 17:31:20.366719 4837 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-11T17:31:09Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-11T17:31:09Z\\\",\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":false,\\\"restartCount\\\":6,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-11T17:31:20Z is after 2025-08-24T17:21:41Z" Jan 11 17:31:20 crc kubenswrapper[4837]: I0111 17:31:20.381075 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 11 17:31:20 crc kubenswrapper[4837]: I0111 17:31:20.381136 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 11 17:31:20 crc kubenswrapper[4837]: I0111 17:31:20.381159 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 11 17:31:20 crc kubenswrapper[4837]: I0111 17:31:20.381191 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 11 17:31:20 crc kubenswrapper[4837]: I0111 17:31:20.381215 4837 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-11T17:31:20Z","lastTransitionTime":"2026-01-11T17:31:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 11 17:31:20 crc kubenswrapper[4837]: I0111 17:31:20.387745 4837 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-11T17:31:09Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-11T17:31:09Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-11T17:31:20Z is after 2025-08-24T17:21:41Z" Jan 11 17:31:20 crc kubenswrapper[4837]: I0111 17:31:20.402295 4837 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-11T17:31:09Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-11T17:31:09Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-11T17:31:20Z is after 2025-08-24T17:21:41Z" Jan 11 17:31:20 crc kubenswrapper[4837]: I0111 17:31:20.423506 4837 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"46acf340-e77e-47f8-ba25-95b9b5870af3\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-11T17:30:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-11T17:30:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-11T17:30:26Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-11T17:30:26Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-11T17:30:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7d60a2ce54687f5ded99fd2e70894c23fe35a18b1bd557e6f05edca994881a3c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-11T17:30:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d3ff334f7925a2e69fa040b1ebfee0fd51669c60ff75534c1e08a9bf30b28d4d\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-11T17:30:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b0188eb536b523f8b0f5373307a7649fedd407a748d413a27ca1f61ed03e9e1d\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-11T17:30:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://45dc7defae85e8f8ec0f441595434d86ab3cf87a7c81cafb9ca178af9a300027\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://45dc7defae85e8f8ec0f441595434d86ab3cf87a7c81cafb9ca178af9a300027\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-01-11T17:31:15Z\\\",\\\"message\\\":\\\"le observer\\\\nW0111 17:31:09.479691 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0111 17:31:09.479821 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0111 17:31:09.480410 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-4110613780/tls.crt::/tmp/serving-cert-4110613780/tls.key\\\\\\\"\\\\nI0111 17:31:09.725824 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0111 17:31:10.438563 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0111 17:31:10.438624 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0111 17:31:10.438713 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0111 17:31:10.438732 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0111 17:31:10.449082 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0111 17:31:10.449135 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0111 17:31:10.449146 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0111 17:31:10.449159 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0111 17:31:10.449169 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0111 17:31:10.449179 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0111 17:31:10.449187 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0111 17:31:10.449256 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0111 17:31:10.451583 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-01-11T17:31:03Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 10s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c40f593938fad7c4202ddde8c6125eed092683e19464cc63448e7579e8174e54\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-11T17:30:35Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7e80355f699d3f0ea37d79f71e08cabb59de047b88df75691a2da0ef7de9d52b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7e80355f699d3f0ea37d79f71e08cabb59de047b88df75691a2da0ef7de9d52b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-11T17:30:27Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-11T17:30:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-11T17:30:26Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-11T17:31:20Z is after 2025-08-24T17:21:41Z" Jan 11 17:31:20 crc kubenswrapper[4837]: I0111 17:31:20.441762 4837 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-11T17:31:09Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-11T17:31:09Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-11T17:31:20Z is after 2025-08-24T17:21:41Z" Jan 11 17:31:20 crc kubenswrapper[4837]: I0111 17:31:20.456525 4837 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-pqnst" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1cf6fa66-290a-4e29-bafc-e60185a22fe2\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-11T17:31:09Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-11T17:31:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-11T17:31:09Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-11T17:31:09Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wgdql\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wgdql\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-11T17:31:09Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-pqnst\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-11T17:31:20Z is after 2025-08-24T17:21:41Z" Jan 11 17:31:20 crc kubenswrapper[4837]: I0111 17:31:20.479141 4837 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-7996w" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"46729771-ba0b-4b7c-8245-b2d57acb5a2c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-11T17:31:09Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-11T17:31:09Z\\\",\\\"message\\\":\\\"containers with incomplete status: [egress-router-binary-copy cni-plugins bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-11T17:31:09Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-11T17:31:09Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fhxrk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fhxrk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fhxrk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fhxrk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fhxrk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fhxrk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fhxrk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-11T17:31:09Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-7996w\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-11T17:31:20Z is after 2025-08-24T17:21:41Z" Jan 11 17:31:20 crc kubenswrapper[4837]: I0111 17:31:20.484507 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 11 17:31:20 crc kubenswrapper[4837]: I0111 17:31:20.484569 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 11 17:31:20 crc kubenswrapper[4837]: I0111 17:31:20.484588 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 11 17:31:20 crc kubenswrapper[4837]: I0111 17:31:20.484614 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 11 17:31:20 crc kubenswrapper[4837]: I0111 17:31:20.484631 4837 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-11T17:31:20Z","lastTransitionTime":"2026-01-11T17:31:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 11 17:31:20 crc kubenswrapper[4837]: I0111 17:31:20.501667 4837 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-11T17:31:09Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-11T17:31:09Z\\\",\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":false,\\\"restartCount\\\":5,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-11T17:31:20Z is after 2025-08-24T17:21:41Z" Jan 11 17:31:20 crc kubenswrapper[4837]: I0111 17:31:20.516178 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/node-ca-zd2gg" event={"ID":"cfaa7eab-7649-4a7d-8e95-aab12da5f86d","Type":"ContainerStarted","Data":"eec8970b0bbf95820938ebc8e014d42feecbb96e0d63a96923d38cc3ba24bd6c"} Jan 11 17:31:20 crc kubenswrapper[4837]: I0111 17:31:20.519218 4837 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-11T17:31:09Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-11T17:31:09Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-11T17:31:20Z is after 2025-08-24T17:21:41Z" Jan 11 17:31:20 crc kubenswrapper[4837]: I0111 17:31:20.543819 4837 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-b7lgc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e1452749-ce38-41f8-89dd-4b567f2a3250\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-11T17:31:09Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-11T17:31:09Z\\\",\\\"message\\\":\\\"containers with incomplete status: [kubecfg-setup]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-11T17:31:09Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-11T17:31:09Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6f6rv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6f6rv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6f6rv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6f6rv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6f6rv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6f6rv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6f6rv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6f6rv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6f6rv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-11T17:31:09Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-b7lgc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-11T17:31:20Z is after 2025-08-24T17:21:41Z" Jan 11 17:31:20 crc kubenswrapper[4837]: I0111 17:31:20.555329 4837 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-kcvkg" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a574a741-71b2-4d60-9553-3d85d815f4b0\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-11T17:31:09Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-11T17:31:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-11T17:31:09Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-11T17:31:09Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xnsnp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-11T17:31:09Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-kcvkg\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-11T17:31:20Z is after 2025-08-24T17:21:41Z" Jan 11 17:31:20 crc kubenswrapper[4837]: I0111 17:31:20.569398 4837 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-zd2gg" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"cfaa7eab-7649-4a7d-8e95-aab12da5f86d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-11T17:31:10Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-11T17:31:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-11T17:31:10Z\\\",\\\"message\\\":\\\"containers with unready status: [node-ca]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-11T17:31:10Z\\\",\\\"message\\\":\\\"containers with unready status: [node-ca]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gq6v5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-11T17:31:10Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-zd2gg\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-11T17:31:20Z is after 2025-08-24T17:21:41Z" Jan 11 17:31:20 crc kubenswrapper[4837]: I0111 17:31:20.588018 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 11 17:31:20 crc kubenswrapper[4837]: I0111 17:31:20.588097 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 11 17:31:20 crc kubenswrapper[4837]: I0111 17:31:20.588122 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 11 17:31:20 crc kubenswrapper[4837]: I0111 17:31:20.588152 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 11 17:31:20 crc kubenswrapper[4837]: I0111 17:31:20.588174 4837 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-11T17:31:20Z","lastTransitionTime":"2026-01-11T17:31:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 11 17:31:20 crc kubenswrapper[4837]: I0111 17:31:20.593905 4837 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"62ccdf63-9fbb-4a5c-825b-aefdb1153bfd\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-11T17:30:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-11T17:30:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-11T17:31:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-11T17:31:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-11T17:30:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4d900bf1f3701fc269e75b435a2660c096b93a7c043693a5f294666defbe51c1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f143223d5223a4e1e7056b09d8a43a04fde926d4c15184b5b452190356db5881\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-01-11T17:31:03Z\\\",\\\"message\\\":\\\"+ timeout 3m /bin/bash -exuo pipefail -c 'while [ -n \\\\\\\"$(ss -Htanop \\\\\\\\( sport = 10357 \\\\\\\\))\\\\\\\" ]; do sleep 1; done'\\\\n++ ss -Htanop '(' sport = 10357 ')'\\\\n+ '[' -n '' ']'\\\\n+ exec cluster-policy-controller start --config=/etc/kubernetes/static-pod-resources/configmaps/cluster-policy-controller-config/config.yaml --kubeconfig=/etc/kubernetes/static-pod-resources/configmaps/controller-manager-kubeconfig/kubeconfig --namespace=openshift-kube-controller-manager -v=2\\\\nI0111 17:30:32.298484 1 leaderelection.go:121] The leader election gives 4 retries and allows for 30s of clock skew. The kube-apiserver downtime tolerance is 78s. Worst non-graceful lease acquisition is 2m43s. Worst graceful lease acquisition is {26s}.\\\\nI0111 17:30:32.299973 1 observer_polling.go:159] Starting file observer\\\\nI0111 17:30:32.306834 1 builder.go:298] cluster-policy-controller version 4.18.0-202501230001.p0.g5fd8525.assembly.stream.el9-5fd8525-5fd852525909ce6eab52972ba9ce8fcf56528eb9\\\\nI0111 17:30:32.308440 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/etc/kubernetes/static-pod-resources/secrets/serving-cert/tls.crt::/etc/kubernetes/static-pod-resources/secrets/serving-cert/tls.key\\\\\\\"\\\\nF0111 17:31:02.868341 1 cmd.go:179] failed checking apiserver connectivity: Get \\\\\\\"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/openshift-kube-controller-manager/leases/cluster-policy-controller-lock\\\\\\\": context deadline exceeded\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-01-11T17:30:30Z\\\"}},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-11T17:31:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e4a6334ba6c5d767d1ea3901dae9be1569836c6043224f9ccedd0100448a80fe\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-11T17:30:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9d344161514ccacd69f22e10f043ccdcf3b4f5a5cdcc0a0578bcb7b135f2f829\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-11T17:30:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5e28776f0910d2b90b9ec6cd95b16bfc77cf6eda9a615666acf6e461076ed9d3\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-11T17:30:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-11T17:30:26Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-11T17:31:20Z is after 2025-08-24T17:21:41Z" Jan 11 17:31:20 crc kubenswrapper[4837]: I0111 17:31:20.691246 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 11 17:31:20 crc kubenswrapper[4837]: I0111 17:31:20.691315 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 11 17:31:20 crc kubenswrapper[4837]: I0111 17:31:20.691334 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 11 17:31:20 crc kubenswrapper[4837]: I0111 17:31:20.691431 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 11 17:31:20 crc kubenswrapper[4837]: I0111 17:31:20.691453 4837 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-11T17:31:20Z","lastTransitionTime":"2026-01-11T17:31:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 11 17:31:20 crc kubenswrapper[4837]: I0111 17:31:20.793823 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 11 17:31:20 crc kubenswrapper[4837]: I0111 17:31:20.793885 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 11 17:31:20 crc kubenswrapper[4837]: I0111 17:31:20.793903 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 11 17:31:20 crc kubenswrapper[4837]: I0111 17:31:20.793933 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 11 17:31:20 crc kubenswrapper[4837]: I0111 17:31:20.793956 4837 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-11T17:31:20Z","lastTransitionTime":"2026-01-11T17:31:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 11 17:31:20 crc kubenswrapper[4837]: I0111 17:31:20.900310 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 11 17:31:20 crc kubenswrapper[4837]: I0111 17:31:20.900364 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 11 17:31:20 crc kubenswrapper[4837]: I0111 17:31:20.900382 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 11 17:31:20 crc kubenswrapper[4837]: I0111 17:31:20.900408 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 11 17:31:20 crc kubenswrapper[4837]: I0111 17:31:20.900428 4837 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-11T17:31:20Z","lastTransitionTime":"2026-01-11T17:31:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 11 17:31:21 crc kubenswrapper[4837]: I0111 17:31:21.002624 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 11 17:31:21 crc kubenswrapper[4837]: I0111 17:31:21.002662 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 11 17:31:21 crc kubenswrapper[4837]: I0111 17:31:21.002693 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 11 17:31:21 crc kubenswrapper[4837]: I0111 17:31:21.002711 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 11 17:31:21 crc kubenswrapper[4837]: I0111 17:31:21.002721 4837 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-11T17:31:21Z","lastTransitionTime":"2026-01-11T17:31:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 11 17:31:21 crc kubenswrapper[4837]: I0111 17:31:21.106011 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 11 17:31:21 crc kubenswrapper[4837]: I0111 17:31:21.106069 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 11 17:31:21 crc kubenswrapper[4837]: I0111 17:31:21.106086 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 11 17:31:21 crc kubenswrapper[4837]: I0111 17:31:21.106111 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 11 17:31:21 crc kubenswrapper[4837]: I0111 17:31:21.106129 4837 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-11T17:31:21Z","lastTransitionTime":"2026-01-11T17:31:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 11 17:31:21 crc kubenswrapper[4837]: I0111 17:31:21.208917 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 11 17:31:21 crc kubenswrapper[4837]: I0111 17:31:21.208954 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 11 17:31:21 crc kubenswrapper[4837]: I0111 17:31:21.208963 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 11 17:31:21 crc kubenswrapper[4837]: I0111 17:31:21.208979 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 11 17:31:21 crc kubenswrapper[4837]: I0111 17:31:21.208988 4837 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-11T17:31:21Z","lastTransitionTime":"2026-01-11T17:31:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 11 17:31:21 crc kubenswrapper[4837]: I0111 17:31:21.311778 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 11 17:31:21 crc kubenswrapper[4837]: I0111 17:31:21.311818 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 11 17:31:21 crc kubenswrapper[4837]: I0111 17:31:21.311827 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 11 17:31:21 crc kubenswrapper[4837]: I0111 17:31:21.311841 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 11 17:31:21 crc kubenswrapper[4837]: I0111 17:31:21.311853 4837 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-11T17:31:21Z","lastTransitionTime":"2026-01-11T17:31:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 11 17:31:21 crc kubenswrapper[4837]: I0111 17:31:21.332045 4837 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-77cm7"] Jan 11 17:31:21 crc kubenswrapper[4837]: I0111 17:31:21.332758 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-77cm7" Jan 11 17:31:21 crc kubenswrapper[4837]: I0111 17:31:21.334964 4837 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-kubernetes-control-plane-dockercfg-gs7dd" Jan 11 17:31:21 crc kubenswrapper[4837]: I0111 17:31:21.335365 4837 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-control-plane-metrics-cert" Jan 11 17:31:21 crc kubenswrapper[4837]: I0111 17:31:21.357868 4837 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-11T17:31:09Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-11T17:31:09Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-11T17:31:21Z is after 2025-08-24T17:21:41Z" Jan 11 17:31:21 crc kubenswrapper[4837]: I0111 17:31:21.363430 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Jan 11 17:31:21 crc kubenswrapper[4837]: E0111 17:31:21.363606 4837 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Jan 11 17:31:21 crc kubenswrapper[4837]: I0111 17:31:21.373495 4837 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-pqnst" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1cf6fa66-290a-4e29-bafc-e60185a22fe2\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-11T17:31:09Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-11T17:31:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-11T17:31:09Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-11T17:31:09Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wgdql\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wgdql\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-11T17:31:09Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-pqnst\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-11T17:31:21Z is after 2025-08-24T17:21:41Z" Jan 11 17:31:21 crc kubenswrapper[4837]: I0111 17:31:21.396117 4837 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-7996w" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"46729771-ba0b-4b7c-8245-b2d57acb5a2c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-11T17:31:09Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-11T17:31:09Z\\\",\\\"message\\\":\\\"containers with incomplete status: [egress-router-binary-copy cni-plugins bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-11T17:31:09Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-11T17:31:09Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fhxrk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fhxrk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fhxrk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fhxrk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fhxrk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fhxrk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fhxrk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-11T17:31:09Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-7996w\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-11T17:31:21Z is after 2025-08-24T17:21:41Z" Jan 11 17:31:21 crc kubenswrapper[4837]: I0111 17:31:21.414078 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 11 17:31:21 crc kubenswrapper[4837]: I0111 17:31:21.414122 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 11 17:31:21 crc kubenswrapper[4837]: I0111 17:31:21.414136 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 11 17:31:21 crc kubenswrapper[4837]: I0111 17:31:21.414156 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 11 17:31:21 crc kubenswrapper[4837]: I0111 17:31:21.414169 4837 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-11T17:31:21Z","lastTransitionTime":"2026-01-11T17:31:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 11 17:31:21 crc kubenswrapper[4837]: I0111 17:31:21.417615 4837 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-11T17:31:09Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-11T17:31:09Z\\\",\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":false,\\\"restartCount\\\":5,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-11T17:31:21Z is after 2025-08-24T17:21:41Z" Jan 11 17:31:21 crc kubenswrapper[4837]: I0111 17:31:21.432648 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/be7e19df-ab8a-4616-99b6-3e13a1fa0e4a-ovnkube-config\") pod \"ovnkube-control-plane-749d76644c-77cm7\" (UID: \"be7e19df-ab8a-4616-99b6-3e13a1fa0e4a\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-77cm7" Jan 11 17:31:21 crc kubenswrapper[4837]: I0111 17:31:21.432740 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/be7e19df-ab8a-4616-99b6-3e13a1fa0e4a-env-overrides\") pod \"ovnkube-control-plane-749d76644c-77cm7\" (UID: \"be7e19df-ab8a-4616-99b6-3e13a1fa0e4a\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-77cm7" Jan 11 17:31:21 crc kubenswrapper[4837]: I0111 17:31:21.432764 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-control-plane-metrics-cert\" (UniqueName: \"kubernetes.io/secret/be7e19df-ab8a-4616-99b6-3e13a1fa0e4a-ovn-control-plane-metrics-cert\") pod \"ovnkube-control-plane-749d76644c-77cm7\" (UID: \"be7e19df-ab8a-4616-99b6-3e13a1fa0e4a\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-77cm7" Jan 11 17:31:21 crc kubenswrapper[4837]: I0111 17:31:21.432813 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-sbctq\" (UniqueName: \"kubernetes.io/projected/be7e19df-ab8a-4616-99b6-3e13a1fa0e4a-kube-api-access-sbctq\") pod \"ovnkube-control-plane-749d76644c-77cm7\" (UID: \"be7e19df-ab8a-4616-99b6-3e13a1fa0e4a\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-77cm7" Jan 11 17:31:21 crc kubenswrapper[4837]: I0111 17:31:21.438176 4837 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-11T17:31:09Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-11T17:31:09Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-11T17:31:21Z is after 2025-08-24T17:21:41Z" Jan 11 17:31:21 crc kubenswrapper[4837]: I0111 17:31:21.467232 4837 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-b7lgc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e1452749-ce38-41f8-89dd-4b567f2a3250\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-11T17:31:09Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-11T17:31:09Z\\\",\\\"message\\\":\\\"containers with incomplete status: [kubecfg-setup]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-11T17:31:09Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-11T17:31:09Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6f6rv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6f6rv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6f6rv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6f6rv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6f6rv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6f6rv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6f6rv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6f6rv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6f6rv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-11T17:31:09Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-b7lgc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-11T17:31:21Z is after 2025-08-24T17:21:41Z" Jan 11 17:31:21 crc kubenswrapper[4837]: I0111 17:31:21.484128 4837 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-kcvkg" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a574a741-71b2-4d60-9553-3d85d815f4b0\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-11T17:31:09Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-11T17:31:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-11T17:31:09Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-11T17:31:09Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xnsnp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-11T17:31:09Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-kcvkg\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-11T17:31:21Z is after 2025-08-24T17:21:41Z" Jan 11 17:31:21 crc kubenswrapper[4837]: I0111 17:31:21.500243 4837 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-zd2gg" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"cfaa7eab-7649-4a7d-8e95-aab12da5f86d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-11T17:31:10Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-11T17:31:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-11T17:31:10Z\\\",\\\"message\\\":\\\"containers with unready status: [node-ca]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-11T17:31:10Z\\\",\\\"message\\\":\\\"containers with unready status: [node-ca]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gq6v5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-11T17:31:10Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-zd2gg\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-11T17:31:21Z is after 2025-08-24T17:21:41Z" Jan 11 17:31:21 crc kubenswrapper[4837]: I0111 17:31:21.517051 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 11 17:31:21 crc kubenswrapper[4837]: I0111 17:31:21.517086 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 11 17:31:21 crc kubenswrapper[4837]: I0111 17:31:21.517099 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 11 17:31:21 crc kubenswrapper[4837]: I0111 17:31:21.517119 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 11 17:31:21 crc kubenswrapper[4837]: I0111 17:31:21.517130 4837 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-11T17:31:21Z","lastTransitionTime":"2026-01-11T17:31:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 11 17:31:21 crc kubenswrapper[4837]: I0111 17:31:21.519169 4837 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"62ccdf63-9fbb-4a5c-825b-aefdb1153bfd\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-11T17:30:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-11T17:30:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-11T17:31:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-11T17:31:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-11T17:30:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4d900bf1f3701fc269e75b435a2660c096b93a7c043693a5f294666defbe51c1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f143223d5223a4e1e7056b09d8a43a04fde926d4c15184b5b452190356db5881\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-01-11T17:31:03Z\\\",\\\"message\\\":\\\"+ timeout 3m /bin/bash -exuo pipefail -c 'while [ -n \\\\\\\"$(ss -Htanop \\\\\\\\( sport = 10357 \\\\\\\\))\\\\\\\" ]; do sleep 1; done'\\\\n++ ss -Htanop '(' sport = 10357 ')'\\\\n+ '[' -n '' ']'\\\\n+ exec cluster-policy-controller start --config=/etc/kubernetes/static-pod-resources/configmaps/cluster-policy-controller-config/config.yaml --kubeconfig=/etc/kubernetes/static-pod-resources/configmaps/controller-manager-kubeconfig/kubeconfig --namespace=openshift-kube-controller-manager -v=2\\\\nI0111 17:30:32.298484 1 leaderelection.go:121] The leader election gives 4 retries and allows for 30s of clock skew. The kube-apiserver downtime tolerance is 78s. Worst non-graceful lease acquisition is 2m43s. Worst graceful lease acquisition is {26s}.\\\\nI0111 17:30:32.299973 1 observer_polling.go:159] Starting file observer\\\\nI0111 17:30:32.306834 1 builder.go:298] cluster-policy-controller version 4.18.0-202501230001.p0.g5fd8525.assembly.stream.el9-5fd8525-5fd852525909ce6eab52972ba9ce8fcf56528eb9\\\\nI0111 17:30:32.308440 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/etc/kubernetes/static-pod-resources/secrets/serving-cert/tls.crt::/etc/kubernetes/static-pod-resources/secrets/serving-cert/tls.key\\\\\\\"\\\\nF0111 17:31:02.868341 1 cmd.go:179] failed checking apiserver connectivity: Get \\\\\\\"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/openshift-kube-controller-manager/leases/cluster-policy-controller-lock\\\\\\\": context deadline exceeded\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-01-11T17:30:30Z\\\"}},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-11T17:31:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e4a6334ba6c5d767d1ea3901dae9be1569836c6043224f9ccedd0100448a80fe\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-11T17:30:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9d344161514ccacd69f22e10f043ccdcf3b4f5a5cdcc0a0578bcb7b135f2f829\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-11T17:30:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5e28776f0910d2b90b9ec6cd95b16bfc77cf6eda9a615666acf6e461076ed9d3\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-11T17:30:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-11T17:30:26Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-11T17:31:21Z is after 2025-08-24T17:21:41Z" Jan 11 17:31:21 crc kubenswrapper[4837]: I0111 17:31:21.533980 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/be7e19df-ab8a-4616-99b6-3e13a1fa0e4a-env-overrides\") pod \"ovnkube-control-plane-749d76644c-77cm7\" (UID: \"be7e19df-ab8a-4616-99b6-3e13a1fa0e4a\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-77cm7" Jan 11 17:31:21 crc kubenswrapper[4837]: I0111 17:31:21.534204 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-control-plane-metrics-cert\" (UniqueName: \"kubernetes.io/secret/be7e19df-ab8a-4616-99b6-3e13a1fa0e4a-ovn-control-plane-metrics-cert\") pod \"ovnkube-control-plane-749d76644c-77cm7\" (UID: \"be7e19df-ab8a-4616-99b6-3e13a1fa0e4a\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-77cm7" Jan 11 17:31:21 crc kubenswrapper[4837]: I0111 17:31:21.534401 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-sbctq\" (UniqueName: \"kubernetes.io/projected/be7e19df-ab8a-4616-99b6-3e13a1fa0e4a-kube-api-access-sbctq\") pod \"ovnkube-control-plane-749d76644c-77cm7\" (UID: \"be7e19df-ab8a-4616-99b6-3e13a1fa0e4a\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-77cm7" Jan 11 17:31:21 crc kubenswrapper[4837]: I0111 17:31:21.534578 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/be7e19df-ab8a-4616-99b6-3e13a1fa0e4a-ovnkube-config\") pod \"ovnkube-control-plane-749d76644c-77cm7\" (UID: \"be7e19df-ab8a-4616-99b6-3e13a1fa0e4a\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-77cm7" Jan 11 17:31:21 crc kubenswrapper[4837]: I0111 17:31:21.535151 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/be7e19df-ab8a-4616-99b6-3e13a1fa0e4a-env-overrides\") pod \"ovnkube-control-plane-749d76644c-77cm7\" (UID: \"be7e19df-ab8a-4616-99b6-3e13a1fa0e4a\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-77cm7" Jan 11 17:31:21 crc kubenswrapper[4837]: I0111 17:31:21.535951 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/be7e19df-ab8a-4616-99b6-3e13a1fa0e4a-ovnkube-config\") pod \"ovnkube-control-plane-749d76644c-77cm7\" (UID: \"be7e19df-ab8a-4616-99b6-3e13a1fa0e4a\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-77cm7" Jan 11 17:31:21 crc kubenswrapper[4837]: I0111 17:31:21.538464 4837 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"fa05a690-1797-4956-a0a5-bec02527f1ef\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-11T17:30:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-11T17:30:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-11T17:31:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-11T17:31:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-11T17:30:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3f1570357568efb8bd28e5d0705440f75cbdee0d1872382250316b2ced45b7b4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-11T17:30:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://61c9dacc05eec80f9db657621ea0185c5aa06416af8f96a0a0cb43304b6cd17c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-11T17:30:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1454dce2acfc65d7ef5ab03957c74ea3fba47fe5ac62a9e8a4599ac7b6a2c217\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-11T17:30:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ebe3e5dfb1cb302cd3b72ee88a864dc0e47a098eb7775d2fe93f100207bb910d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ebe3e5dfb1cb302cd3b72ee88a864dc0e47a098eb7775d2fe93f100207bb910d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-11T17:30:27Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-11T17:30:27Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-11T17:30:26Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-11T17:31:21Z is after 2025-08-24T17:21:41Z" Jan 11 17:31:21 crc kubenswrapper[4837]: I0111 17:31:21.545245 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-control-plane-metrics-cert\" (UniqueName: \"kubernetes.io/secret/be7e19df-ab8a-4616-99b6-3e13a1fa0e4a-ovn-control-plane-metrics-cert\") pod \"ovnkube-control-plane-749d76644c-77cm7\" (UID: \"be7e19df-ab8a-4616-99b6-3e13a1fa0e4a\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-77cm7" Jan 11 17:31:21 crc kubenswrapper[4837]: I0111 17:31:21.560390 4837 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-v5bf5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"78cc7c3f-09f5-4200-a647-8fa4e9b2aae5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-11T17:31:09Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-11T17:31:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-11T17:31:09Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-11T17:31:09Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ns552\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-11T17:31:09Z\\\"}}\" for pod \"openshift-multus\"/\"multus-v5bf5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-11T17:31:21Z is after 2025-08-24T17:21:41Z" Jan 11 17:31:21 crc kubenswrapper[4837]: I0111 17:31:21.563306 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-sbctq\" (UniqueName: \"kubernetes.io/projected/be7e19df-ab8a-4616-99b6-3e13a1fa0e4a-kube-api-access-sbctq\") pod \"ovnkube-control-plane-749d76644c-77cm7\" (UID: \"be7e19df-ab8a-4616-99b6-3e13a1fa0e4a\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-77cm7" Jan 11 17:31:21 crc kubenswrapper[4837]: I0111 17:31:21.579271 4837 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-77cm7" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"be7e19df-ab8a-4616-99b6-3e13a1fa0e4a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-11T17:31:21Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-11T17:31:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-11T17:31:21Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-rbac-proxy ovnkube-cluster-manager]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-11T17:31:21Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-rbac-proxy ovnkube-cluster-manager]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-sbctq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-sbctq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-11T17:31:21Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-77cm7\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-11T17:31:21Z is after 2025-08-24T17:21:41Z" Jan 11 17:31:21 crc kubenswrapper[4837]: I0111 17:31:21.599142 4837 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-11T17:31:09Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-11T17:31:09Z\\\",\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":false,\\\"restartCount\\\":6,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-11T17:31:21Z is after 2025-08-24T17:21:41Z" Jan 11 17:31:21 crc kubenswrapper[4837]: I0111 17:31:21.617364 4837 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-11T17:31:09Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-11T17:31:09Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-11T17:31:21Z is after 2025-08-24T17:21:41Z" Jan 11 17:31:21 crc kubenswrapper[4837]: I0111 17:31:21.621343 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 11 17:31:21 crc kubenswrapper[4837]: I0111 17:31:21.621546 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 11 17:31:21 crc kubenswrapper[4837]: I0111 17:31:21.621712 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 11 17:31:21 crc kubenswrapper[4837]: I0111 17:31:21.621859 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 11 17:31:21 crc kubenswrapper[4837]: I0111 17:31:21.621997 4837 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-11T17:31:21Z","lastTransitionTime":"2026-01-11T17:31:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 11 17:31:21 crc kubenswrapper[4837]: I0111 17:31:21.636586 4837 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-11T17:31:09Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-11T17:31:09Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-11T17:31:21Z is after 2025-08-24T17:21:41Z" Jan 11 17:31:21 crc kubenswrapper[4837]: I0111 17:31:21.652324 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-77cm7" Jan 11 17:31:21 crc kubenswrapper[4837]: I0111 17:31:21.658226 4837 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"46acf340-e77e-47f8-ba25-95b9b5870af3\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-11T17:30:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-11T17:30:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-11T17:30:26Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-11T17:30:26Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-11T17:30:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7d60a2ce54687f5ded99fd2e70894c23fe35a18b1bd557e6f05edca994881a3c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-11T17:30:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d3ff334f7925a2e69fa040b1ebfee0fd51669c60ff75534c1e08a9bf30b28d4d\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-11T17:30:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b0188eb536b523f8b0f5373307a7649fedd407a748d413a27ca1f61ed03e9e1d\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-11T17:30:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://45dc7defae85e8f8ec0f441595434d86ab3cf87a7c81cafb9ca178af9a300027\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://45dc7defae85e8f8ec0f441595434d86ab3cf87a7c81cafb9ca178af9a300027\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-01-11T17:31:15Z\\\",\\\"message\\\":\\\"le observer\\\\nW0111 17:31:09.479691 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0111 17:31:09.479821 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0111 17:31:09.480410 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-4110613780/tls.crt::/tmp/serving-cert-4110613780/tls.key\\\\\\\"\\\\nI0111 17:31:09.725824 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0111 17:31:10.438563 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0111 17:31:10.438624 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0111 17:31:10.438713 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0111 17:31:10.438732 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0111 17:31:10.449082 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0111 17:31:10.449135 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0111 17:31:10.449146 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0111 17:31:10.449159 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0111 17:31:10.449169 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0111 17:31:10.449179 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0111 17:31:10.449187 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0111 17:31:10.449256 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0111 17:31:10.451583 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-01-11T17:31:03Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 10s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c40f593938fad7c4202ddde8c6125eed092683e19464cc63448e7579e8174e54\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-11T17:30:35Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7e80355f699d3f0ea37d79f71e08cabb59de047b88df75691a2da0ef7de9d52b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7e80355f699d3f0ea37d79f71e08cabb59de047b88df75691a2da0ef7de9d52b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-11T17:30:27Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-11T17:30:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-11T17:30:26Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-11T17:31:21Z is after 2025-08-24T17:21:41Z" Jan 11 17:31:21 crc kubenswrapper[4837]: I0111 17:31:21.724808 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 11 17:31:21 crc kubenswrapper[4837]: I0111 17:31:21.724895 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 11 17:31:21 crc kubenswrapper[4837]: I0111 17:31:21.724919 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 11 17:31:21 crc kubenswrapper[4837]: I0111 17:31:21.724945 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 11 17:31:21 crc kubenswrapper[4837]: I0111 17:31:21.724962 4837 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-11T17:31:21Z","lastTransitionTime":"2026-01-11T17:31:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 11 17:31:21 crc kubenswrapper[4837]: I0111 17:31:21.827620 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 11 17:31:21 crc kubenswrapper[4837]: I0111 17:31:21.827700 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 11 17:31:21 crc kubenswrapper[4837]: I0111 17:31:21.827718 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 11 17:31:21 crc kubenswrapper[4837]: I0111 17:31:21.827740 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 11 17:31:21 crc kubenswrapper[4837]: I0111 17:31:21.827754 4837 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-11T17:31:21Z","lastTransitionTime":"2026-01-11T17:31:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 11 17:31:21 crc kubenswrapper[4837]: I0111 17:31:21.930429 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 11 17:31:21 crc kubenswrapper[4837]: I0111 17:31:21.930504 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 11 17:31:21 crc kubenswrapper[4837]: I0111 17:31:21.930532 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 11 17:31:21 crc kubenswrapper[4837]: I0111 17:31:21.930561 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 11 17:31:21 crc kubenswrapper[4837]: I0111 17:31:21.930580 4837 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-11T17:31:21Z","lastTransitionTime":"2026-01-11T17:31:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 11 17:31:22 crc kubenswrapper[4837]: I0111 17:31:22.033390 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 11 17:31:22 crc kubenswrapper[4837]: I0111 17:31:22.033475 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 11 17:31:22 crc kubenswrapper[4837]: I0111 17:31:22.033502 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 11 17:31:22 crc kubenswrapper[4837]: I0111 17:31:22.033533 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 11 17:31:22 crc kubenswrapper[4837]: I0111 17:31:22.033555 4837 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-11T17:31:22Z","lastTransitionTime":"2026-01-11T17:31:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 11 17:31:22 crc kubenswrapper[4837]: I0111 17:31:22.143550 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 11 17:31:22 crc kubenswrapper[4837]: I0111 17:31:22.143603 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 11 17:31:22 crc kubenswrapper[4837]: I0111 17:31:22.143624 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 11 17:31:22 crc kubenswrapper[4837]: I0111 17:31:22.143653 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 11 17:31:22 crc kubenswrapper[4837]: I0111 17:31:22.143729 4837 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-11T17:31:22Z","lastTransitionTime":"2026-01-11T17:31:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 11 17:31:22 crc kubenswrapper[4837]: I0111 17:31:22.246230 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 11 17:31:22 crc kubenswrapper[4837]: I0111 17:31:22.246288 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 11 17:31:22 crc kubenswrapper[4837]: I0111 17:31:22.246306 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 11 17:31:22 crc kubenswrapper[4837]: I0111 17:31:22.246328 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 11 17:31:22 crc kubenswrapper[4837]: I0111 17:31:22.246345 4837 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-11T17:31:22Z","lastTransitionTime":"2026-01-11T17:31:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 11 17:31:22 crc kubenswrapper[4837]: I0111 17:31:22.349210 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 11 17:31:22 crc kubenswrapper[4837]: I0111 17:31:22.349457 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 11 17:31:22 crc kubenswrapper[4837]: I0111 17:31:22.349546 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 11 17:31:22 crc kubenswrapper[4837]: I0111 17:31:22.349644 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 11 17:31:22 crc kubenswrapper[4837]: I0111 17:31:22.349755 4837 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-11T17:31:22Z","lastTransitionTime":"2026-01-11T17:31:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 11 17:31:22 crc kubenswrapper[4837]: I0111 17:31:22.363920 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Jan 11 17:31:22 crc kubenswrapper[4837]: I0111 17:31:22.364084 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Jan 11 17:31:22 crc kubenswrapper[4837]: E0111 17:31:22.364402 4837 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Jan 11 17:31:22 crc kubenswrapper[4837]: E0111 17:31:22.364404 4837 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Jan 11 17:31:22 crc kubenswrapper[4837]: I0111 17:31:22.453198 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 11 17:31:22 crc kubenswrapper[4837]: I0111 17:31:22.453262 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 11 17:31:22 crc kubenswrapper[4837]: I0111 17:31:22.453280 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 11 17:31:22 crc kubenswrapper[4837]: I0111 17:31:22.453305 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 11 17:31:22 crc kubenswrapper[4837]: I0111 17:31:22.453345 4837 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-11T17:31:22Z","lastTransitionTime":"2026-01-11T17:31:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 11 17:31:22 crc kubenswrapper[4837]: I0111 17:31:22.556392 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 11 17:31:22 crc kubenswrapper[4837]: I0111 17:31:22.556476 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 11 17:31:22 crc kubenswrapper[4837]: I0111 17:31:22.556502 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 11 17:31:22 crc kubenswrapper[4837]: I0111 17:31:22.556534 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 11 17:31:22 crc kubenswrapper[4837]: I0111 17:31:22.556558 4837 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-11T17:31:22Z","lastTransitionTime":"2026-01-11T17:31:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 11 17:31:22 crc kubenswrapper[4837]: I0111 17:31:22.659649 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 11 17:31:22 crc kubenswrapper[4837]: I0111 17:31:22.659730 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 11 17:31:22 crc kubenswrapper[4837]: I0111 17:31:22.659747 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 11 17:31:22 crc kubenswrapper[4837]: I0111 17:31:22.659771 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 11 17:31:22 crc kubenswrapper[4837]: I0111 17:31:22.659789 4837 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-11T17:31:22Z","lastTransitionTime":"2026-01-11T17:31:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 11 17:31:22 crc kubenswrapper[4837]: I0111 17:31:22.763318 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 11 17:31:22 crc kubenswrapper[4837]: I0111 17:31:22.763417 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 11 17:31:22 crc kubenswrapper[4837]: I0111 17:31:22.763439 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 11 17:31:22 crc kubenswrapper[4837]: I0111 17:31:22.763464 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 11 17:31:22 crc kubenswrapper[4837]: I0111 17:31:22.763482 4837 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-11T17:31:22Z","lastTransitionTime":"2026-01-11T17:31:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 11 17:31:22 crc kubenswrapper[4837]: I0111 17:31:22.839746 4837 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-multus/network-metrics-daemon-f2l24"] Jan 11 17:31:22 crc kubenswrapper[4837]: I0111 17:31:22.840302 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-f2l24" Jan 11 17:31:22 crc kubenswrapper[4837]: E0111 17:31:22.840386 4837 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-f2l24" podUID="b0d76b5a-6ea4-4508-ac4b-0f74711d7f68" Jan 11 17:31:22 crc kubenswrapper[4837]: I0111 17:31:22.864950 4837 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"46acf340-e77e-47f8-ba25-95b9b5870af3\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-11T17:30:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-11T17:30:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-11T17:30:26Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-11T17:30:26Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-11T17:30:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7d60a2ce54687f5ded99fd2e70894c23fe35a18b1bd557e6f05edca994881a3c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-11T17:30:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d3ff334f7925a2e69fa040b1ebfee0fd51669c60ff75534c1e08a9bf30b28d4d\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-11T17:30:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b0188eb536b523f8b0f5373307a7649fedd407a748d413a27ca1f61ed03e9e1d\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-11T17:30:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://45dc7defae85e8f8ec0f441595434d86ab3cf87a7c81cafb9ca178af9a300027\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://45dc7defae85e8f8ec0f441595434d86ab3cf87a7c81cafb9ca178af9a300027\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-01-11T17:31:15Z\\\",\\\"message\\\":\\\"le observer\\\\nW0111 17:31:09.479691 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0111 17:31:09.479821 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0111 17:31:09.480410 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-4110613780/tls.crt::/tmp/serving-cert-4110613780/tls.key\\\\\\\"\\\\nI0111 17:31:09.725824 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0111 17:31:10.438563 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0111 17:31:10.438624 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0111 17:31:10.438713 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0111 17:31:10.438732 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0111 17:31:10.449082 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0111 17:31:10.449135 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0111 17:31:10.449146 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0111 17:31:10.449159 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0111 17:31:10.449169 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0111 17:31:10.449179 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0111 17:31:10.449187 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0111 17:31:10.449256 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0111 17:31:10.451583 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-01-11T17:31:03Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 10s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c40f593938fad7c4202ddde8c6125eed092683e19464cc63448e7579e8174e54\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-11T17:30:35Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7e80355f699d3f0ea37d79f71e08cabb59de047b88df75691a2da0ef7de9d52b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7e80355f699d3f0ea37d79f71e08cabb59de047b88df75691a2da0ef7de9d52b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-11T17:30:27Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-11T17:30:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-11T17:30:26Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-11T17:31:22Z is after 2025-08-24T17:21:41Z" Jan 11 17:31:22 crc kubenswrapper[4837]: I0111 17:31:22.866001 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 11 17:31:22 crc kubenswrapper[4837]: I0111 17:31:22.866050 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 11 17:31:22 crc kubenswrapper[4837]: I0111 17:31:22.866061 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 11 17:31:22 crc kubenswrapper[4837]: I0111 17:31:22.866082 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 11 17:31:22 crc kubenswrapper[4837]: I0111 17:31:22.866095 4837 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-11T17:31:22Z","lastTransitionTime":"2026-01-11T17:31:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 11 17:31:22 crc kubenswrapper[4837]: I0111 17:31:22.885352 4837 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-11T17:31:09Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-11T17:31:09Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-11T17:31:22Z is after 2025-08-24T17:21:41Z" Jan 11 17:31:22 crc kubenswrapper[4837]: I0111 17:31:22.906455 4837 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-f2l24" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b0d76b5a-6ea4-4508-ac4b-0f74711d7f68\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-11T17:31:22Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-11T17:31:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-11T17:31:22Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-11T17:31:22Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-c57mc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-c57mc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-11T17:31:22Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-f2l24\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-11T17:31:22Z is after 2025-08-24T17:21:41Z" Jan 11 17:31:22 crc kubenswrapper[4837]: I0111 17:31:22.922777 4837 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-11T17:31:09Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-11T17:31:09Z\\\",\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":false,\\\"restartCount\\\":5,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-11T17:31:22Z is after 2025-08-24T17:21:41Z" Jan 11 17:31:22 crc kubenswrapper[4837]: I0111 17:31:22.944465 4837 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-11T17:31:09Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-11T17:31:09Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-11T17:31:22Z is after 2025-08-24T17:21:41Z" Jan 11 17:31:22 crc kubenswrapper[4837]: I0111 17:31:22.951457 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-c57mc\" (UniqueName: \"kubernetes.io/projected/b0d76b5a-6ea4-4508-ac4b-0f74711d7f68-kube-api-access-c57mc\") pod \"network-metrics-daemon-f2l24\" (UID: \"b0d76b5a-6ea4-4508-ac4b-0f74711d7f68\") " pod="openshift-multus/network-metrics-daemon-f2l24" Jan 11 17:31:22 crc kubenswrapper[4837]: I0111 17:31:22.951731 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/b0d76b5a-6ea4-4508-ac4b-0f74711d7f68-metrics-certs\") pod \"network-metrics-daemon-f2l24\" (UID: \"b0d76b5a-6ea4-4508-ac4b-0f74711d7f68\") " pod="openshift-multus/network-metrics-daemon-f2l24" Jan 11 17:31:22 crc kubenswrapper[4837]: I0111 17:31:22.962149 4837 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-pqnst" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1cf6fa66-290a-4e29-bafc-e60185a22fe2\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-11T17:31:09Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-11T17:31:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-11T17:31:09Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-11T17:31:09Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wgdql\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wgdql\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-11T17:31:09Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-pqnst\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-11T17:31:22Z is after 2025-08-24T17:21:41Z" Jan 11 17:31:22 crc kubenswrapper[4837]: I0111 17:31:22.971033 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 11 17:31:22 crc kubenswrapper[4837]: I0111 17:31:22.971095 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 11 17:31:22 crc kubenswrapper[4837]: I0111 17:31:22.971117 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 11 17:31:22 crc kubenswrapper[4837]: I0111 17:31:22.971145 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 11 17:31:22 crc kubenswrapper[4837]: I0111 17:31:22.971169 4837 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-11T17:31:22Z","lastTransitionTime":"2026-01-11T17:31:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 11 17:31:22 crc kubenswrapper[4837]: I0111 17:31:22.988748 4837 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-7996w" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"46729771-ba0b-4b7c-8245-b2d57acb5a2c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-11T17:31:09Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-11T17:31:09Z\\\",\\\"message\\\":\\\"containers with incomplete status: [egress-router-binary-copy cni-plugins bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-11T17:31:09Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-11T17:31:09Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fhxrk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fhxrk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fhxrk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fhxrk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fhxrk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fhxrk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fhxrk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-11T17:31:09Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-7996w\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-11T17:31:22Z is after 2025-08-24T17:21:41Z" Jan 11 17:31:22 crc kubenswrapper[4837]: W0111 17:31:22.989651 4837 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podbe7e19df_ab8a_4616_99b6_3e13a1fa0e4a.slice/crio-a0881a131d952480e77f9effad8b5e5403a34a0d53f1f0034872e27da0124a34 WatchSource:0}: Error finding container a0881a131d952480e77f9effad8b5e5403a34a0d53f1f0034872e27da0124a34: Status 404 returned error can't find the container with id a0881a131d952480e77f9effad8b5e5403a34a0d53f1f0034872e27da0124a34 Jan 11 17:31:23 crc kubenswrapper[4837]: I0111 17:31:23.002931 4837 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"fa05a690-1797-4956-a0a5-bec02527f1ef\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-11T17:30:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-11T17:30:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-11T17:31:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-11T17:31:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-11T17:30:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3f1570357568efb8bd28e5d0705440f75cbdee0d1872382250316b2ced45b7b4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-11T17:30:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://61c9dacc05eec80f9db657621ea0185c5aa06416af8f96a0a0cb43304b6cd17c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-11T17:30:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1454dce2acfc65d7ef5ab03957c74ea3fba47fe5ac62a9e8a4599ac7b6a2c217\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-11T17:30:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ebe3e5dfb1cb302cd3b72ee88a864dc0e47a098eb7775d2fe93f100207bb910d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ebe3e5dfb1cb302cd3b72ee88a864dc0e47a098eb7775d2fe93f100207bb910d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-11T17:30:27Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-11T17:30:27Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-11T17:30:26Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-11T17:31:23Z is after 2025-08-24T17:21:41Z" Jan 11 17:31:23 crc kubenswrapper[4837]: I0111 17:31:23.020090 4837 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-11T17:31:09Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-11T17:31:09Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-11T17:31:23Z is after 2025-08-24T17:21:41Z" Jan 11 17:31:23 crc kubenswrapper[4837]: I0111 17:31:23.042885 4837 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-b7lgc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e1452749-ce38-41f8-89dd-4b567f2a3250\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-11T17:31:09Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-11T17:31:09Z\\\",\\\"message\\\":\\\"containers with incomplete status: [kubecfg-setup]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-11T17:31:09Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-11T17:31:09Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6f6rv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6f6rv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6f6rv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6f6rv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6f6rv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6f6rv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6f6rv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6f6rv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6f6rv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-11T17:31:09Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-b7lgc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-11T17:31:23Z is after 2025-08-24T17:21:41Z" Jan 11 17:31:23 crc kubenswrapper[4837]: I0111 17:31:23.052114 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/b0d76b5a-6ea4-4508-ac4b-0f74711d7f68-metrics-certs\") pod \"network-metrics-daemon-f2l24\" (UID: \"b0d76b5a-6ea4-4508-ac4b-0f74711d7f68\") " pod="openshift-multus/network-metrics-daemon-f2l24" Jan 11 17:31:23 crc kubenswrapper[4837]: I0111 17:31:23.052186 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-c57mc\" (UniqueName: \"kubernetes.io/projected/b0d76b5a-6ea4-4508-ac4b-0f74711d7f68-kube-api-access-c57mc\") pod \"network-metrics-daemon-f2l24\" (UID: \"b0d76b5a-6ea4-4508-ac4b-0f74711d7f68\") " pod="openshift-multus/network-metrics-daemon-f2l24" Jan 11 17:31:23 crc kubenswrapper[4837]: E0111 17:31:23.052339 4837 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Jan 11 17:31:23 crc kubenswrapper[4837]: E0111 17:31:23.052458 4837 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/b0d76b5a-6ea4-4508-ac4b-0f74711d7f68-metrics-certs podName:b0d76b5a-6ea4-4508-ac4b-0f74711d7f68 nodeName:}" failed. No retries permitted until 2026-01-11 17:31:23.552429069 +0000 UTC m=+57.730621805 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/b0d76b5a-6ea4-4508-ac4b-0f74711d7f68-metrics-certs") pod "network-metrics-daemon-f2l24" (UID: "b0d76b5a-6ea4-4508-ac4b-0f74711d7f68") : object "openshift-multus"/"metrics-daemon-secret" not registered Jan 11 17:31:23 crc kubenswrapper[4837]: I0111 17:31:23.056776 4837 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-kcvkg" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a574a741-71b2-4d60-9553-3d85d815f4b0\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-11T17:31:09Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-11T17:31:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-11T17:31:09Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-11T17:31:09Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xnsnp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-11T17:31:09Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-kcvkg\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-11T17:31:23Z is after 2025-08-24T17:21:41Z" Jan 11 17:31:23 crc kubenswrapper[4837]: I0111 17:31:23.067725 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-c57mc\" (UniqueName: \"kubernetes.io/projected/b0d76b5a-6ea4-4508-ac4b-0f74711d7f68-kube-api-access-c57mc\") pod \"network-metrics-daemon-f2l24\" (UID: \"b0d76b5a-6ea4-4508-ac4b-0f74711d7f68\") " pod="openshift-multus/network-metrics-daemon-f2l24" Jan 11 17:31:23 crc kubenswrapper[4837]: I0111 17:31:23.073900 4837 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-zd2gg" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"cfaa7eab-7649-4a7d-8e95-aab12da5f86d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-11T17:31:10Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-11T17:31:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-11T17:31:10Z\\\",\\\"message\\\":\\\"containers with unready status: [node-ca]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-11T17:31:10Z\\\",\\\"message\\\":\\\"containers with unready status: [node-ca]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gq6v5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-11T17:31:10Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-zd2gg\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-11T17:31:23Z is after 2025-08-24T17:21:41Z" Jan 11 17:31:23 crc kubenswrapper[4837]: I0111 17:31:23.074018 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 11 17:31:23 crc kubenswrapper[4837]: I0111 17:31:23.074037 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 11 17:31:23 crc kubenswrapper[4837]: I0111 17:31:23.074046 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 11 17:31:23 crc kubenswrapper[4837]: I0111 17:31:23.074058 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 11 17:31:23 crc kubenswrapper[4837]: I0111 17:31:23.074067 4837 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-11T17:31:23Z","lastTransitionTime":"2026-01-11T17:31:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 11 17:31:23 crc kubenswrapper[4837]: I0111 17:31:23.088128 4837 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"62ccdf63-9fbb-4a5c-825b-aefdb1153bfd\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-11T17:30:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-11T17:30:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-11T17:31:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-11T17:31:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-11T17:30:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4d900bf1f3701fc269e75b435a2660c096b93a7c043693a5f294666defbe51c1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f143223d5223a4e1e7056b09d8a43a04fde926d4c15184b5b452190356db5881\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-01-11T17:31:03Z\\\",\\\"message\\\":\\\"+ timeout 3m /bin/bash -exuo pipefail -c 'while [ -n \\\\\\\"$(ss -Htanop \\\\\\\\( sport = 10357 \\\\\\\\))\\\\\\\" ]; do sleep 1; done'\\\\n++ ss -Htanop '(' sport = 10357 ')'\\\\n+ '[' -n '' ']'\\\\n+ exec cluster-policy-controller start --config=/etc/kubernetes/static-pod-resources/configmaps/cluster-policy-controller-config/config.yaml --kubeconfig=/etc/kubernetes/static-pod-resources/configmaps/controller-manager-kubeconfig/kubeconfig --namespace=openshift-kube-controller-manager -v=2\\\\nI0111 17:30:32.298484 1 leaderelection.go:121] The leader election gives 4 retries and allows for 30s of clock skew. The kube-apiserver downtime tolerance is 78s. Worst non-graceful lease acquisition is 2m43s. Worst graceful lease acquisition is {26s}.\\\\nI0111 17:30:32.299973 1 observer_polling.go:159] Starting file observer\\\\nI0111 17:30:32.306834 1 builder.go:298] cluster-policy-controller version 4.18.0-202501230001.p0.g5fd8525.assembly.stream.el9-5fd8525-5fd852525909ce6eab52972ba9ce8fcf56528eb9\\\\nI0111 17:30:32.308440 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/etc/kubernetes/static-pod-resources/secrets/serving-cert/tls.crt::/etc/kubernetes/static-pod-resources/secrets/serving-cert/tls.key\\\\\\\"\\\\nF0111 17:31:02.868341 1 cmd.go:179] failed checking apiserver connectivity: Get \\\\\\\"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/openshift-kube-controller-manager/leases/cluster-policy-controller-lock\\\\\\\": context deadline exceeded\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-01-11T17:30:30Z\\\"}},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-11T17:31:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e4a6334ba6c5d767d1ea3901dae9be1569836c6043224f9ccedd0100448a80fe\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-11T17:30:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9d344161514ccacd69f22e10f043ccdcf3b4f5a5cdcc0a0578bcb7b135f2f829\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-11T17:30:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5e28776f0910d2b90b9ec6cd95b16bfc77cf6eda9a615666acf6e461076ed9d3\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-11T17:30:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-11T17:30:26Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-11T17:31:23Z is after 2025-08-24T17:21:41Z" Jan 11 17:31:23 crc kubenswrapper[4837]: I0111 17:31:23.101837 4837 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-11T17:31:09Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-11T17:31:09Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-11T17:31:23Z is after 2025-08-24T17:21:41Z" Jan 11 17:31:23 crc kubenswrapper[4837]: I0111 17:31:23.115434 4837 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-v5bf5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"78cc7c3f-09f5-4200-a647-8fa4e9b2aae5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-11T17:31:09Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-11T17:31:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-11T17:31:09Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-11T17:31:09Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ns552\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-11T17:31:09Z\\\"}}\" for pod \"openshift-multus\"/\"multus-v5bf5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-11T17:31:23Z is after 2025-08-24T17:21:41Z" Jan 11 17:31:23 crc kubenswrapper[4837]: I0111 17:31:23.125795 4837 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-77cm7" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"be7e19df-ab8a-4616-99b6-3e13a1fa0e4a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-11T17:31:21Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-11T17:31:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-11T17:31:21Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-rbac-proxy ovnkube-cluster-manager]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-11T17:31:21Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-rbac-proxy ovnkube-cluster-manager]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-sbctq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-sbctq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-11T17:31:21Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-77cm7\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-11T17:31:23Z is after 2025-08-24T17:21:41Z" Jan 11 17:31:23 crc kubenswrapper[4837]: I0111 17:31:23.138309 4837 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-11T17:31:09Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-11T17:31:09Z\\\",\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":false,\\\"restartCount\\\":6,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-11T17:31:23Z is after 2025-08-24T17:21:41Z" Jan 11 17:31:23 crc kubenswrapper[4837]: I0111 17:31:23.176805 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 11 17:31:23 crc kubenswrapper[4837]: I0111 17:31:23.176856 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 11 17:31:23 crc kubenswrapper[4837]: I0111 17:31:23.176867 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 11 17:31:23 crc kubenswrapper[4837]: I0111 17:31:23.176884 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 11 17:31:23 crc kubenswrapper[4837]: I0111 17:31:23.176896 4837 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-11T17:31:23Z","lastTransitionTime":"2026-01-11T17:31:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 11 17:31:23 crc kubenswrapper[4837]: I0111 17:31:23.280345 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 11 17:31:23 crc kubenswrapper[4837]: I0111 17:31:23.280391 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 11 17:31:23 crc kubenswrapper[4837]: I0111 17:31:23.280405 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 11 17:31:23 crc kubenswrapper[4837]: I0111 17:31:23.280426 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 11 17:31:23 crc kubenswrapper[4837]: I0111 17:31:23.280441 4837 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-11T17:31:23Z","lastTransitionTime":"2026-01-11T17:31:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 11 17:31:23 crc kubenswrapper[4837]: I0111 17:31:23.363584 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Jan 11 17:31:23 crc kubenswrapper[4837]: E0111 17:31:23.363757 4837 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Jan 11 17:31:23 crc kubenswrapper[4837]: I0111 17:31:23.386302 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 11 17:31:23 crc kubenswrapper[4837]: I0111 17:31:23.386376 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 11 17:31:23 crc kubenswrapper[4837]: I0111 17:31:23.386398 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 11 17:31:23 crc kubenswrapper[4837]: I0111 17:31:23.386426 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 11 17:31:23 crc kubenswrapper[4837]: I0111 17:31:23.386458 4837 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-11T17:31:23Z","lastTransitionTime":"2026-01-11T17:31:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 11 17:31:23 crc kubenswrapper[4837]: I0111 17:31:23.489576 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 11 17:31:23 crc kubenswrapper[4837]: I0111 17:31:23.490103 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 11 17:31:23 crc kubenswrapper[4837]: I0111 17:31:23.490127 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 11 17:31:23 crc kubenswrapper[4837]: I0111 17:31:23.490153 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 11 17:31:23 crc kubenswrapper[4837]: I0111 17:31:23.490170 4837 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-11T17:31:23Z","lastTransitionTime":"2026-01-11T17:31:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 11 17:31:23 crc kubenswrapper[4837]: I0111 17:31:23.531460 4837 generic.go:334] "Generic (PLEG): container finished" podID="46729771-ba0b-4b7c-8245-b2d57acb5a2c" containerID="5f7aabf84755ea73f102e79b2c39d54635582f0a3cf82ccaa8b9f26c6adcb173" exitCode=0 Jan 11 17:31:23 crc kubenswrapper[4837]: I0111 17:31:23.531564 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-7996w" event={"ID":"46729771-ba0b-4b7c-8245-b2d57acb5a2c","Type":"ContainerDied","Data":"5f7aabf84755ea73f102e79b2c39d54635582f0a3cf82ccaa8b9f26c6adcb173"} Jan 11 17:31:23 crc kubenswrapper[4837]: I0111 17:31:23.537524 4837 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-check-endpoints/1.log" Jan 11 17:31:23 crc kubenswrapper[4837]: I0111 17:31:23.543368 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-77cm7" event={"ID":"be7e19df-ab8a-4616-99b6-3e13a1fa0e4a","Type":"ContainerStarted","Data":"438ee43c27c2fb5ef194b88705e75cd3ce8d5581d4580cb0c51a4af1ffe384ec"} Jan 11 17:31:23 crc kubenswrapper[4837]: I0111 17:31:23.543428 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-77cm7" event={"ID":"be7e19df-ab8a-4616-99b6-3e13a1fa0e4a","Type":"ContainerStarted","Data":"a0881a131d952480e77f9effad8b5e5403a34a0d53f1f0034872e27da0124a34"} Jan 11 17:31:23 crc kubenswrapper[4837]: I0111 17:31:23.546760 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-pqnst" event={"ID":"1cf6fa66-290a-4e29-bafc-e60185a22fe2","Type":"ContainerStarted","Data":"15704025d6b7d15cc708ee005ea866d61f0322aa49289641c2d924132c53c015"} Jan 11 17:31:23 crc kubenswrapper[4837]: I0111 17:31:23.549856 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" event={"ID":"ef543e1b-8068-4ea3-b32a-61027b32e95d","Type":"ContainerStarted","Data":"643f4463a26be7fb0ca0845bb463e54852881c1d250af53246d8c4846093037c"} Jan 11 17:31:23 crc kubenswrapper[4837]: I0111 17:31:23.556907 4837 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-v5bf5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"78cc7c3f-09f5-4200-a647-8fa4e9b2aae5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-11T17:31:09Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-11T17:31:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-11T17:31:09Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-11T17:31:09Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ns552\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-11T17:31:09Z\\\"}}\" for pod \"openshift-multus\"/\"multus-v5bf5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-11T17:31:23Z is after 2025-08-24T17:21:41Z" Jan 11 17:31:23 crc kubenswrapper[4837]: I0111 17:31:23.558754 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/b0d76b5a-6ea4-4508-ac4b-0f74711d7f68-metrics-certs\") pod \"network-metrics-daemon-f2l24\" (UID: \"b0d76b5a-6ea4-4508-ac4b-0f74711d7f68\") " pod="openshift-multus/network-metrics-daemon-f2l24" Jan 11 17:31:23 crc kubenswrapper[4837]: E0111 17:31:23.560525 4837 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Jan 11 17:31:23 crc kubenswrapper[4837]: E0111 17:31:23.560629 4837 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/b0d76b5a-6ea4-4508-ac4b-0f74711d7f68-metrics-certs podName:b0d76b5a-6ea4-4508-ac4b-0f74711d7f68 nodeName:}" failed. No retries permitted until 2026-01-11 17:31:24.560605245 +0000 UTC m=+58.738797981 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/b0d76b5a-6ea4-4508-ac4b-0f74711d7f68-metrics-certs") pod "network-metrics-daemon-f2l24" (UID: "b0d76b5a-6ea4-4508-ac4b-0f74711d7f68") : object "openshift-multus"/"metrics-daemon-secret" not registered Jan 11 17:31:23 crc kubenswrapper[4837]: I0111 17:31:23.571259 4837 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-77cm7" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"be7e19df-ab8a-4616-99b6-3e13a1fa0e4a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-11T17:31:21Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-11T17:31:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-11T17:31:21Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-rbac-proxy ovnkube-cluster-manager]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-11T17:31:21Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-rbac-proxy ovnkube-cluster-manager]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-sbctq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-sbctq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-11T17:31:21Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-77cm7\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-11T17:31:23Z is after 2025-08-24T17:21:41Z" Jan 11 17:31:23 crc kubenswrapper[4837]: I0111 17:31:23.590212 4837 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-11T17:31:09Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-11T17:31:09Z\\\",\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":false,\\\"restartCount\\\":6,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-11T17:31:23Z is after 2025-08-24T17:21:41Z" Jan 11 17:31:23 crc kubenswrapper[4837]: I0111 17:31:23.593320 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 11 17:31:23 crc kubenswrapper[4837]: I0111 17:31:23.593367 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 11 17:31:23 crc kubenswrapper[4837]: I0111 17:31:23.593383 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 11 17:31:23 crc kubenswrapper[4837]: I0111 17:31:23.593405 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 11 17:31:23 crc kubenswrapper[4837]: I0111 17:31:23.593420 4837 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-11T17:31:23Z","lastTransitionTime":"2026-01-11T17:31:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 11 17:31:23 crc kubenswrapper[4837]: I0111 17:31:23.610075 4837 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-11T17:31:09Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-11T17:31:09Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-11T17:31:23Z is after 2025-08-24T17:21:41Z" Jan 11 17:31:23 crc kubenswrapper[4837]: I0111 17:31:23.629333 4837 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-11T17:31:09Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-11T17:31:09Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-11T17:31:23Z is after 2025-08-24T17:21:41Z" Jan 11 17:31:23 crc kubenswrapper[4837]: I0111 17:31:23.645579 4837 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-f2l24" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b0d76b5a-6ea4-4508-ac4b-0f74711d7f68\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-11T17:31:22Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-11T17:31:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-11T17:31:22Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-11T17:31:22Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-c57mc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-c57mc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-11T17:31:22Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-f2l24\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-11T17:31:23Z is after 2025-08-24T17:21:41Z" Jan 11 17:31:23 crc kubenswrapper[4837]: I0111 17:31:23.665575 4837 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"46acf340-e77e-47f8-ba25-95b9b5870af3\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-11T17:30:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-11T17:30:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-11T17:30:26Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-11T17:30:26Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-11T17:30:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7d60a2ce54687f5ded99fd2e70894c23fe35a18b1bd557e6f05edca994881a3c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-11T17:30:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d3ff334f7925a2e69fa040b1ebfee0fd51669c60ff75534c1e08a9bf30b28d4d\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-11T17:30:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b0188eb536b523f8b0f5373307a7649fedd407a748d413a27ca1f61ed03e9e1d\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-11T17:30:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://45dc7defae85e8f8ec0f441595434d86ab3cf87a7c81cafb9ca178af9a300027\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://45dc7defae85e8f8ec0f441595434d86ab3cf87a7c81cafb9ca178af9a300027\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-01-11T17:31:15Z\\\",\\\"message\\\":\\\"le observer\\\\nW0111 17:31:09.479691 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0111 17:31:09.479821 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0111 17:31:09.480410 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-4110613780/tls.crt::/tmp/serving-cert-4110613780/tls.key\\\\\\\"\\\\nI0111 17:31:09.725824 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0111 17:31:10.438563 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0111 17:31:10.438624 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0111 17:31:10.438713 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0111 17:31:10.438732 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0111 17:31:10.449082 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0111 17:31:10.449135 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0111 17:31:10.449146 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0111 17:31:10.449159 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0111 17:31:10.449169 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0111 17:31:10.449179 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0111 17:31:10.449187 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0111 17:31:10.449256 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0111 17:31:10.451583 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-01-11T17:31:03Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 10s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c40f593938fad7c4202ddde8c6125eed092683e19464cc63448e7579e8174e54\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-11T17:30:35Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7e80355f699d3f0ea37d79f71e08cabb59de047b88df75691a2da0ef7de9d52b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7e80355f699d3f0ea37d79f71e08cabb59de047b88df75691a2da0ef7de9d52b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-11T17:30:27Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-11T17:30:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-11T17:30:26Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-11T17:31:23Z is after 2025-08-24T17:21:41Z" Jan 11 17:31:23 crc kubenswrapper[4837]: I0111 17:31:23.695979 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 11 17:31:23 crc kubenswrapper[4837]: I0111 17:31:23.696020 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 11 17:31:23 crc kubenswrapper[4837]: I0111 17:31:23.696033 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 11 17:31:23 crc kubenswrapper[4837]: I0111 17:31:23.696056 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 11 17:31:23 crc kubenswrapper[4837]: I0111 17:31:23.696071 4837 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-11T17:31:23Z","lastTransitionTime":"2026-01-11T17:31:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 11 17:31:23 crc kubenswrapper[4837]: I0111 17:31:23.805921 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 11 17:31:23 crc kubenswrapper[4837]: I0111 17:31:23.805966 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 11 17:31:23 crc kubenswrapper[4837]: I0111 17:31:23.805984 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 11 17:31:23 crc kubenswrapper[4837]: I0111 17:31:23.806002 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 11 17:31:23 crc kubenswrapper[4837]: I0111 17:31:23.806013 4837 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-11T17:31:23Z","lastTransitionTime":"2026-01-11T17:31:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 11 17:31:23 crc kubenswrapper[4837]: I0111 17:31:23.867626 4837 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-controller-manager/kube-controller-manager-crc" podStartSLOduration=13.867609369 podStartE2EDuration="13.867609369s" podCreationTimestamp="2026-01-11 17:31:10 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-11 17:31:23.867502237 +0000 UTC m=+58.045694963" watchObservedRunningTime="2026-01-11 17:31:23.867609369 +0000 UTC m=+58.045802075" Jan 11 17:31:23 crc kubenswrapper[4837]: I0111 17:31:23.879550 4837 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" podStartSLOduration=3.879529806 podStartE2EDuration="3.879529806s" podCreationTimestamp="2026-01-11 17:31:20 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-11 17:31:23.879479055 +0000 UTC m=+58.057671761" watchObservedRunningTime="2026-01-11 17:31:23.879529806 +0000 UTC m=+58.057722512" Jan 11 17:31:23 crc kubenswrapper[4837]: I0111 17:31:23.908960 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 11 17:31:23 crc kubenswrapper[4837]: I0111 17:31:23.908990 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 11 17:31:23 crc kubenswrapper[4837]: I0111 17:31:23.909000 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 11 17:31:23 crc kubenswrapper[4837]: I0111 17:31:23.909015 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 11 17:31:23 crc kubenswrapper[4837]: I0111 17:31:23.909027 4837 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-11T17:31:23Z","lastTransitionTime":"2026-01-11T17:31:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 11 17:31:23 crc kubenswrapper[4837]: I0111 17:31:23.937189 4837 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/multus-v5bf5" podStartSLOduration=15.937170653999999 podStartE2EDuration="15.937170654s" podCreationTimestamp="2026-01-11 17:31:08 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-11 17:31:23.936901257 +0000 UTC m=+58.115093983" watchObservedRunningTime="2026-01-11 17:31:23.937170654 +0000 UTC m=+58.115363360" Jan 11 17:31:24 crc kubenswrapper[4837]: I0111 17:31:24.000900 4837 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-config-operator/machine-config-daemon-pqnst" podStartSLOduration=16.000883579 podStartE2EDuration="16.000883579s" podCreationTimestamp="2026-01-11 17:31:08 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-11 17:31:24.000055958 +0000 UTC m=+58.178248674" watchObservedRunningTime="2026-01-11 17:31:24.000883579 +0000 UTC m=+58.179076285" Jan 11 17:31:24 crc kubenswrapper[4837]: I0111 17:31:24.011515 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 11 17:31:24 crc kubenswrapper[4837]: I0111 17:31:24.011548 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 11 17:31:24 crc kubenswrapper[4837]: I0111 17:31:24.011557 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 11 17:31:24 crc kubenswrapper[4837]: I0111 17:31:24.011570 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 11 17:31:24 crc kubenswrapper[4837]: I0111 17:31:24.011580 4837 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-11T17:31:24Z","lastTransitionTime":"2026-01-11T17:31:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 11 17:31:24 crc kubenswrapper[4837]: I0111 17:31:24.060647 4837 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-dns/node-resolver-kcvkg" podStartSLOduration=16.060632431 podStartE2EDuration="16.060632431s" podCreationTimestamp="2026-01-11 17:31:08 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-11 17:31:24.049876155 +0000 UTC m=+58.228068861" watchObservedRunningTime="2026-01-11 17:31:24.060632431 +0000 UTC m=+58.238825137" Jan 11 17:31:24 crc kubenswrapper[4837]: I0111 17:31:24.114447 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 11 17:31:24 crc kubenswrapper[4837]: I0111 17:31:24.114503 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 11 17:31:24 crc kubenswrapper[4837]: I0111 17:31:24.114515 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 11 17:31:24 crc kubenswrapper[4837]: I0111 17:31:24.114536 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 11 17:31:24 crc kubenswrapper[4837]: I0111 17:31:24.114547 4837 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-11T17:31:24Z","lastTransitionTime":"2026-01-11T17:31:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 11 17:31:24 crc kubenswrapper[4837]: I0111 17:31:24.217083 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 11 17:31:24 crc kubenswrapper[4837]: I0111 17:31:24.217436 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 11 17:31:24 crc kubenswrapper[4837]: I0111 17:31:24.217445 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 11 17:31:24 crc kubenswrapper[4837]: I0111 17:31:24.217460 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 11 17:31:24 crc kubenswrapper[4837]: I0111 17:31:24.217470 4837 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-11T17:31:24Z","lastTransitionTime":"2026-01-11T17:31:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 11 17:31:24 crc kubenswrapper[4837]: I0111 17:31:24.320074 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 11 17:31:24 crc kubenswrapper[4837]: I0111 17:31:24.320133 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 11 17:31:24 crc kubenswrapper[4837]: I0111 17:31:24.320173 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 11 17:31:24 crc kubenswrapper[4837]: I0111 17:31:24.320195 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 11 17:31:24 crc kubenswrapper[4837]: I0111 17:31:24.320212 4837 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-11T17:31:24Z","lastTransitionTime":"2026-01-11T17:31:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 11 17:31:24 crc kubenswrapper[4837]: I0111 17:31:24.363784 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-f2l24" Jan 11 17:31:24 crc kubenswrapper[4837]: I0111 17:31:24.363862 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Jan 11 17:31:24 crc kubenswrapper[4837]: I0111 17:31:24.363801 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Jan 11 17:31:24 crc kubenswrapper[4837]: E0111 17:31:24.363969 4837 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-f2l24" podUID="b0d76b5a-6ea4-4508-ac4b-0f74711d7f68" Jan 11 17:31:24 crc kubenswrapper[4837]: E0111 17:31:24.364062 4837 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Jan 11 17:31:24 crc kubenswrapper[4837]: E0111 17:31:24.364143 4837 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Jan 11 17:31:24 crc kubenswrapper[4837]: I0111 17:31:24.422843 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 11 17:31:24 crc kubenswrapper[4837]: I0111 17:31:24.422872 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 11 17:31:24 crc kubenswrapper[4837]: I0111 17:31:24.422879 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 11 17:31:24 crc kubenswrapper[4837]: I0111 17:31:24.422891 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 11 17:31:24 crc kubenswrapper[4837]: I0111 17:31:24.422899 4837 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-11T17:31:24Z","lastTransitionTime":"2026-01-11T17:31:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 11 17:31:24 crc kubenswrapper[4837]: I0111 17:31:24.527871 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 11 17:31:24 crc kubenswrapper[4837]: I0111 17:31:24.527974 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 11 17:31:24 crc kubenswrapper[4837]: I0111 17:31:24.528007 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 11 17:31:24 crc kubenswrapper[4837]: I0111 17:31:24.528041 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 11 17:31:24 crc kubenswrapper[4837]: I0111 17:31:24.528076 4837 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-11T17:31:24Z","lastTransitionTime":"2026-01-11T17:31:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 11 17:31:24 crc kubenswrapper[4837]: I0111 17:31:24.557494 4837 generic.go:334] "Generic (PLEG): container finished" podID="46729771-ba0b-4b7c-8245-b2d57acb5a2c" containerID="c26387936a1d7780b285a3c8b352c0715297a3587106bc5530e5f899031bb634" exitCode=0 Jan 11 17:31:24 crc kubenswrapper[4837]: I0111 17:31:24.557574 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-7996w" event={"ID":"46729771-ba0b-4b7c-8245-b2d57acb5a2c","Type":"ContainerDied","Data":"c26387936a1d7780b285a3c8b352c0715297a3587106bc5530e5f899031bb634"} Jan 11 17:31:24 crc kubenswrapper[4837]: I0111 17:31:24.567024 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-77cm7" event={"ID":"be7e19df-ab8a-4616-99b6-3e13a1fa0e4a","Type":"ContainerStarted","Data":"36935ef9464194f12f572deafee59d8c24cac7930c13f20cd9de4e9b988539ab"} Jan 11 17:31:24 crc kubenswrapper[4837]: I0111 17:31:24.570898 4837 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-kube-apiserver/kube-apiserver-crc" Jan 11 17:31:24 crc kubenswrapper[4837]: I0111 17:31:24.571824 4837 scope.go:117] "RemoveContainer" containerID="45dc7defae85e8f8ec0f441595434d86ab3cf87a7c81cafb9ca178af9a300027" Jan 11 17:31:24 crc kubenswrapper[4837]: E0111 17:31:24.572053 4837 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-apiserver-check-endpoints\" with CrashLoopBackOff: \"back-off 10s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\"" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" Jan 11 17:31:24 crc kubenswrapper[4837]: I0111 17:31:24.574034 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/b0d76b5a-6ea4-4508-ac4b-0f74711d7f68-metrics-certs\") pod \"network-metrics-daemon-f2l24\" (UID: \"b0d76b5a-6ea4-4508-ac4b-0f74711d7f68\") " pod="openshift-multus/network-metrics-daemon-f2l24" Jan 11 17:31:24 crc kubenswrapper[4837]: E0111 17:31:24.574289 4837 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Jan 11 17:31:24 crc kubenswrapper[4837]: E0111 17:31:24.574368 4837 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/b0d76b5a-6ea4-4508-ac4b-0f74711d7f68-metrics-certs podName:b0d76b5a-6ea4-4508-ac4b-0f74711d7f68 nodeName:}" failed. No retries permitted until 2026-01-11 17:31:26.574344359 +0000 UTC m=+60.752537145 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/b0d76b5a-6ea4-4508-ac4b-0f74711d7f68-metrics-certs") pod "network-metrics-daemon-f2l24" (UID: "b0d76b5a-6ea4-4508-ac4b-0f74711d7f68") : object "openshift-multus"/"metrics-daemon-secret" not registered Jan 11 17:31:24 crc kubenswrapper[4837]: I0111 17:31:24.578244 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-b7lgc" event={"ID":"e1452749-ce38-41f8-89dd-4b567f2a3250","Type":"ContainerStarted","Data":"878ba91885890209a4ad7ed46aaf8610fa1fef6202d8ffc3adf7f527c34d1d18"} Jan 11 17:31:24 crc kubenswrapper[4837]: I0111 17:31:24.578315 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-b7lgc" event={"ID":"e1452749-ce38-41f8-89dd-4b567f2a3250","Type":"ContainerStarted","Data":"d6e485df1c1cc7ceb0e56aba342245806d7ec935e617dbb546bbc6717f65fed0"} Jan 11 17:31:24 crc kubenswrapper[4837]: I0111 17:31:24.578349 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-b7lgc" event={"ID":"e1452749-ce38-41f8-89dd-4b567f2a3250","Type":"ContainerStarted","Data":"7a19d365a2ce792e7e7bc6982d59fd8d21a540943aa7768d919ba498bd627c1e"} Jan 11 17:31:24 crc kubenswrapper[4837]: I0111 17:31:24.578373 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-b7lgc" event={"ID":"e1452749-ce38-41f8-89dd-4b567f2a3250","Type":"ContainerStarted","Data":"2f7c987cd6bac7b28de3c49c8249b4980de62aab3ea9a3b907df9fbeb0309301"} Jan 11 17:31:24 crc kubenswrapper[4837]: I0111 17:31:24.578394 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-b7lgc" event={"ID":"e1452749-ce38-41f8-89dd-4b567f2a3250","Type":"ContainerStarted","Data":"b0d6c2ffdd65ea69d6d830257cf325f8e6b397c8b50a7a55742fed53fe655541"} Jan 11 17:31:24 crc kubenswrapper[4837]: I0111 17:31:24.578417 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-b7lgc" event={"ID":"e1452749-ce38-41f8-89dd-4b567f2a3250","Type":"ContainerStarted","Data":"7d938cb09523badf72bb434ce00ef1964af73cabdf0c3652e5dc3bab6d25a703"} Jan 11 17:31:24 crc kubenswrapper[4837]: I0111 17:31:24.581766 4837 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/node-ca-zd2gg" podStartSLOduration=16.581748188 podStartE2EDuration="16.581748188s" podCreationTimestamp="2026-01-11 17:31:08 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-11 17:31:24.061111763 +0000 UTC m=+58.239304469" watchObservedRunningTime="2026-01-11 17:31:24.581748188 +0000 UTC m=+58.759940894" Jan 11 17:31:24 crc kubenswrapper[4837]: I0111 17:31:24.630533 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 11 17:31:24 crc kubenswrapper[4837]: I0111 17:31:24.630565 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 11 17:31:24 crc kubenswrapper[4837]: I0111 17:31:24.630575 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 11 17:31:24 crc kubenswrapper[4837]: I0111 17:31:24.630591 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 11 17:31:24 crc kubenswrapper[4837]: I0111 17:31:24.630604 4837 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-11T17:31:24Z","lastTransitionTime":"2026-01-11T17:31:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 11 17:31:24 crc kubenswrapper[4837]: I0111 17:31:24.733067 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 11 17:31:24 crc kubenswrapper[4837]: I0111 17:31:24.733117 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 11 17:31:24 crc kubenswrapper[4837]: I0111 17:31:24.733136 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 11 17:31:24 crc kubenswrapper[4837]: I0111 17:31:24.733159 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 11 17:31:24 crc kubenswrapper[4837]: I0111 17:31:24.733176 4837 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-11T17:31:24Z","lastTransitionTime":"2026-01-11T17:31:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 11 17:31:24 crc kubenswrapper[4837]: I0111 17:31:24.834650 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 11 17:31:24 crc kubenswrapper[4837]: I0111 17:31:24.834697 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 11 17:31:24 crc kubenswrapper[4837]: I0111 17:31:24.834707 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 11 17:31:24 crc kubenswrapper[4837]: I0111 17:31:24.834722 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 11 17:31:24 crc kubenswrapper[4837]: I0111 17:31:24.834730 4837 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-11T17:31:24Z","lastTransitionTime":"2026-01-11T17:31:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 11 17:31:24 crc kubenswrapper[4837]: I0111 17:31:24.877434 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Jan 11 17:31:24 crc kubenswrapper[4837]: E0111 17:31:24.877812 4837 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-01-11 17:31:40.877778463 +0000 UTC m=+75.055971209 (durationBeforeRetry 16s). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 11 17:31:24 crc kubenswrapper[4837]: I0111 17:31:24.936958 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 11 17:31:24 crc kubenswrapper[4837]: I0111 17:31:24.937002 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 11 17:31:24 crc kubenswrapper[4837]: I0111 17:31:24.937012 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 11 17:31:24 crc kubenswrapper[4837]: I0111 17:31:24.937028 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 11 17:31:24 crc kubenswrapper[4837]: I0111 17:31:24.937038 4837 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-11T17:31:24Z","lastTransitionTime":"2026-01-11T17:31:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 11 17:31:24 crc kubenswrapper[4837]: I0111 17:31:24.978709 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Jan 11 17:31:24 crc kubenswrapper[4837]: I0111 17:31:24.978826 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Jan 11 17:31:24 crc kubenswrapper[4837]: I0111 17:31:24.978872 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Jan 11 17:31:24 crc kubenswrapper[4837]: I0111 17:31:24.978911 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Jan 11 17:31:24 crc kubenswrapper[4837]: E0111 17:31:24.979083 4837 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Jan 11 17:31:24 crc kubenswrapper[4837]: E0111 17:31:24.979157 4837 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-01-11 17:31:40.979135162 +0000 UTC m=+75.157327908 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Jan 11 17:31:24 crc kubenswrapper[4837]: E0111 17:31:24.979407 4837 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Jan 11 17:31:24 crc kubenswrapper[4837]: E0111 17:31:24.979509 4837 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Jan 11 17:31:24 crc kubenswrapper[4837]: E0111 17:31:24.979558 4837 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Jan 11 17:31:24 crc kubenswrapper[4837]: E0111 17:31:24.979579 4837 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Jan 11 17:31:24 crc kubenswrapper[4837]: E0111 17:31:24.979529 4837 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-01-11 17:31:40.979501292 +0000 UTC m=+75.157694038 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Jan 11 17:31:24 crc kubenswrapper[4837]: E0111 17:31:24.979656 4837 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Jan 11 17:31:24 crc kubenswrapper[4837]: E0111 17:31:24.979711 4837 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Jan 11 17:31:24 crc kubenswrapper[4837]: E0111 17:31:24.979712 4837 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2026-01-11 17:31:40.979652085 +0000 UTC m=+75.157844831 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Jan 11 17:31:24 crc kubenswrapper[4837]: E0111 17:31:24.979733 4837 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Jan 11 17:31:24 crc kubenswrapper[4837]: E0111 17:31:24.979785 4837 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2026-01-11 17:31:40.979769478 +0000 UTC m=+75.157962224 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Jan 11 17:31:25 crc kubenswrapper[4837]: I0111 17:31:25.040725 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 11 17:31:25 crc kubenswrapper[4837]: I0111 17:31:25.040782 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 11 17:31:25 crc kubenswrapper[4837]: I0111 17:31:25.040800 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 11 17:31:25 crc kubenswrapper[4837]: I0111 17:31:25.040824 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 11 17:31:25 crc kubenswrapper[4837]: I0111 17:31:25.040844 4837 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-11T17:31:25Z","lastTransitionTime":"2026-01-11T17:31:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 11 17:31:25 crc kubenswrapper[4837]: I0111 17:31:25.143866 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 11 17:31:25 crc kubenswrapper[4837]: I0111 17:31:25.143895 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 11 17:31:25 crc kubenswrapper[4837]: I0111 17:31:25.143904 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 11 17:31:25 crc kubenswrapper[4837]: I0111 17:31:25.143917 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 11 17:31:25 crc kubenswrapper[4837]: I0111 17:31:25.143926 4837 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-11T17:31:25Z","lastTransitionTime":"2026-01-11T17:31:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 11 17:31:25 crc kubenswrapper[4837]: I0111 17:31:25.247378 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 11 17:31:25 crc kubenswrapper[4837]: I0111 17:31:25.247446 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 11 17:31:25 crc kubenswrapper[4837]: I0111 17:31:25.247454 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 11 17:31:25 crc kubenswrapper[4837]: I0111 17:31:25.247467 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 11 17:31:25 crc kubenswrapper[4837]: I0111 17:31:25.247475 4837 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-11T17:31:25Z","lastTransitionTime":"2026-01-11T17:31:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 11 17:31:25 crc kubenswrapper[4837]: I0111 17:31:25.349984 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 11 17:31:25 crc kubenswrapper[4837]: I0111 17:31:25.350094 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 11 17:31:25 crc kubenswrapper[4837]: I0111 17:31:25.350115 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 11 17:31:25 crc kubenswrapper[4837]: I0111 17:31:25.350138 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 11 17:31:25 crc kubenswrapper[4837]: I0111 17:31:25.350156 4837 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-11T17:31:25Z","lastTransitionTime":"2026-01-11T17:31:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 11 17:31:25 crc kubenswrapper[4837]: I0111 17:31:25.363408 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Jan 11 17:31:25 crc kubenswrapper[4837]: E0111 17:31:25.363546 4837 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Jan 11 17:31:25 crc kubenswrapper[4837]: I0111 17:31:25.453593 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 11 17:31:25 crc kubenswrapper[4837]: I0111 17:31:25.453651 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 11 17:31:25 crc kubenswrapper[4837]: I0111 17:31:25.453665 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 11 17:31:25 crc kubenswrapper[4837]: I0111 17:31:25.453718 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 11 17:31:25 crc kubenswrapper[4837]: I0111 17:31:25.453735 4837 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-11T17:31:25Z","lastTransitionTime":"2026-01-11T17:31:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 11 17:31:25 crc kubenswrapper[4837]: I0111 17:31:25.536416 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 11 17:31:25 crc kubenswrapper[4837]: I0111 17:31:25.536497 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 11 17:31:25 crc kubenswrapper[4837]: I0111 17:31:25.536523 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 11 17:31:25 crc kubenswrapper[4837]: I0111 17:31:25.536554 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 11 17:31:25 crc kubenswrapper[4837]: I0111 17:31:25.536576 4837 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-11T17:31:25Z","lastTransitionTime":"2026-01-11T17:31:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 11 17:31:25 crc kubenswrapper[4837]: I0111 17:31:25.582655 4837 generic.go:334] "Generic (PLEG): container finished" podID="46729771-ba0b-4b7c-8245-b2d57acb5a2c" containerID="6b0cfe50dec54e797529cbca3ebd18ccff327617f1f3ccff68d58311d6229c73" exitCode=0 Jan 11 17:31:25 crc kubenswrapper[4837]: I0111 17:31:25.582718 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-7996w" event={"ID":"46729771-ba0b-4b7c-8245-b2d57acb5a2c","Type":"ContainerDied","Data":"6b0cfe50dec54e797529cbca3ebd18ccff327617f1f3ccff68d58311d6229c73"} Jan 11 17:31:25 crc kubenswrapper[4837]: I0111 17:31:25.590011 4837 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-77cm7" podStartSLOduration=16.589992462 podStartE2EDuration="16.589992462s" podCreationTimestamp="2026-01-11 17:31:09 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-11 17:31:24.606148615 +0000 UTC m=+58.784341351" watchObservedRunningTime="2026-01-11 17:31:25.589992462 +0000 UTC m=+59.768185168" Jan 11 17:31:25 crc kubenswrapper[4837]: I0111 17:31:25.590434 4837 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-cluster-version/cluster-version-operator-5c965bbfc6-kkj8k"] Jan 11 17:31:25 crc kubenswrapper[4837]: I0111 17:31:25.590857 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-kkj8k" Jan 11 17:31:25 crc kubenswrapper[4837]: I0111 17:31:25.595241 4837 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-version"/"default-dockercfg-gxtc4" Jan 11 17:31:25 crc kubenswrapper[4837]: I0111 17:31:25.595294 4837 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-version"/"openshift-service-ca.crt" Jan 11 17:31:25 crc kubenswrapper[4837]: I0111 17:31:25.595381 4837 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-version"/"cluster-version-operator-serving-cert" Jan 11 17:31:25 crc kubenswrapper[4837]: I0111 17:31:25.595906 4837 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-version"/"kube-root-ca.crt" Jan 11 17:31:25 crc kubenswrapper[4837]: I0111 17:31:25.686477 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/4f0ae454-6f30-4b94-97b2-318f4972e1e8-serving-cert\") pod \"cluster-version-operator-5c965bbfc6-kkj8k\" (UID: \"4f0ae454-6f30-4b94-97b2-318f4972e1e8\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-kkj8k" Jan 11 17:31:25 crc kubenswrapper[4837]: I0111 17:31:25.686559 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-ssl-certs\" (UniqueName: \"kubernetes.io/host-path/4f0ae454-6f30-4b94-97b2-318f4972e1e8-etc-ssl-certs\") pod \"cluster-version-operator-5c965bbfc6-kkj8k\" (UID: \"4f0ae454-6f30-4b94-97b2-318f4972e1e8\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-kkj8k" Jan 11 17:31:25 crc kubenswrapper[4837]: I0111 17:31:25.686598 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/4f0ae454-6f30-4b94-97b2-318f4972e1e8-service-ca\") pod \"cluster-version-operator-5c965bbfc6-kkj8k\" (UID: \"4f0ae454-6f30-4b94-97b2-318f4972e1e8\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-kkj8k" Jan 11 17:31:25 crc kubenswrapper[4837]: I0111 17:31:25.686623 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/4f0ae454-6f30-4b94-97b2-318f4972e1e8-kube-api-access\") pod \"cluster-version-operator-5c965bbfc6-kkj8k\" (UID: \"4f0ae454-6f30-4b94-97b2-318f4972e1e8\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-kkj8k" Jan 11 17:31:25 crc kubenswrapper[4837]: I0111 17:31:25.686668 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-cvo-updatepayloads\" (UniqueName: \"kubernetes.io/host-path/4f0ae454-6f30-4b94-97b2-318f4972e1e8-etc-cvo-updatepayloads\") pod \"cluster-version-operator-5c965bbfc6-kkj8k\" (UID: \"4f0ae454-6f30-4b94-97b2-318f4972e1e8\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-kkj8k" Jan 11 17:31:25 crc kubenswrapper[4837]: I0111 17:31:25.788255 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/4f0ae454-6f30-4b94-97b2-318f4972e1e8-serving-cert\") pod \"cluster-version-operator-5c965bbfc6-kkj8k\" (UID: \"4f0ae454-6f30-4b94-97b2-318f4972e1e8\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-kkj8k" Jan 11 17:31:25 crc kubenswrapper[4837]: I0111 17:31:25.788340 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-ssl-certs\" (UniqueName: \"kubernetes.io/host-path/4f0ae454-6f30-4b94-97b2-318f4972e1e8-etc-ssl-certs\") pod \"cluster-version-operator-5c965bbfc6-kkj8k\" (UID: \"4f0ae454-6f30-4b94-97b2-318f4972e1e8\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-kkj8k" Jan 11 17:31:25 crc kubenswrapper[4837]: I0111 17:31:25.788432 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-ssl-certs\" (UniqueName: \"kubernetes.io/host-path/4f0ae454-6f30-4b94-97b2-318f4972e1e8-etc-ssl-certs\") pod \"cluster-version-operator-5c965bbfc6-kkj8k\" (UID: \"4f0ae454-6f30-4b94-97b2-318f4972e1e8\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-kkj8k" Jan 11 17:31:25 crc kubenswrapper[4837]: I0111 17:31:25.788511 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/4f0ae454-6f30-4b94-97b2-318f4972e1e8-service-ca\") pod \"cluster-version-operator-5c965bbfc6-kkj8k\" (UID: \"4f0ae454-6f30-4b94-97b2-318f4972e1e8\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-kkj8k" Jan 11 17:31:25 crc kubenswrapper[4837]: I0111 17:31:25.789303 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/4f0ae454-6f30-4b94-97b2-318f4972e1e8-kube-api-access\") pod \"cluster-version-operator-5c965bbfc6-kkj8k\" (UID: \"4f0ae454-6f30-4b94-97b2-318f4972e1e8\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-kkj8k" Jan 11 17:31:25 crc kubenswrapper[4837]: I0111 17:31:25.789738 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-cvo-updatepayloads\" (UniqueName: \"kubernetes.io/host-path/4f0ae454-6f30-4b94-97b2-318f4972e1e8-etc-cvo-updatepayloads\") pod \"cluster-version-operator-5c965bbfc6-kkj8k\" (UID: \"4f0ae454-6f30-4b94-97b2-318f4972e1e8\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-kkj8k" Jan 11 17:31:25 crc kubenswrapper[4837]: I0111 17:31:25.789779 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/4f0ae454-6f30-4b94-97b2-318f4972e1e8-service-ca\") pod \"cluster-version-operator-5c965bbfc6-kkj8k\" (UID: \"4f0ae454-6f30-4b94-97b2-318f4972e1e8\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-kkj8k" Jan 11 17:31:25 crc kubenswrapper[4837]: I0111 17:31:25.789828 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-cvo-updatepayloads\" (UniqueName: \"kubernetes.io/host-path/4f0ae454-6f30-4b94-97b2-318f4972e1e8-etc-cvo-updatepayloads\") pod \"cluster-version-operator-5c965bbfc6-kkj8k\" (UID: \"4f0ae454-6f30-4b94-97b2-318f4972e1e8\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-kkj8k" Jan 11 17:31:25 crc kubenswrapper[4837]: I0111 17:31:25.794104 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/4f0ae454-6f30-4b94-97b2-318f4972e1e8-serving-cert\") pod \"cluster-version-operator-5c965bbfc6-kkj8k\" (UID: \"4f0ae454-6f30-4b94-97b2-318f4972e1e8\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-kkj8k" Jan 11 17:31:25 crc kubenswrapper[4837]: I0111 17:31:25.809901 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/4f0ae454-6f30-4b94-97b2-318f4972e1e8-kube-api-access\") pod \"cluster-version-operator-5c965bbfc6-kkj8k\" (UID: \"4f0ae454-6f30-4b94-97b2-318f4972e1e8\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-kkj8k" Jan 11 17:31:25 crc kubenswrapper[4837]: I0111 17:31:25.907956 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-kkj8k" Jan 11 17:31:25 crc kubenswrapper[4837]: W0111 17:31:25.924438 4837 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod4f0ae454_6f30_4b94_97b2_318f4972e1e8.slice/crio-ef73bb88b4d271c1392f61ae79696cf5bfac0b7f2abfd0dba38d4d3727edd6a9 WatchSource:0}: Error finding container ef73bb88b4d271c1392f61ae79696cf5bfac0b7f2abfd0dba38d4d3727edd6a9: Status 404 returned error can't find the container with id ef73bb88b4d271c1392f61ae79696cf5bfac0b7f2abfd0dba38d4d3727edd6a9 Jan 11 17:31:26 crc kubenswrapper[4837]: I0111 17:31:26.365021 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Jan 11 17:31:26 crc kubenswrapper[4837]: I0111 17:31:26.366921 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Jan 11 17:31:26 crc kubenswrapper[4837]: E0111 17:31:26.366879 4837 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Jan 11 17:31:26 crc kubenswrapper[4837]: I0111 17:31:26.367000 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-f2l24" Jan 11 17:31:26 crc kubenswrapper[4837]: E0111 17:31:26.367547 4837 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-f2l24" podUID="b0d76b5a-6ea4-4508-ac4b-0f74711d7f68" Jan 11 17:31:26 crc kubenswrapper[4837]: E0111 17:31:26.367726 4837 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Jan 11 17:31:26 crc kubenswrapper[4837]: I0111 17:31:26.590760 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-kkj8k" event={"ID":"4f0ae454-6f30-4b94-97b2-318f4972e1e8","Type":"ContainerStarted","Data":"ef73bb88b4d271c1392f61ae79696cf5bfac0b7f2abfd0dba38d4d3727edd6a9"} Jan 11 17:31:26 crc kubenswrapper[4837]: I0111 17:31:26.599493 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/b0d76b5a-6ea4-4508-ac4b-0f74711d7f68-metrics-certs\") pod \"network-metrics-daemon-f2l24\" (UID: \"b0d76b5a-6ea4-4508-ac4b-0f74711d7f68\") " pod="openshift-multus/network-metrics-daemon-f2l24" Jan 11 17:31:26 crc kubenswrapper[4837]: E0111 17:31:26.599714 4837 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Jan 11 17:31:26 crc kubenswrapper[4837]: E0111 17:31:26.599795 4837 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/b0d76b5a-6ea4-4508-ac4b-0f74711d7f68-metrics-certs podName:b0d76b5a-6ea4-4508-ac4b-0f74711d7f68 nodeName:}" failed. No retries permitted until 2026-01-11 17:31:30.599770615 +0000 UTC m=+64.777963351 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/b0d76b5a-6ea4-4508-ac4b-0f74711d7f68-metrics-certs") pod "network-metrics-daemon-f2l24" (UID: "b0d76b5a-6ea4-4508-ac4b-0f74711d7f68") : object "openshift-multus"/"metrics-daemon-secret" not registered Jan 11 17:31:27 crc kubenswrapper[4837]: I0111 17:31:27.363837 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Jan 11 17:31:27 crc kubenswrapper[4837]: E0111 17:31:27.363981 4837 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Jan 11 17:31:27 crc kubenswrapper[4837]: I0111 17:31:27.599407 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-7996w" event={"ID":"46729771-ba0b-4b7c-8245-b2d57acb5a2c","Type":"ContainerStarted","Data":"666b77a46f9a8198d0ae587c27a0fd08d6ec22b966f2e850a7b43ce1fdd17156"} Jan 11 17:31:28 crc kubenswrapper[4837]: I0111 17:31:28.363071 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Jan 11 17:31:28 crc kubenswrapper[4837]: I0111 17:31:28.363104 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-f2l24" Jan 11 17:31:28 crc kubenswrapper[4837]: E0111 17:31:28.363782 4837 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-f2l24" podUID="b0d76b5a-6ea4-4508-ac4b-0f74711d7f68" Jan 11 17:31:28 crc kubenswrapper[4837]: I0111 17:31:28.363178 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Jan 11 17:31:28 crc kubenswrapper[4837]: E0111 17:31:28.363809 4837 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Jan 11 17:31:28 crc kubenswrapper[4837]: E0111 17:31:28.364638 4837 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Jan 11 17:31:29 crc kubenswrapper[4837]: I0111 17:31:29.363347 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Jan 11 17:31:29 crc kubenswrapper[4837]: E0111 17:31:29.363557 4837 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Jan 11 17:31:29 crc kubenswrapper[4837]: I0111 17:31:29.610635 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-b7lgc" event={"ID":"e1452749-ce38-41f8-89dd-4b567f2a3250","Type":"ContainerStarted","Data":"6015de498ef0eae3d4666f7c48ba95f62f9de0c9a9a797e1b23edb1c48e5562d"} Jan 11 17:31:29 crc kubenswrapper[4837]: I0111 17:31:29.612270 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-kkj8k" event={"ID":"4f0ae454-6f30-4b94-97b2-318f4972e1e8","Type":"ContainerStarted","Data":"76ac3b4a5f6ad220fdb0735063409eb9b67b66051062dc459a6772064790f40c"} Jan 11 17:31:29 crc kubenswrapper[4837]: I0111 17:31:29.652790 4837 generic.go:334] "Generic (PLEG): container finished" podID="46729771-ba0b-4b7c-8245-b2d57acb5a2c" containerID="666b77a46f9a8198d0ae587c27a0fd08d6ec22b966f2e850a7b43ce1fdd17156" exitCode=0 Jan 11 17:31:29 crc kubenswrapper[4837]: I0111 17:31:29.652829 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-7996w" event={"ID":"46729771-ba0b-4b7c-8245-b2d57acb5a2c","Type":"ContainerDied","Data":"666b77a46f9a8198d0ae587c27a0fd08d6ec22b966f2e850a7b43ce1fdd17156"} Jan 11 17:31:29 crc kubenswrapper[4837]: I0111 17:31:29.702314 4837 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-kkj8k" podStartSLOduration=21.70229519 podStartE2EDuration="21.70229519s" podCreationTimestamp="2026-01-11 17:31:08 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-11 17:31:29.667915038 +0000 UTC m=+63.846107744" watchObservedRunningTime="2026-01-11 17:31:29.70229519 +0000 UTC m=+63.880487916" Jan 11 17:31:30 crc kubenswrapper[4837]: I0111 17:31:30.364540 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-f2l24" Jan 11 17:31:30 crc kubenswrapper[4837]: I0111 17:31:30.364570 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Jan 11 17:31:30 crc kubenswrapper[4837]: I0111 17:31:30.364734 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Jan 11 17:31:30 crc kubenswrapper[4837]: E0111 17:31:30.365346 4837 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Jan 11 17:31:30 crc kubenswrapper[4837]: E0111 17:31:30.365137 4837 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-f2l24" podUID="b0d76b5a-6ea4-4508-ac4b-0f74711d7f68" Jan 11 17:31:30 crc kubenswrapper[4837]: E0111 17:31:30.365536 4837 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Jan 11 17:31:30 crc kubenswrapper[4837]: I0111 17:31:30.664953 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/b0d76b5a-6ea4-4508-ac4b-0f74711d7f68-metrics-certs\") pod \"network-metrics-daemon-f2l24\" (UID: \"b0d76b5a-6ea4-4508-ac4b-0f74711d7f68\") " pod="openshift-multus/network-metrics-daemon-f2l24" Jan 11 17:31:30 crc kubenswrapper[4837]: E0111 17:31:30.665123 4837 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Jan 11 17:31:30 crc kubenswrapper[4837]: E0111 17:31:30.665219 4837 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/b0d76b5a-6ea4-4508-ac4b-0f74711d7f68-metrics-certs podName:b0d76b5a-6ea4-4508-ac4b-0f74711d7f68 nodeName:}" failed. No retries permitted until 2026-01-11 17:31:38.66519642 +0000 UTC m=+72.843389206 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/b0d76b5a-6ea4-4508-ac4b-0f74711d7f68-metrics-certs") pod "network-metrics-daemon-f2l24" (UID: "b0d76b5a-6ea4-4508-ac4b-0f74711d7f68") : object "openshift-multus"/"metrics-daemon-secret" not registered Jan 11 17:31:31 crc kubenswrapper[4837]: I0111 17:31:31.363702 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Jan 11 17:31:31 crc kubenswrapper[4837]: E0111 17:31:31.364159 4837 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Jan 11 17:31:31 crc kubenswrapper[4837]: I0111 17:31:31.666738 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-7996w" event={"ID":"46729771-ba0b-4b7c-8245-b2d57acb5a2c","Type":"ContainerStarted","Data":"46e136a9cd5d42a81b6152ce026e909ced9579e279e9362ef1c1e162a069d996"} Jan 11 17:31:31 crc kubenswrapper[4837]: I0111 17:31:31.675565 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-b7lgc" event={"ID":"e1452749-ce38-41f8-89dd-4b567f2a3250","Type":"ContainerStarted","Data":"9292b0d8221063fb8d3780811566b26c880419dfe3f21d500094bf550f6249ff"} Jan 11 17:31:31 crc kubenswrapper[4837]: I0111 17:31:31.676076 4837 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-b7lgc" Jan 11 17:31:31 crc kubenswrapper[4837]: I0111 17:31:31.676113 4837 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-b7lgc" Jan 11 17:31:31 crc kubenswrapper[4837]: I0111 17:31:31.676123 4837 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-b7lgc" Jan 11 17:31:31 crc kubenswrapper[4837]: I0111 17:31:31.709802 4837 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-b7lgc" Jan 11 17:31:31 crc kubenswrapper[4837]: I0111 17:31:31.715821 4837 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-b7lgc" Jan 11 17:31:31 crc kubenswrapper[4837]: I0111 17:31:31.779327 4837 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ovn-kubernetes/ovnkube-node-b7lgc" podStartSLOduration=23.779303129 podStartE2EDuration="23.779303129s" podCreationTimestamp="2026-01-11 17:31:08 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-11 17:31:31.741006996 +0000 UTC m=+65.919199762" watchObservedRunningTime="2026-01-11 17:31:31.779303129 +0000 UTC m=+65.957495875" Jan 11 17:31:32 crc kubenswrapper[4837]: I0111 17:31:32.363609 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-f2l24" Jan 11 17:31:32 crc kubenswrapper[4837]: I0111 17:31:32.363643 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Jan 11 17:31:32 crc kubenswrapper[4837]: I0111 17:31:32.363620 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Jan 11 17:31:32 crc kubenswrapper[4837]: E0111 17:31:32.363820 4837 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-f2l24" podUID="b0d76b5a-6ea4-4508-ac4b-0f74711d7f68" Jan 11 17:31:32 crc kubenswrapper[4837]: E0111 17:31:32.363985 4837 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Jan 11 17:31:32 crc kubenswrapper[4837]: E0111 17:31:32.364051 4837 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Jan 11 17:31:33 crc kubenswrapper[4837]: I0111 17:31:33.362919 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Jan 11 17:31:33 crc kubenswrapper[4837]: E0111 17:31:33.363019 4837 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Jan 11 17:31:34 crc kubenswrapper[4837]: I0111 17:31:34.374917 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-f2l24" Jan 11 17:31:34 crc kubenswrapper[4837]: I0111 17:31:34.374951 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Jan 11 17:31:34 crc kubenswrapper[4837]: I0111 17:31:34.375097 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Jan 11 17:31:34 crc kubenswrapper[4837]: E0111 17:31:34.375088 4837 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-f2l24" podUID="b0d76b5a-6ea4-4508-ac4b-0f74711d7f68" Jan 11 17:31:34 crc kubenswrapper[4837]: E0111 17:31:34.375177 4837 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Jan 11 17:31:34 crc kubenswrapper[4837]: E0111 17:31:34.375280 4837 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Jan 11 17:31:35 crc kubenswrapper[4837]: I0111 17:31:35.363665 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Jan 11 17:31:35 crc kubenswrapper[4837]: E0111 17:31:35.363810 4837 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Jan 11 17:31:35 crc kubenswrapper[4837]: I0111 17:31:35.373720 4837 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-config-operator/kube-rbac-proxy-crio-crc"] Jan 11 17:31:36 crc kubenswrapper[4837]: I0111 17:31:36.363593 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-f2l24" Jan 11 17:31:36 crc kubenswrapper[4837]: I0111 17:31:36.363747 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Jan 11 17:31:36 crc kubenswrapper[4837]: E0111 17:31:36.363789 4837 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-f2l24" podUID="b0d76b5a-6ea4-4508-ac4b-0f74711d7f68" Jan 11 17:31:36 crc kubenswrapper[4837]: I0111 17:31:36.363856 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Jan 11 17:31:36 crc kubenswrapper[4837]: E0111 17:31:36.363997 4837 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Jan 11 17:31:36 crc kubenswrapper[4837]: E0111 17:31:36.364145 4837 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Jan 11 17:31:36 crc kubenswrapper[4837]: I0111 17:31:36.377758 4837 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" podStartSLOduration=1.377737263 podStartE2EDuration="1.377737263s" podCreationTimestamp="2026-01-11 17:31:35 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-11 17:31:36.376708617 +0000 UTC m=+70.554901323" watchObservedRunningTime="2026-01-11 17:31:36.377737263 +0000 UTC m=+70.555929999" Jan 11 17:31:36 crc kubenswrapper[4837]: I0111 17:31:36.445806 4837 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-multus/network-metrics-daemon-f2l24"] Jan 11 17:31:36 crc kubenswrapper[4837]: I0111 17:31:36.691158 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-f2l24" Jan 11 17:31:36 crc kubenswrapper[4837]: E0111 17:31:36.691684 4837 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-f2l24" podUID="b0d76b5a-6ea4-4508-ac4b-0f74711d7f68" Jan 11 17:31:37 crc kubenswrapper[4837]: I0111 17:31:37.363661 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Jan 11 17:31:37 crc kubenswrapper[4837]: E0111 17:31:37.363795 4837 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Jan 11 17:31:37 crc kubenswrapper[4837]: I0111 17:31:37.364487 4837 scope.go:117] "RemoveContainer" containerID="45dc7defae85e8f8ec0f441595434d86ab3cf87a7c81cafb9ca178af9a300027" Jan 11 17:31:37 crc kubenswrapper[4837]: I0111 17:31:37.698201 4837 generic.go:334] "Generic (PLEG): container finished" podID="46729771-ba0b-4b7c-8245-b2d57acb5a2c" containerID="46e136a9cd5d42a81b6152ce026e909ced9579e279e9362ef1c1e162a069d996" exitCode=0 Jan 11 17:31:37 crc kubenswrapper[4837]: I0111 17:31:37.698250 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-7996w" event={"ID":"46729771-ba0b-4b7c-8245-b2d57acb5a2c","Type":"ContainerDied","Data":"46e136a9cd5d42a81b6152ce026e909ced9579e279e9362ef1c1e162a069d996"} Jan 11 17:31:38 crc kubenswrapper[4837]: I0111 17:31:38.363404 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-f2l24" Jan 11 17:31:38 crc kubenswrapper[4837]: I0111 17:31:38.363518 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Jan 11 17:31:38 crc kubenswrapper[4837]: I0111 17:31:38.363419 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Jan 11 17:31:38 crc kubenswrapper[4837]: E0111 17:31:38.363658 4837 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-f2l24" podUID="b0d76b5a-6ea4-4508-ac4b-0f74711d7f68" Jan 11 17:31:38 crc kubenswrapper[4837]: E0111 17:31:38.363827 4837 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Jan 11 17:31:38 crc kubenswrapper[4837]: E0111 17:31:38.363917 4837 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Jan 11 17:31:38 crc kubenswrapper[4837]: I0111 17:31:38.763185 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/b0d76b5a-6ea4-4508-ac4b-0f74711d7f68-metrics-certs\") pod \"network-metrics-daemon-f2l24\" (UID: \"b0d76b5a-6ea4-4508-ac4b-0f74711d7f68\") " pod="openshift-multus/network-metrics-daemon-f2l24" Jan 11 17:31:38 crc kubenswrapper[4837]: E0111 17:31:38.763337 4837 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Jan 11 17:31:38 crc kubenswrapper[4837]: E0111 17:31:38.763427 4837 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/b0d76b5a-6ea4-4508-ac4b-0f74711d7f68-metrics-certs podName:b0d76b5a-6ea4-4508-ac4b-0f74711d7f68 nodeName:}" failed. No retries permitted until 2026-01-11 17:31:54.763404359 +0000 UTC m=+88.941597105 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/b0d76b5a-6ea4-4508-ac4b-0f74711d7f68-metrics-certs") pod "network-metrics-daemon-f2l24" (UID: "b0d76b5a-6ea4-4508-ac4b-0f74711d7f68") : object "openshift-multus"/"metrics-daemon-secret" not registered Jan 11 17:31:39 crc kubenswrapper[4837]: I0111 17:31:39.363740 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Jan 11 17:31:39 crc kubenswrapper[4837]: E0111 17:31:39.363941 4837 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Jan 11 17:31:39 crc kubenswrapper[4837]: I0111 17:31:39.484549 4837 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-ovn-kubernetes/ovnkube-node-b7lgc" podUID="e1452749-ce38-41f8-89dd-4b567f2a3250" containerName="ovnkube-controller" probeResult="failure" output="" Jan 11 17:31:40 crc kubenswrapper[4837]: I0111 17:31:40.364110 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Jan 11 17:31:40 crc kubenswrapper[4837]: I0111 17:31:40.364288 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-f2l24" Jan 11 17:31:40 crc kubenswrapper[4837]: E0111 17:31:40.364372 4837 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Jan 11 17:31:40 crc kubenswrapper[4837]: I0111 17:31:40.364110 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Jan 11 17:31:40 crc kubenswrapper[4837]: E0111 17:31:40.364510 4837 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-f2l24" podUID="b0d76b5a-6ea4-4508-ac4b-0f74711d7f68" Jan 11 17:31:40 crc kubenswrapper[4837]: E0111 17:31:40.364584 4837 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Jan 11 17:31:40 crc kubenswrapper[4837]: I0111 17:31:40.893819 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Jan 11 17:31:40 crc kubenswrapper[4837]: E0111 17:31:40.893982 4837 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-01-11 17:32:12.893961403 +0000 UTC m=+107.072154119 (durationBeforeRetry 32s). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 11 17:31:40 crc kubenswrapper[4837]: I0111 17:31:40.995409 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Jan 11 17:31:40 crc kubenswrapper[4837]: I0111 17:31:40.995537 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Jan 11 17:31:40 crc kubenswrapper[4837]: I0111 17:31:40.995597 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Jan 11 17:31:40 crc kubenswrapper[4837]: I0111 17:31:40.995651 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Jan 11 17:31:40 crc kubenswrapper[4837]: E0111 17:31:40.995748 4837 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Jan 11 17:31:40 crc kubenswrapper[4837]: E0111 17:31:40.995770 4837 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Jan 11 17:31:40 crc kubenswrapper[4837]: E0111 17:31:40.995798 4837 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Jan 11 17:31:40 crc kubenswrapper[4837]: E0111 17:31:40.995828 4837 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Jan 11 17:31:40 crc kubenswrapper[4837]: E0111 17:31:40.995832 4837 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Jan 11 17:31:40 crc kubenswrapper[4837]: E0111 17:31:40.995848 4837 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Jan 11 17:31:40 crc kubenswrapper[4837]: E0111 17:31:40.995869 4837 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-01-11 17:32:12.99584683 +0000 UTC m=+107.174039616 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Jan 11 17:31:40 crc kubenswrapper[4837]: E0111 17:31:40.995952 4837 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2026-01-11 17:32:12.995929982 +0000 UTC m=+107.174122728 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Jan 11 17:31:40 crc kubenswrapper[4837]: E0111 17:31:40.995868 4837 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Jan 11 17:31:40 crc kubenswrapper[4837]: E0111 17:31:40.995973 4837 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-01-11 17:32:12.995963573 +0000 UTC m=+107.174156389 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Jan 11 17:31:40 crc kubenswrapper[4837]: E0111 17:31:40.995976 4837 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Jan 11 17:31:40 crc kubenswrapper[4837]: E0111 17:31:40.996032 4837 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2026-01-11 17:32:12.996016244 +0000 UTC m=+107.174208950 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Jan 11 17:31:41 crc kubenswrapper[4837]: I0111 17:31:41.363608 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Jan 11 17:31:41 crc kubenswrapper[4837]: E0111 17:31:41.363837 4837 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Jan 11 17:31:42 crc kubenswrapper[4837]: I0111 17:31:42.363051 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Jan 11 17:31:42 crc kubenswrapper[4837]: I0111 17:31:42.363174 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-f2l24" Jan 11 17:31:42 crc kubenswrapper[4837]: I0111 17:31:42.363051 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Jan 11 17:31:42 crc kubenswrapper[4837]: E0111 17:31:42.363199 4837 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Jan 11 17:31:42 crc kubenswrapper[4837]: E0111 17:31:42.363348 4837 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-f2l24" podUID="b0d76b5a-6ea4-4508-ac4b-0f74711d7f68" Jan 11 17:31:42 crc kubenswrapper[4837]: E0111 17:31:42.363449 4837 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Jan 11 17:31:43 crc kubenswrapper[4837]: I0111 17:31:43.363364 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Jan 11 17:31:43 crc kubenswrapper[4837]: E0111 17:31:43.363541 4837 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Jan 11 17:31:44 crc kubenswrapper[4837]: I0111 17:31:44.363251 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Jan 11 17:31:44 crc kubenswrapper[4837]: E0111 17:31:44.363377 4837 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Jan 11 17:31:44 crc kubenswrapper[4837]: I0111 17:31:44.363580 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Jan 11 17:31:44 crc kubenswrapper[4837]: E0111 17:31:44.363635 4837 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Jan 11 17:31:44 crc kubenswrapper[4837]: I0111 17:31:44.363798 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-f2l24" Jan 11 17:31:44 crc kubenswrapper[4837]: E0111 17:31:44.363859 4837 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-f2l24" podUID="b0d76b5a-6ea4-4508-ac4b-0f74711d7f68" Jan 11 17:31:44 crc kubenswrapper[4837]: I0111 17:31:44.724491 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-7996w" event={"ID":"46729771-ba0b-4b7c-8245-b2d57acb5a2c","Type":"ContainerStarted","Data":"e5786d8b324fd35829010199a5bd85870c9b2c8a5ec31e51e340d5712aedba08"} Jan 11 17:31:45 crc kubenswrapper[4837]: I0111 17:31:45.362988 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Jan 11 17:31:45 crc kubenswrapper[4837]: E0111 17:31:45.363117 4837 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Jan 11 17:31:45 crc kubenswrapper[4837]: I0111 17:31:45.732474 4837 generic.go:334] "Generic (PLEG): container finished" podID="46729771-ba0b-4b7c-8245-b2d57acb5a2c" containerID="e5786d8b324fd35829010199a5bd85870c9b2c8a5ec31e51e340d5712aedba08" exitCode=0 Jan 11 17:31:45 crc kubenswrapper[4837]: I0111 17:31:45.732550 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-7996w" event={"ID":"46729771-ba0b-4b7c-8245-b2d57acb5a2c","Type":"ContainerDied","Data":"e5786d8b324fd35829010199a5bd85870c9b2c8a5ec31e51e340d5712aedba08"} Jan 11 17:31:45 crc kubenswrapper[4837]: I0111 17:31:45.738407 4837 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-check-endpoints/1.log" Jan 11 17:31:45 crc kubenswrapper[4837]: I0111 17:31:45.741837 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"998f980d03503669fbff3f3dd9cee724c43b1a41050d8cc67ba424e9823a4220"} Jan 11 17:31:45 crc kubenswrapper[4837]: I0111 17:31:45.742588 4837 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-apiserver/kube-apiserver-crc" Jan 11 17:31:45 crc kubenswrapper[4837]: I0111 17:31:45.882700 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeReady" Jan 11 17:31:45 crc kubenswrapper[4837]: I0111 17:31:45.882877 4837 kubelet_node_status.go:538] "Fast updating node status as it just became ready" Jan 11 17:31:45 crc kubenswrapper[4837]: I0111 17:31:45.942614 4837 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-apiserver/kube-apiserver-crc" podStartSLOduration=29.942591612 podStartE2EDuration="29.942591612s" podCreationTimestamp="2026-01-11 17:31:16 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-11 17:31:45.809307446 +0000 UTC m=+79.987500162" watchObservedRunningTime="2026-01-11 17:31:45.942591612 +0000 UTC m=+80.120784318" Jan 11 17:31:45 crc kubenswrapper[4837]: I0111 17:31:45.945111 4837 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-apiserver/apiserver-76f77b778f-pz586"] Jan 11 17:31:45 crc kubenswrapper[4837]: I0111 17:31:45.945591 4837 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-console/console-f9d7485db-v8df8"] Jan 11 17:31:45 crc kubenswrapper[4837]: I0111 17:31:45.946131 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-apiserver/apiserver-76f77b778f-pz586" Jan 11 17:31:45 crc kubenswrapper[4837]: I0111 17:31:45.947853 4837 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-authentication-operator/authentication-operator-69f744f599-l2c7d"] Jan 11 17:31:45 crc kubenswrapper[4837]: I0111 17:31:45.947947 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-f9d7485db-v8df8" Jan 11 17:31:45 crc kubenswrapper[4837]: I0111 17:31:45.948814 4837 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-dns-operator/dns-operator-744455d44c-n8rwr"] Jan 11 17:31:45 crc kubenswrapper[4837]: I0111 17:31:45.949206 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns-operator/dns-operator-744455d44c-n8rwr" Jan 11 17:31:45 crc kubenswrapper[4837]: I0111 17:31:45.949502 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-authentication-operator/authentication-operator-69f744f599-l2c7d" Jan 11 17:31:45 crc kubenswrapper[4837]: I0111 17:31:45.953331 4837 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"kube-root-ca.crt" Jan 11 17:31:45 crc kubenswrapper[4837]: I0111 17:31:45.953611 4837 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"config" Jan 11 17:31:45 crc kubenswrapper[4837]: I0111 17:31:45.953855 4837 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"audit-1" Jan 11 17:31:45 crc kubenswrapper[4837]: I0111 17:31:45.954052 4837 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"service-ca" Jan 11 17:31:45 crc kubenswrapper[4837]: I0111 17:31:45.954286 4837 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"etcd-serving-ca" Jan 11 17:31:45 crc kubenswrapper[4837]: I0111 17:31:45.954507 4837 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"console-serving-cert" Jan 11 17:31:45 crc kubenswrapper[4837]: I0111 17:31:45.954789 4837 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-config-operator/openshift-config-operator-7777fb866f-599f5"] Jan 11 17:31:45 crc kubenswrapper[4837]: I0111 17:31:45.955437 4837 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-oauth-apiserver/apiserver-7bbb656c7d-2ngzf"] Jan 11 17:31:45 crc kubenswrapper[4837]: I0111 17:31:45.955718 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-config-operator/openshift-config-operator-7777fb866f-599f5" Jan 11 17:31:45 crc kubenswrapper[4837]: I0111 17:31:45.956029 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-2ngzf" Jan 11 17:31:45 crc kubenswrapper[4837]: I0111 17:31:45.956481 4837 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6576b87f9c-cwdvp"] Jan 11 17:31:45 crc kubenswrapper[4837]: I0111 17:31:45.957065 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-cwdvp" Jan 11 17:31:45 crc kubenswrapper[4837]: I0111 17:31:45.957747 4837 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-controller-manager/controller-manager-879f6c89f-ljlz2"] Jan 11 17:31:45 crc kubenswrapper[4837]: I0111 17:31:45.958254 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-879f6c89f-ljlz2" Jan 11 17:31:45 crc kubenswrapper[4837]: I0111 17:31:45.962330 4837 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"image-import-ca" Jan 11 17:31:45 crc kubenswrapper[4837]: I0111 17:31:45.962952 4837 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"encryption-config-1" Jan 11 17:31:45 crc kubenswrapper[4837]: I0111 17:31:45.963035 4837 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-config-operator"/"openshift-service-ca.crt" Jan 11 17:31:45 crc kubenswrapper[4837]: I0111 17:31:45.962985 4837 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-config-operator"/"openshift-config-operator-dockercfg-7pc5z" Jan 11 17:31:45 crc kubenswrapper[4837]: I0111 17:31:45.963586 4837 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"service-ca-bundle" Jan 11 17:31:45 crc kubenswrapper[4837]: I0111 17:31:45.963304 4837 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"openshift-service-ca.crt" Jan 11 17:31:45 crc kubenswrapper[4837]: I0111 17:31:45.963864 4837 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"oauth-serving-cert" Jan 11 17:31:45 crc kubenswrapper[4837]: I0111 17:31:45.963768 4837 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"console-config" Jan 11 17:31:45 crc kubenswrapper[4837]: I0111 17:31:45.963808 4837 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"openshift-service-ca.crt" Jan 11 17:31:45 crc kubenswrapper[4837]: I0111 17:31:45.963868 4837 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"console-dockercfg-f62pw" Jan 11 17:31:45 crc kubenswrapper[4837]: I0111 17:31:45.964137 4837 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication-operator"/"authentication-operator-dockercfg-mz9bj" Jan 11 17:31:45 crc kubenswrapper[4837]: I0111 17:31:45.963977 4837 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"openshift-service-ca.crt" Jan 11 17:31:45 crc kubenswrapper[4837]: I0111 17:31:45.964040 4837 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication-operator"/"serving-cert" Jan 11 17:31:45 crc kubenswrapper[4837]: I0111 17:31:45.964281 4837 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"kube-root-ca.crt" Jan 11 17:31:45 crc kubenswrapper[4837]: I0111 17:31:45.964051 4837 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"kube-root-ca.crt" Jan 11 17:31:45 crc kubenswrapper[4837]: I0111 17:31:45.963939 4837 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns-operator"/"metrics-tls" Jan 11 17:31:45 crc kubenswrapper[4837]: I0111 17:31:45.964442 4837 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"encryption-config-1" Jan 11 17:31:45 crc kubenswrapper[4837]: I0111 17:31:45.964400 4837 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns-operator"/"openshift-service-ca.crt" Jan 11 17:31:45 crc kubenswrapper[4837]: I0111 17:31:45.964531 4837 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-console/downloads-7954f5f757-dmnqc"] Jan 11 17:31:45 crc kubenswrapper[4837]: I0111 17:31:45.964666 4837 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"authentication-operator-config" Jan 11 17:31:45 crc kubenswrapper[4837]: I0111 17:31:45.964902 4837 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-config-operator"/"kube-root-ca.crt" Jan 11 17:31:45 crc kubenswrapper[4837]: I0111 17:31:45.965000 4837 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-config-operator"/"config-operator-serving-cert" Jan 11 17:31:45 crc kubenswrapper[4837]: I0111 17:31:45.965038 4837 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns-operator"/"kube-root-ca.crt" Jan 11 17:31:45 crc kubenswrapper[4837]: I0111 17:31:45.965223 4837 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"console-oauth-config" Jan 11 17:31:45 crc kubenswrapper[4837]: I0111 17:31:45.965296 4837 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-cluster-machine-approver/machine-approver-56656f9798-hchsh"] Jan 11 17:31:45 crc kubenswrapper[4837]: I0111 17:31:45.965388 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/downloads-7954f5f757-dmnqc" Jan 11 17:31:45 crc kubenswrapper[4837]: I0111 17:31:45.965423 4837 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"openshift-apiserver-sa-dockercfg-djjff" Jan 11 17:31:45 crc kubenswrapper[4837]: I0111 17:31:45.965561 4837 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns-operator"/"dns-operator-dockercfg-9mqw5" Jan 11 17:31:45 crc kubenswrapper[4837]: I0111 17:31:45.965653 4837 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"serving-cert" Jan 11 17:31:45 crc kubenswrapper[4837]: I0111 17:31:45.965853 4837 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-fpvgj"] Jan 11 17:31:45 crc kubenswrapper[4837]: I0111 17:31:45.966161 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-fpvgj" Jan 11 17:31:45 crc kubenswrapper[4837]: I0111 17:31:45.966306 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-machine-approver/machine-approver-56656f9798-hchsh" Jan 11 17:31:45 crc kubenswrapper[4837]: I0111 17:31:45.966534 4837 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"etcd-client" Jan 11 17:31:45 crc kubenswrapper[4837]: I0111 17:31:45.967261 4837 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-console-operator/console-operator-58897d9998-n8ldn"] Jan 11 17:31:45 crc kubenswrapper[4837]: I0111 17:31:45.967788 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console-operator/console-operator-58897d9998-n8ldn" Jan 11 17:31:46 crc kubenswrapper[4837]: I0111 17:31:46.002364 4837 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"trusted-ca-bundle" Jan 11 17:31:46 crc kubenswrapper[4837]: I0111 17:31:46.002479 4837 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"oauth-apiserver-sa-dockercfg-6r2bq" Jan 11 17:31:46 crc kubenswrapper[4837]: I0111 17:31:46.002782 4837 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"kube-root-ca.crt" Jan 11 17:31:46 crc kubenswrapper[4837]: I0111 17:31:46.002978 4837 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"audit-1" Jan 11 17:31:46 crc kubenswrapper[4837]: I0111 17:31:46.002985 4837 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"openshift-service-ca.crt" Jan 11 17:31:46 crc kubenswrapper[4837]: I0111 17:31:46.003112 4837 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-wlfcq"] Jan 11 17:31:46 crc kubenswrapper[4837]: I0111 17:31:46.003924 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-wlfcq" Jan 11 17:31:46 crc kubenswrapper[4837]: I0111 17:31:46.005270 4837 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"etcd-client" Jan 11 17:31:46 crc kubenswrapper[4837]: I0111 17:31:46.005331 4837 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"openshift-controller-manager-sa-dockercfg-msq4c" Jan 11 17:31:46 crc kubenswrapper[4837]: I0111 17:31:46.005692 4837 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"kube-root-ca.crt" Jan 11 17:31:46 crc kubenswrapper[4837]: I0111 17:31:46.007294 4837 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"serving-cert" Jan 11 17:31:46 crc kubenswrapper[4837]: I0111 17:31:46.008636 4837 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver-operator"/"openshift-service-ca.crt" Jan 11 17:31:46 crc kubenswrapper[4837]: I0111 17:31:46.026512 4837 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver-operator"/"openshift-apiserver-operator-config" Jan 11 17:31:46 crc kubenswrapper[4837]: I0111 17:31:46.031782 4837 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"client-ca" Jan 11 17:31:46 crc kubenswrapper[4837]: I0111 17:31:46.032035 4837 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"serving-cert" Jan 11 17:31:46 crc kubenswrapper[4837]: I0111 17:31:46.032155 4837 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-machine-approver"/"machine-approver-tls" Jan 11 17:31:46 crc kubenswrapper[4837]: I0111 17:31:46.032310 4837 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"route-controller-manager-sa-dockercfg-h2zr2" Jan 11 17:31:46 crc kubenswrapper[4837]: I0111 17:31:46.032419 4837 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"kube-root-ca.crt" Jan 11 17:31:46 crc kubenswrapper[4837]: I0111 17:31:46.033144 4837 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver-operator"/"openshift-apiserver-operator-serving-cert" Jan 11 17:31:46 crc kubenswrapper[4837]: I0111 17:31:46.033228 4837 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"etcd-serving-ca" Jan 11 17:31:46 crc kubenswrapper[4837]: I0111 17:31:46.033772 4837 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"openshift-service-ca.crt" Jan 11 17:31:46 crc kubenswrapper[4837]: I0111 17:31:46.033943 4837 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"openshift-service-ca.crt" Jan 11 17:31:46 crc kubenswrapper[4837]: I0111 17:31:46.034043 4837 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-service-ca.crt" Jan 11 17:31:46 crc kubenswrapper[4837]: I0111 17:31:46.034134 4837 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"openshift-service-ca.crt" Jan 11 17:31:46 crc kubenswrapper[4837]: I0111 17:31:46.034841 4837 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"client-ca" Jan 11 17:31:46 crc kubenswrapper[4837]: I0111 17:31:46.035885 4837 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager-operator"/"openshift-controller-manager-operator-config" Jan 11 17:31:46 crc kubenswrapper[4837]: I0111 17:31:46.035936 4837 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console-operator"/"console-operator-dockercfg-4xjcr" Jan 11 17:31:46 crc kubenswrapper[4837]: I0111 17:31:46.036106 4837 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-machine-approver"/"machine-approver-sa-dockercfg-nl2j4" Jan 11 17:31:46 crc kubenswrapper[4837]: I0111 17:31:46.036186 4837 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"config" Jan 11 17:31:46 crc kubenswrapper[4837]: I0111 17:31:46.036412 4837 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver-operator"/"openshift-apiserver-operator-dockercfg-xtcjv" Jan 11 17:31:46 crc kubenswrapper[4837]: I0111 17:31:46.036518 4837 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver-operator"/"kube-root-ca.crt" Jan 11 17:31:46 crc kubenswrapper[4837]: I0111 17:31:46.036616 4837 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager-operator"/"openshift-service-ca.crt" Jan 11 17:31:46 crc kubenswrapper[4837]: I0111 17:31:46.038569 4837 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"machine-approver-config" Jan 11 17:31:46 crc kubenswrapper[4837]: I0111 17:31:46.038910 4837 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"console-operator-config" Jan 11 17:31:46 crc kubenswrapper[4837]: I0111 17:31:46.039192 4837 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"kube-rbac-proxy" Jan 11 17:31:46 crc kubenswrapper[4837]: I0111 17:31:46.039462 4837 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager-operator"/"openshift-controller-manager-operator-serving-cert" Jan 11 17:31:46 crc kubenswrapper[4837]: I0111 17:31:46.039604 4837 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"default-dockercfg-chnjx" Jan 11 17:31:46 crc kubenswrapper[4837]: I0111 17:31:46.039887 4837 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"serving-cert" Jan 11 17:31:46 crc kubenswrapper[4837]: I0111 17:31:46.039921 4837 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"kube-root-ca.crt" Jan 11 17:31:46 crc kubenswrapper[4837]: I0111 17:31:46.040062 4837 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"config" Jan 11 17:31:46 crc kubenswrapper[4837]: I0111 17:31:46.041217 4837 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"kube-root-ca.crt" Jan 11 17:31:46 crc kubenswrapper[4837]: I0111 17:31:46.041273 4837 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console-operator"/"serving-cert" Jan 11 17:31:46 crc kubenswrapper[4837]: I0111 17:31:46.041337 4837 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-authentication/oauth-openshift-558db77b4-ldzgv"] Jan 11 17:31:46 crc kubenswrapper[4837]: I0111 17:31:46.043364 4837 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"trusted-ca-bundle" Jan 11 17:31:46 crc kubenswrapper[4837]: I0111 17:31:46.044264 4837 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager-operator"/"kube-root-ca.crt" Jan 11 17:31:46 crc kubenswrapper[4837]: I0111 17:31:46.055789 4837 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-global-ca" Jan 11 17:31:46 crc kubenswrapper[4837]: I0111 17:31:46.060442 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pvng8\" (UniqueName: \"kubernetes.io/projected/59f65053-461b-4ad3-acc6-2a29b1fd06c6-kube-api-access-pvng8\") pod \"machine-approver-56656f9798-hchsh\" (UID: \"59f65053-461b-4ad3-acc6-2a29b1fd06c6\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-hchsh" Jan 11 17:31:46 crc kubenswrapper[4837]: I0111 17:31:46.060473 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/1d6837bf-33ce-474a-a4f5-22d1fa7b95b1-serving-cert\") pod \"authentication-operator-69f744f599-l2c7d\" (UID: \"1d6837bf-33ce-474a-a4f5-22d1fa7b95b1\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-l2c7d" Jan 11 17:31:46 crc kubenswrapper[4837]: I0111 17:31:46.060493 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/58f0dabf-1deb-453a-9ac9-11324df2b806-serving-cert\") pod \"console-operator-58897d9998-n8ldn\" (UID: \"58f0dabf-1deb-453a-9ac9-11324df2b806\") " pod="openshift-console-operator/console-operator-58897d9998-n8ldn" Jan 11 17:31:46 crc kubenswrapper[4837]: I0111 17:31:46.060517 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/f9f7469a-7ddb-4d35-962e-86154f7750c9-serving-cert\") pod \"controller-manager-879f6c89f-ljlz2\" (UID: \"f9f7469a-7ddb-4d35-962e-86154f7750c9\") " pod="openshift-controller-manager/controller-manager-879f6c89f-ljlz2" Jan 11 17:31:46 crc kubenswrapper[4837]: I0111 17:31:46.060533 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/59f65053-461b-4ad3-acc6-2a29b1fd06c6-auth-proxy-config\") pod \"machine-approver-56656f9798-hchsh\" (UID: \"59f65053-461b-4ad3-acc6-2a29b1fd06c6\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-hchsh" Jan 11 17:31:46 crc kubenswrapper[4837]: I0111 17:31:46.060553 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/ecf9daad-b73a-4f1a-9247-ee4973ad1bc7-etcd-client\") pod \"apiserver-7bbb656c7d-2ngzf\" (UID: \"ecf9daad-b73a-4f1a-9247-ee4973ad1bc7\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-2ngzf" Jan 11 17:31:46 crc kubenswrapper[4837]: I0111 17:31:46.060570 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/1141f492-afec-40f3-bde7-7072d6a75a68-console-serving-cert\") pod \"console-f9d7485db-v8df8\" (UID: \"1141f492-afec-40f3-bde7-7072d6a75a68\") " pod="openshift-console/console-f9d7485db-v8df8" Jan 11 17:31:46 crc kubenswrapper[4837]: I0111 17:31:46.060584 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/09971765-a82b-4ab2-b79a-5defb8feb416-node-pullsecrets\") pod \"apiserver-76f77b778f-pz586\" (UID: \"09971765-a82b-4ab2-b79a-5defb8feb416\") " pod="openshift-apiserver/apiserver-76f77b778f-pz586" Jan 11 17:31:46 crc kubenswrapper[4837]: I0111 17:31:46.060598 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/1d6837bf-33ce-474a-a4f5-22d1fa7b95b1-trusted-ca-bundle\") pod \"authentication-operator-69f744f599-l2c7d\" (UID: \"1d6837bf-33ce-474a-a4f5-22d1fa7b95b1\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-l2c7d" Jan 11 17:31:46 crc kubenswrapper[4837]: I0111 17:31:46.060615 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-s8d4m\" (UniqueName: \"kubernetes.io/projected/1d6837bf-33ce-474a-a4f5-22d1fa7b95b1-kube-api-access-s8d4m\") pod \"authentication-operator-69f744f599-l2c7d\" (UID: \"1d6837bf-33ce-474a-a4f5-22d1fa7b95b1\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-l2c7d" Jan 11 17:31:46 crc kubenswrapper[4837]: I0111 17:31:46.063047 4837 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"trusted-ca-bundle" Jan 11 17:31:46 crc kubenswrapper[4837]: I0111 17:31:46.063559 4837 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"trusted-ca" Jan 11 17:31:46 crc kubenswrapper[4837]: I0111 17:31:46.064933 4837 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager-operator"/"openshift-controller-manager-operator-dockercfg-vw8fw" Jan 11 17:31:46 crc kubenswrapper[4837]: I0111 17:31:46.065999 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/a45207f1-08c5-4ae8-9ac3-3e3099df8218-metrics-tls\") pod \"dns-operator-744455d44c-n8rwr\" (UID: \"a45207f1-08c5-4ae8-9ac3-3e3099df8218\") " pod="openshift-dns-operator/dns-operator-744455d44c-n8rwr" Jan 11 17:31:46 crc kubenswrapper[4837]: I0111 17:31:46.066101 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/58f0dabf-1deb-453a-9ac9-11324df2b806-trusted-ca\") pod \"console-operator-58897d9998-n8ldn\" (UID: \"58f0dabf-1deb-453a-9ac9-11324df2b806\") " pod="openshift-console-operator/console-operator-58897d9998-n8ldn" Jan 11 17:31:46 crc kubenswrapper[4837]: I0111 17:31:46.066176 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/26867a3e-74b2-4ef0-b671-52b6f72fb0d3-config\") pod \"openshift-apiserver-operator-796bbdcf4f-fpvgj\" (UID: \"26867a3e-74b2-4ef0-b671-52b6f72fb0d3\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-fpvgj" Jan 11 17:31:46 crc kubenswrapper[4837]: I0111 17:31:46.066250 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/59f65053-461b-4ad3-acc6-2a29b1fd06c6-config\") pod \"machine-approver-56656f9798-hchsh\" (UID: \"59f65053-461b-4ad3-acc6-2a29b1fd06c6\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-hchsh" Jan 11 17:31:46 crc kubenswrapper[4837]: I0111 17:31:46.066326 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-g4vz2\" (UniqueName: \"kubernetes.io/projected/f9f7469a-7ddb-4d35-962e-86154f7750c9-kube-api-access-g4vz2\") pod \"controller-manager-879f6c89f-ljlz2\" (UID: \"f9f7469a-7ddb-4d35-962e-86154f7750c9\") " pod="openshift-controller-manager/controller-manager-879f6c89f-ljlz2" Jan 11 17:31:46 crc kubenswrapper[4837]: I0111 17:31:46.066452 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/1141f492-afec-40f3-bde7-7072d6a75a68-service-ca\") pod \"console-f9d7485db-v8df8\" (UID: \"1141f492-afec-40f3-bde7-7072d6a75a68\") " pod="openshift-console/console-f9d7485db-v8df8" Jan 11 17:31:46 crc kubenswrapper[4837]: I0111 17:31:46.066547 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mhmkp\" (UniqueName: \"kubernetes.io/projected/8854497b-9e74-4bd1-b465-847ad61d8779-kube-api-access-mhmkp\") pod \"openshift-config-operator-7777fb866f-599f5\" (UID: \"8854497b-9e74-4bd1-b465-847ad61d8779\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-599f5" Jan 11 17:31:46 crc kubenswrapper[4837]: I0111 17:31:46.066892 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/8bd32a92-1f67-46c3-831f-6b50d22d0fb2-serving-cert\") pod \"openshift-controller-manager-operator-756b6f6bc6-wlfcq\" (UID: \"8bd32a92-1f67-46c3-831f-6b50d22d0fb2\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-wlfcq" Jan 11 17:31:46 crc kubenswrapper[4837]: I0111 17:31:46.066919 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/09971765-a82b-4ab2-b79a-5defb8feb416-config\") pod \"apiserver-76f77b778f-pz586\" (UID: \"09971765-a82b-4ab2-b79a-5defb8feb416\") " pod="openshift-apiserver/apiserver-76f77b778f-pz586" Jan 11 17:31:46 crc kubenswrapper[4837]: I0111 17:31:46.066944 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/ecf9daad-b73a-4f1a-9247-ee4973ad1bc7-trusted-ca-bundle\") pod \"apiserver-7bbb656c7d-2ngzf\" (UID: \"ecf9daad-b73a-4f1a-9247-ee4973ad1bc7\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-2ngzf" Jan 11 17:31:46 crc kubenswrapper[4837]: I0111 17:31:46.066965 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/ecf9daad-b73a-4f1a-9247-ee4973ad1bc7-encryption-config\") pod \"apiserver-7bbb656c7d-2ngzf\" (UID: \"ecf9daad-b73a-4f1a-9247-ee4973ad1bc7\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-2ngzf" Jan 11 17:31:46 crc kubenswrapper[4837]: I0111 17:31:46.067023 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-467k8\" (UniqueName: \"kubernetes.io/projected/75ee2960-aa16-4aea-84f2-d60c34d6fb1a-kube-api-access-467k8\") pod \"downloads-7954f5f757-dmnqc\" (UID: \"75ee2960-aa16-4aea-84f2-d60c34d6fb1a\") " pod="openshift-console/downloads-7954f5f757-dmnqc" Jan 11 17:31:46 crc kubenswrapper[4837]: I0111 17:31:46.067042 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/1d6837bf-33ce-474a-a4f5-22d1fa7b95b1-service-ca-bundle\") pod \"authentication-operator-69f744f599-l2c7d\" (UID: \"1d6837bf-33ce-474a-a4f5-22d1fa7b95b1\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-l2c7d" Jan 11 17:31:46 crc kubenswrapper[4837]: I0111 17:31:46.067057 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f9f7469a-7ddb-4d35-962e-86154f7750c9-config\") pod \"controller-manager-879f6c89f-ljlz2\" (UID: \"f9f7469a-7ddb-4d35-962e-86154f7750c9\") " pod="openshift-controller-manager/controller-manager-879f6c89f-ljlz2" Jan 11 17:31:46 crc kubenswrapper[4837]: I0111 17:31:46.067076 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/1141f492-afec-40f3-bde7-7072d6a75a68-console-config\") pod \"console-f9d7485db-v8df8\" (UID: \"1141f492-afec-40f3-bde7-7072d6a75a68\") " pod="openshift-console/console-f9d7485db-v8df8" Jan 11 17:31:46 crc kubenswrapper[4837]: I0111 17:31:46.067095 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-l8sxd\" (UniqueName: \"kubernetes.io/projected/1141f492-afec-40f3-bde7-7072d6a75a68-kube-api-access-l8sxd\") pod \"console-f9d7485db-v8df8\" (UID: \"1141f492-afec-40f3-bde7-7072d6a75a68\") " pod="openshift-console/console-f9d7485db-v8df8" Jan 11 17:31:46 crc kubenswrapper[4837]: I0111 17:31:46.067111 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2cs68\" (UniqueName: \"kubernetes.io/projected/a45207f1-08c5-4ae8-9ac3-3e3099df8218-kube-api-access-2cs68\") pod \"dns-operator-744455d44c-n8rwr\" (UID: \"a45207f1-08c5-4ae8-9ac3-3e3099df8218\") " pod="openshift-dns-operator/dns-operator-744455d44c-n8rwr" Jan 11 17:31:46 crc kubenswrapper[4837]: I0111 17:31:46.067132 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"image-import-ca\" (UniqueName: \"kubernetes.io/configmap/09971765-a82b-4ab2-b79a-5defb8feb416-image-import-ca\") pod \"apiserver-76f77b778f-pz586\" (UID: \"09971765-a82b-4ab2-b79a-5defb8feb416\") " pod="openshift-apiserver/apiserver-76f77b778f-pz586" Jan 11 17:31:46 crc kubenswrapper[4837]: I0111 17:31:46.067148 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/ecf9daad-b73a-4f1a-9247-ee4973ad1bc7-etcd-serving-ca\") pod \"apiserver-7bbb656c7d-2ngzf\" (UID: \"ecf9daad-b73a-4f1a-9247-ee4973ad1bc7\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-2ngzf" Jan 11 17:31:46 crc kubenswrapper[4837]: I0111 17:31:46.067164 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/76c9940b-89a9-414c-ab2a-c4c1b4519725-config\") pod \"route-controller-manager-6576b87f9c-cwdvp\" (UID: \"76c9940b-89a9-414c-ab2a-c4c1b4519725\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-cwdvp" Jan 11 17:31:46 crc kubenswrapper[4837]: I0111 17:31:46.067180 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/58f0dabf-1deb-453a-9ac9-11324df2b806-config\") pod \"console-operator-58897d9998-n8ldn\" (UID: \"58f0dabf-1deb-453a-9ac9-11324df2b806\") " pod="openshift-console-operator/console-operator-58897d9998-n8ldn" Jan 11 17:31:46 crc kubenswrapper[4837]: I0111 17:31:46.067199 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/09971765-a82b-4ab2-b79a-5defb8feb416-trusted-ca-bundle\") pod \"apiserver-76f77b778f-pz586\" (UID: \"09971765-a82b-4ab2-b79a-5defb8feb416\") " pod="openshift-apiserver/apiserver-76f77b778f-pz586" Jan 11 17:31:46 crc kubenswrapper[4837]: I0111 17:31:46.067217 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8bd32a92-1f67-46c3-831f-6b50d22d0fb2-config\") pod \"openshift-controller-manager-operator-756b6f6bc6-wlfcq\" (UID: \"8bd32a92-1f67-46c3-831f-6b50d22d0fb2\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-wlfcq" Jan 11 17:31:46 crc kubenswrapper[4837]: I0111 17:31:46.067242 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/ecf9daad-b73a-4f1a-9247-ee4973ad1bc7-serving-cert\") pod \"apiserver-7bbb656c7d-2ngzf\" (UID: \"ecf9daad-b73a-4f1a-9247-ee4973ad1bc7\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-2ngzf" Jan 11 17:31:46 crc kubenswrapper[4837]: I0111 17:31:46.067260 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/76c9940b-89a9-414c-ab2a-c4c1b4519725-client-ca\") pod \"route-controller-manager-6576b87f9c-cwdvp\" (UID: \"76c9940b-89a9-414c-ab2a-c4c1b4519725\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-cwdvp" Jan 11 17:31:46 crc kubenswrapper[4837]: I0111 17:31:46.067277 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zpvc9\" (UniqueName: \"kubernetes.io/projected/58f0dabf-1deb-453a-9ac9-11324df2b806-kube-api-access-zpvc9\") pod \"console-operator-58897d9998-n8ldn\" (UID: \"58f0dabf-1deb-453a-9ac9-11324df2b806\") " pod="openshift-console-operator/console-operator-58897d9998-n8ldn" Jan 11 17:31:46 crc kubenswrapper[4837]: I0111 17:31:46.067300 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cn278\" (UniqueName: \"kubernetes.io/projected/8bd32a92-1f67-46c3-831f-6b50d22d0fb2-kube-api-access-cn278\") pod \"openshift-controller-manager-operator-756b6f6bc6-wlfcq\" (UID: \"8bd32a92-1f67-46c3-831f-6b50d22d0fb2\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-wlfcq" Jan 11 17:31:46 crc kubenswrapper[4837]: I0111 17:31:46.067320 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/09971765-a82b-4ab2-b79a-5defb8feb416-encryption-config\") pod \"apiserver-76f77b778f-pz586\" (UID: \"09971765-a82b-4ab2-b79a-5defb8feb416\") " pod="openshift-apiserver/apiserver-76f77b778f-pz586" Jan 11 17:31:46 crc kubenswrapper[4837]: I0111 17:31:46.067339 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/09971765-a82b-4ab2-b79a-5defb8feb416-serving-cert\") pod \"apiserver-76f77b778f-pz586\" (UID: \"09971765-a82b-4ab2-b79a-5defb8feb416\") " pod="openshift-apiserver/apiserver-76f77b778f-pz586" Jan 11 17:31:46 crc kubenswrapper[4837]: I0111 17:31:46.067362 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/ecf9daad-b73a-4f1a-9247-ee4973ad1bc7-audit-policies\") pod \"apiserver-7bbb656c7d-2ngzf\" (UID: \"ecf9daad-b73a-4f1a-9247-ee4973ad1bc7\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-2ngzf" Jan 11 17:31:46 crc kubenswrapper[4837]: I0111 17:31:46.067380 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"available-featuregates\" (UniqueName: \"kubernetes.io/empty-dir/8854497b-9e74-4bd1-b465-847ad61d8779-available-featuregates\") pod \"openshift-config-operator-7777fb866f-599f5\" (UID: \"8854497b-9e74-4bd1-b465-847ad61d8779\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-599f5" Jan 11 17:31:46 crc kubenswrapper[4837]: I0111 17:31:46.067406 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/8854497b-9e74-4bd1-b465-847ad61d8779-serving-cert\") pod \"openshift-config-operator-7777fb866f-599f5\" (UID: \"8854497b-9e74-4bd1-b465-847ad61d8779\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-599f5" Jan 11 17:31:46 crc kubenswrapper[4837]: I0111 17:31:46.067424 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/1141f492-afec-40f3-bde7-7072d6a75a68-trusted-ca-bundle\") pod \"console-f9d7485db-v8df8\" (UID: \"1141f492-afec-40f3-bde7-7072d6a75a68\") " pod="openshift-console/console-f9d7485db-v8df8" Jan 11 17:31:46 crc kubenswrapper[4837]: I0111 17:31:46.067455 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/09971765-a82b-4ab2-b79a-5defb8feb416-etcd-serving-ca\") pod \"apiserver-76f77b778f-pz586\" (UID: \"09971765-a82b-4ab2-b79a-5defb8feb416\") " pod="openshift-apiserver/apiserver-76f77b778f-pz586" Jan 11 17:31:46 crc kubenswrapper[4837]: I0111 17:31:46.067482 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"machine-approver-tls\" (UniqueName: \"kubernetes.io/secret/59f65053-461b-4ad3-acc6-2a29b1fd06c6-machine-approver-tls\") pod \"machine-approver-56656f9798-hchsh\" (UID: \"59f65053-461b-4ad3-acc6-2a29b1fd06c6\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-hchsh" Jan 11 17:31:46 crc kubenswrapper[4837]: I0111 17:31:46.067502 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/1141f492-afec-40f3-bde7-7072d6a75a68-console-oauth-config\") pod \"console-f9d7485db-v8df8\" (UID: \"1141f492-afec-40f3-bde7-7072d6a75a68\") " pod="openshift-console/console-f9d7485db-v8df8" Jan 11 17:31:46 crc kubenswrapper[4837]: I0111 17:31:46.067524 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lsxf4\" (UniqueName: \"kubernetes.io/projected/09971765-a82b-4ab2-b79a-5defb8feb416-kube-api-access-lsxf4\") pod \"apiserver-76f77b778f-pz586\" (UID: \"09971765-a82b-4ab2-b79a-5defb8feb416\") " pod="openshift-apiserver/apiserver-76f77b778f-pz586" Jan 11 17:31:46 crc kubenswrapper[4837]: I0111 17:31:46.067544 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/26867a3e-74b2-4ef0-b671-52b6f72fb0d3-serving-cert\") pod \"openshift-apiserver-operator-796bbdcf4f-fpvgj\" (UID: \"26867a3e-74b2-4ef0-b671-52b6f72fb0d3\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-fpvgj" Jan 11 17:31:46 crc kubenswrapper[4837]: I0111 17:31:46.067562 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/76c9940b-89a9-414c-ab2a-c4c1b4519725-serving-cert\") pod \"route-controller-manager-6576b87f9c-cwdvp\" (UID: \"76c9940b-89a9-414c-ab2a-c4c1b4519725\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-cwdvp" Jan 11 17:31:46 crc kubenswrapper[4837]: I0111 17:31:46.067590 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zrczq\" (UniqueName: \"kubernetes.io/projected/76c9940b-89a9-414c-ab2a-c4c1b4519725-kube-api-access-zrczq\") pod \"route-controller-manager-6576b87f9c-cwdvp\" (UID: \"76c9940b-89a9-414c-ab2a-c4c1b4519725\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-cwdvp" Jan 11 17:31:46 crc kubenswrapper[4837]: I0111 17:31:46.067611 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/ecf9daad-b73a-4f1a-9247-ee4973ad1bc7-audit-dir\") pod \"apiserver-7bbb656c7d-2ngzf\" (UID: \"ecf9daad-b73a-4f1a-9247-ee4973ad1bc7\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-2ngzf" Jan 11 17:31:46 crc kubenswrapper[4837]: I0111 17:31:46.067632 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/f9f7469a-7ddb-4d35-962e-86154f7750c9-proxy-ca-bundles\") pod \"controller-manager-879f6c89f-ljlz2\" (UID: \"f9f7469a-7ddb-4d35-962e-86154f7750c9\") " pod="openshift-controller-manager/controller-manager-879f6c89f-ljlz2" Jan 11 17:31:46 crc kubenswrapper[4837]: I0111 17:31:46.067651 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit\" (UniqueName: \"kubernetes.io/configmap/09971765-a82b-4ab2-b79a-5defb8feb416-audit\") pod \"apiserver-76f77b778f-pz586\" (UID: \"09971765-a82b-4ab2-b79a-5defb8feb416\") " pod="openshift-apiserver/apiserver-76f77b778f-pz586" Jan 11 17:31:46 crc kubenswrapper[4837]: I0111 17:31:46.067668 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fz7j5\" (UniqueName: \"kubernetes.io/projected/ecf9daad-b73a-4f1a-9247-ee4973ad1bc7-kube-api-access-fz7j5\") pod \"apiserver-7bbb656c7d-2ngzf\" (UID: \"ecf9daad-b73a-4f1a-9247-ee4973ad1bc7\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-2ngzf" Jan 11 17:31:46 crc kubenswrapper[4837]: I0111 17:31:46.067704 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/09971765-a82b-4ab2-b79a-5defb8feb416-audit-dir\") pod \"apiserver-76f77b778f-pz586\" (UID: \"09971765-a82b-4ab2-b79a-5defb8feb416\") " pod="openshift-apiserver/apiserver-76f77b778f-pz586" Jan 11 17:31:46 crc kubenswrapper[4837]: I0111 17:31:46.067724 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-w66tn\" (UniqueName: \"kubernetes.io/projected/26867a3e-74b2-4ef0-b671-52b6f72fb0d3-kube-api-access-w66tn\") pod \"openshift-apiserver-operator-796bbdcf4f-fpvgj\" (UID: \"26867a3e-74b2-4ef0-b671-52b6f72fb0d3\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-fpvgj" Jan 11 17:31:46 crc kubenswrapper[4837]: I0111 17:31:46.067750 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/f9f7469a-7ddb-4d35-962e-86154f7750c9-client-ca\") pod \"controller-manager-879f6c89f-ljlz2\" (UID: \"f9f7469a-7ddb-4d35-962e-86154f7750c9\") " pod="openshift-controller-manager/controller-manager-879f6c89f-ljlz2" Jan 11 17:31:46 crc kubenswrapper[4837]: I0111 17:31:46.067770 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1d6837bf-33ce-474a-a4f5-22d1fa7b95b1-config\") pod \"authentication-operator-69f744f599-l2c7d\" (UID: \"1d6837bf-33ce-474a-a4f5-22d1fa7b95b1\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-l2c7d" Jan 11 17:31:46 crc kubenswrapper[4837]: I0111 17:31:46.067801 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/09971765-a82b-4ab2-b79a-5defb8feb416-etcd-client\") pod \"apiserver-76f77b778f-pz586\" (UID: \"09971765-a82b-4ab2-b79a-5defb8feb416\") " pod="openshift-apiserver/apiserver-76f77b778f-pz586" Jan 11 17:31:46 crc kubenswrapper[4837]: I0111 17:31:46.067817 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/1141f492-afec-40f3-bde7-7072d6a75a68-oauth-serving-cert\") pod \"console-f9d7485db-v8df8\" (UID: \"1141f492-afec-40f3-bde7-7072d6a75a68\") " pod="openshift-console/console-f9d7485db-v8df8" Jan 11 17:31:46 crc kubenswrapper[4837]: I0111 17:31:46.071236 4837 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-api/machine-api-operator-5694c8668f-xk9j9"] Jan 11 17:31:46 crc kubenswrapper[4837]: I0111 17:31:46.073575 4837 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-kq5ch"] Jan 11 17:31:46 crc kubenswrapper[4837]: I0111 17:31:46.073942 4837 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-mklqs"] Jan 11 17:31:46 crc kubenswrapper[4837]: I0111 17:31:46.074364 4837 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-etcd-operator/etcd-operator-b45778765-pjl6t"] Jan 11 17:31:46 crc kubenswrapper[4837]: I0111 17:31:46.074608 4837 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-fpvgj"] Jan 11 17:31:46 crc kubenswrapper[4837]: I0111 17:31:46.074627 4837 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-config-operator/openshift-config-operator-7777fb866f-599f5"] Jan 11 17:31:46 crc kubenswrapper[4837]: I0111 17:31:46.074711 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-etcd-operator/etcd-operator-b45778765-pjl6t" Jan 11 17:31:46 crc kubenswrapper[4837]: I0111 17:31:46.074861 4837 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console/downloads-7954f5f757-dmnqc"] Jan 11 17:31:46 crc kubenswrapper[4837]: I0111 17:31:46.075659 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-authentication/oauth-openshift-558db77b4-ldzgv" Jan 11 17:31:46 crc kubenswrapper[4837]: I0111 17:31:46.076061 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-api/machine-api-operator-5694c8668f-xk9j9" Jan 11 17:31:46 crc kubenswrapper[4837]: I0111 17:31:46.076393 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-kq5ch" Jan 11 17:31:46 crc kubenswrapper[4837]: I0111 17:31:46.076754 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-mklqs" Jan 11 17:31:46 crc kubenswrapper[4837]: I0111 17:31:46.080107 4837 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-oauth-apiserver/apiserver-7bbb656c7d-2ngzf"] Jan 11 17:31:46 crc kubenswrapper[4837]: I0111 17:31:46.083485 4837 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"etcd-ca-bundle" Jan 11 17:31:46 crc kubenswrapper[4837]: I0111 17:31:46.083643 4837 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"openshift-service-ca.crt" Jan 11 17:31:46 crc kubenswrapper[4837]: I0111 17:31:46.083781 4837 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-samples-operator"/"openshift-service-ca.crt" Jan 11 17:31:46 crc kubenswrapper[4837]: I0111 17:31:46.104399 4837 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-apiserver/apiserver-76f77b778f-pz586"] Jan 11 17:31:46 crc kubenswrapper[4837]: I0111 17:31:46.104452 4837 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console-operator/console-operator-58897d9998-n8ldn"] Jan 11 17:31:46 crc kubenswrapper[4837]: I0111 17:31:46.106457 4837 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-wlfcq"] Jan 11 17:31:46 crc kubenswrapper[4837]: I0111 17:31:46.113273 4837 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-router-certs" Jan 11 17:31:46 crc kubenswrapper[4837]: I0111 17:31:46.113453 4837 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"machine-api-operator-dockercfg-mfbb7" Jan 11 17:31:46 crc kubenswrapper[4837]: I0111 17:31:46.113614 4837 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"machine-api-operator-images" Jan 11 17:31:46 crc kubenswrapper[4837]: I0111 17:31:46.113840 4837 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6576b87f9c-cwdvp"] Jan 11 17:31:46 crc kubenswrapper[4837]: I0111 17:31:46.113873 4837 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-f9d7485db-v8df8"] Jan 11 17:31:46 crc kubenswrapper[4837]: I0111 17:31:46.113973 4837 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-template-error" Jan 11 17:31:46 crc kubenswrapper[4837]: I0111 17:31:46.115045 4837 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"audit" Jan 11 17:31:46 crc kubenswrapper[4837]: I0111 17:31:46.115119 4837 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"openshift-service-ca.crt" Jan 11 17:31:46 crc kubenswrapper[4837]: I0111 17:31:46.115171 4837 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"openshift-service-ca.crt" Jan 11 17:31:46 crc kubenswrapper[4837]: I0111 17:31:46.115198 4837 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"v4-0-config-system-service-ca" Jan 11 17:31:46 crc kubenswrapper[4837]: I0111 17:31:46.125334 4837 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-session" Jan 11 17:31:46 crc kubenswrapper[4837]: I0111 17:31:46.126812 4837 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"kube-root-ca.crt" Jan 11 17:31:46 crc kubenswrapper[4837]: I0111 17:31:46.127197 4837 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"kube-rbac-proxy" Jan 11 17:31:46 crc kubenswrapper[4837]: I0111 17:31:46.127340 4837 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"etcd-operator-config" Jan 11 17:31:46 crc kubenswrapper[4837]: I0111 17:31:46.127210 4837 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"v4-0-config-system-cliconfig" Jan 11 17:31:46 crc kubenswrapper[4837]: I0111 17:31:46.127636 4837 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-etcd-operator"/"etcd-operator-serving-cert" Jan 11 17:31:46 crc kubenswrapper[4837]: I0111 17:31:46.127548 4837 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"etcd-service-ca-bundle" Jan 11 17:31:46 crc kubenswrapper[4837]: I0111 17:31:46.127926 4837 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"machine-api-operator-tls" Jan 11 17:31:46 crc kubenswrapper[4837]: I0111 17:31:46.128009 4837 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-etcd-operator"/"etcd-client" Jan 11 17:31:46 crc kubenswrapper[4837]: I0111 17:31:46.128141 4837 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"trusted-ca-bundle" Jan 11 17:31:46 crc kubenswrapper[4837]: I0111 17:31:46.128166 4837 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-mklqs"] Jan 11 17:31:46 crc kubenswrapper[4837]: I0111 17:31:46.128519 4837 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-etcd-operator"/"etcd-operator-dockercfg-r9srn" Jan 11 17:31:46 crc kubenswrapper[4837]: I0111 17:31:46.129033 4837 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-api/machine-api-operator-5694c8668f-xk9j9"] Jan 11 17:31:46 crc kubenswrapper[4837]: I0111 17:31:46.131111 4837 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-2vscb"] Jan 11 17:31:46 crc kubenswrapper[4837]: I0111 17:31:46.129014 4837 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"v4-0-config-system-trusted-ca-bundle" Jan 11 17:31:46 crc kubenswrapper[4837]: I0111 17:31:46.129105 4837 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"kube-root-ca.crt" Jan 11 17:31:46 crc kubenswrapper[4837]: I0111 17:31:46.129610 4837 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"kube-root-ca.crt" Jan 11 17:31:46 crc kubenswrapper[4837]: I0111 17:31:46.131650 4837 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-r58pw"] Jan 11 17:31:46 crc kubenswrapper[4837]: I0111 17:31:46.130085 4837 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-template-login" Jan 11 17:31:46 crc kubenswrapper[4837]: I0111 17:31:46.130214 4837 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"trusted-ca" Jan 11 17:31:46 crc kubenswrapper[4837]: I0111 17:31:46.131724 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-2vscb" Jan 11 17:31:46 crc kubenswrapper[4837]: I0111 17:31:46.136140 4837 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-c9fvj"] Jan 11 17:31:46 crc kubenswrapper[4837]: I0111 17:31:46.136268 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-r58pw" Jan 11 17:31:46 crc kubenswrapper[4837]: I0111 17:31:46.136762 4837 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-5xllw"] Jan 11 17:31:46 crc kubenswrapper[4837]: I0111 17:31:46.136817 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-c9fvj" Jan 11 17:31:46 crc kubenswrapper[4837]: I0111 17:31:46.137308 4837 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-image-registry/image-registry-697d97f7c8-4nvrr"] Jan 11 17:31:46 crc kubenswrapper[4837]: I0111 17:31:46.137479 4837 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-serving-cert" Jan 11 17:31:46 crc kubenswrapper[4837]: I0111 17:31:46.137714 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-697d97f7c8-4nvrr" Jan 11 17:31:46 crc kubenswrapper[4837]: I0111 17:31:46.137769 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-5xllw" Jan 11 17:31:46 crc kubenswrapper[4837]: I0111 17:31:46.138769 4837 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-ingress/router-default-5444994796-cscqg"] Jan 11 17:31:46 crc kubenswrapper[4837]: I0111 17:31:46.139334 4837 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-gkkhd"] Jan 11 17:31:46 crc kubenswrapper[4837]: I0111 17:31:46.139960 4837 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-9dr4c"] Jan 11 17:31:46 crc kubenswrapper[4837]: I0111 17:31:46.140064 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress/router-default-5444994796-cscqg" Jan 11 17:31:46 crc kubenswrapper[4837]: I0111 17:31:46.140098 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/marketplace-operator-79b997595-gkkhd" Jan 11 17:31:46 crc kubenswrapper[4837]: I0111 17:31:46.141160 4837 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-service-ca-operator/service-ca-operator-777779d784-vzfwx"] Jan 11 17:31:46 crc kubenswrapper[4837]: I0111 17:31:46.141557 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-9dr4c" Jan 11 17:31:46 crc kubenswrapper[4837]: I0111 17:31:46.141704 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-service-ca-operator/service-ca-operator-777779d784-vzfwx" Jan 11 17:31:46 crc kubenswrapper[4837]: I0111 17:31:46.142234 4837 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-ingress-operator/ingress-operator-5b745b69d9-gsqgk"] Jan 11 17:31:46 crc kubenswrapper[4837]: I0111 17:31:46.142813 4837 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29469210-m2c5r"] Jan 11 17:31:46 crc kubenswrapper[4837]: I0111 17:31:46.142873 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-operator/ingress-operator-5b745b69d9-gsqgk" Jan 11 17:31:46 crc kubenswrapper[4837]: I0111 17:31:46.144748 4837 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-service-ca/service-ca-9c57cc56f-nqs9r"] Jan 11 17:31:46 crc kubenswrapper[4837]: I0111 17:31:46.145066 4837 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-bs6rl"] Jan 11 17:31:46 crc kubenswrapper[4837]: I0111 17:31:46.145136 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29469210-m2c5r" Jan 11 17:31:46 crc kubenswrapper[4837]: I0111 17:31:46.145168 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-service-ca/service-ca-9c57cc56f-nqs9r" Jan 11 17:31:46 crc kubenswrapper[4837]: I0111 17:31:46.145791 4837 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-fxkfr"] Jan 11 17:31:46 crc kubenswrapper[4837]: I0111 17:31:46.146108 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-fxkfr" Jan 11 17:31:46 crc kubenswrapper[4837]: I0111 17:31:46.146176 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-bs6rl" Jan 11 17:31:46 crc kubenswrapper[4837]: I0111 17:31:46.146763 4837 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-wphsv"] Jan 11 17:31:46 crc kubenswrapper[4837]: I0111 17:31:46.147151 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-wphsv" Jan 11 17:31:46 crc kubenswrapper[4837]: I0111 17:31:46.148051 4837 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-template-provider-selection" Jan 11 17:31:46 crc kubenswrapper[4837]: I0111 17:31:46.149117 4837 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-multus/multus-admission-controller-857f4d67dd-2984c"] Jan 11 17:31:46 crc kubenswrapper[4837]: I0111 17:31:46.149588 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-admission-controller-857f4d67dd-2984c" Jan 11 17:31:46 crc kubenswrapper[4837]: I0111 17:31:46.151328 4837 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-ingress-canary/ingress-canary-wbdfb"] Jan 11 17:31:46 crc kubenswrapper[4837]: I0111 17:31:46.151742 4837 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/catalog-operator-68c6474976-sxnbq"] Jan 11 17:31:46 crc kubenswrapper[4837]: I0111 17:31:46.152153 4837 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-config-operator/machine-config-controller-84d6567774-zdg78"] Jan 11 17:31:46 crc kubenswrapper[4837]: I0111 17:31:46.152167 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-sxnbq" Jan 11 17:31:46 crc kubenswrapper[4837]: I0111 17:31:46.152356 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-canary/ingress-canary-wbdfb" Jan 11 17:31:46 crc kubenswrapper[4837]: I0111 17:31:46.153008 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-controller-84d6567774-zdg78" Jan 11 17:31:46 crc kubenswrapper[4837]: I0111 17:31:46.155010 4837 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-storage-version-migrator/migrator-59844c95c7-p7jgn"] Jan 11 17:31:46 crc kubenswrapper[4837]: I0111 17:31:46.156818 4837 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-config-operator/machine-config-operator-74547568cd-ps47j"] Jan 11 17:31:46 crc kubenswrapper[4837]: I0111 17:31:46.160129 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-p7jgn" Jan 11 17:31:46 crc kubenswrapper[4837]: I0111 17:31:46.166084 4837 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-kq5ch"] Jan 11 17:31:46 crc kubenswrapper[4837]: I0111 17:31:46.166110 4837 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-multus/cni-sysctl-allowlist-ds-pvfwl"] Jan 11 17:31:46 crc kubenswrapper[4837]: I0111 17:31:46.167561 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/cni-sysctl-allowlist-ds-pvfwl" Jan 11 17:31:46 crc kubenswrapper[4837]: I0111 17:31:46.167810 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-operator-74547568cd-ps47j" Jan 11 17:31:46 crc kubenswrapper[4837]: I0111 17:31:46.168780 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/09971765-a82b-4ab2-b79a-5defb8feb416-serving-cert\") pod \"apiserver-76f77b778f-pz586\" (UID: \"09971765-a82b-4ab2-b79a-5defb8feb416\") " pod="openshift-apiserver/apiserver-76f77b778f-pz586" Jan 11 17:31:46 crc kubenswrapper[4837]: I0111 17:31:46.168826 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"available-featuregates\" (UniqueName: \"kubernetes.io/empty-dir/8854497b-9e74-4bd1-b465-847ad61d8779-available-featuregates\") pod \"openshift-config-operator-7777fb866f-599f5\" (UID: \"8854497b-9e74-4bd1-b465-847ad61d8779\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-599f5" Jan 11 17:31:46 crc kubenswrapper[4837]: I0111 17:31:46.168848 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/ecf9daad-b73a-4f1a-9247-ee4973ad1bc7-audit-policies\") pod \"apiserver-7bbb656c7d-2ngzf\" (UID: \"ecf9daad-b73a-4f1a-9247-ee4973ad1bc7\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-2ngzf" Jan 11 17:31:46 crc kubenswrapper[4837]: I0111 17:31:46.168869 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/8854497b-9e74-4bd1-b465-847ad61d8779-serving-cert\") pod \"openshift-config-operator-7777fb866f-599f5\" (UID: \"8854497b-9e74-4bd1-b465-847ad61d8779\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-599f5" Jan 11 17:31:46 crc kubenswrapper[4837]: I0111 17:31:46.168890 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/1141f492-afec-40f3-bde7-7072d6a75a68-trusted-ca-bundle\") pod \"console-f9d7485db-v8df8\" (UID: \"1141f492-afec-40f3-bde7-7072d6a75a68\") " pod="openshift-console/console-f9d7485db-v8df8" Jan 11 17:31:46 crc kubenswrapper[4837]: I0111 17:31:46.168915 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/09971765-a82b-4ab2-b79a-5defb8feb416-etcd-serving-ca\") pod \"apiserver-76f77b778f-pz586\" (UID: \"09971765-a82b-4ab2-b79a-5defb8feb416\") " pod="openshift-apiserver/apiserver-76f77b778f-pz586" Jan 11 17:31:46 crc kubenswrapper[4837]: I0111 17:31:46.168932 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"machine-approver-tls\" (UniqueName: \"kubernetes.io/secret/59f65053-461b-4ad3-acc6-2a29b1fd06c6-machine-approver-tls\") pod \"machine-approver-56656f9798-hchsh\" (UID: \"59f65053-461b-4ad3-acc6-2a29b1fd06c6\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-hchsh" Jan 11 17:31:46 crc kubenswrapper[4837]: I0111 17:31:46.168950 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/26867a3e-74b2-4ef0-b671-52b6f72fb0d3-serving-cert\") pod \"openshift-apiserver-operator-796bbdcf4f-fpvgj\" (UID: \"26867a3e-74b2-4ef0-b671-52b6f72fb0d3\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-fpvgj" Jan 11 17:31:46 crc kubenswrapper[4837]: I0111 17:31:46.168970 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/76c9940b-89a9-414c-ab2a-c4c1b4519725-serving-cert\") pod \"route-controller-manager-6576b87f9c-cwdvp\" (UID: \"76c9940b-89a9-414c-ab2a-c4c1b4519725\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-cwdvp" Jan 11 17:31:46 crc kubenswrapper[4837]: I0111 17:31:46.168989 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zrczq\" (UniqueName: \"kubernetes.io/projected/76c9940b-89a9-414c-ab2a-c4c1b4519725-kube-api-access-zrczq\") pod \"route-controller-manager-6576b87f9c-cwdvp\" (UID: \"76c9940b-89a9-414c-ab2a-c4c1b4519725\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-cwdvp" Jan 11 17:31:46 crc kubenswrapper[4837]: I0111 17:31:46.169008 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/1141f492-afec-40f3-bde7-7072d6a75a68-console-oauth-config\") pod \"console-f9d7485db-v8df8\" (UID: \"1141f492-afec-40f3-bde7-7072d6a75a68\") " pod="openshift-console/console-f9d7485db-v8df8" Jan 11 17:31:46 crc kubenswrapper[4837]: I0111 17:31:46.169028 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lsxf4\" (UniqueName: \"kubernetes.io/projected/09971765-a82b-4ab2-b79a-5defb8feb416-kube-api-access-lsxf4\") pod \"apiserver-76f77b778f-pz586\" (UID: \"09971765-a82b-4ab2-b79a-5defb8feb416\") " pod="openshift-apiserver/apiserver-76f77b778f-pz586" Jan 11 17:31:46 crc kubenswrapper[4837]: I0111 17:31:46.169124 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ndmmj\" (UniqueName: \"kubernetes.io/projected/8ec2b154-4844-440e-bec5-911d8456ac91-kube-api-access-ndmmj\") pod \"cluster-image-registry-operator-dc59b4c8b-mklqs\" (UID: \"8ec2b154-4844-440e-bec5-911d8456ac91\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-mklqs" Jan 11 17:31:46 crc kubenswrapper[4837]: I0111 17:31:46.169148 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/f9f7469a-7ddb-4d35-962e-86154f7750c9-proxy-ca-bundles\") pod \"controller-manager-879f6c89f-ljlz2\" (UID: \"f9f7469a-7ddb-4d35-962e-86154f7750c9\") " pod="openshift-controller-manager/controller-manager-879f6c89f-ljlz2" Jan 11 17:31:46 crc kubenswrapper[4837]: I0111 17:31:46.169168 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/ecf9daad-b73a-4f1a-9247-ee4973ad1bc7-audit-dir\") pod \"apiserver-7bbb656c7d-2ngzf\" (UID: \"ecf9daad-b73a-4f1a-9247-ee4973ad1bc7\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-2ngzf" Jan 11 17:31:46 crc kubenswrapper[4837]: I0111 17:31:46.169187 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit\" (UniqueName: \"kubernetes.io/configmap/09971765-a82b-4ab2-b79a-5defb8feb416-audit\") pod \"apiserver-76f77b778f-pz586\" (UID: \"09971765-a82b-4ab2-b79a-5defb8feb416\") " pod="openshift-apiserver/apiserver-76f77b778f-pz586" Jan 11 17:31:46 crc kubenswrapper[4837]: I0111 17:31:46.169204 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fz7j5\" (UniqueName: \"kubernetes.io/projected/ecf9daad-b73a-4f1a-9247-ee4973ad1bc7-kube-api-access-fz7j5\") pod \"apiserver-7bbb656c7d-2ngzf\" (UID: \"ecf9daad-b73a-4f1a-9247-ee4973ad1bc7\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-2ngzf" Jan 11 17:31:46 crc kubenswrapper[4837]: I0111 17:31:46.169224 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/09971765-a82b-4ab2-b79a-5defb8feb416-audit-dir\") pod \"apiserver-76f77b778f-pz586\" (UID: \"09971765-a82b-4ab2-b79a-5defb8feb416\") " pod="openshift-apiserver/apiserver-76f77b778f-pz586" Jan 11 17:31:46 crc kubenswrapper[4837]: I0111 17:31:46.169243 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-w66tn\" (UniqueName: \"kubernetes.io/projected/26867a3e-74b2-4ef0-b671-52b6f72fb0d3-kube-api-access-w66tn\") pod \"openshift-apiserver-operator-796bbdcf4f-fpvgj\" (UID: \"26867a3e-74b2-4ef0-b671-52b6f72fb0d3\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-fpvgj" Jan 11 17:31:46 crc kubenswrapper[4837]: I0111 17:31:46.169285 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/f9f7469a-7ddb-4d35-962e-86154f7750c9-client-ca\") pod \"controller-manager-879f6c89f-ljlz2\" (UID: \"f9f7469a-7ddb-4d35-962e-86154f7750c9\") " pod="openshift-controller-manager/controller-manager-879f6c89f-ljlz2" Jan 11 17:31:46 crc kubenswrapper[4837]: I0111 17:31:46.169305 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/09971765-a82b-4ab2-b79a-5defb8feb416-etcd-client\") pod \"apiserver-76f77b778f-pz586\" (UID: \"09971765-a82b-4ab2-b79a-5defb8feb416\") " pod="openshift-apiserver/apiserver-76f77b778f-pz586" Jan 11 17:31:46 crc kubenswrapper[4837]: I0111 17:31:46.169323 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1d6837bf-33ce-474a-a4f5-22d1fa7b95b1-config\") pod \"authentication-operator-69f744f599-l2c7d\" (UID: \"1d6837bf-33ce-474a-a4f5-22d1fa7b95b1\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-l2c7d" Jan 11 17:31:46 crc kubenswrapper[4837]: I0111 17:31:46.169343 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/1141f492-afec-40f3-bde7-7072d6a75a68-oauth-serving-cert\") pod \"console-f9d7485db-v8df8\" (UID: \"1141f492-afec-40f3-bde7-7072d6a75a68\") " pod="openshift-console/console-f9d7485db-v8df8" Jan 11 17:31:46 crc kubenswrapper[4837]: I0111 17:31:46.169362 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/1d6837bf-33ce-474a-a4f5-22d1fa7b95b1-serving-cert\") pod \"authentication-operator-69f744f599-l2c7d\" (UID: \"1d6837bf-33ce-474a-a4f5-22d1fa7b95b1\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-l2c7d" Jan 11 17:31:46 crc kubenswrapper[4837]: I0111 17:31:46.169381 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6184fa1c-4a00-4ae3-9dad-a5672220571e-config\") pod \"kube-storage-version-migrator-operator-b67b599dd-r58pw\" (UID: \"6184fa1c-4a00-4ae3-9dad-a5672220571e\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-r58pw" Jan 11 17:31:46 crc kubenswrapper[4837]: I0111 17:31:46.169402 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-pvng8\" (UniqueName: \"kubernetes.io/projected/59f65053-461b-4ad3-acc6-2a29b1fd06c6-kube-api-access-pvng8\") pod \"machine-approver-56656f9798-hchsh\" (UID: \"59f65053-461b-4ad3-acc6-2a29b1fd06c6\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-hchsh" Jan 11 17:31:46 crc kubenswrapper[4837]: I0111 17:31:46.169420 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/58f0dabf-1deb-453a-9ac9-11324df2b806-serving-cert\") pod \"console-operator-58897d9998-n8ldn\" (UID: \"58f0dabf-1deb-453a-9ac9-11324df2b806\") " pod="openshift-console-operator/console-operator-58897d9998-n8ldn" Jan 11 17:31:46 crc kubenswrapper[4837]: I0111 17:31:46.169474 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/f9f7469a-7ddb-4d35-962e-86154f7750c9-serving-cert\") pod \"controller-manager-879f6c89f-ljlz2\" (UID: \"f9f7469a-7ddb-4d35-962e-86154f7750c9\") " pod="openshift-controller-manager/controller-manager-879f6c89f-ljlz2" Jan 11 17:31:46 crc kubenswrapper[4837]: I0111 17:31:46.169576 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/59f65053-461b-4ad3-acc6-2a29b1fd06c6-auth-proxy-config\") pod \"machine-approver-56656f9798-hchsh\" (UID: \"59f65053-461b-4ad3-acc6-2a29b1fd06c6\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-hchsh" Jan 11 17:31:46 crc kubenswrapper[4837]: I0111 17:31:46.169609 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/ecf9daad-b73a-4f1a-9247-ee4973ad1bc7-etcd-client\") pod \"apiserver-7bbb656c7d-2ngzf\" (UID: \"ecf9daad-b73a-4f1a-9247-ee4973ad1bc7\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-2ngzf" Jan 11 17:31:46 crc kubenswrapper[4837]: I0111 17:31:46.169628 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/6a05a50b-ccec-45f2-be70-d210e5334d18-serving-cert\") pod \"kube-apiserver-operator-766d6c64bb-c9fvj\" (UID: \"6a05a50b-ccec-45f2-be70-d210e5334d18\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-c9fvj" Jan 11 17:31:46 crc kubenswrapper[4837]: I0111 17:31:46.169883 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/ecf9daad-b73a-4f1a-9247-ee4973ad1bc7-audit-dir\") pod \"apiserver-7bbb656c7d-2ngzf\" (UID: \"ecf9daad-b73a-4f1a-9247-ee4973ad1bc7\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-2ngzf" Jan 11 17:31:46 crc kubenswrapper[4837]: I0111 17:31:46.169946 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/1141f492-afec-40f3-bde7-7072d6a75a68-console-serving-cert\") pod \"console-f9d7485db-v8df8\" (UID: \"1141f492-afec-40f3-bde7-7072d6a75a68\") " pod="openshift-console/console-f9d7485db-v8df8" Jan 11 17:31:46 crc kubenswrapper[4837]: I0111 17:31:46.170016 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/09971765-a82b-4ab2-b79a-5defb8feb416-node-pullsecrets\") pod \"apiserver-76f77b778f-pz586\" (UID: \"09971765-a82b-4ab2-b79a-5defb8feb416\") " pod="openshift-apiserver/apiserver-76f77b778f-pz586" Jan 11 17:31:46 crc kubenswrapper[4837]: I0111 17:31:46.170042 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/1d6837bf-33ce-474a-a4f5-22d1fa7b95b1-trusted-ca-bundle\") pod \"authentication-operator-69f744f599-l2c7d\" (UID: \"1d6837bf-33ce-474a-a4f5-22d1fa7b95b1\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-l2c7d" Jan 11 17:31:46 crc kubenswrapper[4837]: I0111 17:31:46.170070 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/6a05a50b-ccec-45f2-be70-d210e5334d18-kube-api-access\") pod \"kube-apiserver-operator-766d6c64bb-c9fvj\" (UID: \"6a05a50b-ccec-45f2-be70-d210e5334d18\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-c9fvj" Jan 11 17:31:46 crc kubenswrapper[4837]: I0111 17:31:46.170095 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/a45207f1-08c5-4ae8-9ac3-3e3099df8218-metrics-tls\") pod \"dns-operator-744455d44c-n8rwr\" (UID: \"a45207f1-08c5-4ae8-9ac3-3e3099df8218\") " pod="openshift-dns-operator/dns-operator-744455d44c-n8rwr" Jan 11 17:31:46 crc kubenswrapper[4837]: I0111 17:31:46.170149 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/58f0dabf-1deb-453a-9ac9-11324df2b806-trusted-ca\") pod \"console-operator-58897d9998-n8ldn\" (UID: \"58f0dabf-1deb-453a-9ac9-11324df2b806\") " pod="openshift-console-operator/console-operator-58897d9998-n8ldn" Jan 11 17:31:46 crc kubenswrapper[4837]: I0111 17:31:46.170202 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s8d4m\" (UniqueName: \"kubernetes.io/projected/1d6837bf-33ce-474a-a4f5-22d1fa7b95b1-kube-api-access-s8d4m\") pod \"authentication-operator-69f744f599-l2c7d\" (UID: \"1d6837bf-33ce-474a-a4f5-22d1fa7b95b1\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-l2c7d" Jan 11 17:31:46 crc kubenswrapper[4837]: I0111 17:31:46.170228 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/1141f492-afec-40f3-bde7-7072d6a75a68-service-ca\") pod \"console-f9d7485db-v8df8\" (UID: \"1141f492-afec-40f3-bde7-7072d6a75a68\") " pod="openshift-console/console-f9d7485db-v8df8" Jan 11 17:31:46 crc kubenswrapper[4837]: I0111 17:31:46.170270 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/26867a3e-74b2-4ef0-b671-52b6f72fb0d3-config\") pod \"openshift-apiserver-operator-796bbdcf4f-fpvgj\" (UID: \"26867a3e-74b2-4ef0-b671-52b6f72fb0d3\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-fpvgj" Jan 11 17:31:46 crc kubenswrapper[4837]: I0111 17:31:46.170287 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/59f65053-461b-4ad3-acc6-2a29b1fd06c6-config\") pod \"machine-approver-56656f9798-hchsh\" (UID: \"59f65053-461b-4ad3-acc6-2a29b1fd06c6\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-hchsh" Jan 11 17:31:46 crc kubenswrapper[4837]: I0111 17:31:46.170307 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-g4vz2\" (UniqueName: \"kubernetes.io/projected/f9f7469a-7ddb-4d35-962e-86154f7750c9-kube-api-access-g4vz2\") pod \"controller-manager-879f6c89f-ljlz2\" (UID: \"f9f7469a-7ddb-4d35-962e-86154f7750c9\") " pod="openshift-controller-manager/controller-manager-879f6c89f-ljlz2" Jan 11 17:31:46 crc kubenswrapper[4837]: I0111 17:31:46.170363 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/8ec2b154-4844-440e-bec5-911d8456ac91-trusted-ca\") pod \"cluster-image-registry-operator-dc59b4c8b-mklqs\" (UID: \"8ec2b154-4844-440e-bec5-911d8456ac91\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-mklqs" Jan 11 17:31:46 crc kubenswrapper[4837]: I0111 17:31:46.170408 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mhmkp\" (UniqueName: \"kubernetes.io/projected/8854497b-9e74-4bd1-b465-847ad61d8779-kube-api-access-mhmkp\") pod \"openshift-config-operator-7777fb866f-599f5\" (UID: \"8854497b-9e74-4bd1-b465-847ad61d8779\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-599f5" Jan 11 17:31:46 crc kubenswrapper[4837]: I0111 17:31:46.170557 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kt6wj\" (UniqueName: \"kubernetes.io/projected/6184fa1c-4a00-4ae3-9dad-a5672220571e-kube-api-access-kt6wj\") pod \"kube-storage-version-migrator-operator-b67b599dd-r58pw\" (UID: \"6184fa1c-4a00-4ae3-9dad-a5672220571e\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-r58pw" Jan 11 17:31:46 crc kubenswrapper[4837]: I0111 17:31:46.170589 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/09971765-a82b-4ab2-b79a-5defb8feb416-config\") pod \"apiserver-76f77b778f-pz586\" (UID: \"09971765-a82b-4ab2-b79a-5defb8feb416\") " pod="openshift-apiserver/apiserver-76f77b778f-pz586" Jan 11 17:31:46 crc kubenswrapper[4837]: I0111 17:31:46.170609 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/ecf9daad-b73a-4f1a-9247-ee4973ad1bc7-trusted-ca-bundle\") pod \"apiserver-7bbb656c7d-2ngzf\" (UID: \"ecf9daad-b73a-4f1a-9247-ee4973ad1bc7\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-2ngzf" Jan 11 17:31:46 crc kubenswrapper[4837]: I0111 17:31:46.170640 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/ecf9daad-b73a-4f1a-9247-ee4973ad1bc7-encryption-config\") pod \"apiserver-7bbb656c7d-2ngzf\" (UID: \"ecf9daad-b73a-4f1a-9247-ee4973ad1bc7\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-2ngzf" Jan 11 17:31:46 crc kubenswrapper[4837]: I0111 17:31:46.170694 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/8bd32a92-1f67-46c3-831f-6b50d22d0fb2-serving-cert\") pod \"openshift-controller-manager-operator-756b6f6bc6-wlfcq\" (UID: \"8bd32a92-1f67-46c3-831f-6b50d22d0fb2\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-wlfcq" Jan 11 17:31:46 crc kubenswrapper[4837]: I0111 17:31:46.170848 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/8ec2b154-4844-440e-bec5-911d8456ac91-bound-sa-token\") pod \"cluster-image-registry-operator-dc59b4c8b-mklqs\" (UID: \"8ec2b154-4844-440e-bec5-911d8456ac91\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-mklqs" Jan 11 17:31:46 crc kubenswrapper[4837]: I0111 17:31:46.170916 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/6184fa1c-4a00-4ae3-9dad-a5672220571e-serving-cert\") pod \"kube-storage-version-migrator-operator-b67b599dd-r58pw\" (UID: \"6184fa1c-4a00-4ae3-9dad-a5672220571e\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-r58pw" Jan 11 17:31:46 crc kubenswrapper[4837]: I0111 17:31:46.170949 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"image-registry-operator-tls\" (UniqueName: \"kubernetes.io/secret/8ec2b154-4844-440e-bec5-911d8456ac91-image-registry-operator-tls\") pod \"cluster-image-registry-operator-dc59b4c8b-mklqs\" (UID: \"8ec2b154-4844-440e-bec5-911d8456ac91\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-mklqs" Jan 11 17:31:46 crc kubenswrapper[4837]: I0111 17:31:46.171011 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-467k8\" (UniqueName: \"kubernetes.io/projected/75ee2960-aa16-4aea-84f2-d60c34d6fb1a-kube-api-access-467k8\") pod \"downloads-7954f5f757-dmnqc\" (UID: \"75ee2960-aa16-4aea-84f2-d60c34d6fb1a\") " pod="openshift-console/downloads-7954f5f757-dmnqc" Jan 11 17:31:46 crc kubenswrapper[4837]: I0111 17:31:46.171053 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/1d6837bf-33ce-474a-a4f5-22d1fa7b95b1-service-ca-bundle\") pod \"authentication-operator-69f744f599-l2c7d\" (UID: \"1d6837bf-33ce-474a-a4f5-22d1fa7b95b1\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-l2c7d" Jan 11 17:31:46 crc kubenswrapper[4837]: I0111 17:31:46.171077 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/09971765-a82b-4ab2-b79a-5defb8feb416-audit-dir\") pod \"apiserver-76f77b778f-pz586\" (UID: \"09971765-a82b-4ab2-b79a-5defb8feb416\") " pod="openshift-apiserver/apiserver-76f77b778f-pz586" Jan 11 17:31:46 crc kubenswrapper[4837]: I0111 17:31:46.171086 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f9f7469a-7ddb-4d35-962e-86154f7750c9-config\") pod \"controller-manager-879f6c89f-ljlz2\" (UID: \"f9f7469a-7ddb-4d35-962e-86154f7750c9\") " pod="openshift-controller-manager/controller-manager-879f6c89f-ljlz2" Jan 11 17:31:46 crc kubenswrapper[4837]: I0111 17:31:46.171145 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2cs68\" (UniqueName: \"kubernetes.io/projected/a45207f1-08c5-4ae8-9ac3-3e3099df8218-kube-api-access-2cs68\") pod \"dns-operator-744455d44c-n8rwr\" (UID: \"a45207f1-08c5-4ae8-9ac3-3e3099df8218\") " pod="openshift-dns-operator/dns-operator-744455d44c-n8rwr" Jan 11 17:31:46 crc kubenswrapper[4837]: I0111 17:31:46.171157 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"available-featuregates\" (UniqueName: \"kubernetes.io/empty-dir/8854497b-9e74-4bd1-b465-847ad61d8779-available-featuregates\") pod \"openshift-config-operator-7777fb866f-599f5\" (UID: \"8854497b-9e74-4bd1-b465-847ad61d8779\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-599f5" Jan 11 17:31:46 crc kubenswrapper[4837]: I0111 17:31:46.171179 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/1141f492-afec-40f3-bde7-7072d6a75a68-console-config\") pod \"console-f9d7485db-v8df8\" (UID: \"1141f492-afec-40f3-bde7-7072d6a75a68\") " pod="openshift-console/console-f9d7485db-v8df8" Jan 11 17:31:46 crc kubenswrapper[4837]: I0111 17:31:46.171247 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-l8sxd\" (UniqueName: \"kubernetes.io/projected/1141f492-afec-40f3-bde7-7072d6a75a68-kube-api-access-l8sxd\") pod \"console-f9d7485db-v8df8\" (UID: \"1141f492-afec-40f3-bde7-7072d6a75a68\") " pod="openshift-console/console-f9d7485db-v8df8" Jan 11 17:31:46 crc kubenswrapper[4837]: I0111 17:31:46.171268 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/58f0dabf-1deb-453a-9ac9-11324df2b806-config\") pod \"console-operator-58897d9998-n8ldn\" (UID: \"58f0dabf-1deb-453a-9ac9-11324df2b806\") " pod="openshift-console-operator/console-operator-58897d9998-n8ldn" Jan 11 17:31:46 crc kubenswrapper[4837]: I0111 17:31:46.171284 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"image-import-ca\" (UniqueName: \"kubernetes.io/configmap/09971765-a82b-4ab2-b79a-5defb8feb416-image-import-ca\") pod \"apiserver-76f77b778f-pz586\" (UID: \"09971765-a82b-4ab2-b79a-5defb8feb416\") " pod="openshift-apiserver/apiserver-76f77b778f-pz586" Jan 11 17:31:46 crc kubenswrapper[4837]: I0111 17:31:46.171303 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/ecf9daad-b73a-4f1a-9247-ee4973ad1bc7-etcd-serving-ca\") pod \"apiserver-7bbb656c7d-2ngzf\" (UID: \"ecf9daad-b73a-4f1a-9247-ee4973ad1bc7\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-2ngzf" Jan 11 17:31:46 crc kubenswrapper[4837]: I0111 17:31:46.171351 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/76c9940b-89a9-414c-ab2a-c4c1b4519725-config\") pod \"route-controller-manager-6576b87f9c-cwdvp\" (UID: \"76c9940b-89a9-414c-ab2a-c4c1b4519725\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-cwdvp" Jan 11 17:31:46 crc kubenswrapper[4837]: I0111 17:31:46.171398 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6a05a50b-ccec-45f2-be70-d210e5334d18-config\") pod \"kube-apiserver-operator-766d6c64bb-c9fvj\" (UID: \"6a05a50b-ccec-45f2-be70-d210e5334d18\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-c9fvj" Jan 11 17:31:46 crc kubenswrapper[4837]: I0111 17:31:46.171417 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/09971765-a82b-4ab2-b79a-5defb8feb416-trusted-ca-bundle\") pod \"apiserver-76f77b778f-pz586\" (UID: \"09971765-a82b-4ab2-b79a-5defb8feb416\") " pod="openshift-apiserver/apiserver-76f77b778f-pz586" Jan 11 17:31:46 crc kubenswrapper[4837]: I0111 17:31:46.171437 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/ecf9daad-b73a-4f1a-9247-ee4973ad1bc7-serving-cert\") pod \"apiserver-7bbb656c7d-2ngzf\" (UID: \"ecf9daad-b73a-4f1a-9247-ee4973ad1bc7\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-2ngzf" Jan 11 17:31:46 crc kubenswrapper[4837]: I0111 17:31:46.171456 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8bd32a92-1f67-46c3-831f-6b50d22d0fb2-config\") pod \"openshift-controller-manager-operator-756b6f6bc6-wlfcq\" (UID: \"8bd32a92-1f67-46c3-831f-6b50d22d0fb2\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-wlfcq" Jan 11 17:31:46 crc kubenswrapper[4837]: I0111 17:31:46.171491 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/76c9940b-89a9-414c-ab2a-c4c1b4519725-client-ca\") pod \"route-controller-manager-6576b87f9c-cwdvp\" (UID: \"76c9940b-89a9-414c-ab2a-c4c1b4519725\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-cwdvp" Jan 11 17:31:46 crc kubenswrapper[4837]: I0111 17:31:46.171526 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zpvc9\" (UniqueName: \"kubernetes.io/projected/58f0dabf-1deb-453a-9ac9-11324df2b806-kube-api-access-zpvc9\") pod \"console-operator-58897d9998-n8ldn\" (UID: \"58f0dabf-1deb-453a-9ac9-11324df2b806\") " pod="openshift-console-operator/console-operator-58897d9998-n8ldn" Jan 11 17:31:46 crc kubenswrapper[4837]: I0111 17:31:46.171573 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cn278\" (UniqueName: \"kubernetes.io/projected/8bd32a92-1f67-46c3-831f-6b50d22d0fb2-kube-api-access-cn278\") pod \"openshift-controller-manager-operator-756b6f6bc6-wlfcq\" (UID: \"8bd32a92-1f67-46c3-831f-6b50d22d0fb2\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-wlfcq" Jan 11 17:31:46 crc kubenswrapper[4837]: I0111 17:31:46.171652 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/09971765-a82b-4ab2-b79a-5defb8feb416-encryption-config\") pod \"apiserver-76f77b778f-pz586\" (UID: \"09971765-a82b-4ab2-b79a-5defb8feb416\") " pod="openshift-apiserver/apiserver-76f77b778f-pz586" Jan 11 17:31:46 crc kubenswrapper[4837]: I0111 17:31:46.172103 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/ecf9daad-b73a-4f1a-9247-ee4973ad1bc7-audit-policies\") pod \"apiserver-7bbb656c7d-2ngzf\" (UID: \"ecf9daad-b73a-4f1a-9247-ee4973ad1bc7\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-2ngzf" Jan 11 17:31:46 crc kubenswrapper[4837]: I0111 17:31:46.173186 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit\" (UniqueName: \"kubernetes.io/configmap/09971765-a82b-4ab2-b79a-5defb8feb416-audit\") pod \"apiserver-76f77b778f-pz586\" (UID: \"09971765-a82b-4ab2-b79a-5defb8feb416\") " pod="openshift-apiserver/apiserver-76f77b778f-pz586" Jan 11 17:31:46 crc kubenswrapper[4837]: I0111 17:31:46.173349 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f9f7469a-7ddb-4d35-962e-86154f7750c9-config\") pod \"controller-manager-879f6c89f-ljlz2\" (UID: \"f9f7469a-7ddb-4d35-962e-86154f7750c9\") " pod="openshift-controller-manager/controller-manager-879f6c89f-ljlz2" Jan 11 17:31:46 crc kubenswrapper[4837]: I0111 17:31:46.175693 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/26867a3e-74b2-4ef0-b671-52b6f72fb0d3-config\") pod \"openshift-apiserver-operator-796bbdcf4f-fpvgj\" (UID: \"26867a3e-74b2-4ef0-b671-52b6f72fb0d3\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-fpvgj" Jan 11 17:31:46 crc kubenswrapper[4837]: I0111 17:31:46.175942 4837 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-idp-0-file-data" Jan 11 17:31:46 crc kubenswrapper[4837]: I0111 17:31:46.176776 4837 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-config-operator/machine-config-server-7mr75"] Jan 11 17:31:46 crc kubenswrapper[4837]: I0111 17:31:46.177100 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/58f0dabf-1deb-453a-9ac9-11324df2b806-trusted-ca\") pod \"console-operator-58897d9998-n8ldn\" (UID: \"58f0dabf-1deb-453a-9ac9-11324df2b806\") " pod="openshift-console-operator/console-operator-58897d9998-n8ldn" Jan 11 17:31:46 crc kubenswrapper[4837]: I0111 17:31:46.177442 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/1141f492-afec-40f3-bde7-7072d6a75a68-trusted-ca-bundle\") pod \"console-f9d7485db-v8df8\" (UID: \"1141f492-afec-40f3-bde7-7072d6a75a68\") " pod="openshift-console/console-f9d7485db-v8df8" Jan 11 17:31:46 crc kubenswrapper[4837]: I0111 17:31:46.177827 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-server-7mr75" Jan 11 17:31:46 crc kubenswrapper[4837]: I0111 17:31:46.182968 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/1141f492-afec-40f3-bde7-7072d6a75a68-oauth-serving-cert\") pod \"console-f9d7485db-v8df8\" (UID: \"1141f492-afec-40f3-bde7-7072d6a75a68\") " pod="openshift-console/console-f9d7485db-v8df8" Jan 11 17:31:46 crc kubenswrapper[4837]: I0111 17:31:46.179535 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/f9f7469a-7ddb-4d35-962e-86154f7750c9-client-ca\") pod \"controller-manager-879f6c89f-ljlz2\" (UID: \"f9f7469a-7ddb-4d35-962e-86154f7750c9\") " pod="openshift-controller-manager/controller-manager-879f6c89f-ljlz2" Jan 11 17:31:46 crc kubenswrapper[4837]: I0111 17:31:46.181653 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/1d6837bf-33ce-474a-a4f5-22d1fa7b95b1-service-ca-bundle\") pod \"authentication-operator-69f744f599-l2c7d\" (UID: \"1d6837bf-33ce-474a-a4f5-22d1fa7b95b1\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-l2c7d" Jan 11 17:31:46 crc kubenswrapper[4837]: I0111 17:31:46.183115 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"machine-approver-tls\" (UniqueName: \"kubernetes.io/secret/59f65053-461b-4ad3-acc6-2a29b1fd06c6-machine-approver-tls\") pod \"machine-approver-56656f9798-hchsh\" (UID: \"59f65053-461b-4ad3-acc6-2a29b1fd06c6\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-hchsh" Jan 11 17:31:46 crc kubenswrapper[4837]: I0111 17:31:46.178995 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/09971765-a82b-4ab2-b79a-5defb8feb416-etcd-serving-ca\") pod \"apiserver-76f77b778f-pz586\" (UID: \"09971765-a82b-4ab2-b79a-5defb8feb416\") " pod="openshift-apiserver/apiserver-76f77b778f-pz586" Jan 11 17:31:46 crc kubenswrapper[4837]: I0111 17:31:46.183399 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1d6837bf-33ce-474a-a4f5-22d1fa7b95b1-config\") pod \"authentication-operator-69f744f599-l2c7d\" (UID: \"1d6837bf-33ce-474a-a4f5-22d1fa7b95b1\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-l2c7d" Jan 11 17:31:46 crc kubenswrapper[4837]: I0111 17:31:46.183444 4837 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-dns-operator/dns-operator-744455d44c-n8rwr"] Jan 11 17:31:46 crc kubenswrapper[4837]: I0111 17:31:46.184222 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/59f65053-461b-4ad3-acc6-2a29b1fd06c6-config\") pod \"machine-approver-56656f9798-hchsh\" (UID: \"59f65053-461b-4ad3-acc6-2a29b1fd06c6\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-hchsh" Jan 11 17:31:46 crc kubenswrapper[4837]: I0111 17:31:46.184239 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/09971765-a82b-4ab2-b79a-5defb8feb416-config\") pod \"apiserver-76f77b778f-pz586\" (UID: \"09971765-a82b-4ab2-b79a-5defb8feb416\") " pod="openshift-apiserver/apiserver-76f77b778f-pz586" Jan 11 17:31:46 crc kubenswrapper[4837]: I0111 17:31:46.184993 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/ecf9daad-b73a-4f1a-9247-ee4973ad1bc7-trusted-ca-bundle\") pod \"apiserver-7bbb656c7d-2ngzf\" (UID: \"ecf9daad-b73a-4f1a-9247-ee4973ad1bc7\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-2ngzf" Jan 11 17:31:46 crc kubenswrapper[4837]: I0111 17:31:46.185846 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/09971765-a82b-4ab2-b79a-5defb8feb416-etcd-client\") pod \"apiserver-76f77b778f-pz586\" (UID: \"09971765-a82b-4ab2-b79a-5defb8feb416\") " pod="openshift-apiserver/apiserver-76f77b778f-pz586" Jan 11 17:31:46 crc kubenswrapper[4837]: I0111 17:31:46.186027 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/1d6837bf-33ce-474a-a4f5-22d1fa7b95b1-serving-cert\") pod \"authentication-operator-69f744f599-l2c7d\" (UID: \"1d6837bf-33ce-474a-a4f5-22d1fa7b95b1\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-l2c7d" Jan 11 17:31:46 crc kubenswrapper[4837]: I0111 17:31:46.186109 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/09971765-a82b-4ab2-b79a-5defb8feb416-node-pullsecrets\") pod \"apiserver-76f77b778f-pz586\" (UID: \"09971765-a82b-4ab2-b79a-5defb8feb416\") " pod="openshift-apiserver/apiserver-76f77b778f-pz586" Jan 11 17:31:46 crc kubenswrapper[4837]: I0111 17:31:46.186239 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/59f65053-461b-4ad3-acc6-2a29b1fd06c6-auth-proxy-config\") pod \"machine-approver-56656f9798-hchsh\" (UID: \"59f65053-461b-4ad3-acc6-2a29b1fd06c6\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-hchsh" Jan 11 17:31:46 crc kubenswrapper[4837]: I0111 17:31:46.186242 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/1141f492-afec-40f3-bde7-7072d6a75a68-console-config\") pod \"console-f9d7485db-v8df8\" (UID: \"1141f492-afec-40f3-bde7-7072d6a75a68\") " pod="openshift-console/console-f9d7485db-v8df8" Jan 11 17:31:46 crc kubenswrapper[4837]: I0111 17:31:46.190019 4837 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-samples-operator"/"samples-operator-tls" Jan 11 17:31:46 crc kubenswrapper[4837]: I0111 17:31:46.190013 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/f9f7469a-7ddb-4d35-962e-86154f7750c9-proxy-ca-bundles\") pod \"controller-manager-879f6c89f-ljlz2\" (UID: \"f9f7469a-7ddb-4d35-962e-86154f7750c9\") " pod="openshift-controller-manager/controller-manager-879f6c89f-ljlz2" Jan 11 17:31:46 crc kubenswrapper[4837]: I0111 17:31:46.190674 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/58f0dabf-1deb-453a-9ac9-11324df2b806-config\") pod \"console-operator-58897d9998-n8ldn\" (UID: \"58f0dabf-1deb-453a-9ac9-11324df2b806\") " pod="openshift-console-operator/console-operator-58897d9998-n8ldn" Jan 11 17:31:46 crc kubenswrapper[4837]: I0111 17:31:46.190890 4837 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-service-ca/service-ca-9c57cc56f-nqs9r"] Jan 11 17:31:46 crc kubenswrapper[4837]: I0111 17:31:46.190927 4837 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-wphsv"] Jan 11 17:31:46 crc kubenswrapper[4837]: I0111 17:31:46.190939 4837 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-2vscb"] Jan 11 17:31:46 crc kubenswrapper[4837]: I0111 17:31:46.191051 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/1141f492-afec-40f3-bde7-7072d6a75a68-console-serving-cert\") pod \"console-f9d7485db-v8df8\" (UID: \"1141f492-afec-40f3-bde7-7072d6a75a68\") " pod="openshift-console/console-f9d7485db-v8df8" Jan 11 17:31:46 crc kubenswrapper[4837]: I0111 17:31:46.191246 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/ecf9daad-b73a-4f1a-9247-ee4973ad1bc7-etcd-serving-ca\") pod \"apiserver-7bbb656c7d-2ngzf\" (UID: \"ecf9daad-b73a-4f1a-9247-ee4973ad1bc7\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-2ngzf" Jan 11 17:31:46 crc kubenswrapper[4837]: I0111 17:31:46.191933 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/76c9940b-89a9-414c-ab2a-c4c1b4519725-client-ca\") pod \"route-controller-manager-6576b87f9c-cwdvp\" (UID: \"76c9940b-89a9-414c-ab2a-c4c1b4519725\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-cwdvp" Jan 11 17:31:46 crc kubenswrapper[4837]: I0111 17:31:46.192199 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/a45207f1-08c5-4ae8-9ac3-3e3099df8218-metrics-tls\") pod \"dns-operator-744455d44c-n8rwr\" (UID: \"a45207f1-08c5-4ae8-9ac3-3e3099df8218\") " pod="openshift-dns-operator/dns-operator-744455d44c-n8rwr" Jan 11 17:31:46 crc kubenswrapper[4837]: I0111 17:31:46.193013 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/f9f7469a-7ddb-4d35-962e-86154f7750c9-serving-cert\") pod \"controller-manager-879f6c89f-ljlz2\" (UID: \"f9f7469a-7ddb-4d35-962e-86154f7750c9\") " pod="openshift-controller-manager/controller-manager-879f6c89f-ljlz2" Jan 11 17:31:46 crc kubenswrapper[4837]: I0111 17:31:46.194108 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/09971765-a82b-4ab2-b79a-5defb8feb416-encryption-config\") pod \"apiserver-76f77b778f-pz586\" (UID: \"09971765-a82b-4ab2-b79a-5defb8feb416\") " pod="openshift-apiserver/apiserver-76f77b778f-pz586" Jan 11 17:31:46 crc kubenswrapper[4837]: I0111 17:31:46.194150 4837 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-bs6rl"] Jan 11 17:31:46 crc kubenswrapper[4837]: I0111 17:31:46.196584 4837 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-c9fvj"] Jan 11 17:31:46 crc kubenswrapper[4837]: I0111 17:31:46.196602 4837 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-authentication-operator/authentication-operator-69f744f599-l2c7d"] Jan 11 17:31:46 crc kubenswrapper[4837]: I0111 17:31:46.197529 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/1d6837bf-33ce-474a-a4f5-22d1fa7b95b1-trusted-ca-bundle\") pod \"authentication-operator-69f744f599-l2c7d\" (UID: \"1d6837bf-33ce-474a-a4f5-22d1fa7b95b1\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-l2c7d" Jan 11 17:31:46 crc kubenswrapper[4837]: I0111 17:31:46.197617 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/ecf9daad-b73a-4f1a-9247-ee4973ad1bc7-serving-cert\") pod \"apiserver-7bbb656c7d-2ngzf\" (UID: \"ecf9daad-b73a-4f1a-9247-ee4973ad1bc7\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-2ngzf" Jan 11 17:31:46 crc kubenswrapper[4837]: I0111 17:31:46.197717 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/1141f492-afec-40f3-bde7-7072d6a75a68-service-ca\") pod \"console-f9d7485db-v8df8\" (UID: \"1141f492-afec-40f3-bde7-7072d6a75a68\") " pod="openshift-console/console-f9d7485db-v8df8" Jan 11 17:31:46 crc kubenswrapper[4837]: I0111 17:31:46.198035 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/ecf9daad-b73a-4f1a-9247-ee4973ad1bc7-encryption-config\") pod \"apiserver-7bbb656c7d-2ngzf\" (UID: \"ecf9daad-b73a-4f1a-9247-ee4973ad1bc7\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-2ngzf" Jan 11 17:31:46 crc kubenswrapper[4837]: I0111 17:31:46.198115 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/76c9940b-89a9-414c-ab2a-c4c1b4519725-serving-cert\") pod \"route-controller-manager-6576b87f9c-cwdvp\" (UID: \"76c9940b-89a9-414c-ab2a-c4c1b4519725\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-cwdvp" Jan 11 17:31:46 crc kubenswrapper[4837]: I0111 17:31:46.198547 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/76c9940b-89a9-414c-ab2a-c4c1b4519725-config\") pod \"route-controller-manager-6576b87f9c-cwdvp\" (UID: \"76c9940b-89a9-414c-ab2a-c4c1b4519725\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-cwdvp" Jan 11 17:31:46 crc kubenswrapper[4837]: I0111 17:31:46.198590 4837 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-9dr4c"] Jan 11 17:31:46 crc kubenswrapper[4837]: I0111 17:31:46.200013 4837 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-879f6c89f-ljlz2"] Jan 11 17:31:46 crc kubenswrapper[4837]: I0111 17:31:46.200542 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/26867a3e-74b2-4ef0-b671-52b6f72fb0d3-serving-cert\") pod \"openshift-apiserver-operator-796bbdcf4f-fpvgj\" (UID: \"26867a3e-74b2-4ef0-b671-52b6f72fb0d3\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-fpvgj" Jan 11 17:31:46 crc kubenswrapper[4837]: I0111 17:31:46.201085 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/8854497b-9e74-4bd1-b465-847ad61d8779-serving-cert\") pod \"openshift-config-operator-7777fb866f-599f5\" (UID: \"8854497b-9e74-4bd1-b465-847ad61d8779\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-599f5" Jan 11 17:31:46 crc kubenswrapper[4837]: I0111 17:31:46.201088 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/09971765-a82b-4ab2-b79a-5defb8feb416-serving-cert\") pod \"apiserver-76f77b778f-pz586\" (UID: \"09971765-a82b-4ab2-b79a-5defb8feb416\") " pod="openshift-apiserver/apiserver-76f77b778f-pz586" Jan 11 17:31:46 crc kubenswrapper[4837]: I0111 17:31:46.202850 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/1141f492-afec-40f3-bde7-7072d6a75a68-console-oauth-config\") pod \"console-f9d7485db-v8df8\" (UID: \"1141f492-afec-40f3-bde7-7072d6a75a68\") " pod="openshift-console/console-f9d7485db-v8df8" Jan 11 17:31:46 crc kubenswrapper[4837]: I0111 17:31:46.206127 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/8bd32a92-1f67-46c3-831f-6b50d22d0fb2-serving-cert\") pod \"openshift-controller-manager-operator-756b6f6bc6-wlfcq\" (UID: \"8bd32a92-1f67-46c3-831f-6b50d22d0fb2\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-wlfcq" Jan 11 17:31:46 crc kubenswrapper[4837]: I0111 17:31:46.207860 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"image-import-ca\" (UniqueName: \"kubernetes.io/configmap/09971765-a82b-4ab2-b79a-5defb8feb416-image-import-ca\") pod \"apiserver-76f77b778f-pz586\" (UID: \"09971765-a82b-4ab2-b79a-5defb8feb416\") " pod="openshift-apiserver/apiserver-76f77b778f-pz586" Jan 11 17:31:46 crc kubenswrapper[4837]: I0111 17:31:46.207999 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/09971765-a82b-4ab2-b79a-5defb8feb416-trusted-ca-bundle\") pod \"apiserver-76f77b778f-pz586\" (UID: \"09971765-a82b-4ab2-b79a-5defb8feb416\") " pod="openshift-apiserver/apiserver-76f77b778f-pz586" Jan 11 17:31:46 crc kubenswrapper[4837]: I0111 17:31:46.208086 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/ecf9daad-b73a-4f1a-9247-ee4973ad1bc7-etcd-client\") pod \"apiserver-7bbb656c7d-2ngzf\" (UID: \"ecf9daad-b73a-4f1a-9247-ee4973ad1bc7\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-2ngzf" Jan 11 17:31:46 crc kubenswrapper[4837]: I0111 17:31:46.208719 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/58f0dabf-1deb-453a-9ac9-11324df2b806-serving-cert\") pod \"console-operator-58897d9998-n8ldn\" (UID: \"58f0dabf-1deb-453a-9ac9-11324df2b806\") " pod="openshift-console-operator/console-operator-58897d9998-n8ldn" Jan 11 17:31:46 crc kubenswrapper[4837]: I0111 17:31:46.222713 4837 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-r58pw"] Jan 11 17:31:46 crc kubenswrapper[4837]: I0111 17:31:46.226320 4837 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-canary/ingress-canary-wbdfb"] Jan 11 17:31:46 crc kubenswrapper[4837]: I0111 17:31:46.226734 4837 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-gkkhd"] Jan 11 17:31:46 crc kubenswrapper[4837]: I0111 17:31:46.228523 4837 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-config-operator/machine-config-operator-74547568cd-ps47j"] Jan 11 17:31:46 crc kubenswrapper[4837]: I0111 17:31:46.229319 4837 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"oauth-openshift-dockercfg-znhcc" Jan 11 17:31:46 crc kubenswrapper[4837]: I0111 17:31:46.230121 4837 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-etcd-operator/etcd-operator-b45778765-pjl6t"] Jan 11 17:31:46 crc kubenswrapper[4837]: I0111 17:31:46.230301 4837 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-authentication/oauth-openshift-558db77b4-ldzgv"] Jan 11 17:31:46 crc kubenswrapper[4837]: I0111 17:31:46.231886 4837 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-ocp-branding-template" Jan 11 17:31:46 crc kubenswrapper[4837]: I0111 17:31:46.232037 4837 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-fxkfr"] Jan 11 17:31:46 crc kubenswrapper[4837]: I0111 17:31:46.233243 4837 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-service-ca-operator/service-ca-operator-777779d784-vzfwx"] Jan 11 17:31:46 crc kubenswrapper[4837]: I0111 17:31:46.234247 4837 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29469210-m2c5r"] Jan 11 17:31:46 crc kubenswrapper[4837]: I0111 17:31:46.235264 4837 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-5xllw"] Jan 11 17:31:46 crc kubenswrapper[4837]: I0111 17:31:46.236366 4837 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-config-operator/machine-config-controller-84d6567774-zdg78"] Jan 11 17:31:46 crc kubenswrapper[4837]: I0111 17:31:46.237794 4837 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-multus/multus-admission-controller-857f4d67dd-2984c"] Jan 11 17:31:46 crc kubenswrapper[4837]: I0111 17:31:46.238478 4837 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/catalog-operator-68c6474976-sxnbq"] Jan 11 17:31:46 crc kubenswrapper[4837]: I0111 17:31:46.239562 4837 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-operator/ingress-operator-5b745b69d9-gsqgk"] Jan 11 17:31:46 crc kubenswrapper[4837]: I0111 17:31:46.240820 4837 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-registry-697d97f7c8-4nvrr"] Jan 11 17:31:46 crc kubenswrapper[4837]: I0111 17:31:46.241837 4837 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-dns/dns-default-dddb7"] Jan 11 17:31:46 crc kubenswrapper[4837]: I0111 17:31:46.242737 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/dns-default-dddb7" Jan 11 17:31:46 crc kubenswrapper[4837]: I0111 17:31:46.243642 4837 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["hostpath-provisioner/csi-hostpathplugin-tdxwr"] Jan 11 17:31:46 crc kubenswrapper[4837]: I0111 17:31:46.245368 4837 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-storage-version-migrator/migrator-59844c95c7-p7jgn"] Jan 11 17:31:46 crc kubenswrapper[4837]: I0111 17:31:46.245450 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="hostpath-provisioner/csi-hostpathplugin-tdxwr" Jan 11 17:31:46 crc kubenswrapper[4837]: I0111 17:31:46.245543 4837 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-dns/dns-default-dddb7"] Jan 11 17:31:46 crc kubenswrapper[4837]: I0111 17:31:46.246514 4837 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["hostpath-provisioner/csi-hostpathplugin-tdxwr"] Jan 11 17:31:46 crc kubenswrapper[4837]: I0111 17:31:46.249333 4837 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"image-registry-operator-tls" Jan 11 17:31:46 crc kubenswrapper[4837]: I0111 17:31:46.268423 4837 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-samples-operator"/"kube-root-ca.crt" Jan 11 17:31:46 crc kubenswrapper[4837]: I0111 17:31:46.277097 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6184fa1c-4a00-4ae3-9dad-a5672220571e-config\") pod \"kube-storage-version-migrator-operator-b67b599dd-r58pw\" (UID: \"6184fa1c-4a00-4ae3-9dad-a5672220571e\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-r58pw" Jan 11 17:31:46 crc kubenswrapper[4837]: I0111 17:31:46.277145 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/6a05a50b-ccec-45f2-be70-d210e5334d18-serving-cert\") pod \"kube-apiserver-operator-766d6c64bb-c9fvj\" (UID: \"6a05a50b-ccec-45f2-be70-d210e5334d18\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-c9fvj" Jan 11 17:31:46 crc kubenswrapper[4837]: I0111 17:31:46.277183 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/6a05a50b-ccec-45f2-be70-d210e5334d18-kube-api-access\") pod \"kube-apiserver-operator-766d6c64bb-c9fvj\" (UID: \"6a05a50b-ccec-45f2-be70-d210e5334d18\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-c9fvj" Jan 11 17:31:46 crc kubenswrapper[4837]: I0111 17:31:46.277219 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kt6wj\" (UniqueName: \"kubernetes.io/projected/6184fa1c-4a00-4ae3-9dad-a5672220571e-kube-api-access-kt6wj\") pod \"kube-storage-version-migrator-operator-b67b599dd-r58pw\" (UID: \"6184fa1c-4a00-4ae3-9dad-a5672220571e\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-r58pw" Jan 11 17:31:46 crc kubenswrapper[4837]: I0111 17:31:46.277258 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/8ec2b154-4844-440e-bec5-911d8456ac91-trusted-ca\") pod \"cluster-image-registry-operator-dc59b4c8b-mklqs\" (UID: \"8ec2b154-4844-440e-bec5-911d8456ac91\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-mklqs" Jan 11 17:31:46 crc kubenswrapper[4837]: I0111 17:31:46.277279 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/6184fa1c-4a00-4ae3-9dad-a5672220571e-serving-cert\") pod \"kube-storage-version-migrator-operator-b67b599dd-r58pw\" (UID: \"6184fa1c-4a00-4ae3-9dad-a5672220571e\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-r58pw" Jan 11 17:31:46 crc kubenswrapper[4837]: I0111 17:31:46.277294 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"image-registry-operator-tls\" (UniqueName: \"kubernetes.io/secret/8ec2b154-4844-440e-bec5-911d8456ac91-image-registry-operator-tls\") pod \"cluster-image-registry-operator-dc59b4c8b-mklqs\" (UID: \"8ec2b154-4844-440e-bec5-911d8456ac91\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-mklqs" Jan 11 17:31:46 crc kubenswrapper[4837]: I0111 17:31:46.277308 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/8ec2b154-4844-440e-bec5-911d8456ac91-bound-sa-token\") pod \"cluster-image-registry-operator-dc59b4c8b-mklqs\" (UID: \"8ec2b154-4844-440e-bec5-911d8456ac91\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-mklqs" Jan 11 17:31:46 crc kubenswrapper[4837]: I0111 17:31:46.277375 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6a05a50b-ccec-45f2-be70-d210e5334d18-config\") pod \"kube-apiserver-operator-766d6c64bb-c9fvj\" (UID: \"6a05a50b-ccec-45f2-be70-d210e5334d18\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-c9fvj" Jan 11 17:31:46 crc kubenswrapper[4837]: I0111 17:31:46.277450 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ndmmj\" (UniqueName: \"kubernetes.io/projected/8ec2b154-4844-440e-bec5-911d8456ac91-kube-api-access-ndmmj\") pod \"cluster-image-registry-operator-dc59b4c8b-mklqs\" (UID: \"8ec2b154-4844-440e-bec5-911d8456ac91\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-mklqs" Jan 11 17:31:46 crc kubenswrapper[4837]: I0111 17:31:46.278644 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/8ec2b154-4844-440e-bec5-911d8456ac91-trusted-ca\") pod \"cluster-image-registry-operator-dc59b4c8b-mklqs\" (UID: \"8ec2b154-4844-440e-bec5-911d8456ac91\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-mklqs" Jan 11 17:31:46 crc kubenswrapper[4837]: I0111 17:31:46.281017 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"image-registry-operator-tls\" (UniqueName: \"kubernetes.io/secret/8ec2b154-4844-440e-bec5-911d8456ac91-image-registry-operator-tls\") pod \"cluster-image-registry-operator-dc59b4c8b-mklqs\" (UID: \"8ec2b154-4844-440e-bec5-911d8456ac91\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-mklqs" Jan 11 17:31:46 crc kubenswrapper[4837]: I0111 17:31:46.289794 4837 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"cluster-image-registry-operator-dockercfg-m4qtx" Jan 11 17:31:46 crc kubenswrapper[4837]: I0111 17:31:46.308411 4837 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-samples-operator"/"cluster-samples-operator-dockercfg-xpp9w" Jan 11 17:31:46 crc kubenswrapper[4837]: I0111 17:31:46.348817 4837 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"openshift-service-ca.crt" Jan 11 17:31:46 crc kubenswrapper[4837]: I0111 17:31:46.363196 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Jan 11 17:31:46 crc kubenswrapper[4837]: I0111 17:31:46.363414 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-f2l24" Jan 11 17:31:46 crc kubenswrapper[4837]: I0111 17:31:46.363592 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Jan 11 17:31:46 crc kubenswrapper[4837]: I0111 17:31:46.368319 4837 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"olm-operator-serviceaccount-dockercfg-rq7zk" Jan 11 17:31:46 crc kubenswrapper[4837]: I0111 17:31:46.391945 4837 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"package-server-manager-serving-cert" Jan 11 17:31:46 crc kubenswrapper[4837]: I0111 17:31:46.408930 4837 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"kube-root-ca.crt" Jan 11 17:31:46 crc kubenswrapper[4837]: I0111 17:31:46.429493 4837 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-storage-version-migrator-operator"/"kube-storage-version-migrator-operator-dockercfg-2bh8d" Jan 11 17:31:46 crc kubenswrapper[4837]: I0111 17:31:46.448885 4837 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-apiserver-operator"/"kube-root-ca.crt" Jan 11 17:31:46 crc kubenswrapper[4837]: I0111 17:31:46.468447 4837 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator-operator"/"openshift-service-ca.crt" Jan 11 17:31:46 crc kubenswrapper[4837]: I0111 17:31:46.488762 4837 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-storage-version-migrator-operator"/"serving-cert" Jan 11 17:31:46 crc kubenswrapper[4837]: I0111 17:31:46.502928 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/6184fa1c-4a00-4ae3-9dad-a5672220571e-serving-cert\") pod \"kube-storage-version-migrator-operator-b67b599dd-r58pw\" (UID: \"6184fa1c-4a00-4ae3-9dad-a5672220571e\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-r58pw" Jan 11 17:31:46 crc kubenswrapper[4837]: I0111 17:31:46.509261 4837 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator-operator"/"config" Jan 11 17:31:46 crc kubenswrapper[4837]: I0111 17:31:46.518980 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6184fa1c-4a00-4ae3-9dad-a5672220571e-config\") pod \"kube-storage-version-migrator-operator-b67b599dd-r58pw\" (UID: \"6184fa1c-4a00-4ae3-9dad-a5672220571e\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-r58pw" Jan 11 17:31:46 crc kubenswrapper[4837]: I0111 17:31:46.529430 4837 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator-operator"/"kube-root-ca.crt" Jan 11 17:31:46 crc kubenswrapper[4837]: I0111 17:31:46.549302 4837 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-apiserver-operator"/"kube-apiserver-operator-dockercfg-x57mr" Jan 11 17:31:46 crc kubenswrapper[4837]: I0111 17:31:46.568960 4837 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-apiserver-operator"/"kube-apiserver-operator-serving-cert" Jan 11 17:31:46 crc kubenswrapper[4837]: I0111 17:31:46.582462 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/6a05a50b-ccec-45f2-be70-d210e5334d18-serving-cert\") pod \"kube-apiserver-operator-766d6c64bb-c9fvj\" (UID: \"6a05a50b-ccec-45f2-be70-d210e5334d18\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-c9fvj" Jan 11 17:31:46 crc kubenswrapper[4837]: I0111 17:31:46.589443 4837 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-apiserver-operator"/"kube-apiserver-operator-config" Jan 11 17:31:46 crc kubenswrapper[4837]: I0111 17:31:46.599499 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6a05a50b-ccec-45f2-be70-d210e5334d18-config\") pod \"kube-apiserver-operator-766d6c64bb-c9fvj\" (UID: \"6a05a50b-ccec-45f2-be70-d210e5334d18\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-c9fvj" Jan 11 17:31:46 crc kubenswrapper[4837]: I0111 17:31:46.608579 4837 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"registry-dockercfg-kzzsd" Jan 11 17:31:46 crc kubenswrapper[4837]: I0111 17:31:46.629438 4837 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"pprof-cert" Jan 11 17:31:46 crc kubenswrapper[4837]: I0111 17:31:46.635668 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8bd32a92-1f67-46c3-831f-6b50d22d0fb2-config\") pod \"openshift-controller-manager-operator-756b6f6bc6-wlfcq\" (UID: \"8bd32a92-1f67-46c3-831f-6b50d22d0fb2\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-wlfcq" Jan 11 17:31:46 crc kubenswrapper[4837]: I0111 17:31:46.651309 4837 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"image-registry-tls" Jan 11 17:31:46 crc kubenswrapper[4837]: I0111 17:31:46.668944 4837 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"olm-operator-serving-cert" Jan 11 17:31:46 crc kubenswrapper[4837]: I0111 17:31:46.689533 4837 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"installation-pull-secrets" Jan 11 17:31:46 crc kubenswrapper[4837]: I0111 17:31:46.728532 4837 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-dockercfg-zdk86" Jan 11 17:31:46 crc kubenswrapper[4837]: I0111 17:31:46.749299 4837 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-metrics-certs-default" Jan 11 17:31:46 crc kubenswrapper[4837]: I0111 17:31:46.795874 4837 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress"/"service-ca-bundle" Jan 11 17:31:46 crc kubenswrapper[4837]: I0111 17:31:46.798106 4837 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-certs-default" Jan 11 17:31:46 crc kubenswrapper[4837]: I0111 17:31:46.807838 4837 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress"/"kube-root-ca.crt" Jan 11 17:31:46 crc kubenswrapper[4837]: I0111 17:31:46.829432 4837 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-stats-default" Jan 11 17:31:46 crc kubenswrapper[4837]: I0111 17:31:46.849453 4837 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"marketplace-operator-metrics" Jan 11 17:31:46 crc kubenswrapper[4837]: I0111 17:31:46.868251 4837 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress"/"openshift-service-ca.crt" Jan 11 17:31:46 crc kubenswrapper[4837]: I0111 17:31:46.888433 4837 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"marketplace-operator-dockercfg-5nsgg" Jan 11 17:31:46 crc kubenswrapper[4837]: I0111 17:31:46.919436 4837 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-marketplace"/"marketplace-trusted-ca" Jan 11 17:31:46 crc kubenswrapper[4837]: I0111 17:31:46.929044 4837 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-marketplace"/"openshift-service-ca.crt" Jan 11 17:31:46 crc kubenswrapper[4837]: I0111 17:31:46.948582 4837 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-marketplace"/"kube-root-ca.crt" Jan 11 17:31:46 crc kubenswrapper[4837]: I0111 17:31:46.969166 4837 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-controller-manager-operator"/"kube-root-ca.crt" Jan 11 17:31:46 crc kubenswrapper[4837]: I0111 17:31:46.989013 4837 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca-operator"/"openshift-service-ca.crt" Jan 11 17:31:47 crc kubenswrapper[4837]: I0111 17:31:47.009509 4837 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-controller-manager-operator"/"kube-controller-manager-operator-dockercfg-gkqpw" Jan 11 17:31:47 crc kubenswrapper[4837]: I0111 17:31:47.029552 4837 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-controller-manager-operator"/"kube-controller-manager-operator-serving-cert" Jan 11 17:31:47 crc kubenswrapper[4837]: I0111 17:31:47.048994 4837 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-controller-manager-operator"/"kube-controller-manager-operator-config" Jan 11 17:31:47 crc kubenswrapper[4837]: I0111 17:31:47.070621 4837 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca-operator"/"service-ca-operator-dockercfg-rg9jl" Jan 11 17:31:47 crc kubenswrapper[4837]: I0111 17:31:47.095772 4837 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca-operator"/"serving-cert" Jan 11 17:31:47 crc kubenswrapper[4837]: I0111 17:31:47.109722 4837 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca-operator"/"service-ca-operator-config" Jan 11 17:31:47 crc kubenswrapper[4837]: I0111 17:31:47.129563 4837 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca-operator"/"kube-root-ca.crt" Jan 11 17:31:47 crc kubenswrapper[4837]: I0111 17:31:47.147039 4837 request.go:700] Waited for 1.002849281s due to client-side throttling, not priority and fairness, request: GET:https://api-int.crc.testing:6443/api/v1/namespaces/openshift-ingress-operator/configmaps?fieldSelector=metadata.name%3Dopenshift-service-ca.crt&limit=500&resourceVersion=0 Jan 11 17:31:47 crc kubenswrapper[4837]: I0111 17:31:47.149058 4837 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-operator"/"openshift-service-ca.crt" Jan 11 17:31:47 crc kubenswrapper[4837]: I0111 17:31:47.169634 4837 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-operator"/"ingress-operator-dockercfg-7lnqk" Jan 11 17:31:47 crc kubenswrapper[4837]: I0111 17:31:47.189257 4837 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-operator"/"metrics-tls" Jan 11 17:31:47 crc kubenswrapper[4837]: I0111 17:31:47.219577 4837 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-operator"/"trusted-ca" Jan 11 17:31:47 crc kubenswrapper[4837]: I0111 17:31:47.228931 4837 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-operator"/"kube-root-ca.crt" Jan 11 17:31:47 crc kubenswrapper[4837]: I0111 17:31:47.248438 4837 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"collect-profiles-config" Jan 11 17:31:47 crc kubenswrapper[4837]: I0111 17:31:47.268937 4837 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"collect-profiles-dockercfg-kzf4t" Jan 11 17:31:47 crc kubenswrapper[4837]: I0111 17:31:47.289566 4837 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca"/"signing-key" Jan 11 17:31:47 crc kubenswrapper[4837]: I0111 17:31:47.309131 4837 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca"/"service-ca-dockercfg-pn86c" Jan 11 17:31:47 crc kubenswrapper[4837]: I0111 17:31:47.328608 4837 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca"/"openshift-service-ca.crt" Jan 11 17:31:47 crc kubenswrapper[4837]: I0111 17:31:47.348469 4837 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca"/"signing-cabundle" Jan 11 17:31:47 crc kubenswrapper[4837]: I0111 17:31:47.363577 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Jan 11 17:31:47 crc kubenswrapper[4837]: I0111 17:31:47.368340 4837 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca"/"kube-root-ca.crt" Jan 11 17:31:47 crc kubenswrapper[4837]: I0111 17:31:47.389235 4837 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-scheduler-operator"/"openshift-kube-scheduler-operator-config" Jan 11 17:31:47 crc kubenswrapper[4837]: I0111 17:31:47.408933 4837 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-scheduler-operator"/"openshift-kube-scheduler-operator-dockercfg-qt55r" Jan 11 17:31:47 crc kubenswrapper[4837]: I0111 17:31:47.428383 4837 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-scheduler-operator"/"kube-scheduler-operator-serving-cert" Jan 11 17:31:47 crc kubenswrapper[4837]: I0111 17:31:47.449215 4837 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"control-plane-machine-set-operator-tls" Jan 11 17:31:47 crc kubenswrapper[4837]: I0111 17:31:47.469135 4837 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-scheduler-operator"/"kube-root-ca.crt" Jan 11 17:31:47 crc kubenswrapper[4837]: I0111 17:31:47.488610 4837 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"control-plane-machine-set-operator-dockercfg-k9rxt" Jan 11 17:31:47 crc kubenswrapper[4837]: I0111 17:31:47.509315 4837 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"packageserver-service-cert" Jan 11 17:31:47 crc kubenswrapper[4837]: I0111 17:31:47.530275 4837 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"multus-ac-dockercfg-9lkdf" Jan 11 17:31:47 crc kubenswrapper[4837]: I0111 17:31:47.549142 4837 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"multus-admission-controller-secret" Jan 11 17:31:47 crc kubenswrapper[4837]: I0111 17:31:47.569267 4837 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"catalog-operator-serving-cert" Jan 11 17:31:47 crc kubenswrapper[4837]: I0111 17:31:47.588979 4837 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-canary"/"default-dockercfg-2llfx" Jan 11 17:31:47 crc kubenswrapper[4837]: I0111 17:31:47.608546 4837 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-canary"/"openshift-service-ca.crt" Jan 11 17:31:47 crc kubenswrapper[4837]: I0111 17:31:47.629006 4837 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-canary"/"canary-serving-cert" Jan 11 17:31:47 crc kubenswrapper[4837]: I0111 17:31:47.653525 4837 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-canary"/"kube-root-ca.crt" Jan 11 17:31:47 crc kubenswrapper[4837]: I0111 17:31:47.673000 4837 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"mcc-proxy-tls" Jan 11 17:31:47 crc kubenswrapper[4837]: I0111 17:31:47.692729 4837 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-controller-dockercfg-c2lfx" Jan 11 17:31:47 crc kubenswrapper[4837]: I0111 17:31:47.709438 4837 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-storage-version-migrator"/"kube-storage-version-migrator-sa-dockercfg-5xfcg" Jan 11 17:31:47 crc kubenswrapper[4837]: I0111 17:31:47.730138 4837 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator"/"openshift-service-ca.crt" Jan 11 17:31:47 crc kubenswrapper[4837]: I0111 17:31:47.748823 4837 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator"/"kube-root-ca.crt" Jan 11 17:31:47 crc kubenswrapper[4837]: I0111 17:31:47.755629 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-7996w" event={"ID":"46729771-ba0b-4b7c-8245-b2d57acb5a2c","Type":"ContainerStarted","Data":"e0a028b238c9226d8bb8bda91f15ae6288296c83cc8eb751725411c315cea81c"} Jan 11 17:31:47 crc kubenswrapper[4837]: I0111 17:31:47.768552 4837 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"machine-config-operator-images" Jan 11 17:31:47 crc kubenswrapper[4837]: I0111 17:31:47.789318 4837 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"mco-proxy-tls" Jan 11 17:31:47 crc kubenswrapper[4837]: I0111 17:31:47.810303 4837 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-operator-dockercfg-98p87" Jan 11 17:31:47 crc kubenswrapper[4837]: I0111 17:31:47.829528 4837 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"cni-sysctl-allowlist" Jan 11 17:31:47 crc kubenswrapper[4837]: I0111 17:31:47.883710 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fz7j5\" (UniqueName: \"kubernetes.io/projected/ecf9daad-b73a-4f1a-9247-ee4973ad1bc7-kube-api-access-fz7j5\") pod \"apiserver-7bbb656c7d-2ngzf\" (UID: \"ecf9daad-b73a-4f1a-9247-ee4973ad1bc7\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-2ngzf" Jan 11 17:31:47 crc kubenswrapper[4837]: I0111 17:31:47.899191 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-467k8\" (UniqueName: \"kubernetes.io/projected/75ee2960-aa16-4aea-84f2-d60c34d6fb1a-kube-api-access-467k8\") pod \"downloads-7954f5f757-dmnqc\" (UID: \"75ee2960-aa16-4aea-84f2-d60c34d6fb1a\") " pod="openshift-console/downloads-7954f5f757-dmnqc" Jan 11 17:31:47 crc kubenswrapper[4837]: I0111 17:31:47.919745 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-w66tn\" (UniqueName: \"kubernetes.io/projected/26867a3e-74b2-4ef0-b671-52b6f72fb0d3-kube-api-access-w66tn\") pod \"openshift-apiserver-operator-796bbdcf4f-fpvgj\" (UID: \"26867a3e-74b2-4ef0-b671-52b6f72fb0d3\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-fpvgj" Jan 11 17:31:47 crc kubenswrapper[4837]: I0111 17:31:47.929954 4837 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"node-bootstrapper-token" Jan 11 17:31:47 crc kubenswrapper[4837]: I0111 17:31:47.935955 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-s8d4m\" (UniqueName: \"kubernetes.io/projected/1d6837bf-33ce-474a-a4f5-22d1fa7b95b1-kube-api-access-s8d4m\") pod \"authentication-operator-69f744f599-l2c7d\" (UID: \"1d6837bf-33ce-474a-a4f5-22d1fa7b95b1\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-l2c7d" Jan 11 17:31:47 crc kubenswrapper[4837]: I0111 17:31:47.938437 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-2ngzf" Jan 11 17:31:47 crc kubenswrapper[4837]: I0111 17:31:47.950142 4837 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-server-dockercfg-qx5rd" Jan 11 17:31:47 crc kubenswrapper[4837]: I0111 17:31:47.989461 4837 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-server-tls" Jan 11 17:31:47 crc kubenswrapper[4837]: I0111 17:31:47.997582 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-l8sxd\" (UniqueName: \"kubernetes.io/projected/1141f492-afec-40f3-bde7-7072d6a75a68-kube-api-access-l8sxd\") pod \"console-f9d7485db-v8df8\" (UID: \"1141f492-afec-40f3-bde7-7072d6a75a68\") " pod="openshift-console/console-f9d7485db-v8df8" Jan 11 17:31:48 crc kubenswrapper[4837]: I0111 17:31:48.030742 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/downloads-7954f5f757-dmnqc" Jan 11 17:31:48 crc kubenswrapper[4837]: I0111 17:31:48.037335 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mhmkp\" (UniqueName: \"kubernetes.io/projected/8854497b-9e74-4bd1-b465-847ad61d8779-kube-api-access-mhmkp\") pod \"openshift-config-operator-7777fb866f-599f5\" (UID: \"8854497b-9e74-4bd1-b465-847ad61d8779\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-599f5" Jan 11 17:31:48 crc kubenswrapper[4837]: I0111 17:31:48.044611 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-fpvgj" Jan 11 17:31:48 crc kubenswrapper[4837]: I0111 17:31:48.057244 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2cs68\" (UniqueName: \"kubernetes.io/projected/a45207f1-08c5-4ae8-9ac3-3e3099df8218-kube-api-access-2cs68\") pod \"dns-operator-744455d44c-n8rwr\" (UID: \"a45207f1-08c5-4ae8-9ac3-3e3099df8218\") " pod="openshift-dns-operator/dns-operator-744455d44c-n8rwr" Jan 11 17:31:48 crc kubenswrapper[4837]: I0111 17:31:48.101483 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-pvng8\" (UniqueName: \"kubernetes.io/projected/59f65053-461b-4ad3-acc6-2a29b1fd06c6-kube-api-access-pvng8\") pod \"machine-approver-56656f9798-hchsh\" (UID: \"59f65053-461b-4ad3-acc6-2a29b1fd06c6\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-hchsh" Jan 11 17:31:48 crc kubenswrapper[4837]: I0111 17:31:48.109705 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-g4vz2\" (UniqueName: \"kubernetes.io/projected/f9f7469a-7ddb-4d35-962e-86154f7750c9-kube-api-access-g4vz2\") pod \"controller-manager-879f6c89f-ljlz2\" (UID: \"f9f7469a-7ddb-4d35-962e-86154f7750c9\") " pod="openshift-controller-manager/controller-manager-879f6c89f-ljlz2" Jan 11 17:31:48 crc kubenswrapper[4837]: I0111 17:31:48.121077 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zrczq\" (UniqueName: \"kubernetes.io/projected/76c9940b-89a9-414c-ab2a-c4c1b4519725-kube-api-access-zrczq\") pod \"route-controller-manager-6576b87f9c-cwdvp\" (UID: \"76c9940b-89a9-414c-ab2a-c4c1b4519725\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-cwdvp" Jan 11 17:31:48 crc kubenswrapper[4837]: I0111 17:31:48.140345 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lsxf4\" (UniqueName: \"kubernetes.io/projected/09971765-a82b-4ab2-b79a-5defb8feb416-kube-api-access-lsxf4\") pod \"apiserver-76f77b778f-pz586\" (UID: \"09971765-a82b-4ab2-b79a-5defb8feb416\") " pod="openshift-apiserver/apiserver-76f77b778f-pz586" Jan 11 17:31:48 crc kubenswrapper[4837]: I0111 17:31:48.141837 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-f9d7485db-v8df8" Jan 11 17:31:48 crc kubenswrapper[4837]: I0111 17:31:48.147532 4837 request.go:700] Waited for 1.955342198s due to client-side throttling, not priority and fairness, request: POST:https://api-int.crc.testing:6443/api/v1/namespaces/openshift-console-operator/serviceaccounts/console-operator/token Jan 11 17:31:48 crc kubenswrapper[4837]: I0111 17:31:48.148577 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-cn278\" (UniqueName: \"kubernetes.io/projected/8bd32a92-1f67-46c3-831f-6b50d22d0fb2-kube-api-access-cn278\") pod \"openshift-controller-manager-operator-756b6f6bc6-wlfcq\" (UID: \"8bd32a92-1f67-46c3-831f-6b50d22d0fb2\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-wlfcq" Jan 11 17:31:48 crc kubenswrapper[4837]: I0111 17:31:48.164210 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns-operator/dns-operator-744455d44c-n8rwr" Jan 11 17:31:48 crc kubenswrapper[4837]: I0111 17:31:48.169562 4837 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns"/"dns-dockercfg-jwfmh" Jan 11 17:31:48 crc kubenswrapper[4837]: I0111 17:31:48.176906 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zpvc9\" (UniqueName: \"kubernetes.io/projected/58f0dabf-1deb-453a-9ac9-11324df2b806-kube-api-access-zpvc9\") pod \"console-operator-58897d9998-n8ldn\" (UID: \"58f0dabf-1deb-453a-9ac9-11324df2b806\") " pod="openshift-console-operator/console-operator-58897d9998-n8ldn" Jan 11 17:31:48 crc kubenswrapper[4837]: I0111 17:31:48.191987 4837 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns"/"dns-default-metrics-tls" Jan 11 17:31:48 crc kubenswrapper[4837]: I0111 17:31:48.208381 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-authentication-operator/authentication-operator-69f744f599-l2c7d" Jan 11 17:31:48 crc kubenswrapper[4837]: I0111 17:31:48.208582 4837 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns"/"dns-default" Jan 11 17:31:48 crc kubenswrapper[4837]: I0111 17:31:48.227732 4837 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-oauth-apiserver/apiserver-7bbb656c7d-2ngzf"] Jan 11 17:31:48 crc kubenswrapper[4837]: I0111 17:31:48.228382 4837 reflector.go:368] Caches populated for *v1.Secret from object-"hostpath-provisioner"/"csi-hostpath-provisioner-sa-dockercfg-qd74k" Jan 11 17:31:48 crc kubenswrapper[4837]: I0111 17:31:48.230332 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-config-operator/openshift-config-operator-7777fb866f-599f5" Jan 11 17:31:48 crc kubenswrapper[4837]: W0111 17:31:48.243612 4837 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podecf9daad_b73a_4f1a_9247_ee4973ad1bc7.slice/crio-ce3680fbd087d6c8f8a518166f3ab8daef27ba23434c211e4afcd22027d4e0e3 WatchSource:0}: Error finding container ce3680fbd087d6c8f8a518166f3ab8daef27ba23434c211e4afcd22027d4e0e3: Status 404 returned error can't find the container with id ce3680fbd087d6c8f8a518166f3ab8daef27ba23434c211e4afcd22027d4e0e3 Jan 11 17:31:48 crc kubenswrapper[4837]: I0111 17:31:48.248627 4837 reflector.go:368] Caches populated for *v1.ConfigMap from object-"hostpath-provisioner"/"openshift-service-ca.crt" Jan 11 17:31:48 crc kubenswrapper[4837]: I0111 17:31:48.249845 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-cwdvp" Jan 11 17:31:48 crc kubenswrapper[4837]: I0111 17:31:48.256949 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-879f6c89f-ljlz2" Jan 11 17:31:48 crc kubenswrapper[4837]: I0111 17:31:48.270692 4837 reflector.go:368] Caches populated for *v1.ConfigMap from object-"hostpath-provisioner"/"kube-root-ca.crt" Jan 11 17:31:48 crc kubenswrapper[4837]: I0111 17:31:48.310551 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-kt6wj\" (UniqueName: \"kubernetes.io/projected/6184fa1c-4a00-4ae3-9dad-a5672220571e-kube-api-access-kt6wj\") pod \"kube-storage-version-migrator-operator-b67b599dd-r58pw\" (UID: \"6184fa1c-4a00-4ae3-9dad-a5672220571e\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-r58pw" Jan 11 17:31:48 crc kubenswrapper[4837]: I0111 17:31:48.319266 4837 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-fpvgj"] Jan 11 17:31:48 crc kubenswrapper[4837]: I0111 17:31:48.335319 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ndmmj\" (UniqueName: \"kubernetes.io/projected/8ec2b154-4844-440e-bec5-911d8456ac91-kube-api-access-ndmmj\") pod \"cluster-image-registry-operator-dc59b4c8b-mklqs\" (UID: \"8ec2b154-4844-440e-bec5-911d8456ac91\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-mklqs" Jan 11 17:31:48 crc kubenswrapper[4837]: I0111 17:31:48.336854 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-machine-approver/machine-approver-56656f9798-hchsh" Jan 11 17:31:48 crc kubenswrapper[4837]: I0111 17:31:48.345961 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console-operator/console-operator-58897d9998-n8ldn" Jan 11 17:31:48 crc kubenswrapper[4837]: I0111 17:31:48.348557 4837 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console/downloads-7954f5f757-dmnqc"] Jan 11 17:31:48 crc kubenswrapper[4837]: I0111 17:31:48.352106 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-wlfcq" Jan 11 17:31:48 crc kubenswrapper[4837]: I0111 17:31:48.357169 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/6a05a50b-ccec-45f2-be70-d210e5334d18-kube-api-access\") pod \"kube-apiserver-operator-766d6c64bb-c9fvj\" (UID: \"6a05a50b-ccec-45f2-be70-d210e5334d18\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-c9fvj" Jan 11 17:31:48 crc kubenswrapper[4837]: I0111 17:31:48.363453 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/8ec2b154-4844-440e-bec5-911d8456ac91-bound-sa-token\") pod \"cluster-image-registry-operator-dc59b4c8b-mklqs\" (UID: \"8ec2b154-4844-440e-bec5-911d8456ac91\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-mklqs" Jan 11 17:31:48 crc kubenswrapper[4837]: I0111 17:31:48.389032 4837 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-diagnostics"/"openshift-service-ca.crt" Jan 11 17:31:48 crc kubenswrapper[4837]: W0111 17:31:48.390885 4837 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod75ee2960_aa16_4aea_84f2_d60c34d6fb1a.slice/crio-f6f5684c866a5676288642fe256335f4f99d0cd4d55b54211d3dee3b0f063737 WatchSource:0}: Error finding container f6f5684c866a5676288642fe256335f4f99d0cd4d55b54211d3dee3b0f063737: Status 404 returned error can't find the container with id f6f5684c866a5676288642fe256335f4f99d0cd4d55b54211d3dee3b0f063737 Jan 11 17:31:48 crc kubenswrapper[4837]: I0111 17:31:48.392562 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-mklqs" Jan 11 17:31:48 crc kubenswrapper[4837]: I0111 17:31:48.403793 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-r58pw" Jan 11 17:31:48 crc kubenswrapper[4837]: I0111 17:31:48.409616 4837 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-diagnostics"/"kube-root-ca.crt" Jan 11 17:31:48 crc kubenswrapper[4837]: I0111 17:31:48.410697 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-c9fvj" Jan 11 17:31:48 crc kubenswrapper[4837]: I0111 17:31:48.428832 4837 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"metrics-daemon-secret" Jan 11 17:31:48 crc kubenswrapper[4837]: I0111 17:31:48.439716 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-apiserver/apiserver-76f77b778f-pz586" Jan 11 17:31:48 crc kubenswrapper[4837]: I0111 17:31:48.450875 4837 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"metrics-daemon-sa-dockercfg-d427c" Jan 11 17:31:48 crc kubenswrapper[4837]: I0111 17:31:48.472696 4837 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-console"/"networking-console-plugin" Jan 11 17:31:48 crc kubenswrapper[4837]: I0111 17:31:48.489705 4837 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-network-console"/"networking-console-plugin-cert" Jan 11 17:31:48 crc kubenswrapper[4837]: I0111 17:31:48.519126 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/bb63c9f5-457d-4c61-8cc6-56690e66a952-registry-tls\") pod \"image-registry-697d97f7c8-4nvrr\" (UID: \"bb63c9f5-457d-4c61-8cc6-56690e66a952\") " pod="openshift-image-registry/image-registry-697d97f7c8-4nvrr" Jan 11 17:31:48 crc kubenswrapper[4837]: I0111 17:31:48.519538 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/1b5ce6bf-72e2-494a-aa22-830e992fbec5-v4-0-config-user-template-error\") pod \"oauth-openshift-558db77b4-ldzgv\" (UID: \"1b5ce6bf-72e2-494a-aa22-830e992fbec5\") " pod="openshift-authentication/oauth-openshift-558db77b4-ldzgv" Jan 11 17:31:48 crc kubenswrapper[4837]: I0111 17:31:48.519582 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-ca\" (UniqueName: \"kubernetes.io/configmap/9b730a03-0d21-4b8a-8fa6-3d64f0b3aa5b-etcd-ca\") pod \"etcd-operator-b45778765-pjl6t\" (UID: \"9b730a03-0d21-4b8a-8fa6-3d64f0b3aa5b\") " pod="openshift-etcd-operator/etcd-operator-b45778765-pjl6t" Jan 11 17:31:48 crc kubenswrapper[4837]: I0111 17:31:48.519621 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/1b5ce6bf-72e2-494a-aa22-830e992fbec5-v4-0-config-system-session\") pod \"oauth-openshift-558db77b4-ldzgv\" (UID: \"1b5ce6bf-72e2-494a-aa22-830e992fbec5\") " pod="openshift-authentication/oauth-openshift-558db77b4-ldzgv" Jan 11 17:31:48 crc kubenswrapper[4837]: I0111 17:31:48.519646 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/1b5ce6bf-72e2-494a-aa22-830e992fbec5-audit-policies\") pod \"oauth-openshift-558db77b4-ldzgv\" (UID: \"1b5ce6bf-72e2-494a-aa22-830e992fbec5\") " pod="openshift-authentication/oauth-openshift-558db77b4-ldzgv" Jan 11 17:31:48 crc kubenswrapper[4837]: I0111 17:31:48.519762 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/bb63c9f5-457d-4c61-8cc6-56690e66a952-trusted-ca\") pod \"image-registry-697d97f7c8-4nvrr\" (UID: \"bb63c9f5-457d-4c61-8cc6-56690e66a952\") " pod="openshift-image-registry/image-registry-697d97f7c8-4nvrr" Jan 11 17:31:48 crc kubenswrapper[4837]: I0111 17:31:48.519788 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/1b5ce6bf-72e2-494a-aa22-830e992fbec5-v4-0-config-system-router-certs\") pod \"oauth-openshift-558db77b4-ldzgv\" (UID: \"1b5ce6bf-72e2-494a-aa22-830e992fbec5\") " pod="openshift-authentication/oauth-openshift-558db77b4-ldzgv" Jan 11 17:31:48 crc kubenswrapper[4837]: I0111 17:31:48.519833 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-4nvrr\" (UID: \"bb63c9f5-457d-4c61-8cc6-56690e66a952\") " pod="openshift-image-registry/image-registry-697d97f7c8-4nvrr" Jan 11 17:31:48 crc kubenswrapper[4837]: I0111 17:31:48.519857 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-sqmvk\" (UniqueName: \"kubernetes.io/projected/9b730a03-0d21-4b8a-8fa6-3d64f0b3aa5b-kube-api-access-sqmvk\") pod \"etcd-operator-b45778765-pjl6t\" (UID: \"9b730a03-0d21-4b8a-8fa6-3d64f0b3aa5b\") " pod="openshift-etcd-operator/etcd-operator-b45778765-pjl6t" Jan 11 17:31:48 crc kubenswrapper[4837]: I0111 17:31:48.519911 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-x8mhv\" (UniqueName: \"kubernetes.io/projected/1b5ce6bf-72e2-494a-aa22-830e992fbec5-kube-api-access-x8mhv\") pod \"oauth-openshift-558db77b4-ldzgv\" (UID: \"1b5ce6bf-72e2-494a-aa22-830e992fbec5\") " pod="openshift-authentication/oauth-openshift-558db77b4-ldzgv" Jan 11 17:31:48 crc kubenswrapper[4837]: I0111 17:31:48.519947 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/bb63c9f5-457d-4c61-8cc6-56690e66a952-ca-trust-extracted\") pod \"image-registry-697d97f7c8-4nvrr\" (UID: \"bb63c9f5-457d-4c61-8cc6-56690e66a952\") " pod="openshift-image-registry/image-registry-697d97f7c8-4nvrr" Jan 11 17:31:48 crc kubenswrapper[4837]: I0111 17:31:48.519971 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xhcls\" (UniqueName: \"kubernetes.io/projected/876b267e-baf3-4f7d-a4a3-49f44b4dfbb7-kube-api-access-xhcls\") pod \"cluster-samples-operator-665b6dd947-kq5ch\" (UID: \"876b267e-baf3-4f7d-a4a3-49f44b4dfbb7\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-kq5ch" Jan 11 17:31:48 crc kubenswrapper[4837]: I0111 17:31:48.519995 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"package-server-manager-serving-cert\" (UniqueName: \"kubernetes.io/secret/43082497-a570-48fc-95f8-eda27581cde7-package-server-manager-serving-cert\") pod \"package-server-manager-789f6589d5-2vscb\" (UID: \"43082497-a570-48fc-95f8-eda27581cde7\") " pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-2vscb" Jan 11 17:31:48 crc kubenswrapper[4837]: I0111 17:31:48.520042 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-service-ca\" (UniqueName: \"kubernetes.io/configmap/9b730a03-0d21-4b8a-8fa6-3d64f0b3aa5b-etcd-service-ca\") pod \"etcd-operator-b45778765-pjl6t\" (UID: \"9b730a03-0d21-4b8a-8fa6-3d64f0b3aa5b\") " pod="openshift-etcd-operator/etcd-operator-b45778765-pjl6t" Jan 11 17:31:48 crc kubenswrapper[4837]: I0111 17:31:48.520075 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-85wsv\" (UniqueName: \"kubernetes.io/projected/b61e27df-5c38-48b3-b6e9-bca3ce8aa429-kube-api-access-85wsv\") pod \"machine-api-operator-5694c8668f-xk9j9\" (UID: \"b61e27df-5c38-48b3-b6e9-bca3ce8aa429\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-xk9j9" Jan 11 17:31:48 crc kubenswrapper[4837]: I0111 17:31:48.520113 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/1b5ce6bf-72e2-494a-aa22-830e992fbec5-v4-0-config-system-serving-cert\") pod \"oauth-openshift-558db77b4-ldzgv\" (UID: \"1b5ce6bf-72e2-494a-aa22-830e992fbec5\") " pod="openshift-authentication/oauth-openshift-558db77b4-ldzgv" Jan 11 17:31:48 crc kubenswrapper[4837]: I0111 17:31:48.520158 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-r45g8\" (UniqueName: \"kubernetes.io/projected/bb63c9f5-457d-4c61-8cc6-56690e66a952-kube-api-access-r45g8\") pod \"image-registry-697d97f7c8-4nvrr\" (UID: \"bb63c9f5-457d-4c61-8cc6-56690e66a952\") " pod="openshift-image-registry/image-registry-697d97f7c8-4nvrr" Jan 11 17:31:48 crc kubenswrapper[4837]: I0111 17:31:48.520223 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/1b5ce6bf-72e2-494a-aa22-830e992fbec5-v4-0-config-user-template-login\") pod \"oauth-openshift-558db77b4-ldzgv\" (UID: \"1b5ce6bf-72e2-494a-aa22-830e992fbec5\") " pod="openshift-authentication/oauth-openshift-558db77b4-ldzgv" Jan 11 17:31:48 crc kubenswrapper[4837]: I0111 17:31:48.520245 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/1b5ce6bf-72e2-494a-aa22-830e992fbec5-audit-dir\") pod \"oauth-openshift-558db77b4-ldzgv\" (UID: \"1b5ce6bf-72e2-494a-aa22-830e992fbec5\") " pod="openshift-authentication/oauth-openshift-558db77b4-ldzgv" Jan 11 17:31:48 crc kubenswrapper[4837]: I0111 17:31:48.520290 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/bb63c9f5-457d-4c61-8cc6-56690e66a952-registry-certificates\") pod \"image-registry-697d97f7c8-4nvrr\" (UID: \"bb63c9f5-457d-4c61-8cc6-56690e66a952\") " pod="openshift-image-registry/image-registry-697d97f7c8-4nvrr" Jan 11 17:31:48 crc kubenswrapper[4837]: I0111 17:31:48.520322 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/9b730a03-0d21-4b8a-8fa6-3d64f0b3aa5b-etcd-client\") pod \"etcd-operator-b45778765-pjl6t\" (UID: \"9b730a03-0d21-4b8a-8fa6-3d64f0b3aa5b\") " pod="openshift-etcd-operator/etcd-operator-b45778765-pjl6t" Jan 11 17:31:48 crc kubenswrapper[4837]: I0111 17:31:48.520345 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/1b5ce6bf-72e2-494a-aa22-830e992fbec5-v4-0-config-system-trusted-ca-bundle\") pod \"oauth-openshift-558db77b4-ldzgv\" (UID: \"1b5ce6bf-72e2-494a-aa22-830e992fbec5\") " pod="openshift-authentication/oauth-openshift-558db77b4-ldzgv" Jan 11 17:31:48 crc kubenswrapper[4837]: I0111 17:31:48.520393 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/d32a6f5b-b581-425c-80aa-c7deee3c5b2c-srv-cert\") pod \"olm-operator-6b444d44fb-5xllw\" (UID: \"d32a6f5b-b581-425c-80aa-c7deee3c5b2c\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-5xllw" Jan 11 17:31:48 crc kubenswrapper[4837]: I0111 17:31:48.520414 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/1b5ce6bf-72e2-494a-aa22-830e992fbec5-v4-0-config-system-cliconfig\") pod \"oauth-openshift-558db77b4-ldzgv\" (UID: \"1b5ce6bf-72e2-494a-aa22-830e992fbec5\") " pod="openshift-authentication/oauth-openshift-558db77b4-ldzgv" Jan 11 17:31:48 crc kubenswrapper[4837]: I0111 17:31:48.520434 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9b730a03-0d21-4b8a-8fa6-3d64f0b3aa5b-config\") pod \"etcd-operator-b45778765-pjl6t\" (UID: \"9b730a03-0d21-4b8a-8fa6-3d64f0b3aa5b\") " pod="openshift-etcd-operator/etcd-operator-b45778765-pjl6t" Jan 11 17:31:48 crc kubenswrapper[4837]: I0111 17:31:48.520456 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/1b5ce6bf-72e2-494a-aa22-830e992fbec5-v4-0-config-system-service-ca\") pod \"oauth-openshift-558db77b4-ldzgv\" (UID: \"1b5ce6bf-72e2-494a-aa22-830e992fbec5\") " pod="openshift-authentication/oauth-openshift-558db77b4-ldzgv" Jan 11 17:31:48 crc kubenswrapper[4837]: I0111 17:31:48.520480 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"images\" (UniqueName: \"kubernetes.io/configmap/b61e27df-5c38-48b3-b6e9-bca3ce8aa429-images\") pod \"machine-api-operator-5694c8668f-xk9j9\" (UID: \"b61e27df-5c38-48b3-b6e9-bca3ce8aa429\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-xk9j9" Jan 11 17:31:48 crc kubenswrapper[4837]: I0111 17:31:48.520503 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8g27x\" (UniqueName: \"kubernetes.io/projected/43082497-a570-48fc-95f8-eda27581cde7-kube-api-access-8g27x\") pod \"package-server-manager-789f6589d5-2vscb\" (UID: \"43082497-a570-48fc-95f8-eda27581cde7\") " pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-2vscb" Jan 11 17:31:48 crc kubenswrapper[4837]: I0111 17:31:48.520537 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/bb63c9f5-457d-4c61-8cc6-56690e66a952-bound-sa-token\") pod \"image-registry-697d97f7c8-4nvrr\" (UID: \"bb63c9f5-457d-4c61-8cc6-56690e66a952\") " pod="openshift-image-registry/image-registry-697d97f7c8-4nvrr" Jan 11 17:31:48 crc kubenswrapper[4837]: I0111 17:31:48.520569 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/876b267e-baf3-4f7d-a4a3-49f44b4dfbb7-samples-operator-tls\") pod \"cluster-samples-operator-665b6dd947-kq5ch\" (UID: \"876b267e-baf3-4f7d-a4a3-49f44b4dfbb7\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-kq5ch" Jan 11 17:31:48 crc kubenswrapper[4837]: I0111 17:31:48.520592 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b61e27df-5c38-48b3-b6e9-bca3ce8aa429-config\") pod \"machine-api-operator-5694c8668f-xk9j9\" (UID: \"b61e27df-5c38-48b3-b6e9-bca3ce8aa429\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-xk9j9" Jan 11 17:31:48 crc kubenswrapper[4837]: I0111 17:31:48.520647 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/1b5ce6bf-72e2-494a-aa22-830e992fbec5-v4-0-config-user-template-provider-selection\") pod \"oauth-openshift-558db77b4-ldzgv\" (UID: \"1b5ce6bf-72e2-494a-aa22-830e992fbec5\") " pod="openshift-authentication/oauth-openshift-558db77b4-ldzgv" Jan 11 17:31:48 crc kubenswrapper[4837]: I0111 17:31:48.520692 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mpr2w\" (UniqueName: \"kubernetes.io/projected/d32a6f5b-b581-425c-80aa-c7deee3c5b2c-kube-api-access-mpr2w\") pod \"olm-operator-6b444d44fb-5xllw\" (UID: \"d32a6f5b-b581-425c-80aa-c7deee3c5b2c\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-5xllw" Jan 11 17:31:48 crc kubenswrapper[4837]: I0111 17:31:48.520719 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/9b730a03-0d21-4b8a-8fa6-3d64f0b3aa5b-serving-cert\") pod \"etcd-operator-b45778765-pjl6t\" (UID: \"9b730a03-0d21-4b8a-8fa6-3d64f0b3aa5b\") " pod="openshift-etcd-operator/etcd-operator-b45778765-pjl6t" Jan 11 17:31:48 crc kubenswrapper[4837]: I0111 17:31:48.520740 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"machine-api-operator-tls\" (UniqueName: \"kubernetes.io/secret/b61e27df-5c38-48b3-b6e9-bca3ce8aa429-machine-api-operator-tls\") pod \"machine-api-operator-5694c8668f-xk9j9\" (UID: \"b61e27df-5c38-48b3-b6e9-bca3ce8aa429\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-xk9j9" Jan 11 17:31:48 crc kubenswrapper[4837]: I0111 17:31:48.520839 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/d32a6f5b-b581-425c-80aa-c7deee3c5b2c-profile-collector-cert\") pod \"olm-operator-6b444d44fb-5xllw\" (UID: \"d32a6f5b-b581-425c-80aa-c7deee3c5b2c\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-5xllw" Jan 11 17:31:48 crc kubenswrapper[4837]: I0111 17:31:48.520879 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/1b5ce6bf-72e2-494a-aa22-830e992fbec5-v4-0-config-system-ocp-branding-template\") pod \"oauth-openshift-558db77b4-ldzgv\" (UID: \"1b5ce6bf-72e2-494a-aa22-830e992fbec5\") " pod="openshift-authentication/oauth-openshift-558db77b4-ldzgv" Jan 11 17:31:48 crc kubenswrapper[4837]: I0111 17:31:48.520906 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/bb63c9f5-457d-4c61-8cc6-56690e66a952-installation-pull-secrets\") pod \"image-registry-697d97f7c8-4nvrr\" (UID: \"bb63c9f5-457d-4c61-8cc6-56690e66a952\") " pod="openshift-image-registry/image-registry-697d97f7c8-4nvrr" Jan 11 17:31:48 crc kubenswrapper[4837]: I0111 17:31:48.520927 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/1b5ce6bf-72e2-494a-aa22-830e992fbec5-v4-0-config-user-idp-0-file-data\") pod \"oauth-openshift-558db77b4-ldzgv\" (UID: \"1b5ce6bf-72e2-494a-aa22-830e992fbec5\") " pod="openshift-authentication/oauth-openshift-558db77b4-ldzgv" Jan 11 17:31:48 crc kubenswrapper[4837]: E0111 17:31:48.527181 4837 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-01-11 17:31:49.027166068 +0000 UTC m=+83.205358774 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-4nvrr" (UID: "bb63c9f5-457d-4c61-8cc6-56690e66a952") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 11 17:31:48 crc kubenswrapper[4837]: I0111 17:31:48.559321 4837 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-879f6c89f-ljlz2"] Jan 11 17:31:48 crc kubenswrapper[4837]: I0111 17:31:48.580048 4837 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-config-operator/openshift-config-operator-7777fb866f-599f5"] Jan 11 17:31:48 crc kubenswrapper[4837]: I0111 17:31:48.585299 4837 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6576b87f9c-cwdvp"] Jan 11 17:31:48 crc kubenswrapper[4837]: I0111 17:31:48.605868 4837 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-f9d7485db-v8df8"] Jan 11 17:31:48 crc kubenswrapper[4837]: W0111 17:31:48.606309 4837 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf9f7469a_7ddb_4d35_962e_86154f7750c9.slice/crio-a69046d634c7d619658399991d46773b3a2bec6a5441e85c91462348e8410ae9 WatchSource:0}: Error finding container a69046d634c7d619658399991d46773b3a2bec6a5441e85c91462348e8410ae9: Status 404 returned error can't find the container with id a69046d634c7d619658399991d46773b3a2bec6a5441e85c91462348e8410ae9 Jan 11 17:31:48 crc kubenswrapper[4837]: I0111 17:31:48.621995 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Jan 11 17:31:48 crc kubenswrapper[4837]: I0111 17:31:48.622158 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-x8mhv\" (UniqueName: \"kubernetes.io/projected/1b5ce6bf-72e2-494a-aa22-830e992fbec5-kube-api-access-x8mhv\") pod \"oauth-openshift-558db77b4-ldzgv\" (UID: \"1b5ce6bf-72e2-494a-aa22-830e992fbec5\") " pod="openshift-authentication/oauth-openshift-558db77b4-ldzgv" Jan 11 17:31:48 crc kubenswrapper[4837]: E0111 17:31:48.622184 4837 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-01-11 17:31:49.12215421 +0000 UTC m=+83.300346916 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 11 17:31:48 crc kubenswrapper[4837]: I0111 17:31:48.622369 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/4eff370b-3729-4929-b73f-752f1b00a318-config\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-fxkfr\" (UID: \"4eff370b-3729-4929-b73f-752f1b00a318\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-fxkfr" Jan 11 17:31:48 crc kubenswrapper[4837]: I0111 17:31:48.622392 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/8c7124f1-e7cc-4ae8-89d2-19457c045576-webhook-certs\") pod \"multus-admission-controller-857f4d67dd-2984c\" (UID: \"8c7124f1-e7cc-4ae8-89d2-19457c045576\") " pod="openshift-multus/multus-admission-controller-857f4d67dd-2984c" Jan 11 17:31:48 crc kubenswrapper[4837]: I0111 17:31:48.622478 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xhcls\" (UniqueName: \"kubernetes.io/projected/876b267e-baf3-4f7d-a4a3-49f44b4dfbb7-kube-api-access-xhcls\") pod \"cluster-samples-operator-665b6dd947-kq5ch\" (UID: \"876b267e-baf3-4f7d-a4a3-49f44b4dfbb7\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-kq5ch" Jan 11 17:31:48 crc kubenswrapper[4837]: I0111 17:31:48.622505 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"plugins-dir\" (UniqueName: \"kubernetes.io/host-path/ce38ff9a-9354-463c-a8b5-3d4bbea8694d-plugins-dir\") pod \"csi-hostpathplugin-tdxwr\" (UID: \"ce38ff9a-9354-463c-a8b5-3d4bbea8694d\") " pod="hostpath-provisioner/csi-hostpathplugin-tdxwr" Jan 11 17:31:48 crc kubenswrapper[4837]: I0111 17:31:48.622545 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-85wsv\" (UniqueName: \"kubernetes.io/projected/b61e27df-5c38-48b3-b6e9-bca3ce8aa429-kube-api-access-85wsv\") pod \"machine-api-operator-5694c8668f-xk9j9\" (UID: \"b61e27df-5c38-48b3-b6e9-bca3ce8aa429\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-xk9j9" Jan 11 17:31:48 crc kubenswrapper[4837]: I0111 17:31:48.622568 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/4047a74f-e1b9-40a2-b525-bc06011477d7-srv-cert\") pod \"catalog-operator-68c6474976-sxnbq\" (UID: \"4047a74f-e1b9-40a2-b525-bc06011477d7\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-sxnbq" Jan 11 17:31:48 crc kubenswrapper[4837]: I0111 17:31:48.622589 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/cb48d08d-1606-49d9-a55d-53b21ad9f404-webhook-cert\") pod \"packageserver-d55dfcdfc-wphsv\" (UID: \"cb48d08d-1606-49d9-a55d-53b21ad9f404\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-wphsv" Jan 11 17:31:48 crc kubenswrapper[4837]: I0111 17:31:48.622610 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/e59909fb-c783-46de-9955-1c31cc9fd6b2-kube-api-access\") pod \"kube-controller-manager-operator-78b949d7b-9dr4c\" (UID: \"e59909fb-c783-46de-9955-1c31cc9fd6b2\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-9dr4c" Jan 11 17:31:48 crc kubenswrapper[4837]: I0111 17:31:48.622635 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-l7rjn\" (UniqueName: \"kubernetes.io/projected/90317184-ec5a-4dc2-a9f8-6075c0d78aa1-kube-api-access-l7rjn\") pod \"collect-profiles-29469210-m2c5r\" (UID: \"90317184-ec5a-4dc2-a9f8-6075c0d78aa1\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29469210-m2c5r" Jan 11 17:31:48 crc kubenswrapper[4837]: I0111 17:31:48.622751 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/1b5ce6bf-72e2-494a-aa22-830e992fbec5-v4-0-config-system-serving-cert\") pod \"oauth-openshift-558db77b4-ldzgv\" (UID: \"1b5ce6bf-72e2-494a-aa22-830e992fbec5\") " pod="openshift-authentication/oauth-openshift-558db77b4-ldzgv" Jan 11 17:31:48 crc kubenswrapper[4837]: I0111 17:31:48.623250 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/31dcdcdc-a207-4f09-90af-82c452f9a3f0-tuning-conf-dir\") pod \"cni-sysctl-allowlist-ds-pvfwl\" (UID: \"31dcdcdc-a207-4f09-90af-82c452f9a3f0\") " pod="openshift-multus/cni-sysctl-allowlist-ds-pvfwl" Jan 11 17:31:48 crc kubenswrapper[4837]: I0111 17:31:48.623277 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/41ef8783-9b8b-427d-b3db-56d90bb448fa-proxy-tls\") pod \"machine-config-operator-74547568cd-ps47j\" (UID: \"41ef8783-9b8b-427d-b3db-56d90bb448fa\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-ps47j" Jan 11 17:31:48 crc kubenswrapper[4837]: I0111 17:31:48.623299 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-r45g8\" (UniqueName: \"kubernetes.io/projected/bb63c9f5-457d-4c61-8cc6-56690e66a952-kube-api-access-r45g8\") pod \"image-registry-697d97f7c8-4nvrr\" (UID: \"bb63c9f5-457d-4c61-8cc6-56690e66a952\") " pod="openshift-image-registry/image-registry-697d97f7c8-4nvrr" Jan 11 17:31:48 crc kubenswrapper[4837]: I0111 17:31:48.623318 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-blx94\" (UniqueName: \"kubernetes.io/projected/8299e03d-1f93-4032-bcad-2ce040734c86-kube-api-access-blx94\") pod \"machine-config-controller-84d6567774-zdg78\" (UID: \"8299e03d-1f93-4032-bcad-2ce040734c86\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-zdg78" Jan 11 17:31:48 crc kubenswrapper[4837]: I0111 17:31:48.623333 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/ce38ff9a-9354-463c-a8b5-3d4bbea8694d-socket-dir\") pod \"csi-hostpathplugin-tdxwr\" (UID: \"ce38ff9a-9354-463c-a8b5-3d4bbea8694d\") " pod="hostpath-provisioner/csi-hostpathplugin-tdxwr" Jan 11 17:31:48 crc kubenswrapper[4837]: I0111 17:31:48.623378 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/1b5ce6bf-72e2-494a-aa22-830e992fbec5-v4-0-config-user-template-login\") pod \"oauth-openshift-558db77b4-ldzgv\" (UID: \"1b5ce6bf-72e2-494a-aa22-830e992fbec5\") " pod="openshift-authentication/oauth-openshift-558db77b4-ldzgv" Jan 11 17:31:48 crc kubenswrapper[4837]: I0111 17:31:48.623394 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/f9e34a1e-5456-4b26-b347-aa569c5987d5-marketplace-operator-metrics\") pod \"marketplace-operator-79b997595-gkkhd\" (UID: \"f9e34a1e-5456-4b26-b347-aa569c5987d5\") " pod="openshift-marketplace/marketplace-operator-79b997595-gkkhd" Jan 11 17:31:48 crc kubenswrapper[4837]: I0111 17:31:48.623409 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/e59909fb-c783-46de-9955-1c31cc9fd6b2-serving-cert\") pod \"kube-controller-manager-operator-78b949d7b-9dr4c\" (UID: \"e59909fb-c783-46de-9955-1c31cc9fd6b2\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-9dr4c" Jan 11 17:31:48 crc kubenswrapper[4837]: I0111 17:31:48.623426 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"signing-cabundle\" (UniqueName: \"kubernetes.io/configmap/334b61aa-43d5-4f60-af17-50408198b8f5-signing-cabundle\") pod \"service-ca-9c57cc56f-nqs9r\" (UID: \"334b61aa-43d5-4f60-af17-50408198b8f5\") " pod="openshift-service-ca/service-ca-9c57cc56f-nqs9r" Jan 11 17:31:48 crc kubenswrapper[4837]: I0111 17:31:48.623444 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/9b730a03-0d21-4b8a-8fa6-3d64f0b3aa5b-etcd-client\") pod \"etcd-operator-b45778765-pjl6t\" (UID: \"9b730a03-0d21-4b8a-8fa6-3d64f0b3aa5b\") " pod="openshift-etcd-operator/etcd-operator-b45778765-pjl6t" Jan 11 17:31:48 crc kubenswrapper[4837]: I0111 17:31:48.623495 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/ce38ff9a-9354-463c-a8b5-3d4bbea8694d-registration-dir\") pod \"csi-hostpathplugin-tdxwr\" (UID: \"ce38ff9a-9354-463c-a8b5-3d4bbea8694d\") " pod="hostpath-provisioner/csi-hostpathplugin-tdxwr" Jan 11 17:31:48 crc kubenswrapper[4837]: I0111 17:31:48.623511 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xfcw8\" (UniqueName: \"kubernetes.io/projected/41ef8783-9b8b-427d-b3db-56d90bb448fa-kube-api-access-xfcw8\") pod \"machine-config-operator-74547568cd-ps47j\" (UID: \"41ef8783-9b8b-427d-b3db-56d90bb448fa\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-ps47j" Jan 11 17:31:48 crc kubenswrapper[4837]: I0111 17:31:48.623528 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/bb63c9f5-457d-4c61-8cc6-56690e66a952-bound-sa-token\") pod \"image-registry-697d97f7c8-4nvrr\" (UID: \"bb63c9f5-457d-4c61-8cc6-56690e66a952\") " pod="openshift-image-registry/image-registry-697d97f7c8-4nvrr" Jan 11 17:31:48 crc kubenswrapper[4837]: I0111 17:31:48.623546 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"csi-data-dir\" (UniqueName: \"kubernetes.io/host-path/ce38ff9a-9354-463c-a8b5-3d4bbea8694d-csi-data-dir\") pod \"csi-hostpathplugin-tdxwr\" (UID: \"ce38ff9a-9354-463c-a8b5-3d4bbea8694d\") " pod="hostpath-provisioner/csi-hostpathplugin-tdxwr" Jan 11 17:31:48 crc kubenswrapper[4837]: I0111 17:31:48.623562 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/f15ef5f6-e972-4986-a15c-da1e90b74ae0-bound-sa-token\") pod \"ingress-operator-5b745b69d9-gsqgk\" (UID: \"f15ef5f6-e972-4986-a15c-da1e90b74ae0\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-gsqgk" Jan 11 17:31:48 crc kubenswrapper[4837]: I0111 17:31:48.623593 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"mcc-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/8299e03d-1f93-4032-bcad-2ce040734c86-mcc-auth-proxy-config\") pod \"machine-config-controller-84d6567774-zdg78\" (UID: \"8299e03d-1f93-4032-bcad-2ce040734c86\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-zdg78" Jan 11 17:31:48 crc kubenswrapper[4837]: I0111 17:31:48.623613 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/4eff370b-3729-4929-b73f-752f1b00a318-kube-api-access\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-fxkfr\" (UID: \"4eff370b-3729-4929-b73f-752f1b00a318\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-fxkfr" Jan 11 17:31:48 crc kubenswrapper[4837]: I0111 17:31:48.623627 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dnsct\" (UniqueName: \"kubernetes.io/projected/31dcdcdc-a207-4f09-90af-82c452f9a3f0-kube-api-access-dnsct\") pod \"cni-sysctl-allowlist-ds-pvfwl\" (UID: \"31dcdcdc-a207-4f09-90af-82c452f9a3f0\") " pod="openshift-multus/cni-sysctl-allowlist-ds-pvfwl" Jan 11 17:31:48 crc kubenswrapper[4837]: I0111 17:31:48.623649 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/41ef8783-9b8b-427d-b3db-56d90bb448fa-auth-proxy-config\") pod \"machine-config-operator-74547568cd-ps47j\" (UID: \"41ef8783-9b8b-427d-b3db-56d90bb448fa\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-ps47j" Jan 11 17:31:48 crc kubenswrapper[4837]: I0111 17:31:48.623666 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/1b5ce6bf-72e2-494a-aa22-830e992fbec5-v4-0-config-user-template-provider-selection\") pod \"oauth-openshift-558db77b4-ldzgv\" (UID: \"1b5ce6bf-72e2-494a-aa22-830e992fbec5\") " pod="openshift-authentication/oauth-openshift-558db77b4-ldzgv" Jan 11 17:31:48 crc kubenswrapper[4837]: I0111 17:31:48.623701 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/cb48d08d-1606-49d9-a55d-53b21ad9f404-apiservice-cert\") pod \"packageserver-d55dfcdfc-wphsv\" (UID: \"cb48d08d-1606-49d9-a55d-53b21ad9f404\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-wphsv" Jan 11 17:31:48 crc kubenswrapper[4837]: I0111 17:31:48.623717 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/1bb35332-d7fa-4163-99c1-3de2e12a6165-metrics-tls\") pod \"dns-default-dddb7\" (UID: \"1bb35332-d7fa-4163-99c1-3de2e12a6165\") " pod="openshift-dns/dns-default-dddb7" Jan 11 17:31:48 crc kubenswrapper[4837]: I0111 17:31:48.623734 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-28t46\" (UniqueName: \"kubernetes.io/projected/4047a74f-e1b9-40a2-b525-bc06011477d7-kube-api-access-28t46\") pod \"catalog-operator-68c6474976-sxnbq\" (UID: \"4047a74f-e1b9-40a2-b525-bc06011477d7\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-sxnbq" Jan 11 17:31:48 crc kubenswrapper[4837]: I0111 17:31:48.623762 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mpr2w\" (UniqueName: \"kubernetes.io/projected/d32a6f5b-b581-425c-80aa-c7deee3c5b2c-kube-api-access-mpr2w\") pod \"olm-operator-6b444d44fb-5xllw\" (UID: \"d32a6f5b-b581-425c-80aa-c7deee3c5b2c\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-5xllw" Jan 11 17:31:48 crc kubenswrapper[4837]: I0111 17:31:48.623779 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/d32a6f5b-b581-425c-80aa-c7deee3c5b2c-profile-collector-cert\") pod \"olm-operator-6b444d44fb-5xllw\" (UID: \"d32a6f5b-b581-425c-80aa-c7deee3c5b2c\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-5xllw" Jan 11 17:31:48 crc kubenswrapper[4837]: I0111 17:31:48.623794 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e59909fb-c783-46de-9955-1c31cc9fd6b2-config\") pod \"kube-controller-manager-operator-78b949d7b-9dr4c\" (UID: \"e59909fb-c783-46de-9955-1c31cc9fd6b2\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-9dr4c" Jan 11 17:31:48 crc kubenswrapper[4837]: I0111 17:31:48.623838 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/5a71c215-dd54-402e-aaaa-ad6f2320da35-cert\") pod \"ingress-canary-wbdfb\" (UID: \"5a71c215-dd54-402e-aaaa-ad6f2320da35\") " pod="openshift-ingress-canary/ingress-canary-wbdfb" Jan 11 17:31:48 crc kubenswrapper[4837]: I0111 17:31:48.623878 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/bb63c9f5-457d-4c61-8cc6-56690e66a952-registry-tls\") pod \"image-registry-697d97f7c8-4nvrr\" (UID: \"bb63c9f5-457d-4c61-8cc6-56690e66a952\") " pod="openshift-image-registry/image-registry-697d97f7c8-4nvrr" Jan 11 17:31:48 crc kubenswrapper[4837]: I0111 17:31:48.623903 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-ca\" (UniqueName: \"kubernetes.io/configmap/9b730a03-0d21-4b8a-8fa6-3d64f0b3aa5b-etcd-ca\") pod \"etcd-operator-b45778765-pjl6t\" (UID: \"9b730a03-0d21-4b8a-8fa6-3d64f0b3aa5b\") " pod="openshift-etcd-operator/etcd-operator-b45778765-pjl6t" Jan 11 17:31:48 crc kubenswrapper[4837]: I0111 17:31:48.623919 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/90317184-ec5a-4dc2-a9f8-6075c0d78aa1-config-volume\") pod \"collect-profiles-29469210-m2c5r\" (UID: \"90317184-ec5a-4dc2-a9f8-6075c0d78aa1\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29469210-m2c5r" Jan 11 17:31:48 crc kubenswrapper[4837]: I0111 17:31:48.623944 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/1b5ce6bf-72e2-494a-aa22-830e992fbec5-audit-policies\") pod \"oauth-openshift-558db77b4-ldzgv\" (UID: \"1b5ce6bf-72e2-494a-aa22-830e992fbec5\") " pod="openshift-authentication/oauth-openshift-558db77b4-ldzgv" Jan 11 17:31:48 crc kubenswrapper[4837]: I0111 17:31:48.623961 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/1bb35332-d7fa-4163-99c1-3de2e12a6165-config-volume\") pod \"dns-default-dddb7\" (UID: \"1bb35332-d7fa-4163-99c1-3de2e12a6165\") " pod="openshift-dns/dns-default-dddb7" Jan 11 17:31:48 crc kubenswrapper[4837]: I0111 17:31:48.623976 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"certs\" (UniqueName: \"kubernetes.io/secret/76f75c9a-7f59-4fa1-99e2-7c8f67a6aa17-certs\") pod \"machine-config-server-7mr75\" (UID: \"76f75c9a-7f59-4fa1-99e2-7c8f67a6aa17\") " pod="openshift-machine-config-operator/machine-config-server-7mr75" Jan 11 17:31:48 crc kubenswrapper[4837]: I0111 17:31:48.623992 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/1b5ce6bf-72e2-494a-aa22-830e992fbec5-v4-0-config-system-router-certs\") pod \"oauth-openshift-558db77b4-ldzgv\" (UID: \"1b5ce6bf-72e2-494a-aa22-830e992fbec5\") " pod="openshift-authentication/oauth-openshift-558db77b4-ldzgv" Jan 11 17:31:48 crc kubenswrapper[4837]: I0111 17:31:48.624012 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-4nvrr\" (UID: \"bb63c9f5-457d-4c61-8cc6-56690e66a952\") " pod="openshift-image-registry/image-registry-697d97f7c8-4nvrr" Jan 11 17:31:48 crc kubenswrapper[4837]: I0111 17:31:48.624028 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/d310cf03-2422-47e4-a98b-4a0636c1b8f8-serving-cert\") pod \"service-ca-operator-777779d784-vzfwx\" (UID: \"d310cf03-2422-47e4-a98b-4a0636c1b8f8\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-vzfwx" Jan 11 17:31:48 crc kubenswrapper[4837]: I0111 17:31:48.624043 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d310cf03-2422-47e4-a98b-4a0636c1b8f8-config\") pod \"service-ca-operator-777779d784-vzfwx\" (UID: \"d310cf03-2422-47e4-a98b-4a0636c1b8f8\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-vzfwx" Jan 11 17:31:48 crc kubenswrapper[4837]: I0111 17:31:48.624077 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-n7pzq\" (UniqueName: \"kubernetes.io/projected/334b61aa-43d5-4f60-af17-50408198b8f5-kube-api-access-n7pzq\") pod \"service-ca-9c57cc56f-nqs9r\" (UID: \"334b61aa-43d5-4f60-af17-50408198b8f5\") " pod="openshift-service-ca/service-ca-9c57cc56f-nqs9r" Jan 11 17:31:48 crc kubenswrapper[4837]: I0111 17:31:48.624093 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qfcj4\" (UniqueName: \"kubernetes.io/projected/5a71c215-dd54-402e-aaaa-ad6f2320da35-kube-api-access-qfcj4\") pod \"ingress-canary-wbdfb\" (UID: \"5a71c215-dd54-402e-aaaa-ad6f2320da35\") " pod="openshift-ingress-canary/ingress-canary-wbdfb" Jan 11 17:31:48 crc kubenswrapper[4837]: I0111 17:31:48.624120 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/f9e34a1e-5456-4b26-b347-aa569c5987d5-marketplace-trusted-ca\") pod \"marketplace-operator-79b997595-gkkhd\" (UID: \"f9e34a1e-5456-4b26-b347-aa569c5987d5\") " pod="openshift-marketplace/marketplace-operator-79b997595-gkkhd" Jan 11 17:31:48 crc kubenswrapper[4837]: I0111 17:31:48.624134 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"mountpoint-dir\" (UniqueName: \"kubernetes.io/host-path/ce38ff9a-9354-463c-a8b5-3d4bbea8694d-mountpoint-dir\") pod \"csi-hostpathplugin-tdxwr\" (UID: \"ce38ff9a-9354-463c-a8b5-3d4bbea8694d\") " pod="hostpath-provisioner/csi-hostpathplugin-tdxwr" Jan 11 17:31:48 crc kubenswrapper[4837]: I0111 17:31:48.624150 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/e610467b-9c0c-47ac-86c8-2d700aba3e8e-service-ca-bundle\") pod \"router-default-5444994796-cscqg\" (UID: \"e610467b-9c0c-47ac-86c8-2d700aba3e8e\") " pod="openshift-ingress/router-default-5444994796-cscqg" Jan 11 17:31:48 crc kubenswrapper[4837]: I0111 17:31:48.624164 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/e610467b-9c0c-47ac-86c8-2d700aba3e8e-metrics-certs\") pod \"router-default-5444994796-cscqg\" (UID: \"e610467b-9c0c-47ac-86c8-2d700aba3e8e\") " pod="openshift-ingress/router-default-5444994796-cscqg" Jan 11 17:31:48 crc kubenswrapper[4837]: I0111 17:31:48.624182 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"control-plane-machine-set-operator-tls\" (UniqueName: \"kubernetes.io/secret/75e7928f-bf2f-4a17-b2eb-f4fc925c7ce3-control-plane-machine-set-operator-tls\") pod \"control-plane-machine-set-operator-78cbb6b69f-bs6rl\" (UID: \"75e7928f-bf2f-4a17-b2eb-f4fc925c7ce3\") " pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-bs6rl" Jan 11 17:31:48 crc kubenswrapper[4837]: I0111 17:31:48.624199 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/bb63c9f5-457d-4c61-8cc6-56690e66a952-ca-trust-extracted\") pod \"image-registry-697d97f7c8-4nvrr\" (UID: \"bb63c9f5-457d-4c61-8cc6-56690e66a952\") " pod="openshift-image-registry/image-registry-697d97f7c8-4nvrr" Jan 11 17:31:48 crc kubenswrapper[4837]: I0111 17:31:48.624214 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"package-server-manager-serving-cert\" (UniqueName: \"kubernetes.io/secret/43082497-a570-48fc-95f8-eda27581cde7-package-server-manager-serving-cert\") pod \"package-server-manager-789f6589d5-2vscb\" (UID: \"43082497-a570-48fc-95f8-eda27581cde7\") " pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-2vscb" Jan 11 17:31:48 crc kubenswrapper[4837]: I0111 17:31:48.624231 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"signing-key\" (UniqueName: \"kubernetes.io/secret/334b61aa-43d5-4f60-af17-50408198b8f5-signing-key\") pod \"service-ca-9c57cc56f-nqs9r\" (UID: \"334b61aa-43d5-4f60-af17-50408198b8f5\") " pod="openshift-service-ca/service-ca-9c57cc56f-nqs9r" Jan 11 17:31:48 crc kubenswrapper[4837]: I0111 17:31:48.624539 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"default-certificate\" (UniqueName: \"kubernetes.io/secret/e610467b-9c0c-47ac-86c8-2d700aba3e8e-default-certificate\") pod \"router-default-5444994796-cscqg\" (UID: \"e610467b-9c0c-47ac-86c8-2d700aba3e8e\") " pod="openshift-ingress/router-default-5444994796-cscqg" Jan 11 17:31:48 crc kubenswrapper[4837]: I0111 17:31:48.624557 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-service-ca\" (UniqueName: \"kubernetes.io/configmap/9b730a03-0d21-4b8a-8fa6-3d64f0b3aa5b-etcd-service-ca\") pod \"etcd-operator-b45778765-pjl6t\" (UID: \"9b730a03-0d21-4b8a-8fa6-3d64f0b3aa5b\") " pod="openshift-etcd-operator/etcd-operator-b45778765-pjl6t" Jan 11 17:31:48 crc kubenswrapper[4837]: I0111 17:31:48.624574 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/f15ef5f6-e972-4986-a15c-da1e90b74ae0-trusted-ca\") pod \"ingress-operator-5b745b69d9-gsqgk\" (UID: \"f15ef5f6-e972-4986-a15c-da1e90b74ae0\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-gsqgk" Jan 11 17:31:48 crc kubenswrapper[4837]: I0111 17:31:48.624600 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-bootstrap-token\" (UniqueName: \"kubernetes.io/secret/76f75c9a-7f59-4fa1-99e2-7c8f67a6aa17-node-bootstrap-token\") pod \"machine-config-server-7mr75\" (UID: \"76f75c9a-7f59-4fa1-99e2-7c8f67a6aa17\") " pod="openshift-machine-config-operator/machine-config-server-7mr75" Jan 11 17:31:48 crc kubenswrapper[4837]: I0111 17:31:48.624615 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xr9l9\" (UniqueName: \"kubernetes.io/projected/76f75c9a-7f59-4fa1-99e2-7c8f67a6aa17-kube-api-access-xr9l9\") pod \"machine-config-server-7mr75\" (UID: \"76f75c9a-7f59-4fa1-99e2-7c8f67a6aa17\") " pod="openshift-machine-config-operator/machine-config-server-7mr75" Jan 11 17:31:48 crc kubenswrapper[4837]: I0111 17:31:48.624629 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"images\" (UniqueName: \"kubernetes.io/configmap/41ef8783-9b8b-427d-b3db-56d90bb448fa-images\") pod \"machine-config-operator-74547568cd-ps47j\" (UID: \"41ef8783-9b8b-427d-b3db-56d90bb448fa\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-ps47j" Jan 11 17:31:48 crc kubenswrapper[4837]: I0111 17:31:48.624661 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8wxtr\" (UniqueName: \"kubernetes.io/projected/1bb35332-d7fa-4163-99c1-3de2e12a6165-kube-api-access-8wxtr\") pod \"dns-default-dddb7\" (UID: \"1bb35332-d7fa-4163-99c1-3de2e12a6165\") " pod="openshift-dns/dns-default-dddb7" Jan 11 17:31:48 crc kubenswrapper[4837]: I0111 17:31:48.624714 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-46hnv\" (UniqueName: \"kubernetes.io/projected/f15ef5f6-e972-4986-a15c-da1e90b74ae0-kube-api-access-46hnv\") pod \"ingress-operator-5b745b69d9-gsqgk\" (UID: \"f15ef5f6-e972-4986-a15c-da1e90b74ae0\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-gsqgk" Jan 11 17:31:48 crc kubenswrapper[4837]: I0111 17:31:48.624733 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/8299e03d-1f93-4032-bcad-2ce040734c86-proxy-tls\") pod \"machine-config-controller-84d6567774-zdg78\" (UID: \"8299e03d-1f93-4032-bcad-2ce040734c86\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-zdg78" Jan 11 17:31:48 crc kubenswrapper[4837]: I0111 17:31:48.624754 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/bb63c9f5-457d-4c61-8cc6-56690e66a952-registry-certificates\") pod \"image-registry-697d97f7c8-4nvrr\" (UID: \"bb63c9f5-457d-4c61-8cc6-56690e66a952\") " pod="openshift-image-registry/image-registry-697d97f7c8-4nvrr" Jan 11 17:31:48 crc kubenswrapper[4837]: I0111 17:31:48.624770 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/1b5ce6bf-72e2-494a-aa22-830e992fbec5-audit-dir\") pod \"oauth-openshift-558db77b4-ldzgv\" (UID: \"1b5ce6bf-72e2-494a-aa22-830e992fbec5\") " pod="openshift-authentication/oauth-openshift-558db77b4-ldzgv" Jan 11 17:31:48 crc kubenswrapper[4837]: I0111 17:31:48.624785 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/f15ef5f6-e972-4986-a15c-da1e90b74ae0-metrics-tls\") pod \"ingress-operator-5b745b69d9-gsqgk\" (UID: \"f15ef5f6-e972-4986-a15c-da1e90b74ae0\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-gsqgk" Jan 11 17:31:48 crc kubenswrapper[4837]: I0111 17:31:48.624801 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/31dcdcdc-a207-4f09-90af-82c452f9a3f0-cni-sysctl-allowlist\") pod \"cni-sysctl-allowlist-ds-pvfwl\" (UID: \"31dcdcdc-a207-4f09-90af-82c452f9a3f0\") " pod="openshift-multus/cni-sysctl-allowlist-ds-pvfwl" Jan 11 17:31:48 crc kubenswrapper[4837]: I0111 17:31:48.624821 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/1b5ce6bf-72e2-494a-aa22-830e992fbec5-v4-0-config-system-trusted-ca-bundle\") pod \"oauth-openshift-558db77b4-ldzgv\" (UID: \"1b5ce6bf-72e2-494a-aa22-830e992fbec5\") " pod="openshift-authentication/oauth-openshift-558db77b4-ldzgv" Jan 11 17:31:48 crc kubenswrapper[4837]: I0111 17:31:48.624837 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmpfs\" (UniqueName: \"kubernetes.io/empty-dir/cb48d08d-1606-49d9-a55d-53b21ad9f404-tmpfs\") pod \"packageserver-d55dfcdfc-wphsv\" (UID: \"cb48d08d-1606-49d9-a55d-53b21ad9f404\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-wphsv" Jan 11 17:31:48 crc kubenswrapper[4837]: I0111 17:31:48.624866 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/d32a6f5b-b581-425c-80aa-c7deee3c5b2c-srv-cert\") pod \"olm-operator-6b444d44fb-5xllw\" (UID: \"d32a6f5b-b581-425c-80aa-c7deee3c5b2c\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-5xllw" Jan 11 17:31:48 crc kubenswrapper[4837]: I0111 17:31:48.624883 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/1b5ce6bf-72e2-494a-aa22-830e992fbec5-v4-0-config-system-cliconfig\") pod \"oauth-openshift-558db77b4-ldzgv\" (UID: \"1b5ce6bf-72e2-494a-aa22-830e992fbec5\") " pod="openshift-authentication/oauth-openshift-558db77b4-ldzgv" Jan 11 17:31:48 crc kubenswrapper[4837]: I0111 17:31:48.624900 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hwrj2\" (UniqueName: \"kubernetes.io/projected/09ba226e-a3c0-402a-a7d0-bbc4f46d5dd3-kube-api-access-hwrj2\") pod \"migrator-59844c95c7-p7jgn\" (UID: \"09ba226e-a3c0-402a-a7d0-bbc4f46d5dd3\") " pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-p7jgn" Jan 11 17:31:48 crc kubenswrapper[4837]: I0111 17:31:48.624926 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/1b5ce6bf-72e2-494a-aa22-830e992fbec5-v4-0-config-system-service-ca\") pod \"oauth-openshift-558db77b4-ldzgv\" (UID: \"1b5ce6bf-72e2-494a-aa22-830e992fbec5\") " pod="openshift-authentication/oauth-openshift-558db77b4-ldzgv" Jan 11 17:31:48 crc kubenswrapper[4837]: I0111 17:31:48.624942 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9b730a03-0d21-4b8a-8fa6-3d64f0b3aa5b-config\") pod \"etcd-operator-b45778765-pjl6t\" (UID: \"9b730a03-0d21-4b8a-8fa6-3d64f0b3aa5b\") " pod="openshift-etcd-operator/etcd-operator-b45778765-pjl6t" Jan 11 17:31:48 crc kubenswrapper[4837]: I0111 17:31:48.624958 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"images\" (UniqueName: \"kubernetes.io/configmap/b61e27df-5c38-48b3-b6e9-bca3ce8aa429-images\") pod \"machine-api-operator-5694c8668f-xk9j9\" (UID: \"b61e27df-5c38-48b3-b6e9-bca3ce8aa429\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-xk9j9" Jan 11 17:31:48 crc kubenswrapper[4837]: I0111 17:31:48.624974 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8g27x\" (UniqueName: \"kubernetes.io/projected/43082497-a570-48fc-95f8-eda27581cde7-kube-api-access-8g27x\") pod \"package-server-manager-789f6589d5-2vscb\" (UID: \"43082497-a570-48fc-95f8-eda27581cde7\") " pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-2vscb" Jan 11 17:31:48 crc kubenswrapper[4837]: I0111 17:31:48.624991 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/876b267e-baf3-4f7d-a4a3-49f44b4dfbb7-samples-operator-tls\") pod \"cluster-samples-operator-665b6dd947-kq5ch\" (UID: \"876b267e-baf3-4f7d-a4a3-49f44b4dfbb7\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-kq5ch" Jan 11 17:31:48 crc kubenswrapper[4837]: I0111 17:31:48.625017 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b61e27df-5c38-48b3-b6e9-bca3ce8aa429-config\") pod \"machine-api-operator-5694c8668f-xk9j9\" (UID: \"b61e27df-5c38-48b3-b6e9-bca3ce8aa429\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-xk9j9" Jan 11 17:31:48 crc kubenswrapper[4837]: I0111 17:31:48.625047 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-m4nll\" (UniqueName: \"kubernetes.io/projected/cb48d08d-1606-49d9-a55d-53b21ad9f404-kube-api-access-m4nll\") pod \"packageserver-d55dfcdfc-wphsv\" (UID: \"cb48d08d-1606-49d9-a55d-53b21ad9f404\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-wphsv" Jan 11 17:31:48 crc kubenswrapper[4837]: I0111 17:31:48.625063 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nhdlp\" (UniqueName: \"kubernetes.io/projected/ce38ff9a-9354-463c-a8b5-3d4bbea8694d-kube-api-access-nhdlp\") pod \"csi-hostpathplugin-tdxwr\" (UID: \"ce38ff9a-9354-463c-a8b5-3d4bbea8694d\") " pod="hostpath-provisioner/csi-hostpathplugin-tdxwr" Jan 11 17:31:48 crc kubenswrapper[4837]: I0111 17:31:48.625078 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5fzfc\" (UniqueName: \"kubernetes.io/projected/75e7928f-bf2f-4a17-b2eb-f4fc925c7ce3-kube-api-access-5fzfc\") pod \"control-plane-machine-set-operator-78cbb6b69f-bs6rl\" (UID: \"75e7928f-bf2f-4a17-b2eb-f4fc925c7ce3\") " pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-bs6rl" Jan 11 17:31:48 crc kubenswrapper[4837]: I0111 17:31:48.625097 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/9b730a03-0d21-4b8a-8fa6-3d64f0b3aa5b-serving-cert\") pod \"etcd-operator-b45778765-pjl6t\" (UID: \"9b730a03-0d21-4b8a-8fa6-3d64f0b3aa5b\") " pod="openshift-etcd-operator/etcd-operator-b45778765-pjl6t" Jan 11 17:31:48 crc kubenswrapper[4837]: I0111 17:31:48.625114 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"machine-api-operator-tls\" (UniqueName: \"kubernetes.io/secret/b61e27df-5c38-48b3-b6e9-bca3ce8aa429-machine-api-operator-tls\") pod \"machine-api-operator-5694c8668f-xk9j9\" (UID: \"b61e27df-5c38-48b3-b6e9-bca3ce8aa429\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-xk9j9" Jan 11 17:31:48 crc kubenswrapper[4837]: I0111 17:31:48.625154 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/1b5ce6bf-72e2-494a-aa22-830e992fbec5-v4-0-config-system-ocp-branding-template\") pod \"oauth-openshift-558db77b4-ldzgv\" (UID: \"1b5ce6bf-72e2-494a-aa22-830e992fbec5\") " pod="openshift-authentication/oauth-openshift-558db77b4-ldzgv" Jan 11 17:31:48 crc kubenswrapper[4837]: I0111 17:31:48.625170 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/4047a74f-e1b9-40a2-b525-bc06011477d7-profile-collector-cert\") pod \"catalog-operator-68c6474976-sxnbq\" (UID: \"4047a74f-e1b9-40a2-b525-bc06011477d7\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-sxnbq" Jan 11 17:31:48 crc kubenswrapper[4837]: I0111 17:31:48.625185 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ready\" (UniqueName: \"kubernetes.io/empty-dir/31dcdcdc-a207-4f09-90af-82c452f9a3f0-ready\") pod \"cni-sysctl-allowlist-ds-pvfwl\" (UID: \"31dcdcdc-a207-4f09-90af-82c452f9a3f0\") " pod="openshift-multus/cni-sysctl-allowlist-ds-pvfwl" Jan 11 17:31:48 crc kubenswrapper[4837]: I0111 17:31:48.625201 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nl78m\" (UniqueName: \"kubernetes.io/projected/d310cf03-2422-47e4-a98b-4a0636c1b8f8-kube-api-access-nl78m\") pod \"service-ca-operator-777779d784-vzfwx\" (UID: \"d310cf03-2422-47e4-a98b-4a0636c1b8f8\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-vzfwx" Jan 11 17:31:48 crc kubenswrapper[4837]: I0111 17:31:48.625219 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/bb63c9f5-457d-4c61-8cc6-56690e66a952-installation-pull-secrets\") pod \"image-registry-697d97f7c8-4nvrr\" (UID: \"bb63c9f5-457d-4c61-8cc6-56690e66a952\") " pod="openshift-image-registry/image-registry-697d97f7c8-4nvrr" Jan 11 17:31:48 crc kubenswrapper[4837]: I0111 17:31:48.625249 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/1b5ce6bf-72e2-494a-aa22-830e992fbec5-v4-0-config-user-idp-0-file-data\") pod \"oauth-openshift-558db77b4-ldzgv\" (UID: \"1b5ce6bf-72e2-494a-aa22-830e992fbec5\") " pod="openshift-authentication/oauth-openshift-558db77b4-ldzgv" Jan 11 17:31:48 crc kubenswrapper[4837]: I0111 17:31:48.625288 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/1b5ce6bf-72e2-494a-aa22-830e992fbec5-v4-0-config-user-template-error\") pod \"oauth-openshift-558db77b4-ldzgv\" (UID: \"1b5ce6bf-72e2-494a-aa22-830e992fbec5\") " pod="openshift-authentication/oauth-openshift-558db77b4-ldzgv" Jan 11 17:31:48 crc kubenswrapper[4837]: I0111 17:31:48.625305 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gtjfz\" (UniqueName: \"kubernetes.io/projected/f9e34a1e-5456-4b26-b347-aa569c5987d5-kube-api-access-gtjfz\") pod \"marketplace-operator-79b997595-gkkhd\" (UID: \"f9e34a1e-5456-4b26-b347-aa569c5987d5\") " pod="openshift-marketplace/marketplace-operator-79b997595-gkkhd" Jan 11 17:31:48 crc kubenswrapper[4837]: I0111 17:31:48.625325 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-x6z5d\" (UniqueName: \"kubernetes.io/projected/e610467b-9c0c-47ac-86c8-2d700aba3e8e-kube-api-access-x6z5d\") pod \"router-default-5444994796-cscqg\" (UID: \"e610467b-9c0c-47ac-86c8-2d700aba3e8e\") " pod="openshift-ingress/router-default-5444994796-cscqg" Jan 11 17:31:48 crc kubenswrapper[4837]: I0111 17:31:48.625339 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/90317184-ec5a-4dc2-a9f8-6075c0d78aa1-secret-volume\") pod \"collect-profiles-29469210-m2c5r\" (UID: \"90317184-ec5a-4dc2-a9f8-6075c0d78aa1\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29469210-m2c5r" Jan 11 17:31:48 crc kubenswrapper[4837]: I0111 17:31:48.625375 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/1b5ce6bf-72e2-494a-aa22-830e992fbec5-v4-0-config-system-session\") pod \"oauth-openshift-558db77b4-ldzgv\" (UID: \"1b5ce6bf-72e2-494a-aa22-830e992fbec5\") " pod="openshift-authentication/oauth-openshift-558db77b4-ldzgv" Jan 11 17:31:48 crc kubenswrapper[4837]: I0111 17:31:48.625409 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/4eff370b-3729-4929-b73f-752f1b00a318-serving-cert\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-fxkfr\" (UID: \"4eff370b-3729-4929-b73f-752f1b00a318\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-fxkfr" Jan 11 17:31:48 crc kubenswrapper[4837]: I0111 17:31:48.625431 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-slzch\" (UniqueName: \"kubernetes.io/projected/8c7124f1-e7cc-4ae8-89d2-19457c045576-kube-api-access-slzch\") pod \"multus-admission-controller-857f4d67dd-2984c\" (UID: \"8c7124f1-e7cc-4ae8-89d2-19457c045576\") " pod="openshift-multus/multus-admission-controller-857f4d67dd-2984c" Jan 11 17:31:48 crc kubenswrapper[4837]: I0111 17:31:48.625458 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/bb63c9f5-457d-4c61-8cc6-56690e66a952-trusted-ca\") pod \"image-registry-697d97f7c8-4nvrr\" (UID: \"bb63c9f5-457d-4c61-8cc6-56690e66a952\") " pod="openshift-image-registry/image-registry-697d97f7c8-4nvrr" Jan 11 17:31:48 crc kubenswrapper[4837]: I0111 17:31:48.625486 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-sqmvk\" (UniqueName: \"kubernetes.io/projected/9b730a03-0d21-4b8a-8fa6-3d64f0b3aa5b-kube-api-access-sqmvk\") pod \"etcd-operator-b45778765-pjl6t\" (UID: \"9b730a03-0d21-4b8a-8fa6-3d64f0b3aa5b\") " pod="openshift-etcd-operator/etcd-operator-b45778765-pjl6t" Jan 11 17:31:48 crc kubenswrapper[4837]: I0111 17:31:48.625506 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"stats-auth\" (UniqueName: \"kubernetes.io/secret/e610467b-9c0c-47ac-86c8-2d700aba3e8e-stats-auth\") pod \"router-default-5444994796-cscqg\" (UID: \"e610467b-9c0c-47ac-86c8-2d700aba3e8e\") " pod="openshift-ingress/router-default-5444994796-cscqg" Jan 11 17:31:48 crc kubenswrapper[4837]: I0111 17:31:48.625803 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/bb63c9f5-457d-4c61-8cc6-56690e66a952-ca-trust-extracted\") pod \"image-registry-697d97f7c8-4nvrr\" (UID: \"bb63c9f5-457d-4c61-8cc6-56690e66a952\") " pod="openshift-image-registry/image-registry-697d97f7c8-4nvrr" Jan 11 17:31:48 crc kubenswrapper[4837]: I0111 17:31:48.626395 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/1b5ce6bf-72e2-494a-aa22-830e992fbec5-audit-dir\") pod \"oauth-openshift-558db77b4-ldzgv\" (UID: \"1b5ce6bf-72e2-494a-aa22-830e992fbec5\") " pod="openshift-authentication/oauth-openshift-558db77b4-ldzgv" Jan 11 17:31:48 crc kubenswrapper[4837]: I0111 17:31:48.627353 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/1b5ce6bf-72e2-494a-aa22-830e992fbec5-v4-0-config-system-trusted-ca-bundle\") pod \"oauth-openshift-558db77b4-ldzgv\" (UID: \"1b5ce6bf-72e2-494a-aa22-830e992fbec5\") " pod="openshift-authentication/oauth-openshift-558db77b4-ldzgv" Jan 11 17:31:48 crc kubenswrapper[4837]: I0111 17:31:48.627576 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/bb63c9f5-457d-4c61-8cc6-56690e66a952-registry-certificates\") pod \"image-registry-697d97f7c8-4nvrr\" (UID: \"bb63c9f5-457d-4c61-8cc6-56690e66a952\") " pod="openshift-image-registry/image-registry-697d97f7c8-4nvrr" Jan 11 17:31:48 crc kubenswrapper[4837]: I0111 17:31:48.635172 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/1b5ce6bf-72e2-494a-aa22-830e992fbec5-v4-0-config-user-template-error\") pod \"oauth-openshift-558db77b4-ldzgv\" (UID: \"1b5ce6bf-72e2-494a-aa22-830e992fbec5\") " pod="openshift-authentication/oauth-openshift-558db77b4-ldzgv" Jan 11 17:31:48 crc kubenswrapper[4837]: I0111 17:31:48.635919 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"package-server-manager-serving-cert\" (UniqueName: \"kubernetes.io/secret/43082497-a570-48fc-95f8-eda27581cde7-package-server-manager-serving-cert\") pod \"package-server-manager-789f6589d5-2vscb\" (UID: \"43082497-a570-48fc-95f8-eda27581cde7\") " pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-2vscb" Jan 11 17:31:48 crc kubenswrapper[4837]: I0111 17:31:48.636416 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/d32a6f5b-b581-425c-80aa-c7deee3c5b2c-profile-collector-cert\") pod \"olm-operator-6b444d44fb-5xllw\" (UID: \"d32a6f5b-b581-425c-80aa-c7deee3c5b2c\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-5xllw" Jan 11 17:31:48 crc kubenswrapper[4837]: I0111 17:31:48.638797 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/1b5ce6bf-72e2-494a-aa22-830e992fbec5-audit-policies\") pod \"oauth-openshift-558db77b4-ldzgv\" (UID: \"1b5ce6bf-72e2-494a-aa22-830e992fbec5\") " pod="openshift-authentication/oauth-openshift-558db77b4-ldzgv" Jan 11 17:31:48 crc kubenswrapper[4837]: I0111 17:31:48.640061 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-service-ca\" (UniqueName: \"kubernetes.io/configmap/9b730a03-0d21-4b8a-8fa6-3d64f0b3aa5b-etcd-service-ca\") pod \"etcd-operator-b45778765-pjl6t\" (UID: \"9b730a03-0d21-4b8a-8fa6-3d64f0b3aa5b\") " pod="openshift-etcd-operator/etcd-operator-b45778765-pjl6t" Jan 11 17:31:48 crc kubenswrapper[4837]: I0111 17:31:48.640980 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/1b5ce6bf-72e2-494a-aa22-830e992fbec5-v4-0-config-system-ocp-branding-template\") pod \"oauth-openshift-558db77b4-ldzgv\" (UID: \"1b5ce6bf-72e2-494a-aa22-830e992fbec5\") " pod="openshift-authentication/oauth-openshift-558db77b4-ldzgv" Jan 11 17:31:48 crc kubenswrapper[4837]: E0111 17:31:48.641542 4837 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-01-11 17:31:49.141528123 +0000 UTC m=+83.319720829 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-4nvrr" (UID: "bb63c9f5-457d-4c61-8cc6-56690e66a952") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 11 17:31:48 crc kubenswrapper[4837]: I0111 17:31:48.642772 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/d32a6f5b-b581-425c-80aa-c7deee3c5b2c-srv-cert\") pod \"olm-operator-6b444d44fb-5xllw\" (UID: \"d32a6f5b-b581-425c-80aa-c7deee3c5b2c\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-5xllw" Jan 11 17:31:48 crc kubenswrapper[4837]: I0111 17:31:48.642868 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"machine-api-operator-tls\" (UniqueName: \"kubernetes.io/secret/b61e27df-5c38-48b3-b6e9-bca3ce8aa429-machine-api-operator-tls\") pod \"machine-api-operator-5694c8668f-xk9j9\" (UID: \"b61e27df-5c38-48b3-b6e9-bca3ce8aa429\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-xk9j9" Jan 11 17:31:48 crc kubenswrapper[4837]: I0111 17:31:48.643039 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/1b5ce6bf-72e2-494a-aa22-830e992fbec5-v4-0-config-user-idp-0-file-data\") pod \"oauth-openshift-558db77b4-ldzgv\" (UID: \"1b5ce6bf-72e2-494a-aa22-830e992fbec5\") " pod="openshift-authentication/oauth-openshift-558db77b4-ldzgv" Jan 11 17:31:48 crc kubenswrapper[4837]: I0111 17:31:48.643715 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"images\" (UniqueName: \"kubernetes.io/configmap/b61e27df-5c38-48b3-b6e9-bca3ce8aa429-images\") pod \"machine-api-operator-5694c8668f-xk9j9\" (UID: \"b61e27df-5c38-48b3-b6e9-bca3ce8aa429\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-xk9j9" Jan 11 17:31:48 crc kubenswrapper[4837]: I0111 17:31:48.644913 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/bb63c9f5-457d-4c61-8cc6-56690e66a952-registry-tls\") pod \"image-registry-697d97f7c8-4nvrr\" (UID: \"bb63c9f5-457d-4c61-8cc6-56690e66a952\") " pod="openshift-image-registry/image-registry-697d97f7c8-4nvrr" Jan 11 17:31:48 crc kubenswrapper[4837]: I0111 17:31:48.652098 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/1b5ce6bf-72e2-494a-aa22-830e992fbec5-v4-0-config-user-template-login\") pod \"oauth-openshift-558db77b4-ldzgv\" (UID: \"1b5ce6bf-72e2-494a-aa22-830e992fbec5\") " pod="openshift-authentication/oauth-openshift-558db77b4-ldzgv" Jan 11 17:31:48 crc kubenswrapper[4837]: I0111 17:31:48.653503 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9b730a03-0d21-4b8a-8fa6-3d64f0b3aa5b-config\") pod \"etcd-operator-b45778765-pjl6t\" (UID: \"9b730a03-0d21-4b8a-8fa6-3d64f0b3aa5b\") " pod="openshift-etcd-operator/etcd-operator-b45778765-pjl6t" Jan 11 17:31:48 crc kubenswrapper[4837]: I0111 17:31:48.653526 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-ca\" (UniqueName: \"kubernetes.io/configmap/9b730a03-0d21-4b8a-8fa6-3d64f0b3aa5b-etcd-ca\") pod \"etcd-operator-b45778765-pjl6t\" (UID: \"9b730a03-0d21-4b8a-8fa6-3d64f0b3aa5b\") " pod="openshift-etcd-operator/etcd-operator-b45778765-pjl6t" Jan 11 17:31:48 crc kubenswrapper[4837]: I0111 17:31:48.654388 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/bb63c9f5-457d-4c61-8cc6-56690e66a952-trusted-ca\") pod \"image-registry-697d97f7c8-4nvrr\" (UID: \"bb63c9f5-457d-4c61-8cc6-56690e66a952\") " pod="openshift-image-registry/image-registry-697d97f7c8-4nvrr" Jan 11 17:31:48 crc kubenswrapper[4837]: I0111 17:31:48.654691 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b61e27df-5c38-48b3-b6e9-bca3ce8aa429-config\") pod \"machine-api-operator-5694c8668f-xk9j9\" (UID: \"b61e27df-5c38-48b3-b6e9-bca3ce8aa429\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-xk9j9" Jan 11 17:31:48 crc kubenswrapper[4837]: I0111 17:31:48.661804 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/876b267e-baf3-4f7d-a4a3-49f44b4dfbb7-samples-operator-tls\") pod \"cluster-samples-operator-665b6dd947-kq5ch\" (UID: \"876b267e-baf3-4f7d-a4a3-49f44b4dfbb7\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-kq5ch" Jan 11 17:31:48 crc kubenswrapper[4837]: I0111 17:31:48.665790 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/1b5ce6bf-72e2-494a-aa22-830e992fbec5-v4-0-config-system-service-ca\") pod \"oauth-openshift-558db77b4-ldzgv\" (UID: \"1b5ce6bf-72e2-494a-aa22-830e992fbec5\") " pod="openshift-authentication/oauth-openshift-558db77b4-ldzgv" Jan 11 17:31:48 crc kubenswrapper[4837]: I0111 17:31:48.665909 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/1b5ce6bf-72e2-494a-aa22-830e992fbec5-v4-0-config-system-serving-cert\") pod \"oauth-openshift-558db77b4-ldzgv\" (UID: \"1b5ce6bf-72e2-494a-aa22-830e992fbec5\") " pod="openshift-authentication/oauth-openshift-558db77b4-ldzgv" Jan 11 17:31:48 crc kubenswrapper[4837]: I0111 17:31:48.667309 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/1b5ce6bf-72e2-494a-aa22-830e992fbec5-v4-0-config-system-cliconfig\") pod \"oauth-openshift-558db77b4-ldzgv\" (UID: \"1b5ce6bf-72e2-494a-aa22-830e992fbec5\") " pod="openshift-authentication/oauth-openshift-558db77b4-ldzgv" Jan 11 17:31:48 crc kubenswrapper[4837]: I0111 17:31:48.669842 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/1b5ce6bf-72e2-494a-aa22-830e992fbec5-v4-0-config-system-session\") pod \"oauth-openshift-558db77b4-ldzgv\" (UID: \"1b5ce6bf-72e2-494a-aa22-830e992fbec5\") " pod="openshift-authentication/oauth-openshift-558db77b4-ldzgv" Jan 11 17:31:48 crc kubenswrapper[4837]: I0111 17:31:48.670914 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/1b5ce6bf-72e2-494a-aa22-830e992fbec5-v4-0-config-user-template-provider-selection\") pod \"oauth-openshift-558db77b4-ldzgv\" (UID: \"1b5ce6bf-72e2-494a-aa22-830e992fbec5\") " pod="openshift-authentication/oauth-openshift-558db77b4-ldzgv" Jan 11 17:31:48 crc kubenswrapper[4837]: I0111 17:31:48.674841 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/1b5ce6bf-72e2-494a-aa22-830e992fbec5-v4-0-config-system-router-certs\") pod \"oauth-openshift-558db77b4-ldzgv\" (UID: \"1b5ce6bf-72e2-494a-aa22-830e992fbec5\") " pod="openshift-authentication/oauth-openshift-558db77b4-ldzgv" Jan 11 17:31:48 crc kubenswrapper[4837]: I0111 17:31:48.659225 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/bb63c9f5-457d-4c61-8cc6-56690e66a952-installation-pull-secrets\") pod \"image-registry-697d97f7c8-4nvrr\" (UID: \"bb63c9f5-457d-4c61-8cc6-56690e66a952\") " pod="openshift-image-registry/image-registry-697d97f7c8-4nvrr" Jan 11 17:31:48 crc kubenswrapper[4837]: I0111 17:31:48.675831 4837 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-dns-operator/dns-operator-744455d44c-n8rwr"] Jan 11 17:31:48 crc kubenswrapper[4837]: I0111 17:31:48.676966 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-x8mhv\" (UniqueName: \"kubernetes.io/projected/1b5ce6bf-72e2-494a-aa22-830e992fbec5-kube-api-access-x8mhv\") pod \"oauth-openshift-558db77b4-ldzgv\" (UID: \"1b5ce6bf-72e2-494a-aa22-830e992fbec5\") " pod="openshift-authentication/oauth-openshift-558db77b4-ldzgv" Jan 11 17:31:48 crc kubenswrapper[4837]: I0111 17:31:48.678573 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/9b730a03-0d21-4b8a-8fa6-3d64f0b3aa5b-serving-cert\") pod \"etcd-operator-b45778765-pjl6t\" (UID: \"9b730a03-0d21-4b8a-8fa6-3d64f0b3aa5b\") " pod="openshift-etcd-operator/etcd-operator-b45778765-pjl6t" Jan 11 17:31:48 crc kubenswrapper[4837]: I0111 17:31:48.680794 4837 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-authentication-operator/authentication-operator-69f744f599-l2c7d"] Jan 11 17:31:48 crc kubenswrapper[4837]: I0111 17:31:48.681106 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/9b730a03-0d21-4b8a-8fa6-3d64f0b3aa5b-etcd-client\") pod \"etcd-operator-b45778765-pjl6t\" (UID: \"9b730a03-0d21-4b8a-8fa6-3d64f0b3aa5b\") " pod="openshift-etcd-operator/etcd-operator-b45778765-pjl6t" Jan 11 17:31:48 crc kubenswrapper[4837]: I0111 17:31:48.682608 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xhcls\" (UniqueName: \"kubernetes.io/projected/876b267e-baf3-4f7d-a4a3-49f44b4dfbb7-kube-api-access-xhcls\") pod \"cluster-samples-operator-665b6dd947-kq5ch\" (UID: \"876b267e-baf3-4f7d-a4a3-49f44b4dfbb7\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-kq5ch" Jan 11 17:31:48 crc kubenswrapper[4837]: I0111 17:31:48.686236 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-kq5ch" Jan 11 17:31:48 crc kubenswrapper[4837]: I0111 17:31:48.703059 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-85wsv\" (UniqueName: \"kubernetes.io/projected/b61e27df-5c38-48b3-b6e9-bca3ce8aa429-kube-api-access-85wsv\") pod \"machine-api-operator-5694c8668f-xk9j9\" (UID: \"b61e27df-5c38-48b3-b6e9-bca3ce8aa429\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-xk9j9" Jan 11 17:31:48 crc kubenswrapper[4837]: I0111 17:31:48.721040 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mpr2w\" (UniqueName: \"kubernetes.io/projected/d32a6f5b-b581-425c-80aa-c7deee3c5b2c-kube-api-access-mpr2w\") pod \"olm-operator-6b444d44fb-5xllw\" (UID: \"d32a6f5b-b581-425c-80aa-c7deee3c5b2c\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-5xllw" Jan 11 17:31:48 crc kubenswrapper[4837]: I0111 17:31:48.722884 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-5xllw" Jan 11 17:31:48 crc kubenswrapper[4837]: I0111 17:31:48.726205 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Jan 11 17:31:48 crc kubenswrapper[4837]: I0111 17:31:48.726328 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"csi-data-dir\" (UniqueName: \"kubernetes.io/host-path/ce38ff9a-9354-463c-a8b5-3d4bbea8694d-csi-data-dir\") pod \"csi-hostpathplugin-tdxwr\" (UID: \"ce38ff9a-9354-463c-a8b5-3d4bbea8694d\") " pod="hostpath-provisioner/csi-hostpathplugin-tdxwr" Jan 11 17:31:48 crc kubenswrapper[4837]: I0111 17:31:48.726349 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/f15ef5f6-e972-4986-a15c-da1e90b74ae0-bound-sa-token\") pod \"ingress-operator-5b745b69d9-gsqgk\" (UID: \"f15ef5f6-e972-4986-a15c-da1e90b74ae0\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-gsqgk" Jan 11 17:31:48 crc kubenswrapper[4837]: I0111 17:31:48.726372 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"mcc-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/8299e03d-1f93-4032-bcad-2ce040734c86-mcc-auth-proxy-config\") pod \"machine-config-controller-84d6567774-zdg78\" (UID: \"8299e03d-1f93-4032-bcad-2ce040734c86\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-zdg78" Jan 11 17:31:48 crc kubenswrapper[4837]: I0111 17:31:48.726389 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/4eff370b-3729-4929-b73f-752f1b00a318-kube-api-access\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-fxkfr\" (UID: \"4eff370b-3729-4929-b73f-752f1b00a318\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-fxkfr" Jan 11 17:31:48 crc kubenswrapper[4837]: I0111 17:31:48.726403 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dnsct\" (UniqueName: \"kubernetes.io/projected/31dcdcdc-a207-4f09-90af-82c452f9a3f0-kube-api-access-dnsct\") pod \"cni-sysctl-allowlist-ds-pvfwl\" (UID: \"31dcdcdc-a207-4f09-90af-82c452f9a3f0\") " pod="openshift-multus/cni-sysctl-allowlist-ds-pvfwl" Jan 11 17:31:48 crc kubenswrapper[4837]: I0111 17:31:48.726423 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/41ef8783-9b8b-427d-b3db-56d90bb448fa-auth-proxy-config\") pod \"machine-config-operator-74547568cd-ps47j\" (UID: \"41ef8783-9b8b-427d-b3db-56d90bb448fa\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-ps47j" Jan 11 17:31:48 crc kubenswrapper[4837]: I0111 17:31:48.726438 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/cb48d08d-1606-49d9-a55d-53b21ad9f404-apiservice-cert\") pod \"packageserver-d55dfcdfc-wphsv\" (UID: \"cb48d08d-1606-49d9-a55d-53b21ad9f404\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-wphsv" Jan 11 17:31:48 crc kubenswrapper[4837]: I0111 17:31:48.726454 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/1bb35332-d7fa-4163-99c1-3de2e12a6165-metrics-tls\") pod \"dns-default-dddb7\" (UID: \"1bb35332-d7fa-4163-99c1-3de2e12a6165\") " pod="openshift-dns/dns-default-dddb7" Jan 11 17:31:48 crc kubenswrapper[4837]: I0111 17:31:48.726470 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-28t46\" (UniqueName: \"kubernetes.io/projected/4047a74f-e1b9-40a2-b525-bc06011477d7-kube-api-access-28t46\") pod \"catalog-operator-68c6474976-sxnbq\" (UID: \"4047a74f-e1b9-40a2-b525-bc06011477d7\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-sxnbq" Jan 11 17:31:48 crc kubenswrapper[4837]: I0111 17:31:48.726487 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e59909fb-c783-46de-9955-1c31cc9fd6b2-config\") pod \"kube-controller-manager-operator-78b949d7b-9dr4c\" (UID: \"e59909fb-c783-46de-9955-1c31cc9fd6b2\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-9dr4c" Jan 11 17:31:48 crc kubenswrapper[4837]: I0111 17:31:48.726502 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/5a71c215-dd54-402e-aaaa-ad6f2320da35-cert\") pod \"ingress-canary-wbdfb\" (UID: \"5a71c215-dd54-402e-aaaa-ad6f2320da35\") " pod="openshift-ingress-canary/ingress-canary-wbdfb" Jan 11 17:31:48 crc kubenswrapper[4837]: I0111 17:31:48.726528 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/90317184-ec5a-4dc2-a9f8-6075c0d78aa1-config-volume\") pod \"collect-profiles-29469210-m2c5r\" (UID: \"90317184-ec5a-4dc2-a9f8-6075c0d78aa1\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29469210-m2c5r" Jan 11 17:31:48 crc kubenswrapper[4837]: I0111 17:31:48.726542 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/1bb35332-d7fa-4163-99c1-3de2e12a6165-config-volume\") pod \"dns-default-dddb7\" (UID: \"1bb35332-d7fa-4163-99c1-3de2e12a6165\") " pod="openshift-dns/dns-default-dddb7" Jan 11 17:31:48 crc kubenswrapper[4837]: I0111 17:31:48.726557 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"certs\" (UniqueName: \"kubernetes.io/secret/76f75c9a-7f59-4fa1-99e2-7c8f67a6aa17-certs\") pod \"machine-config-server-7mr75\" (UID: \"76f75c9a-7f59-4fa1-99e2-7c8f67a6aa17\") " pod="openshift-machine-config-operator/machine-config-server-7mr75" Jan 11 17:31:48 crc kubenswrapper[4837]: I0111 17:31:48.726577 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/d310cf03-2422-47e4-a98b-4a0636c1b8f8-serving-cert\") pod \"service-ca-operator-777779d784-vzfwx\" (UID: \"d310cf03-2422-47e4-a98b-4a0636c1b8f8\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-vzfwx" Jan 11 17:31:48 crc kubenswrapper[4837]: I0111 17:31:48.726590 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d310cf03-2422-47e4-a98b-4a0636c1b8f8-config\") pod \"service-ca-operator-777779d784-vzfwx\" (UID: \"d310cf03-2422-47e4-a98b-4a0636c1b8f8\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-vzfwx" Jan 11 17:31:48 crc kubenswrapper[4837]: I0111 17:31:48.726605 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-n7pzq\" (UniqueName: \"kubernetes.io/projected/334b61aa-43d5-4f60-af17-50408198b8f5-kube-api-access-n7pzq\") pod \"service-ca-9c57cc56f-nqs9r\" (UID: \"334b61aa-43d5-4f60-af17-50408198b8f5\") " pod="openshift-service-ca/service-ca-9c57cc56f-nqs9r" Jan 11 17:31:48 crc kubenswrapper[4837]: I0111 17:31:48.726619 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qfcj4\" (UniqueName: \"kubernetes.io/projected/5a71c215-dd54-402e-aaaa-ad6f2320da35-kube-api-access-qfcj4\") pod \"ingress-canary-wbdfb\" (UID: \"5a71c215-dd54-402e-aaaa-ad6f2320da35\") " pod="openshift-ingress-canary/ingress-canary-wbdfb" Jan 11 17:31:48 crc kubenswrapper[4837]: I0111 17:31:48.726636 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"mountpoint-dir\" (UniqueName: \"kubernetes.io/host-path/ce38ff9a-9354-463c-a8b5-3d4bbea8694d-mountpoint-dir\") pod \"csi-hostpathplugin-tdxwr\" (UID: \"ce38ff9a-9354-463c-a8b5-3d4bbea8694d\") " pod="hostpath-provisioner/csi-hostpathplugin-tdxwr" Jan 11 17:31:48 crc kubenswrapper[4837]: I0111 17:31:48.726650 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/e610467b-9c0c-47ac-86c8-2d700aba3e8e-service-ca-bundle\") pod \"router-default-5444994796-cscqg\" (UID: \"e610467b-9c0c-47ac-86c8-2d700aba3e8e\") " pod="openshift-ingress/router-default-5444994796-cscqg" Jan 11 17:31:48 crc kubenswrapper[4837]: I0111 17:31:48.726666 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/e610467b-9c0c-47ac-86c8-2d700aba3e8e-metrics-certs\") pod \"router-default-5444994796-cscqg\" (UID: \"e610467b-9c0c-47ac-86c8-2d700aba3e8e\") " pod="openshift-ingress/router-default-5444994796-cscqg" Jan 11 17:31:48 crc kubenswrapper[4837]: I0111 17:31:48.726698 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/f9e34a1e-5456-4b26-b347-aa569c5987d5-marketplace-trusted-ca\") pod \"marketplace-operator-79b997595-gkkhd\" (UID: \"f9e34a1e-5456-4b26-b347-aa569c5987d5\") " pod="openshift-marketplace/marketplace-operator-79b997595-gkkhd" Jan 11 17:31:48 crc kubenswrapper[4837]: I0111 17:31:48.726718 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"control-plane-machine-set-operator-tls\" (UniqueName: \"kubernetes.io/secret/75e7928f-bf2f-4a17-b2eb-f4fc925c7ce3-control-plane-machine-set-operator-tls\") pod \"control-plane-machine-set-operator-78cbb6b69f-bs6rl\" (UID: \"75e7928f-bf2f-4a17-b2eb-f4fc925c7ce3\") " pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-bs6rl" Jan 11 17:31:48 crc kubenswrapper[4837]: I0111 17:31:48.726735 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"signing-key\" (UniqueName: \"kubernetes.io/secret/334b61aa-43d5-4f60-af17-50408198b8f5-signing-key\") pod \"service-ca-9c57cc56f-nqs9r\" (UID: \"334b61aa-43d5-4f60-af17-50408198b8f5\") " pod="openshift-service-ca/service-ca-9c57cc56f-nqs9r" Jan 11 17:31:48 crc kubenswrapper[4837]: I0111 17:31:48.726749 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"default-certificate\" (UniqueName: \"kubernetes.io/secret/e610467b-9c0c-47ac-86c8-2d700aba3e8e-default-certificate\") pod \"router-default-5444994796-cscqg\" (UID: \"e610467b-9c0c-47ac-86c8-2d700aba3e8e\") " pod="openshift-ingress/router-default-5444994796-cscqg" Jan 11 17:31:48 crc kubenswrapper[4837]: I0111 17:31:48.726764 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/f15ef5f6-e972-4986-a15c-da1e90b74ae0-trusted-ca\") pod \"ingress-operator-5b745b69d9-gsqgk\" (UID: \"f15ef5f6-e972-4986-a15c-da1e90b74ae0\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-gsqgk" Jan 11 17:31:48 crc kubenswrapper[4837]: I0111 17:31:48.726779 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xr9l9\" (UniqueName: \"kubernetes.io/projected/76f75c9a-7f59-4fa1-99e2-7c8f67a6aa17-kube-api-access-xr9l9\") pod \"machine-config-server-7mr75\" (UID: \"76f75c9a-7f59-4fa1-99e2-7c8f67a6aa17\") " pod="openshift-machine-config-operator/machine-config-server-7mr75" Jan 11 17:31:48 crc kubenswrapper[4837]: I0111 17:31:48.726795 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"images\" (UniqueName: \"kubernetes.io/configmap/41ef8783-9b8b-427d-b3db-56d90bb448fa-images\") pod \"machine-config-operator-74547568cd-ps47j\" (UID: \"41ef8783-9b8b-427d-b3db-56d90bb448fa\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-ps47j" Jan 11 17:31:48 crc kubenswrapper[4837]: I0111 17:31:48.726810 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"node-bootstrap-token\" (UniqueName: \"kubernetes.io/secret/76f75c9a-7f59-4fa1-99e2-7c8f67a6aa17-node-bootstrap-token\") pod \"machine-config-server-7mr75\" (UID: \"76f75c9a-7f59-4fa1-99e2-7c8f67a6aa17\") " pod="openshift-machine-config-operator/machine-config-server-7mr75" Jan 11 17:31:48 crc kubenswrapper[4837]: I0111 17:31:48.726831 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8wxtr\" (UniqueName: \"kubernetes.io/projected/1bb35332-d7fa-4163-99c1-3de2e12a6165-kube-api-access-8wxtr\") pod \"dns-default-dddb7\" (UID: \"1bb35332-d7fa-4163-99c1-3de2e12a6165\") " pod="openshift-dns/dns-default-dddb7" Jan 11 17:31:48 crc kubenswrapper[4837]: I0111 17:31:48.726849 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-46hnv\" (UniqueName: \"kubernetes.io/projected/f15ef5f6-e972-4986-a15c-da1e90b74ae0-kube-api-access-46hnv\") pod \"ingress-operator-5b745b69d9-gsqgk\" (UID: \"f15ef5f6-e972-4986-a15c-da1e90b74ae0\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-gsqgk" Jan 11 17:31:48 crc kubenswrapper[4837]: I0111 17:31:48.726866 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/8299e03d-1f93-4032-bcad-2ce040734c86-proxy-tls\") pod \"machine-config-controller-84d6567774-zdg78\" (UID: \"8299e03d-1f93-4032-bcad-2ce040734c86\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-zdg78" Jan 11 17:31:48 crc kubenswrapper[4837]: I0111 17:31:48.726882 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/f15ef5f6-e972-4986-a15c-da1e90b74ae0-metrics-tls\") pod \"ingress-operator-5b745b69d9-gsqgk\" (UID: \"f15ef5f6-e972-4986-a15c-da1e90b74ae0\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-gsqgk" Jan 11 17:31:48 crc kubenswrapper[4837]: I0111 17:31:48.726897 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/31dcdcdc-a207-4f09-90af-82c452f9a3f0-cni-sysctl-allowlist\") pod \"cni-sysctl-allowlist-ds-pvfwl\" (UID: \"31dcdcdc-a207-4f09-90af-82c452f9a3f0\") " pod="openshift-multus/cni-sysctl-allowlist-ds-pvfwl" Jan 11 17:31:48 crc kubenswrapper[4837]: I0111 17:31:48.726910 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"tmpfs\" (UniqueName: \"kubernetes.io/empty-dir/cb48d08d-1606-49d9-a55d-53b21ad9f404-tmpfs\") pod \"packageserver-d55dfcdfc-wphsv\" (UID: \"cb48d08d-1606-49d9-a55d-53b21ad9f404\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-wphsv" Jan 11 17:31:48 crc kubenswrapper[4837]: I0111 17:31:48.726926 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hwrj2\" (UniqueName: \"kubernetes.io/projected/09ba226e-a3c0-402a-a7d0-bbc4f46d5dd3-kube-api-access-hwrj2\") pod \"migrator-59844c95c7-p7jgn\" (UID: \"09ba226e-a3c0-402a-a7d0-bbc4f46d5dd3\") " pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-p7jgn" Jan 11 17:31:48 crc kubenswrapper[4837]: I0111 17:31:48.726949 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-nhdlp\" (UniqueName: \"kubernetes.io/projected/ce38ff9a-9354-463c-a8b5-3d4bbea8694d-kube-api-access-nhdlp\") pod \"csi-hostpathplugin-tdxwr\" (UID: \"ce38ff9a-9354-463c-a8b5-3d4bbea8694d\") " pod="hostpath-provisioner/csi-hostpathplugin-tdxwr" Jan 11 17:31:48 crc kubenswrapper[4837]: I0111 17:31:48.726963 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5fzfc\" (UniqueName: \"kubernetes.io/projected/75e7928f-bf2f-4a17-b2eb-f4fc925c7ce3-kube-api-access-5fzfc\") pod \"control-plane-machine-set-operator-78cbb6b69f-bs6rl\" (UID: \"75e7928f-bf2f-4a17-b2eb-f4fc925c7ce3\") " pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-bs6rl" Jan 11 17:31:48 crc kubenswrapper[4837]: I0111 17:31:48.726978 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-m4nll\" (UniqueName: \"kubernetes.io/projected/cb48d08d-1606-49d9-a55d-53b21ad9f404-kube-api-access-m4nll\") pod \"packageserver-d55dfcdfc-wphsv\" (UID: \"cb48d08d-1606-49d9-a55d-53b21ad9f404\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-wphsv" Jan 11 17:31:48 crc kubenswrapper[4837]: I0111 17:31:48.726994 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ready\" (UniqueName: \"kubernetes.io/empty-dir/31dcdcdc-a207-4f09-90af-82c452f9a3f0-ready\") pod \"cni-sysctl-allowlist-ds-pvfwl\" (UID: \"31dcdcdc-a207-4f09-90af-82c452f9a3f0\") " pod="openshift-multus/cni-sysctl-allowlist-ds-pvfwl" Jan 11 17:31:48 crc kubenswrapper[4837]: I0111 17:31:48.727008 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-nl78m\" (UniqueName: \"kubernetes.io/projected/d310cf03-2422-47e4-a98b-4a0636c1b8f8-kube-api-access-nl78m\") pod \"service-ca-operator-777779d784-vzfwx\" (UID: \"d310cf03-2422-47e4-a98b-4a0636c1b8f8\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-vzfwx" Jan 11 17:31:48 crc kubenswrapper[4837]: I0111 17:31:48.727024 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/4047a74f-e1b9-40a2-b525-bc06011477d7-profile-collector-cert\") pod \"catalog-operator-68c6474976-sxnbq\" (UID: \"4047a74f-e1b9-40a2-b525-bc06011477d7\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-sxnbq" Jan 11 17:31:48 crc kubenswrapper[4837]: I0111 17:31:48.727042 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gtjfz\" (UniqueName: \"kubernetes.io/projected/f9e34a1e-5456-4b26-b347-aa569c5987d5-kube-api-access-gtjfz\") pod \"marketplace-operator-79b997595-gkkhd\" (UID: \"f9e34a1e-5456-4b26-b347-aa569c5987d5\") " pod="openshift-marketplace/marketplace-operator-79b997595-gkkhd" Jan 11 17:31:48 crc kubenswrapper[4837]: I0111 17:31:48.727056 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-x6z5d\" (UniqueName: \"kubernetes.io/projected/e610467b-9c0c-47ac-86c8-2d700aba3e8e-kube-api-access-x6z5d\") pod \"router-default-5444994796-cscqg\" (UID: \"e610467b-9c0c-47ac-86c8-2d700aba3e8e\") " pod="openshift-ingress/router-default-5444994796-cscqg" Jan 11 17:31:48 crc kubenswrapper[4837]: I0111 17:31:48.727071 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/90317184-ec5a-4dc2-a9f8-6075c0d78aa1-secret-volume\") pod \"collect-profiles-29469210-m2c5r\" (UID: \"90317184-ec5a-4dc2-a9f8-6075c0d78aa1\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29469210-m2c5r" Jan 11 17:31:48 crc kubenswrapper[4837]: I0111 17:31:48.727086 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/4eff370b-3729-4929-b73f-752f1b00a318-serving-cert\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-fxkfr\" (UID: \"4eff370b-3729-4929-b73f-752f1b00a318\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-fxkfr" Jan 11 17:31:48 crc kubenswrapper[4837]: I0111 17:31:48.727102 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-slzch\" (UniqueName: \"kubernetes.io/projected/8c7124f1-e7cc-4ae8-89d2-19457c045576-kube-api-access-slzch\") pod \"multus-admission-controller-857f4d67dd-2984c\" (UID: \"8c7124f1-e7cc-4ae8-89d2-19457c045576\") " pod="openshift-multus/multus-admission-controller-857f4d67dd-2984c" Jan 11 17:31:48 crc kubenswrapper[4837]: I0111 17:31:48.727123 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"stats-auth\" (UniqueName: \"kubernetes.io/secret/e610467b-9c0c-47ac-86c8-2d700aba3e8e-stats-auth\") pod \"router-default-5444994796-cscqg\" (UID: \"e610467b-9c0c-47ac-86c8-2d700aba3e8e\") " pod="openshift-ingress/router-default-5444994796-cscqg" Jan 11 17:31:48 crc kubenswrapper[4837]: I0111 17:31:48.727138 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/4eff370b-3729-4929-b73f-752f1b00a318-config\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-fxkfr\" (UID: \"4eff370b-3729-4929-b73f-752f1b00a318\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-fxkfr" Jan 11 17:31:48 crc kubenswrapper[4837]: I0111 17:31:48.727155 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/8c7124f1-e7cc-4ae8-89d2-19457c045576-webhook-certs\") pod \"multus-admission-controller-857f4d67dd-2984c\" (UID: \"8c7124f1-e7cc-4ae8-89d2-19457c045576\") " pod="openshift-multus/multus-admission-controller-857f4d67dd-2984c" Jan 11 17:31:48 crc kubenswrapper[4837]: I0111 17:31:48.727170 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"plugins-dir\" (UniqueName: \"kubernetes.io/host-path/ce38ff9a-9354-463c-a8b5-3d4bbea8694d-plugins-dir\") pod \"csi-hostpathplugin-tdxwr\" (UID: \"ce38ff9a-9354-463c-a8b5-3d4bbea8694d\") " pod="hostpath-provisioner/csi-hostpathplugin-tdxwr" Jan 11 17:31:48 crc kubenswrapper[4837]: I0111 17:31:48.727185 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/4047a74f-e1b9-40a2-b525-bc06011477d7-srv-cert\") pod \"catalog-operator-68c6474976-sxnbq\" (UID: \"4047a74f-e1b9-40a2-b525-bc06011477d7\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-sxnbq" Jan 11 17:31:48 crc kubenswrapper[4837]: I0111 17:31:48.727199 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/cb48d08d-1606-49d9-a55d-53b21ad9f404-webhook-cert\") pod \"packageserver-d55dfcdfc-wphsv\" (UID: \"cb48d08d-1606-49d9-a55d-53b21ad9f404\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-wphsv" Jan 11 17:31:48 crc kubenswrapper[4837]: I0111 17:31:48.727224 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/e59909fb-c783-46de-9955-1c31cc9fd6b2-kube-api-access\") pod \"kube-controller-manager-operator-78b949d7b-9dr4c\" (UID: \"e59909fb-c783-46de-9955-1c31cc9fd6b2\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-9dr4c" Jan 11 17:31:48 crc kubenswrapper[4837]: I0111 17:31:48.727238 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-l7rjn\" (UniqueName: \"kubernetes.io/projected/90317184-ec5a-4dc2-a9f8-6075c0d78aa1-kube-api-access-l7rjn\") pod \"collect-profiles-29469210-m2c5r\" (UID: \"90317184-ec5a-4dc2-a9f8-6075c0d78aa1\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29469210-m2c5r" Jan 11 17:31:48 crc kubenswrapper[4837]: I0111 17:31:48.727253 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/31dcdcdc-a207-4f09-90af-82c452f9a3f0-tuning-conf-dir\") pod \"cni-sysctl-allowlist-ds-pvfwl\" (UID: \"31dcdcdc-a207-4f09-90af-82c452f9a3f0\") " pod="openshift-multus/cni-sysctl-allowlist-ds-pvfwl" Jan 11 17:31:48 crc kubenswrapper[4837]: I0111 17:31:48.727268 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/41ef8783-9b8b-427d-b3db-56d90bb448fa-proxy-tls\") pod \"machine-config-operator-74547568cd-ps47j\" (UID: \"41ef8783-9b8b-427d-b3db-56d90bb448fa\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-ps47j" Jan 11 17:31:48 crc kubenswrapper[4837]: I0111 17:31:48.727288 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-blx94\" (UniqueName: \"kubernetes.io/projected/8299e03d-1f93-4032-bcad-2ce040734c86-kube-api-access-blx94\") pod \"machine-config-controller-84d6567774-zdg78\" (UID: \"8299e03d-1f93-4032-bcad-2ce040734c86\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-zdg78" Jan 11 17:31:48 crc kubenswrapper[4837]: I0111 17:31:48.727306 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/ce38ff9a-9354-463c-a8b5-3d4bbea8694d-socket-dir\") pod \"csi-hostpathplugin-tdxwr\" (UID: \"ce38ff9a-9354-463c-a8b5-3d4bbea8694d\") " pod="hostpath-provisioner/csi-hostpathplugin-tdxwr" Jan 11 17:31:48 crc kubenswrapper[4837]: I0111 17:31:48.727322 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/f9e34a1e-5456-4b26-b347-aa569c5987d5-marketplace-operator-metrics\") pod \"marketplace-operator-79b997595-gkkhd\" (UID: \"f9e34a1e-5456-4b26-b347-aa569c5987d5\") " pod="openshift-marketplace/marketplace-operator-79b997595-gkkhd" Jan 11 17:31:48 crc kubenswrapper[4837]: I0111 17:31:48.727336 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/e59909fb-c783-46de-9955-1c31cc9fd6b2-serving-cert\") pod \"kube-controller-manager-operator-78b949d7b-9dr4c\" (UID: \"e59909fb-c783-46de-9955-1c31cc9fd6b2\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-9dr4c" Jan 11 17:31:48 crc kubenswrapper[4837]: I0111 17:31:48.727351 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"signing-cabundle\" (UniqueName: \"kubernetes.io/configmap/334b61aa-43d5-4f60-af17-50408198b8f5-signing-cabundle\") pod \"service-ca-9c57cc56f-nqs9r\" (UID: \"334b61aa-43d5-4f60-af17-50408198b8f5\") " pod="openshift-service-ca/service-ca-9c57cc56f-nqs9r" Jan 11 17:31:48 crc kubenswrapper[4837]: I0111 17:31:48.727369 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/ce38ff9a-9354-463c-a8b5-3d4bbea8694d-registration-dir\") pod \"csi-hostpathplugin-tdxwr\" (UID: \"ce38ff9a-9354-463c-a8b5-3d4bbea8694d\") " pod="hostpath-provisioner/csi-hostpathplugin-tdxwr" Jan 11 17:31:48 crc kubenswrapper[4837]: I0111 17:31:48.727383 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xfcw8\" (UniqueName: \"kubernetes.io/projected/41ef8783-9b8b-427d-b3db-56d90bb448fa-kube-api-access-xfcw8\") pod \"machine-config-operator-74547568cd-ps47j\" (UID: \"41ef8783-9b8b-427d-b3db-56d90bb448fa\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-ps47j" Jan 11 17:31:48 crc kubenswrapper[4837]: I0111 17:31:48.729123 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/1bb35332-d7fa-4163-99c1-3de2e12a6165-config-volume\") pod \"dns-default-dddb7\" (UID: \"1bb35332-d7fa-4163-99c1-3de2e12a6165\") " pod="openshift-dns/dns-default-dddb7" Jan 11 17:31:48 crc kubenswrapper[4837]: I0111 17:31:48.729228 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/90317184-ec5a-4dc2-a9f8-6075c0d78aa1-config-volume\") pod \"collect-profiles-29469210-m2c5r\" (UID: \"90317184-ec5a-4dc2-a9f8-6075c0d78aa1\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29469210-m2c5r" Jan 11 17:31:48 crc kubenswrapper[4837]: E0111 17:31:48.729310 4837 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-01-11 17:31:49.22929448 +0000 UTC m=+83.407487186 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 11 17:31:48 crc kubenswrapper[4837]: I0111 17:31:48.729367 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"csi-data-dir\" (UniqueName: \"kubernetes.io/host-path/ce38ff9a-9354-463c-a8b5-3d4bbea8694d-csi-data-dir\") pod \"csi-hostpathplugin-tdxwr\" (UID: \"ce38ff9a-9354-463c-a8b5-3d4bbea8694d\") " pod="hostpath-provisioner/csi-hostpathplugin-tdxwr" Jan 11 17:31:48 crc kubenswrapper[4837]: I0111 17:31:48.729852 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/f15ef5f6-e972-4986-a15c-da1e90b74ae0-trusted-ca\") pod \"ingress-operator-5b745b69d9-gsqgk\" (UID: \"f15ef5f6-e972-4986-a15c-da1e90b74ae0\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-gsqgk" Jan 11 17:31:48 crc kubenswrapper[4837]: I0111 17:31:48.730222 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"images\" (UniqueName: \"kubernetes.io/configmap/41ef8783-9b8b-427d-b3db-56d90bb448fa-images\") pod \"machine-config-operator-74547568cd-ps47j\" (UID: \"41ef8783-9b8b-427d-b3db-56d90bb448fa\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-ps47j" Jan 11 17:31:48 crc kubenswrapper[4837]: I0111 17:31:48.730578 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"mountpoint-dir\" (UniqueName: \"kubernetes.io/host-path/ce38ff9a-9354-463c-a8b5-3d4bbea8694d-mountpoint-dir\") pod \"csi-hostpathplugin-tdxwr\" (UID: \"ce38ff9a-9354-463c-a8b5-3d4bbea8694d\") " pod="hostpath-provisioner/csi-hostpathplugin-tdxwr" Jan 11 17:31:48 crc kubenswrapper[4837]: I0111 17:31:48.730637 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d310cf03-2422-47e4-a98b-4a0636c1b8f8-config\") pod \"service-ca-operator-777779d784-vzfwx\" (UID: \"d310cf03-2422-47e4-a98b-4a0636c1b8f8\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-vzfwx" Jan 11 17:31:48 crc kubenswrapper[4837]: I0111 17:31:48.731313 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/e610467b-9c0c-47ac-86c8-2d700aba3e8e-service-ca-bundle\") pod \"router-default-5444994796-cscqg\" (UID: \"e610467b-9c0c-47ac-86c8-2d700aba3e8e\") " pod="openshift-ingress/router-default-5444994796-cscqg" Jan 11 17:31:48 crc kubenswrapper[4837]: I0111 17:31:48.731315 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/41ef8783-9b8b-427d-b3db-56d90bb448fa-auth-proxy-config\") pod \"machine-config-operator-74547568cd-ps47j\" (UID: \"41ef8783-9b8b-427d-b3db-56d90bb448fa\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-ps47j" Jan 11 17:31:48 crc kubenswrapper[4837]: I0111 17:31:48.731405 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"tmpfs\" (UniqueName: \"kubernetes.io/empty-dir/cb48d08d-1606-49d9-a55d-53b21ad9f404-tmpfs\") pod \"packageserver-d55dfcdfc-wphsv\" (UID: \"cb48d08d-1606-49d9-a55d-53b21ad9f404\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-wphsv" Jan 11 17:31:48 crc kubenswrapper[4837]: I0111 17:31:48.731411 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"mcc-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/8299e03d-1f93-4032-bcad-2ce040734c86-mcc-auth-proxy-config\") pod \"machine-config-controller-84d6567774-zdg78\" (UID: \"8299e03d-1f93-4032-bcad-2ce040734c86\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-zdg78" Jan 11 17:31:48 crc kubenswrapper[4837]: I0111 17:31:48.731930 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"certs\" (UniqueName: \"kubernetes.io/secret/76f75c9a-7f59-4fa1-99e2-7c8f67a6aa17-certs\") pod \"machine-config-server-7mr75\" (UID: \"76f75c9a-7f59-4fa1-99e2-7c8f67a6aa17\") " pod="openshift-machine-config-operator/machine-config-server-7mr75" Jan 11 17:31:48 crc kubenswrapper[4837]: I0111 17:31:48.731995 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/31dcdcdc-a207-4f09-90af-82c452f9a3f0-tuning-conf-dir\") pod \"cni-sysctl-allowlist-ds-pvfwl\" (UID: \"31dcdcdc-a207-4f09-90af-82c452f9a3f0\") " pod="openshift-multus/cni-sysctl-allowlist-ds-pvfwl" Jan 11 17:31:48 crc kubenswrapper[4837]: I0111 17:31:48.732011 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/31dcdcdc-a207-4f09-90af-82c452f9a3f0-cni-sysctl-allowlist\") pod \"cni-sysctl-allowlist-ds-pvfwl\" (UID: \"31dcdcdc-a207-4f09-90af-82c452f9a3f0\") " pod="openshift-multus/cni-sysctl-allowlist-ds-pvfwl" Jan 11 17:31:48 crc kubenswrapper[4837]: I0111 17:31:48.733732 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/e610467b-9c0c-47ac-86c8-2d700aba3e8e-metrics-certs\") pod \"router-default-5444994796-cscqg\" (UID: \"e610467b-9c0c-47ac-86c8-2d700aba3e8e\") " pod="openshift-ingress/router-default-5444994796-cscqg" Jan 11 17:31:48 crc kubenswrapper[4837]: I0111 17:31:48.733893 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"plugins-dir\" (UniqueName: \"kubernetes.io/host-path/ce38ff9a-9354-463c-a8b5-3d4bbea8694d-plugins-dir\") pod \"csi-hostpathplugin-tdxwr\" (UID: \"ce38ff9a-9354-463c-a8b5-3d4bbea8694d\") " pod="hostpath-provisioner/csi-hostpathplugin-tdxwr" Jan 11 17:31:48 crc kubenswrapper[4837]: I0111 17:31:48.734728 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/ce38ff9a-9354-463c-a8b5-3d4bbea8694d-registration-dir\") pod \"csi-hostpathplugin-tdxwr\" (UID: \"ce38ff9a-9354-463c-a8b5-3d4bbea8694d\") " pod="hostpath-provisioner/csi-hostpathplugin-tdxwr" Jan 11 17:31:48 crc kubenswrapper[4837]: I0111 17:31:48.735015 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e59909fb-c783-46de-9955-1c31cc9fd6b2-config\") pod \"kube-controller-manager-operator-78b949d7b-9dr4c\" (UID: \"e59909fb-c783-46de-9955-1c31cc9fd6b2\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-9dr4c" Jan 11 17:31:48 crc kubenswrapper[4837]: I0111 17:31:48.736261 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/4eff370b-3729-4929-b73f-752f1b00a318-config\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-fxkfr\" (UID: \"4eff370b-3729-4929-b73f-752f1b00a318\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-fxkfr" Jan 11 17:31:48 crc kubenswrapper[4837]: I0111 17:31:48.736539 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/f9e34a1e-5456-4b26-b347-aa569c5987d5-marketplace-trusted-ca\") pod \"marketplace-operator-79b997595-gkkhd\" (UID: \"f9e34a1e-5456-4b26-b347-aa569c5987d5\") " pod="openshift-marketplace/marketplace-operator-79b997595-gkkhd" Jan 11 17:31:48 crc kubenswrapper[4837]: I0111 17:31:48.736752 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ready\" (UniqueName: \"kubernetes.io/empty-dir/31dcdcdc-a207-4f09-90af-82c452f9a3f0-ready\") pod \"cni-sysctl-allowlist-ds-pvfwl\" (UID: \"31dcdcdc-a207-4f09-90af-82c452f9a3f0\") " pod="openshift-multus/cni-sysctl-allowlist-ds-pvfwl" Jan 11 17:31:48 crc kubenswrapper[4837]: I0111 17:31:48.736861 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/d310cf03-2422-47e4-a98b-4a0636c1b8f8-serving-cert\") pod \"service-ca-operator-777779d784-vzfwx\" (UID: \"d310cf03-2422-47e4-a98b-4a0636c1b8f8\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-vzfwx" Jan 11 17:31:48 crc kubenswrapper[4837]: I0111 17:31:48.737496 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"control-plane-machine-set-operator-tls\" (UniqueName: \"kubernetes.io/secret/75e7928f-bf2f-4a17-b2eb-f4fc925c7ce3-control-plane-machine-set-operator-tls\") pod \"control-plane-machine-set-operator-78cbb6b69f-bs6rl\" (UID: \"75e7928f-bf2f-4a17-b2eb-f4fc925c7ce3\") " pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-bs6rl" Jan 11 17:31:48 crc kubenswrapper[4837]: I0111 17:31:48.737537 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/8c7124f1-e7cc-4ae8-89d2-19457c045576-webhook-certs\") pod \"multus-admission-controller-857f4d67dd-2984c\" (UID: \"8c7124f1-e7cc-4ae8-89d2-19457c045576\") " pod="openshift-multus/multus-admission-controller-857f4d67dd-2984c" Jan 11 17:31:48 crc kubenswrapper[4837]: I0111 17:31:48.738149 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"default-certificate\" (UniqueName: \"kubernetes.io/secret/e610467b-9c0c-47ac-86c8-2d700aba3e8e-default-certificate\") pod \"router-default-5444994796-cscqg\" (UID: \"e610467b-9c0c-47ac-86c8-2d700aba3e8e\") " pod="openshift-ingress/router-default-5444994796-cscqg" Jan 11 17:31:48 crc kubenswrapper[4837]: I0111 17:31:48.738213 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/cb48d08d-1606-49d9-a55d-53b21ad9f404-apiservice-cert\") pod \"packageserver-d55dfcdfc-wphsv\" (UID: \"cb48d08d-1606-49d9-a55d-53b21ad9f404\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-wphsv" Jan 11 17:31:48 crc kubenswrapper[4837]: I0111 17:31:48.738380 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/4047a74f-e1b9-40a2-b525-bc06011477d7-profile-collector-cert\") pod \"catalog-operator-68c6474976-sxnbq\" (UID: \"4047a74f-e1b9-40a2-b525-bc06011477d7\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-sxnbq" Jan 11 17:31:48 crc kubenswrapper[4837]: I0111 17:31:48.738694 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/f15ef5f6-e972-4986-a15c-da1e90b74ae0-metrics-tls\") pod \"ingress-operator-5b745b69d9-gsqgk\" (UID: \"f15ef5f6-e972-4986-a15c-da1e90b74ae0\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-gsqgk" Jan 11 17:31:48 crc kubenswrapper[4837]: I0111 17:31:48.739081 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"node-bootstrap-token\" (UniqueName: \"kubernetes.io/secret/76f75c9a-7f59-4fa1-99e2-7c8f67a6aa17-node-bootstrap-token\") pod \"machine-config-server-7mr75\" (UID: \"76f75c9a-7f59-4fa1-99e2-7c8f67a6aa17\") " pod="openshift-machine-config-operator/machine-config-server-7mr75" Jan 11 17:31:48 crc kubenswrapper[4837]: I0111 17:31:48.739420 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/8299e03d-1f93-4032-bcad-2ce040734c86-proxy-tls\") pod \"machine-config-controller-84d6567774-zdg78\" (UID: \"8299e03d-1f93-4032-bcad-2ce040734c86\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-zdg78" Jan 11 17:31:48 crc kubenswrapper[4837]: I0111 17:31:48.739538 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/4eff370b-3729-4929-b73f-752f1b00a318-serving-cert\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-fxkfr\" (UID: \"4eff370b-3729-4929-b73f-752f1b00a318\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-fxkfr" Jan 11 17:31:48 crc kubenswrapper[4837]: I0111 17:31:48.739914 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/4047a74f-e1b9-40a2-b525-bc06011477d7-srv-cert\") pod \"catalog-operator-68c6474976-sxnbq\" (UID: \"4047a74f-e1b9-40a2-b525-bc06011477d7\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-sxnbq" Jan 11 17:31:48 crc kubenswrapper[4837]: I0111 17:31:48.740021 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/1bb35332-d7fa-4163-99c1-3de2e12a6165-metrics-tls\") pod \"dns-default-dddb7\" (UID: \"1bb35332-d7fa-4163-99c1-3de2e12a6165\") " pod="openshift-dns/dns-default-dddb7" Jan 11 17:31:48 crc kubenswrapper[4837]: I0111 17:31:48.740505 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/5a71c215-dd54-402e-aaaa-ad6f2320da35-cert\") pod \"ingress-canary-wbdfb\" (UID: \"5a71c215-dd54-402e-aaaa-ad6f2320da35\") " pod="openshift-ingress-canary/ingress-canary-wbdfb" Jan 11 17:31:48 crc kubenswrapper[4837]: I0111 17:31:48.741101 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/90317184-ec5a-4dc2-a9f8-6075c0d78aa1-secret-volume\") pod \"collect-profiles-29469210-m2c5r\" (UID: \"90317184-ec5a-4dc2-a9f8-6075c0d78aa1\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29469210-m2c5r" Jan 11 17:31:48 crc kubenswrapper[4837]: I0111 17:31:48.741182 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"signing-cabundle\" (UniqueName: \"kubernetes.io/configmap/334b61aa-43d5-4f60-af17-50408198b8f5-signing-cabundle\") pod \"service-ca-9c57cc56f-nqs9r\" (UID: \"334b61aa-43d5-4f60-af17-50408198b8f5\") " pod="openshift-service-ca/service-ca-9c57cc56f-nqs9r" Jan 11 17:31:48 crc kubenswrapper[4837]: I0111 17:31:48.742797 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/ce38ff9a-9354-463c-a8b5-3d4bbea8694d-socket-dir\") pod \"csi-hostpathplugin-tdxwr\" (UID: \"ce38ff9a-9354-463c-a8b5-3d4bbea8694d\") " pod="hostpath-provisioner/csi-hostpathplugin-tdxwr" Jan 11 17:31:48 crc kubenswrapper[4837]: I0111 17:31:48.743010 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/cb48d08d-1606-49d9-a55d-53b21ad9f404-webhook-cert\") pod \"packageserver-d55dfcdfc-wphsv\" (UID: \"cb48d08d-1606-49d9-a55d-53b21ad9f404\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-wphsv" Jan 11 17:31:48 crc kubenswrapper[4837]: I0111 17:31:48.743334 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"signing-key\" (UniqueName: \"kubernetes.io/secret/334b61aa-43d5-4f60-af17-50408198b8f5-signing-key\") pod \"service-ca-9c57cc56f-nqs9r\" (UID: \"334b61aa-43d5-4f60-af17-50408198b8f5\") " pod="openshift-service-ca/service-ca-9c57cc56f-nqs9r" Jan 11 17:31:48 crc kubenswrapper[4837]: I0111 17:31:48.744152 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/f9e34a1e-5456-4b26-b347-aa569c5987d5-marketplace-operator-metrics\") pod \"marketplace-operator-79b997595-gkkhd\" (UID: \"f9e34a1e-5456-4b26-b347-aa569c5987d5\") " pod="openshift-marketplace/marketplace-operator-79b997595-gkkhd" Jan 11 17:31:48 crc kubenswrapper[4837]: I0111 17:31:48.744197 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"stats-auth\" (UniqueName: \"kubernetes.io/secret/e610467b-9c0c-47ac-86c8-2d700aba3e8e-stats-auth\") pod \"router-default-5444994796-cscqg\" (UID: \"e610467b-9c0c-47ac-86c8-2d700aba3e8e\") " pod="openshift-ingress/router-default-5444994796-cscqg" Jan 11 17:31:48 crc kubenswrapper[4837]: I0111 17:31:48.744919 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/41ef8783-9b8b-427d-b3db-56d90bb448fa-proxy-tls\") pod \"machine-config-operator-74547568cd-ps47j\" (UID: \"41ef8783-9b8b-427d-b3db-56d90bb448fa\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-ps47j" Jan 11 17:31:48 crc kubenswrapper[4837]: I0111 17:31:48.745241 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/e59909fb-c783-46de-9955-1c31cc9fd6b2-serving-cert\") pod \"kube-controller-manager-operator-78b949d7b-9dr4c\" (UID: \"e59909fb-c783-46de-9955-1c31cc9fd6b2\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-9dr4c" Jan 11 17:31:48 crc kubenswrapper[4837]: I0111 17:31:48.747101 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-r45g8\" (UniqueName: \"kubernetes.io/projected/bb63c9f5-457d-4c61-8cc6-56690e66a952-kube-api-access-r45g8\") pod \"image-registry-697d97f7c8-4nvrr\" (UID: \"bb63c9f5-457d-4c61-8cc6-56690e66a952\") " pod="openshift-image-registry/image-registry-697d97f7c8-4nvrr" Jan 11 17:31:48 crc kubenswrapper[4837]: I0111 17:31:48.765232 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8g27x\" (UniqueName: \"kubernetes.io/projected/43082497-a570-48fc-95f8-eda27581cde7-kube-api-access-8g27x\") pod \"package-server-manager-789f6589d5-2vscb\" (UID: \"43082497-a570-48fc-95f8-eda27581cde7\") " pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-2vscb" Jan 11 17:31:48 crc kubenswrapper[4837]: I0111 17:31:48.767088 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-2ngzf" event={"ID":"ecf9daad-b73a-4f1a-9247-ee4973ad1bc7","Type":"ContainerStarted","Data":"ce3680fbd087d6c8f8a518166f3ab8daef27ba23434c211e4afcd22027d4e0e3"} Jan 11 17:31:48 crc kubenswrapper[4837]: I0111 17:31:48.772409 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-fpvgj" event={"ID":"26867a3e-74b2-4ef0-b671-52b6f72fb0d3","Type":"ContainerStarted","Data":"8ad717fa88c35c28bff1f42d5d10ebc4d1b298663f6fc85f1f702597f5454ccd"} Jan 11 17:31:48 crc kubenswrapper[4837]: I0111 17:31:48.774914 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-config-operator/openshift-config-operator-7777fb866f-599f5" event={"ID":"8854497b-9e74-4bd1-b465-847ad61d8779","Type":"ContainerStarted","Data":"7021a31f4b0ea1b7d5470c673091897a839767ba619b0268374530bdf5344930"} Jan 11 17:31:48 crc kubenswrapper[4837]: I0111 17:31:48.778838 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-f9d7485db-v8df8" event={"ID":"1141f492-afec-40f3-bde7-7072d6a75a68","Type":"ContainerStarted","Data":"73a286ac58474705dc48075cafc3a1bc6c04270cfe3d679f971104098fe04639"} Jan 11 17:31:48 crc kubenswrapper[4837]: I0111 17:31:48.779794 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/downloads-7954f5f757-dmnqc" event={"ID":"75ee2960-aa16-4aea-84f2-d60c34d6fb1a","Type":"ContainerStarted","Data":"f6f5684c866a5676288642fe256335f4f99d0cd4d55b54211d3dee3b0f063737"} Jan 11 17:31:48 crc kubenswrapper[4837]: I0111 17:31:48.780748 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns-operator/dns-operator-744455d44c-n8rwr" event={"ID":"a45207f1-08c5-4ae8-9ac3-3e3099df8218","Type":"ContainerStarted","Data":"e2f23e00150f853a40bbfc1c762c4bdaee60feddc57e217513407c8d265cba68"} Jan 11 17:31:48 crc kubenswrapper[4837]: I0111 17:31:48.781732 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-879f6c89f-ljlz2" event={"ID":"f9f7469a-7ddb-4d35-962e-86154f7750c9","Type":"ContainerStarted","Data":"a69046d634c7d619658399991d46773b3a2bec6a5441e85c91462348e8410ae9"} Jan 11 17:31:48 crc kubenswrapper[4837]: I0111 17:31:48.783338 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-cwdvp" event={"ID":"76c9940b-89a9-414c-ab2a-c4c1b4519725","Type":"ContainerStarted","Data":"8c438c10c32cc1ce028a5826080a60bb19fcdd0ce715bb13c6531fe27e9f26b1"} Jan 11 17:31:48 crc kubenswrapper[4837]: I0111 17:31:48.787356 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-machine-approver/machine-approver-56656f9798-hchsh" event={"ID":"59f65053-461b-4ad3-acc6-2a29b1fd06c6","Type":"ContainerStarted","Data":"a0f823330e4f2dc15c304005af2604e00208e234afaf7f9d9c3c3ca8422ce6e5"} Jan 11 17:31:48 crc kubenswrapper[4837]: I0111 17:31:48.787974 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication-operator/authentication-operator-69f744f599-l2c7d" event={"ID":"1d6837bf-33ce-474a-a4f5-22d1fa7b95b1","Type":"ContainerStarted","Data":"fe2b33adfeb49abcea4695e47fc309c82a42ef141bd45a56d5d901c1c6da77b6"} Jan 11 17:31:48 crc kubenswrapper[4837]: I0111 17:31:48.814317 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-sqmvk\" (UniqueName: \"kubernetes.io/projected/9b730a03-0d21-4b8a-8fa6-3d64f0b3aa5b-kube-api-access-sqmvk\") pod \"etcd-operator-b45778765-pjl6t\" (UID: \"9b730a03-0d21-4b8a-8fa6-3d64f0b3aa5b\") " pod="openshift-etcd-operator/etcd-operator-b45778765-pjl6t" Jan 11 17:31:48 crc kubenswrapper[4837]: I0111 17:31:48.826371 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/bb63c9f5-457d-4c61-8cc6-56690e66a952-bound-sa-token\") pod \"image-registry-697d97f7c8-4nvrr\" (UID: \"bb63c9f5-457d-4c61-8cc6-56690e66a952\") " pod="openshift-image-registry/image-registry-697d97f7c8-4nvrr" Jan 11 17:31:48 crc kubenswrapper[4837]: I0111 17:31:48.828834 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-4nvrr\" (UID: \"bb63c9f5-457d-4c61-8cc6-56690e66a952\") " pod="openshift-image-registry/image-registry-697d97f7c8-4nvrr" Jan 11 17:31:48 crc kubenswrapper[4837]: E0111 17:31:48.829336 4837 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-01-11 17:31:49.32932177 +0000 UTC m=+83.507514476 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-4nvrr" (UID: "bb63c9f5-457d-4c61-8cc6-56690e66a952") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 11 17:31:48 crc kubenswrapper[4837]: I0111 17:31:48.846809 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xfcw8\" (UniqueName: \"kubernetes.io/projected/41ef8783-9b8b-427d-b3db-56d90bb448fa-kube-api-access-xfcw8\") pod \"machine-config-operator-74547568cd-ps47j\" (UID: \"41ef8783-9b8b-427d-b3db-56d90bb448fa\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-ps47j" Jan 11 17:31:48 crc kubenswrapper[4837]: I0111 17:31:48.865536 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qfcj4\" (UniqueName: \"kubernetes.io/projected/5a71c215-dd54-402e-aaaa-ad6f2320da35-kube-api-access-qfcj4\") pod \"ingress-canary-wbdfb\" (UID: \"5a71c215-dd54-402e-aaaa-ad6f2320da35\") " pod="openshift-ingress-canary/ingress-canary-wbdfb" Jan 11 17:31:48 crc kubenswrapper[4837]: I0111 17:31:48.892628 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/f15ef5f6-e972-4986-a15c-da1e90b74ae0-bound-sa-token\") pod \"ingress-operator-5b745b69d9-gsqgk\" (UID: \"f15ef5f6-e972-4986-a15c-da1e90b74ae0\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-gsqgk" Jan 11 17:31:48 crc kubenswrapper[4837]: I0111 17:31:48.904012 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xr9l9\" (UniqueName: \"kubernetes.io/projected/76f75c9a-7f59-4fa1-99e2-7c8f67a6aa17-kube-api-access-xr9l9\") pod \"machine-config-server-7mr75\" (UID: \"76f75c9a-7f59-4fa1-99e2-7c8f67a6aa17\") " pod="openshift-machine-config-operator/machine-config-server-7mr75" Jan 11 17:31:48 crc kubenswrapper[4837]: I0111 17:31:48.922104 4837 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-wlfcq"] Jan 11 17:31:48 crc kubenswrapper[4837]: I0111 17:31:48.922421 4837 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console-operator/console-operator-58897d9998-n8ldn"] Jan 11 17:31:48 crc kubenswrapper[4837]: I0111 17:31:48.923550 4837 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-mklqs"] Jan 11 17:31:48 crc kubenswrapper[4837]: I0111 17:31:48.924473 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/4eff370b-3729-4929-b73f-752f1b00a318-kube-api-access\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-fxkfr\" (UID: \"4eff370b-3729-4929-b73f-752f1b00a318\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-fxkfr" Jan 11 17:31:48 crc kubenswrapper[4837]: I0111 17:31:48.930838 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Jan 11 17:31:48 crc kubenswrapper[4837]: E0111 17:31:48.931087 4837 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-01-11 17:31:49.431048742 +0000 UTC m=+83.609241448 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 11 17:31:48 crc kubenswrapper[4837]: I0111 17:31:48.931488 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-4nvrr\" (UID: \"bb63c9f5-457d-4c61-8cc6-56690e66a952\") " pod="openshift-image-registry/image-registry-697d97f7c8-4nvrr" Jan 11 17:31:48 crc kubenswrapper[4837]: E0111 17:31:48.931890 4837 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-01-11 17:31:49.431880224 +0000 UTC m=+83.610072930 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-4nvrr" (UID: "bb63c9f5-457d-4c61-8cc6-56690e66a952") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 11 17:31:48 crc kubenswrapper[4837]: I0111 17:31:48.950496 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dnsct\" (UniqueName: \"kubernetes.io/projected/31dcdcdc-a207-4f09-90af-82c452f9a3f0-kube-api-access-dnsct\") pod \"cni-sysctl-allowlist-ds-pvfwl\" (UID: \"31dcdcdc-a207-4f09-90af-82c452f9a3f0\") " pod="openshift-multus/cni-sysctl-allowlist-ds-pvfwl" Jan 11 17:31:48 crc kubenswrapper[4837]: I0111 17:31:48.954922 4837 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-kq5ch"] Jan 11 17:31:48 crc kubenswrapper[4837]: I0111 17:31:48.961889 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-etcd-operator/etcd-operator-b45778765-pjl6t" Jan 11 17:31:48 crc kubenswrapper[4837]: I0111 17:31:48.967294 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-nhdlp\" (UniqueName: \"kubernetes.io/projected/ce38ff9a-9354-463c-a8b5-3d4bbea8694d-kube-api-access-nhdlp\") pod \"csi-hostpathplugin-tdxwr\" (UID: \"ce38ff9a-9354-463c-a8b5-3d4bbea8694d\") " pod="hostpath-provisioner/csi-hostpathplugin-tdxwr" Jan 11 17:31:48 crc kubenswrapper[4837]: I0111 17:31:48.967878 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-authentication/oauth-openshift-558db77b4-ldzgv" Jan 11 17:31:48 crc kubenswrapper[4837]: I0111 17:31:48.975568 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-api/machine-api-operator-5694c8668f-xk9j9" Jan 11 17:31:48 crc kubenswrapper[4837]: I0111 17:31:48.975912 4837 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-r58pw"] Jan 11 17:31:48 crc kubenswrapper[4837]: I0111 17:31:48.982291 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8wxtr\" (UniqueName: \"kubernetes.io/projected/1bb35332-d7fa-4163-99c1-3de2e12a6165-kube-api-access-8wxtr\") pod \"dns-default-dddb7\" (UID: \"1bb35332-d7fa-4163-99c1-3de2e12a6165\") " pod="openshift-dns/dns-default-dddb7" Jan 11 17:31:48 crc kubenswrapper[4837]: I0111 17:31:48.984502 4837 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-c9fvj"] Jan 11 17:31:48 crc kubenswrapper[4837]: I0111 17:31:48.988934 4837 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-apiserver/apiserver-76f77b778f-pz586"] Jan 11 17:31:48 crc kubenswrapper[4837]: I0111 17:31:48.994926 4837 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-5xllw"] Jan 11 17:31:48 crc kubenswrapper[4837]: I0111 17:31:48.997695 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-2vscb" Jan 11 17:31:49 crc kubenswrapper[4837]: I0111 17:31:49.001772 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-46hnv\" (UniqueName: \"kubernetes.io/projected/f15ef5f6-e972-4986-a15c-da1e90b74ae0-kube-api-access-46hnv\") pod \"ingress-operator-5b745b69d9-gsqgk\" (UID: \"f15ef5f6-e972-4986-a15c-da1e90b74ae0\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-gsqgk" Jan 11 17:31:49 crc kubenswrapper[4837]: I0111 17:31:49.022155 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-slzch\" (UniqueName: \"kubernetes.io/projected/8c7124f1-e7cc-4ae8-89d2-19457c045576-kube-api-access-slzch\") pod \"multus-admission-controller-857f4d67dd-2984c\" (UID: \"8c7124f1-e7cc-4ae8-89d2-19457c045576\") " pod="openshift-multus/multus-admission-controller-857f4d67dd-2984c" Jan 11 17:31:49 crc kubenswrapper[4837]: I0111 17:31:49.038827 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-fxkfr" Jan 11 17:31:49 crc kubenswrapper[4837]: I0111 17:31:49.039012 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-operator-74547568cd-ps47j" Jan 11 17:31:49 crc kubenswrapper[4837]: I0111 17:31:49.039207 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-server-7mr75" Jan 11 17:31:49 crc kubenswrapper[4837]: I0111 17:31:49.039296 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="hostpath-provisioner/csi-hostpathplugin-tdxwr" Jan 11 17:31:49 crc kubenswrapper[4837]: I0111 17:31:49.039373 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-canary/ingress-canary-wbdfb" Jan 11 17:31:49 crc kubenswrapper[4837]: I0111 17:31:49.039384 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Jan 11 17:31:49 crc kubenswrapper[4837]: E0111 17:31:49.039635 4837 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-01-11 17:31:49.53961193 +0000 UTC m=+83.717804636 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 11 17:31:49 crc kubenswrapper[4837]: I0111 17:31:49.039695 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-operator/ingress-operator-5b745b69d9-gsqgk" Jan 11 17:31:49 crc kubenswrapper[4837]: I0111 17:31:49.039906 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-4nvrr\" (UID: \"bb63c9f5-457d-4c61-8cc6-56690e66a952\") " pod="openshift-image-registry/image-registry-697d97f7c8-4nvrr" Jan 11 17:31:49 crc kubenswrapper[4837]: I0111 17:31:49.039947 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/dns-default-dddb7" Jan 11 17:31:49 crc kubenswrapper[4837]: E0111 17:31:49.040192 4837 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-01-11 17:31:49.540172304 +0000 UTC m=+83.718365010 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-4nvrr" (UID: "bb63c9f5-457d-4c61-8cc6-56690e66a952") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 11 17:31:49 crc kubenswrapper[4837]: I0111 17:31:49.040504 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/cni-sysctl-allowlist-ds-pvfwl" Jan 11 17:31:49 crc kubenswrapper[4837]: I0111 17:31:49.043751 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-n7pzq\" (UniqueName: \"kubernetes.io/projected/334b61aa-43d5-4f60-af17-50408198b8f5-kube-api-access-n7pzq\") pod \"service-ca-9c57cc56f-nqs9r\" (UID: \"334b61aa-43d5-4f60-af17-50408198b8f5\") " pod="openshift-service-ca/service-ca-9c57cc56f-nqs9r" Jan 11 17:31:49 crc kubenswrapper[4837]: I0111 17:31:49.064187 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-28t46\" (UniqueName: \"kubernetes.io/projected/4047a74f-e1b9-40a2-b525-bc06011477d7-kube-api-access-28t46\") pod \"catalog-operator-68c6474976-sxnbq\" (UID: \"4047a74f-e1b9-40a2-b525-bc06011477d7\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-sxnbq" Jan 11 17:31:49 crc kubenswrapper[4837]: I0111 17:31:49.081006 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hwrj2\" (UniqueName: \"kubernetes.io/projected/09ba226e-a3c0-402a-a7d0-bbc4f46d5dd3-kube-api-access-hwrj2\") pod \"migrator-59844c95c7-p7jgn\" (UID: \"09ba226e-a3c0-402a-a7d0-bbc4f46d5dd3\") " pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-p7jgn" Jan 11 17:31:49 crc kubenswrapper[4837]: I0111 17:31:49.106391 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-nl78m\" (UniqueName: \"kubernetes.io/projected/d310cf03-2422-47e4-a98b-4a0636c1b8f8-kube-api-access-nl78m\") pod \"service-ca-operator-777779d784-vzfwx\" (UID: \"d310cf03-2422-47e4-a98b-4a0636c1b8f8\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-vzfwx" Jan 11 17:31:49 crc kubenswrapper[4837]: I0111 17:31:49.127697 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5fzfc\" (UniqueName: \"kubernetes.io/projected/75e7928f-bf2f-4a17-b2eb-f4fc925c7ce3-kube-api-access-5fzfc\") pod \"control-plane-machine-set-operator-78cbb6b69f-bs6rl\" (UID: \"75e7928f-bf2f-4a17-b2eb-f4fc925c7ce3\") " pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-bs6rl" Jan 11 17:31:49 crc kubenswrapper[4837]: I0111 17:31:49.143021 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Jan 11 17:31:49 crc kubenswrapper[4837]: E0111 17:31:49.143383 4837 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-01-11 17:31:49.643368334 +0000 UTC m=+83.821561040 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 11 17:31:49 crc kubenswrapper[4837]: I0111 17:31:49.151907 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-m4nll\" (UniqueName: \"kubernetes.io/projected/cb48d08d-1606-49d9-a55d-53b21ad9f404-kube-api-access-m4nll\") pod \"packageserver-d55dfcdfc-wphsv\" (UID: \"cb48d08d-1606-49d9-a55d-53b21ad9f404\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-wphsv" Jan 11 17:31:49 crc kubenswrapper[4837]: I0111 17:31:49.169828 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-x6z5d\" (UniqueName: \"kubernetes.io/projected/e610467b-9c0c-47ac-86c8-2d700aba3e8e-kube-api-access-x6z5d\") pod \"router-default-5444994796-cscqg\" (UID: \"e610467b-9c0c-47ac-86c8-2d700aba3e8e\") " pod="openshift-ingress/router-default-5444994796-cscqg" Jan 11 17:31:49 crc kubenswrapper[4837]: I0111 17:31:49.173443 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-admission-controller-857f4d67dd-2984c" Jan 11 17:31:49 crc kubenswrapper[4837]: I0111 17:31:49.192931 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gtjfz\" (UniqueName: \"kubernetes.io/projected/f9e34a1e-5456-4b26-b347-aa569c5987d5-kube-api-access-gtjfz\") pod \"marketplace-operator-79b997595-gkkhd\" (UID: \"f9e34a1e-5456-4b26-b347-aa569c5987d5\") " pod="openshift-marketplace/marketplace-operator-79b997595-gkkhd" Jan 11 17:31:49 crc kubenswrapper[4837]: I0111 17:31:49.221617 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-l7rjn\" (UniqueName: \"kubernetes.io/projected/90317184-ec5a-4dc2-a9f8-6075c0d78aa1-kube-api-access-l7rjn\") pod \"collect-profiles-29469210-m2c5r\" (UID: \"90317184-ec5a-4dc2-a9f8-6075c0d78aa1\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29469210-m2c5r" Jan 11 17:31:49 crc kubenswrapper[4837]: I0111 17:31:49.244135 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/e59909fb-c783-46de-9955-1c31cc9fd6b2-kube-api-access\") pod \"kube-controller-manager-operator-78b949d7b-9dr4c\" (UID: \"e59909fb-c783-46de-9955-1c31cc9fd6b2\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-9dr4c" Jan 11 17:31:49 crc kubenswrapper[4837]: I0111 17:31:49.244521 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-4nvrr\" (UID: \"bb63c9f5-457d-4c61-8cc6-56690e66a952\") " pod="openshift-image-registry/image-registry-697d97f7c8-4nvrr" Jan 11 17:31:49 crc kubenswrapper[4837]: I0111 17:31:49.245165 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-blx94\" (UniqueName: \"kubernetes.io/projected/8299e03d-1f93-4032-bcad-2ce040734c86-kube-api-access-blx94\") pod \"machine-config-controller-84d6567774-zdg78\" (UID: \"8299e03d-1f93-4032-bcad-2ce040734c86\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-zdg78" Jan 11 17:31:49 crc kubenswrapper[4837]: E0111 17:31:49.245269 4837 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-01-11 17:31:49.74523621 +0000 UTC m=+83.923428916 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-4nvrr" (UID: "bb63c9f5-457d-4c61-8cc6-56690e66a952") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 11 17:31:49 crc kubenswrapper[4837]: I0111 17:31:49.329461 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress/router-default-5444994796-cscqg" Jan 11 17:31:49 crc kubenswrapper[4837]: I0111 17:31:49.337449 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/marketplace-operator-79b997595-gkkhd" Jan 11 17:31:49 crc kubenswrapper[4837]: I0111 17:31:49.338269 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-sxnbq" Jan 11 17:31:49 crc kubenswrapper[4837]: I0111 17:31:49.338632 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29469210-m2c5r" Jan 11 17:31:49 crc kubenswrapper[4837]: I0111 17:31:49.338653 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-wphsv" Jan 11 17:31:49 crc kubenswrapper[4837]: I0111 17:31:49.338791 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-service-ca/service-ca-9c57cc56f-nqs9r" Jan 11 17:31:49 crc kubenswrapper[4837]: I0111 17:31:49.339059 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-controller-84d6567774-zdg78" Jan 11 17:31:49 crc kubenswrapper[4837]: I0111 17:31:49.340238 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-p7jgn" Jan 11 17:31:49 crc kubenswrapper[4837]: I0111 17:31:49.340503 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-bs6rl" Jan 11 17:31:49 crc kubenswrapper[4837]: I0111 17:31:49.340755 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-9dr4c" Jan 11 17:31:49 crc kubenswrapper[4837]: E0111 17:31:49.345559 4837 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-01-11 17:31:49.845542517 +0000 UTC m=+84.023735223 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 11 17:31:49 crc kubenswrapper[4837]: I0111 17:31:49.345394 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Jan 11 17:31:49 crc kubenswrapper[4837]: I0111 17:31:49.345978 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-4nvrr\" (UID: \"bb63c9f5-457d-4c61-8cc6-56690e66a952\") " pod="openshift-image-registry/image-registry-697d97f7c8-4nvrr" Jan 11 17:31:49 crc kubenswrapper[4837]: E0111 17:31:49.346251 4837 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-01-11 17:31:49.846243805 +0000 UTC m=+84.024436511 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-4nvrr" (UID: "bb63c9f5-457d-4c61-8cc6-56690e66a952") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 11 17:31:49 crc kubenswrapper[4837]: I0111 17:31:49.348925 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-service-ca-operator/service-ca-operator-777779d784-vzfwx" Jan 11 17:31:49 crc kubenswrapper[4837]: I0111 17:31:49.447563 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Jan 11 17:31:49 crc kubenswrapper[4837]: E0111 17:31:49.447805 4837 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-01-11 17:31:49.947778663 +0000 UTC m=+84.125971379 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 11 17:31:49 crc kubenswrapper[4837]: I0111 17:31:49.448268 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-4nvrr\" (UID: \"bb63c9f5-457d-4c61-8cc6-56690e66a952\") " pod="openshift-image-registry/image-registry-697d97f7c8-4nvrr" Jan 11 17:31:49 crc kubenswrapper[4837]: E0111 17:31:49.448705 4837 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-01-11 17:31:49.948675406 +0000 UTC m=+84.126868112 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-4nvrr" (UID: "bb63c9f5-457d-4c61-8cc6-56690e66a952") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 11 17:31:49 crc kubenswrapper[4837]: I0111 17:31:49.477512 4837 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-api/machine-api-operator-5694c8668f-xk9j9"] Jan 11 17:31:49 crc kubenswrapper[4837]: I0111 17:31:49.549809 4837 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-fxkfr"] Jan 11 17:31:49 crc kubenswrapper[4837]: I0111 17:31:49.550540 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Jan 11 17:31:49 crc kubenswrapper[4837]: E0111 17:31:49.550770 4837 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-01-11 17:31:50.050733097 +0000 UTC m=+84.228925803 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 11 17:31:49 crc kubenswrapper[4837]: I0111 17:31:49.550872 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-4nvrr\" (UID: \"bb63c9f5-457d-4c61-8cc6-56690e66a952\") " pod="openshift-image-registry/image-registry-697d97f7c8-4nvrr" Jan 11 17:31:49 crc kubenswrapper[4837]: E0111 17:31:49.551172 4837 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-01-11 17:31:50.051161168 +0000 UTC m=+84.229353874 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-4nvrr" (UID: "bb63c9f5-457d-4c61-8cc6-56690e66a952") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 11 17:31:49 crc kubenswrapper[4837]: I0111 17:31:49.565356 4837 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-etcd-operator/etcd-operator-b45778765-pjl6t"] Jan 11 17:31:49 crc kubenswrapper[4837]: I0111 17:31:49.574775 4837 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-2vscb"] Jan 11 17:31:49 crc kubenswrapper[4837]: I0111 17:31:49.601419 4837 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-authentication/oauth-openshift-558db77b4-ldzgv"] Jan 11 17:31:49 crc kubenswrapper[4837]: I0111 17:31:49.651883 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Jan 11 17:31:49 crc kubenswrapper[4837]: E0111 17:31:49.652027 4837 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-01-11 17:31:50.152005509 +0000 UTC m=+84.330198215 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 11 17:31:49 crc kubenswrapper[4837]: I0111 17:31:49.652141 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-4nvrr\" (UID: \"bb63c9f5-457d-4c61-8cc6-56690e66a952\") " pod="openshift-image-registry/image-registry-697d97f7c8-4nvrr" Jan 11 17:31:49 crc kubenswrapper[4837]: E0111 17:31:49.652416 4837 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-01-11 17:31:50.152405149 +0000 UTC m=+84.330597845 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-4nvrr" (UID: "bb63c9f5-457d-4c61-8cc6-56690e66a952") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 11 17:31:49 crc kubenswrapper[4837]: I0111 17:31:49.711558 4837 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-multus/multus-admission-controller-857f4d67dd-2984c"] Jan 11 17:31:49 crc kubenswrapper[4837]: I0111 17:31:49.739515 4837 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-dns/dns-default-dddb7"] Jan 11 17:31:49 crc kubenswrapper[4837]: I0111 17:31:49.743761 4837 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-canary/ingress-canary-wbdfb"] Jan 11 17:31:49 crc kubenswrapper[4837]: I0111 17:31:49.745251 4837 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-service-ca/service-ca-9c57cc56f-nqs9r"] Jan 11 17:31:49 crc kubenswrapper[4837]: I0111 17:31:49.753560 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Jan 11 17:31:49 crc kubenswrapper[4837]: E0111 17:31:49.753907 4837 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-01-11 17:31:50.253895156 +0000 UTC m=+84.432087862 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 11 17:31:49 crc kubenswrapper[4837]: W0111 17:31:49.779877 4837 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod4eff370b_3729_4929_b73f_752f1b00a318.slice/crio-e490614efd38e9d5e950fed51c6fc9cf83fa0226d0afd75f44fe7b06e73f6638 WatchSource:0}: Error finding container e490614efd38e9d5e950fed51c6fc9cf83fa0226d0afd75f44fe7b06e73f6638: Status 404 returned error can't find the container with id e490614efd38e9d5e950fed51c6fc9cf83fa0226d0afd75f44fe7b06e73f6638 Jan 11 17:31:49 crc kubenswrapper[4837]: W0111 17:31:49.781213 4837 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod9b730a03_0d21_4b8a_8fa6_3d64f0b3aa5b.slice/crio-faa27c049d07dc51b41eaebff4493fe71088210aaadf6ad146cbd1a9a3adc71c WatchSource:0}: Error finding container faa27c049d07dc51b41eaebff4493fe71088210aaadf6ad146cbd1a9a3adc71c: Status 404 returned error can't find the container with id faa27c049d07dc51b41eaebff4493fe71088210aaadf6ad146cbd1a9a3adc71c Jan 11 17:31:49 crc kubenswrapper[4837]: I0111 17:31:49.810447 4837 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["hostpath-provisioner/csi-hostpathplugin-tdxwr"] Jan 11 17:31:49 crc kubenswrapper[4837]: I0111 17:31:49.814371 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-wlfcq" event={"ID":"8bd32a92-1f67-46c3-831f-6b50d22d0fb2","Type":"ContainerStarted","Data":"703bc6ecf63da73c62ea46f43ee464ad305b02c2bdf1908b2b096a5429a514a8"} Jan 11 17:31:49 crc kubenswrapper[4837]: I0111 17:31:49.818614 4837 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-operator/ingress-operator-5b745b69d9-gsqgk"] Jan 11 17:31:49 crc kubenswrapper[4837]: I0111 17:31:49.818824 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-fxkfr" event={"ID":"4eff370b-3729-4929-b73f-752f1b00a318","Type":"ContainerStarted","Data":"e490614efd38e9d5e950fed51c6fc9cf83fa0226d0afd75f44fe7b06e73f6638"} Jan 11 17:31:49 crc kubenswrapper[4837]: I0111 17:31:49.827941 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-fpvgj" event={"ID":"26867a3e-74b2-4ef0-b671-52b6f72fb0d3","Type":"ContainerStarted","Data":"0425775ad7cc2949f62ee1514d263022e6273403b11275766a0c6c615d540458"} Jan 11 17:31:49 crc kubenswrapper[4837]: I0111 17:31:49.831591 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-cwdvp" event={"ID":"76c9940b-89a9-414c-ab2a-c4c1b4519725","Type":"ContainerStarted","Data":"768d549fbdcc0e9a96d2e67c13e0d6694f4d56653fbacb74d5f141cf0e1bb3c4"} Jan 11 17:31:49 crc kubenswrapper[4837]: I0111 17:31:49.832171 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-5xllw" event={"ID":"d32a6f5b-b581-425c-80aa-c7deee3c5b2c","Type":"ContainerStarted","Data":"99c8cbffc70df21e2cd22fa383cda58ee86caf65f94261ded36742999f3a3712"} Jan 11 17:31:49 crc kubenswrapper[4837]: I0111 17:31:49.832843 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-apiserver/apiserver-76f77b778f-pz586" event={"ID":"09971765-a82b-4ab2-b79a-5defb8feb416","Type":"ContainerStarted","Data":"f6e1236898ab4074916f501d0ded5be2df2e40ebb7aabafff4bad74363db2b3c"} Jan 11 17:31:49 crc kubenswrapper[4837]: I0111 17:31:49.833797 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-879f6c89f-ljlz2" event={"ID":"f9f7469a-7ddb-4d35-962e-86154f7750c9","Type":"ContainerStarted","Data":"1ba2d3b02fd38acfbe485a7c569d325c8f90c37299ef3ce62f46c78cd54d92e5"} Jan 11 17:31:49 crc kubenswrapper[4837]: I0111 17:31:49.834540 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/cni-sysctl-allowlist-ds-pvfwl" event={"ID":"31dcdcdc-a207-4f09-90af-82c452f9a3f0","Type":"ContainerStarted","Data":"cdc30b8e30daa7d568941c549c77456206d6e80f8186ebb36fef18a3c4bbef63"} Jan 11 17:31:49 crc kubenswrapper[4837]: I0111 17:31:49.842311 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication-operator/authentication-operator-69f744f599-l2c7d" event={"ID":"1d6837bf-33ce-474a-a4f5-22d1fa7b95b1","Type":"ContainerStarted","Data":"905ade78f18cec373e16cedae5ad557f442502681d822d7626712451deecc716"} Jan 11 17:31:49 crc kubenswrapper[4837]: W0111 17:31:49.843316 4837 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod8c7124f1_e7cc_4ae8_89d2_19457c045576.slice/crio-155709342ac7209760ea7982f9d7b5c3773b1dbe3ef81f100a8eaef63503d70e WatchSource:0}: Error finding container 155709342ac7209760ea7982f9d7b5c3773b1dbe3ef81f100a8eaef63503d70e: Status 404 returned error can't find the container with id 155709342ac7209760ea7982f9d7b5c3773b1dbe3ef81f100a8eaef63503d70e Jan 11 17:31:49 crc kubenswrapper[4837]: I0111 17:31:49.845380 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-c9fvj" event={"ID":"6a05a50b-ccec-45f2-be70-d210e5334d18","Type":"ContainerStarted","Data":"eff99be41239800ea355b21078345621e79cc365bf0a79567c9cd1aa1991f507"} Jan 11 17:31:49 crc kubenswrapper[4837]: I0111 17:31:49.846289 4837 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-config-operator/machine-config-operator-74547568cd-ps47j"] Jan 11 17:31:49 crc kubenswrapper[4837]: I0111 17:31:49.854966 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-4nvrr\" (UID: \"bb63c9f5-457d-4c61-8cc6-56690e66a952\") " pod="openshift-image-registry/image-registry-697d97f7c8-4nvrr" Jan 11 17:31:49 crc kubenswrapper[4837]: E0111 17:31:49.855815 4837 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-01-11 17:31:50.355803123 +0000 UTC m=+84.533995829 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-4nvrr" (UID: "bb63c9f5-457d-4c61-8cc6-56690e66a952") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 11 17:31:49 crc kubenswrapper[4837]: I0111 17:31:49.862937 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd-operator/etcd-operator-b45778765-pjl6t" event={"ID":"9b730a03-0d21-4b8a-8fa6-3d64f0b3aa5b","Type":"ContainerStarted","Data":"faa27c049d07dc51b41eaebff4493fe71088210aaadf6ad146cbd1a9a3adc71c"} Jan 11 17:31:49 crc kubenswrapper[4837]: I0111 17:31:49.873638 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/downloads-7954f5f757-dmnqc" event={"ID":"75ee2960-aa16-4aea-84f2-d60c34d6fb1a","Type":"ContainerStarted","Data":"eadb0e8e349d4956e3dc9e921083da199dbcab2b5fc27a0b4a05668dcbf564ae"} Jan 11 17:31:49 crc kubenswrapper[4837]: W0111 17:31:49.874162 4837 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod334b61aa_43d5_4f60_af17_50408198b8f5.slice/crio-7ac523c2da3ee0721190976ceb06f27a9368a5ccb00cd961f29dce209ecd9c64 WatchSource:0}: Error finding container 7ac523c2da3ee0721190976ceb06f27a9368a5ccb00cd961f29dce209ecd9c64: Status 404 returned error can't find the container with id 7ac523c2da3ee0721190976ceb06f27a9368a5ccb00cd961f29dce209ecd9c64 Jan 11 17:31:49 crc kubenswrapper[4837]: I0111 17:31:49.874506 4837 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-console/downloads-7954f5f757-dmnqc" Jan 11 17:31:49 crc kubenswrapper[4837]: I0111 17:31:49.875187 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-server-7mr75" event={"ID":"76f75c9a-7f59-4fa1-99e2-7c8f67a6aa17","Type":"ContainerStarted","Data":"789708a78ef315e976ceb17073f0e89acd90e2aaa66bb1a4131ee1e2f48fb3fa"} Jan 11 17:31:49 crc kubenswrapper[4837]: I0111 17:31:49.886011 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-mklqs" event={"ID":"8ec2b154-4844-440e-bec5-911d8456ac91","Type":"ContainerStarted","Data":"5e5ed3013fa3e19649e8c06b35b9c03b42279291ccd361b3e4bb400e07e25e11"} Jan 11 17:31:49 crc kubenswrapper[4837]: I0111 17:31:49.886105 4837 patch_prober.go:28] interesting pod/downloads-7954f5f757-dmnqc container/download-server namespace/openshift-console: Readiness probe status=failure output="Get \"http://10.217.0.10:8080/\": dial tcp 10.217.0.10:8080: connect: connection refused" start-of-body= Jan 11 17:31:49 crc kubenswrapper[4837]: I0111 17:31:49.886131 4837 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console/downloads-7954f5f757-dmnqc" podUID="75ee2960-aa16-4aea-84f2-d60c34d6fb1a" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.10:8080/\": dial tcp 10.217.0.10:8080: connect: connection refused" Jan 11 17:31:49 crc kubenswrapper[4837]: I0111 17:31:49.954359 4837 generic.go:334] "Generic (PLEG): container finished" podID="8854497b-9e74-4bd1-b465-847ad61d8779" containerID="497b6b26c880b5e75bbaf147912973b05feebc7d35a6230533537dfc8ed47e97" exitCode=0 Jan 11 17:31:49 crc kubenswrapper[4837]: I0111 17:31:49.955309 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-config-operator/openshift-config-operator-7777fb866f-599f5" event={"ID":"8854497b-9e74-4bd1-b465-847ad61d8779","Type":"ContainerDied","Data":"497b6b26c880b5e75bbaf147912973b05feebc7d35a6230533537dfc8ed47e97"} Jan 11 17:31:49 crc kubenswrapper[4837]: I0111 17:31:49.955642 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Jan 11 17:31:49 crc kubenswrapper[4837]: E0111 17:31:49.955796 4837 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-01-11 17:31:50.455772541 +0000 UTC m=+84.633965247 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 11 17:31:49 crc kubenswrapper[4837]: I0111 17:31:49.960182 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-4nvrr\" (UID: \"bb63c9f5-457d-4c61-8cc6-56690e66a952\") " pod="openshift-image-registry/image-registry-697d97f7c8-4nvrr" Jan 11 17:31:49 crc kubenswrapper[4837]: I0111 17:31:49.968242 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console-operator/console-operator-58897d9998-n8ldn" event={"ID":"58f0dabf-1deb-453a-9ac9-11324df2b806","Type":"ContainerStarted","Data":"5c3e3a429bae6b8050cca7bd0a31059a495ae057b3392e1ece1ccca26085e455"} Jan 11 17:31:49 crc kubenswrapper[4837]: I0111 17:31:49.969094 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-r58pw" event={"ID":"6184fa1c-4a00-4ae3-9dad-a5672220571e","Type":"ContainerStarted","Data":"80f22f66cb81328d2bb06e50235bbce666e4967b0173c697d18f43aaea20475b"} Jan 11 17:31:49 crc kubenswrapper[4837]: I0111 17:31:49.971660 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-f9d7485db-v8df8" event={"ID":"1141f492-afec-40f3-bde7-7072d6a75a68","Type":"ContainerStarted","Data":"e6cb39cd7ba9a53ad90183631cd08e2974d886a416a3c254a23cb1459c204d89"} Jan 11 17:31:49 crc kubenswrapper[4837]: I0111 17:31:49.973498 4837 generic.go:334] "Generic (PLEG): container finished" podID="ecf9daad-b73a-4f1a-9247-ee4973ad1bc7" containerID="214788fec0002001096eb6600b5c06baf5c7a0ef7c3216b620c0770542d23017" exitCode=0 Jan 11 17:31:49 crc kubenswrapper[4837]: I0111 17:31:49.973545 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-2ngzf" event={"ID":"ecf9daad-b73a-4f1a-9247-ee4973ad1bc7","Type":"ContainerDied","Data":"214788fec0002001096eb6600b5c06baf5c7a0ef7c3216b620c0770542d23017"} Jan 11 17:31:49 crc kubenswrapper[4837]: I0111 17:31:49.978025 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress/router-default-5444994796-cscqg" event={"ID":"e610467b-9c0c-47ac-86c8-2d700aba3e8e","Type":"ContainerStarted","Data":"3a9c9ed8abb1e5f65dc66b1b4d3d27296303110e532c0a1e00005dc2bb6e6f29"} Jan 11 17:31:49 crc kubenswrapper[4837]: I0111 17:31:49.983961 4837 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/multus-additional-cni-plugins-7996w" podStartSLOduration=41.983940029 podStartE2EDuration="41.983940029s" podCreationTimestamp="2026-01-11 17:31:08 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-11 17:31:49.955083164 +0000 UTC m=+84.133275870" watchObservedRunningTime="2026-01-11 17:31:49.983940029 +0000 UTC m=+84.162132735" Jan 11 17:31:49 crc kubenswrapper[4837]: E0111 17:31:49.989329 4837 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-01-11 17:31:50.489313486 +0000 UTC m=+84.667506192 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-4nvrr" (UID: "bb63c9f5-457d-4c61-8cc6-56690e66a952") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 11 17:31:49 crc kubenswrapper[4837]: I0111 17:31:49.999518 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-api/machine-api-operator-5694c8668f-xk9j9" event={"ID":"b61e27df-5c38-48b3-b6e9-bca3ce8aa429","Type":"ContainerStarted","Data":"55d5805b9fbe1f3ccd7e563a6a4a043ba421b0fa10e0be0288e7a724e25298a1"} Jan 11 17:31:50 crc kubenswrapper[4837]: I0111 17:31:50.001428 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-machine-approver/machine-approver-56656f9798-hchsh" event={"ID":"59f65053-461b-4ad3-acc6-2a29b1fd06c6","Type":"ContainerStarted","Data":"13d99d27ec5e502b839c55a45aa4bb7725007fe09de02b6c5d6e7490291a3f57"} Jan 11 17:31:50 crc kubenswrapper[4837]: I0111 17:31:50.006539 4837 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29469210-m2c5r"] Jan 11 17:31:50 crc kubenswrapper[4837]: I0111 17:31:50.044462 4837 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-storage-version-migrator/migrator-59844c95c7-p7jgn"] Jan 11 17:31:50 crc kubenswrapper[4837]: I0111 17:31:50.099032 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Jan 11 17:31:50 crc kubenswrapper[4837]: E0111 17:31:50.100509 4837 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-01-11 17:31:50.600478899 +0000 UTC m=+84.778671595 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 11 17:31:50 crc kubenswrapper[4837]: I0111 17:31:50.143842 4837 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-wphsv"] Jan 11 17:31:50 crc kubenswrapper[4837]: I0111 17:31:50.202040 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-4nvrr\" (UID: \"bb63c9f5-457d-4c61-8cc6-56690e66a952\") " pod="openshift-image-registry/image-registry-697d97f7c8-4nvrr" Jan 11 17:31:50 crc kubenswrapper[4837]: E0111 17:31:50.202341 4837 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-01-11 17:31:50.702330595 +0000 UTC m=+84.880523301 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-4nvrr" (UID: "bb63c9f5-457d-4c61-8cc6-56690e66a952") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 11 17:31:50 crc kubenswrapper[4837]: I0111 17:31:50.303035 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Jan 11 17:31:50 crc kubenswrapper[4837]: E0111 17:31:50.303839 4837 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-01-11 17:31:50.803825182 +0000 UTC m=+84.982017888 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 11 17:31:50 crc kubenswrapper[4837]: I0111 17:31:50.408768 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-4nvrr\" (UID: \"bb63c9f5-457d-4c61-8cc6-56690e66a952\") " pod="openshift-image-registry/image-registry-697d97f7c8-4nvrr" Jan 11 17:31:50 crc kubenswrapper[4837]: E0111 17:31:50.409531 4837 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-01-11 17:31:50.909498536 +0000 UTC m=+85.087691242 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-4nvrr" (UID: "bb63c9f5-457d-4c61-8cc6-56690e66a952") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 11 17:31:50 crc kubenswrapper[4837]: I0111 17:31:50.506606 4837 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-bs6rl"] Jan 11 17:31:50 crc kubenswrapper[4837]: I0111 17:31:50.507797 4837 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-config-operator/machine-config-controller-84d6567774-zdg78"] Jan 11 17:31:50 crc kubenswrapper[4837]: I0111 17:31:50.509964 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Jan 11 17:31:50 crc kubenswrapper[4837]: E0111 17:31:50.510363 4837 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-01-11 17:31:51.010347176 +0000 UTC m=+85.188539882 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 11 17:31:50 crc kubenswrapper[4837]: W0111 17:31:50.595355 4837 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod75e7928f_bf2f_4a17_b2eb_f4fc925c7ce3.slice/crio-f43ae1a47c1bddc6da7837e1bb6c035b696385ecfab81bc81462cd097ec644ef WatchSource:0}: Error finding container f43ae1a47c1bddc6da7837e1bb6c035b696385ecfab81bc81462cd097ec644ef: Status 404 returned error can't find the container with id f43ae1a47c1bddc6da7837e1bb6c035b696385ecfab81bc81462cd097ec644ef Jan 11 17:31:50 crc kubenswrapper[4837]: I0111 17:31:50.611641 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-4nvrr\" (UID: \"bb63c9f5-457d-4c61-8cc6-56690e66a952\") " pod="openshift-image-registry/image-registry-697d97f7c8-4nvrr" Jan 11 17:31:50 crc kubenswrapper[4837]: E0111 17:31:50.611956 4837 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-01-11 17:31:51.111928255 +0000 UTC m=+85.290120961 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-4nvrr" (UID: "bb63c9f5-457d-4c61-8cc6-56690e66a952") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 11 17:31:50 crc kubenswrapper[4837]: I0111 17:31:50.662871 4837 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/catalog-operator-68c6474976-sxnbq"] Jan 11 17:31:50 crc kubenswrapper[4837]: I0111 17:31:50.662916 4837 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-service-ca-operator/service-ca-operator-777779d784-vzfwx"] Jan 11 17:31:50 crc kubenswrapper[4837]: I0111 17:31:50.676924 4837 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-gkkhd"] Jan 11 17:31:50 crc kubenswrapper[4837]: W0111 17:31:50.708963 4837 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podd310cf03_2422_47e4_a98b_4a0636c1b8f8.slice/crio-83b43352aad905b86af92b3a77f9d525df68449908156f47ece44a29466ffddb WatchSource:0}: Error finding container 83b43352aad905b86af92b3a77f9d525df68449908156f47ece44a29466ffddb: Status 404 returned error can't find the container with id 83b43352aad905b86af92b3a77f9d525df68449908156f47ece44a29466ffddb Jan 11 17:31:50 crc kubenswrapper[4837]: I0111 17:31:50.718751 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Jan 11 17:31:50 crc kubenswrapper[4837]: E0111 17:31:50.719225 4837 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-01-11 17:31:51.21921073 +0000 UTC m=+85.397403436 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 11 17:31:50 crc kubenswrapper[4837]: I0111 17:31:50.727454 4837 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-9dr4c"] Jan 11 17:31:50 crc kubenswrapper[4837]: I0111 17:31:50.820790 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-4nvrr\" (UID: \"bb63c9f5-457d-4c61-8cc6-56690e66a952\") " pod="openshift-image-registry/image-registry-697d97f7c8-4nvrr" Jan 11 17:31:50 crc kubenswrapper[4837]: E0111 17:31:50.821347 4837 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-01-11 17:31:51.321335872 +0000 UTC m=+85.499528578 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-4nvrr" (UID: "bb63c9f5-457d-4c61-8cc6-56690e66a952") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 11 17:31:50 crc kubenswrapper[4837]: I0111 17:31:50.926180 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Jan 11 17:31:50 crc kubenswrapper[4837]: E0111 17:31:50.926563 4837 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-01-11 17:31:51.426543574 +0000 UTC m=+85.604736290 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 11 17:31:50 crc kubenswrapper[4837]: I0111 17:31:50.964704 4837 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console/downloads-7954f5f757-dmnqc" podStartSLOduration=42.964690846 podStartE2EDuration="42.964690846s" podCreationTimestamp="2026-01-11 17:31:08 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-11 17:31:50.960060758 +0000 UTC m=+85.138253464" watchObservedRunningTime="2026-01-11 17:31:50.964690846 +0000 UTC m=+85.142883552" Jan 11 17:31:51 crc kubenswrapper[4837]: I0111 17:31:51.011504 4837 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-fpvgj" podStartSLOduration=43.011486859 podStartE2EDuration="43.011486859s" podCreationTimestamp="2026-01-11 17:31:08 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-11 17:31:51.00248559 +0000 UTC m=+85.180678296" watchObservedRunningTime="2026-01-11 17:31:51.011486859 +0000 UTC m=+85.189679565" Jan 11 17:31:51 crc kubenswrapper[4837]: I0111 17:31:51.022639 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-kq5ch" event={"ID":"876b267e-baf3-4f7d-a4a3-49f44b4dfbb7","Type":"ContainerStarted","Data":"4dbf8640ec8f1c021340d9fdc31e0975ae0379f2ce4f9645bbe6d24a66b2c781"} Jan 11 17:31:51 crc kubenswrapper[4837]: I0111 17:31:51.027382 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-4nvrr\" (UID: \"bb63c9f5-457d-4c61-8cc6-56690e66a952\") " pod="openshift-image-registry/image-registry-697d97f7c8-4nvrr" Jan 11 17:31:51 crc kubenswrapper[4837]: E0111 17:31:51.027692 4837 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-01-11 17:31:51.527663101 +0000 UTC m=+85.705855807 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-4nvrr" (UID: "bb63c9f5-457d-4c61-8cc6-56690e66a952") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 11 17:31:51 crc kubenswrapper[4837]: I0111 17:31:51.033329 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/marketplace-operator-79b997595-gkkhd" event={"ID":"f9e34a1e-5456-4b26-b347-aa569c5987d5","Type":"ContainerStarted","Data":"06566f0c1c3ccdfdd35cacfa70f38cd1bcb1f7c65955d9575205fbbfabc11c09"} Jan 11 17:31:51 crc kubenswrapper[4837]: I0111 17:31:51.051577 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-dddb7" event={"ID":"1bb35332-d7fa-4163-99c1-3de2e12a6165","Type":"ContainerStarted","Data":"af600d07369a3c6eaffeff0fdbb77078ca7b179236b37533fadbce06ddb01c15"} Jan 11 17:31:51 crc kubenswrapper[4837]: I0111 17:31:51.072213 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-server-7mr75" event={"ID":"76f75c9a-7f59-4fa1-99e2-7c8f67a6aa17","Type":"ContainerStarted","Data":"83121461f016f7f9ff2ad2475b5e18c81d3c01cbc734ae7733e8183b8f47d2be"} Jan 11 17:31:51 crc kubenswrapper[4837]: I0111 17:31:51.092286 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-wphsv" event={"ID":"cb48d08d-1606-49d9-a55d-53b21ad9f404","Type":"ContainerStarted","Data":"37e2c446ce9c19784541a314112f77cf3edf1e728cbca104bbfc432bdd6ce154"} Jan 11 17:31:51 crc kubenswrapper[4837]: I0111 17:31:51.106815 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-canary/ingress-canary-wbdfb" event={"ID":"5a71c215-dd54-402e-aaaa-ad6f2320da35","Type":"ContainerStarted","Data":"ceca7a731ff49d71259e559f1da7e530dd4eb3936b5b72bf6391d57524a4ccbc"} Jan 11 17:31:51 crc kubenswrapper[4837]: I0111 17:31:51.119555 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-5xllw" event={"ID":"d32a6f5b-b581-425c-80aa-c7deee3c5b2c","Type":"ContainerStarted","Data":"5b1c7afe896bfe745219698fce1f1b8b34ca50e97354e7739fddfa0296f63fc8"} Jan 11 17:31:51 crc kubenswrapper[4837]: I0111 17:31:51.121288 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-mklqs" event={"ID":"8ec2b154-4844-440e-bec5-911d8456ac91","Type":"ContainerStarted","Data":"e27918ae606620933c6878bbefd23556655d026ed3230da3dd9ff7c2261c77aa"} Jan 11 17:31:51 crc kubenswrapper[4837]: I0111 17:31:51.122242 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-p7jgn" event={"ID":"09ba226e-a3c0-402a-a7d0-bbc4f46d5dd3","Type":"ContainerStarted","Data":"bc8c6fce6113e7eeb09ebee4c05df8855a05ac47ee73d9c4fc75c8624afd88a2"} Jan 11 17:31:51 crc kubenswrapper[4837]: I0111 17:31:51.123276 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca/service-ca-9c57cc56f-nqs9r" event={"ID":"334b61aa-43d5-4f60-af17-50408198b8f5","Type":"ContainerStarted","Data":"7ac523c2da3ee0721190976ceb06f27a9368a5ccb00cd961f29dce209ecd9c64"} Jan 11 17:31:51 crc kubenswrapper[4837]: I0111 17:31:51.123993 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-9dr4c" event={"ID":"e59909fb-c783-46de-9955-1c31cc9fd6b2","Type":"ContainerStarted","Data":"3b9e4de37901c64e145dd4669c228212de858f244507d8947cc1ad853dbfb224"} Jan 11 17:31:51 crc kubenswrapper[4837]: I0111 17:31:51.127456 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication/oauth-openshift-558db77b4-ldzgv" event={"ID":"1b5ce6bf-72e2-494a-aa22-830e992fbec5","Type":"ContainerStarted","Data":"a47addc44e48347ea1e4f8889bae172c523bd73bbeeef86fe01a6cf74f67a66b"} Jan 11 17:31:51 crc kubenswrapper[4837]: I0111 17:31:51.129525 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-wlfcq" event={"ID":"8bd32a92-1f67-46c3-831f-6b50d22d0fb2","Type":"ContainerStarted","Data":"84acc1d85d949325cb0f65a1b09fb317d58be1a1df57a31676e15bf3e007e723"} Jan 11 17:31:51 crc kubenswrapper[4837]: I0111 17:31:51.131499 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca-operator/service-ca-operator-777779d784-vzfwx" event={"ID":"d310cf03-2422-47e4-a98b-4a0636c1b8f8","Type":"ContainerStarted","Data":"83b43352aad905b86af92b3a77f9d525df68449908156f47ece44a29466ffddb"} Jan 11 17:31:51 crc kubenswrapper[4837]: I0111 17:31:51.132333 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-2vscb" event={"ID":"43082497-a570-48fc-95f8-eda27581cde7","Type":"ContainerStarted","Data":"162f08eba93a8dd815713b59f1f573ec51915cd68af408303b1b5000a9e86df3"} Jan 11 17:31:51 crc kubenswrapper[4837]: I0111 17:31:51.134099 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Jan 11 17:31:51 crc kubenswrapper[4837]: E0111 17:31:51.134441 4837 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-01-11 17:31:51.634415422 +0000 UTC m=+85.812608138 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 11 17:31:51 crc kubenswrapper[4837]: I0111 17:31:51.135073 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-4nvrr\" (UID: \"bb63c9f5-457d-4c61-8cc6-56690e66a952\") " pod="openshift-image-registry/image-registry-697d97f7c8-4nvrr" Jan 11 17:31:51 crc kubenswrapper[4837]: E0111 17:31:51.135426 4837 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-01-11 17:31:51.635416397 +0000 UTC m=+85.813609173 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-4nvrr" (UID: "bb63c9f5-457d-4c61-8cc6-56690e66a952") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 11 17:31:51 crc kubenswrapper[4837]: I0111 17:31:51.137711 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console-operator/console-operator-58897d9998-n8ldn" event={"ID":"58f0dabf-1deb-453a-9ac9-11324df2b806","Type":"ContainerStarted","Data":"e435411cf289f0177b962b79669a60571a793abe4182c9ee12d6b0a97b9f828e"} Jan 11 17:31:51 crc kubenswrapper[4837]: I0111 17:31:51.140116 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-operator/ingress-operator-5b745b69d9-gsqgk" event={"ID":"f15ef5f6-e972-4986-a15c-da1e90b74ae0","Type":"ContainerStarted","Data":"185fd70c527ebbea6efe75b5e031f36580005a60c750e615ebc3e983fb46fec8"} Jan 11 17:31:51 crc kubenswrapper[4837]: I0111 17:31:51.140148 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-operator/ingress-operator-5b745b69d9-gsqgk" event={"ID":"f15ef5f6-e972-4986-a15c-da1e90b74ae0","Type":"ContainerStarted","Data":"6c56b3bc34cba813ea4fde5a7c8487549e056cedcf62c43239761e7f5d1add77"} Jan 11 17:31:51 crc kubenswrapper[4837]: I0111 17:31:51.144714 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-bs6rl" event={"ID":"75e7928f-bf2f-4a17-b2eb-f4fc925c7ce3","Type":"ContainerStarted","Data":"f43ae1a47c1bddc6da7837e1bb6c035b696385ecfab81bc81462cd097ec644ef"} Jan 11 17:31:51 crc kubenswrapper[4837]: I0111 17:31:51.156450 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-controller-84d6567774-zdg78" event={"ID":"8299e03d-1f93-4032-bcad-2ce040734c86","Type":"ContainerStarted","Data":"7901d53212989f204da05f1f12482c218be234703b94811b414d3b3da9ac2a12"} Jan 11 17:31:51 crc kubenswrapper[4837]: I0111 17:31:51.158307 4837 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-config-operator/machine-config-server-7mr75" podStartSLOduration=6.15828946 podStartE2EDuration="6.15828946s" podCreationTimestamp="2026-01-11 17:31:45 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-11 17:31:51.101620226 +0000 UTC m=+85.279812922" watchObservedRunningTime="2026-01-11 17:31:51.15828946 +0000 UTC m=+85.336482166" Jan 11 17:31:51 crc kubenswrapper[4837]: I0111 17:31:51.158894 4837 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-wlfcq" podStartSLOduration=43.158886775 podStartE2EDuration="43.158886775s" podCreationTimestamp="2026-01-11 17:31:08 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-11 17:31:51.156657289 +0000 UTC m=+85.334849995" watchObservedRunningTime="2026-01-11 17:31:51.158886775 +0000 UTC m=+85.337079491" Jan 11 17:31:51 crc kubenswrapper[4837]: I0111 17:31:51.164957 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29469210-m2c5r" event={"ID":"90317184-ec5a-4dc2-a9f8-6075c0d78aa1","Type":"ContainerStarted","Data":"d3dd044288f2ef676d6e67142fad923ef85f583071149f64a963d044c59dfa32"} Jan 11 17:31:51 crc kubenswrapper[4837]: I0111 17:31:51.166767 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/cni-sysctl-allowlist-ds-pvfwl" event={"ID":"31dcdcdc-a207-4f09-90af-82c452f9a3f0","Type":"ContainerStarted","Data":"8e11a566745fc0a91a7711eeb5325461c2be37c7f2d3144b9cc34f15e4aa587a"} Jan 11 17:31:51 crc kubenswrapper[4837]: I0111 17:31:51.168394 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns-operator/dns-operator-744455d44c-n8rwr" event={"ID":"a45207f1-08c5-4ae8-9ac3-3e3099df8218","Type":"ContainerStarted","Data":"f0ee07913ea13bdfbb39b9c4b29ec988d77e03989911cc6d45a3a0f9704cbe5d"} Jan 11 17:31:51 crc kubenswrapper[4837]: I0111 17:31:51.171074 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-sxnbq" event={"ID":"4047a74f-e1b9-40a2-b525-bc06011477d7","Type":"ContainerStarted","Data":"eb85afb7e30c88e3df577ccfbf9ad84a827b75dce50b0c9732003932b9c4e9d6"} Jan 11 17:31:51 crc kubenswrapper[4837]: I0111 17:31:51.171909 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-operator-74547568cd-ps47j" event={"ID":"41ef8783-9b8b-427d-b3db-56d90bb448fa","Type":"ContainerStarted","Data":"6b356ce2500a56532c504d957ea54cb7b0c78170d54ed14f76452dbd8169c8c7"} Jan 11 17:31:51 crc kubenswrapper[4837]: I0111 17:31:51.173393 4837 generic.go:334] "Generic (PLEG): container finished" podID="09971765-a82b-4ab2-b79a-5defb8feb416" containerID="32d78de844cdcc53b5aa150014ab1aed23bc93b74b182fda976d317ff928705e" exitCode=0 Jan 11 17:31:51 crc kubenswrapper[4837]: I0111 17:31:51.173705 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-apiserver/apiserver-76f77b778f-pz586" event={"ID":"09971765-a82b-4ab2-b79a-5defb8feb416","Type":"ContainerDied","Data":"32d78de844cdcc53b5aa150014ab1aed23bc93b74b182fda976d317ff928705e"} Jan 11 17:31:51 crc kubenswrapper[4837]: I0111 17:31:51.175921 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-c9fvj" event={"ID":"6a05a50b-ccec-45f2-be70-d210e5334d18","Type":"ContainerStarted","Data":"66a270250751beaecb86abc82c7eca12fcaf80cc444c713d9ebfbcc97d07cbc9"} Jan 11 17:31:51 crc kubenswrapper[4837]: I0111 17:31:51.195323 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-r58pw" event={"ID":"6184fa1c-4a00-4ae3-9dad-a5672220571e","Type":"ContainerStarted","Data":"6b7188baf38fdb16f558ec0914b6ee2aaed743f2512d883f014c0e1a3a58fbe0"} Jan 11 17:31:51 crc kubenswrapper[4837]: I0111 17:31:51.207095 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="hostpath-provisioner/csi-hostpathplugin-tdxwr" event={"ID":"ce38ff9a-9354-463c-a8b5-3d4bbea8694d","Type":"ContainerStarted","Data":"0c01a60a4ad67fec8a8e5ac9e68127a0d2254e3bdd56a82a5724e199a89851a6"} Jan 11 17:31:51 crc kubenswrapper[4837]: I0111 17:31:51.209499 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-admission-controller-857f4d67dd-2984c" event={"ID":"8c7124f1-e7cc-4ae8-89d2-19457c045576","Type":"ContainerStarted","Data":"155709342ac7209760ea7982f9d7b5c3773b1dbe3ef81f100a8eaef63503d70e"} Jan 11 17:31:51 crc kubenswrapper[4837]: I0111 17:31:51.210164 4837 patch_prober.go:28] interesting pod/downloads-7954f5f757-dmnqc container/download-server namespace/openshift-console: Readiness probe status=failure output="Get \"http://10.217.0.10:8080/\": dial tcp 10.217.0.10:8080: connect: connection refused" start-of-body= Jan 11 17:31:51 crc kubenswrapper[4837]: I0111 17:31:51.210190 4837 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console/downloads-7954f5f757-dmnqc" podUID="75ee2960-aa16-4aea-84f2-d60c34d6fb1a" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.10:8080/\": dial tcp 10.217.0.10:8080: connect: connection refused" Jan 11 17:31:51 crc kubenswrapper[4837]: I0111 17:31:51.220736 4837 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-controller-manager/controller-manager-879f6c89f-ljlz2" Jan 11 17:31:51 crc kubenswrapper[4837]: I0111 17:31:51.232043 4837 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-controller-manager/controller-manager-879f6c89f-ljlz2" podStartSLOduration=43.2320218 podStartE2EDuration="43.2320218s" podCreationTimestamp="2026-01-11 17:31:08 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-11 17:31:51.227268119 +0000 UTC m=+85.405460845" watchObservedRunningTime="2026-01-11 17:31:51.2320218 +0000 UTC m=+85.410214506" Jan 11 17:31:51 crc kubenswrapper[4837]: I0111 17:31:51.236769 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Jan 11 17:31:51 crc kubenswrapper[4837]: E0111 17:31:51.237037 4837 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-01-11 17:31:51.737019337 +0000 UTC m=+85.915212043 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 11 17:31:51 crc kubenswrapper[4837]: I0111 17:31:51.237521 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-4nvrr\" (UID: \"bb63c9f5-457d-4c61-8cc6-56690e66a952\") " pod="openshift-image-registry/image-registry-697d97f7c8-4nvrr" Jan 11 17:31:51 crc kubenswrapper[4837]: E0111 17:31:51.239738 4837 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-01-11 17:31:51.739723936 +0000 UTC m=+85.917916642 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-4nvrr" (UID: "bb63c9f5-457d-4c61-8cc6-56690e66a952") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 11 17:31:51 crc kubenswrapper[4837]: I0111 17:31:51.242495 4837 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-controller-manager/controller-manager-879f6c89f-ljlz2" Jan 11 17:31:51 crc kubenswrapper[4837]: I0111 17:31:51.258669 4837 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console/console-f9d7485db-v8df8" podStartSLOduration=43.258655109 podStartE2EDuration="43.258655109s" podCreationTimestamp="2026-01-11 17:31:08 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-11 17:31:51.258041903 +0000 UTC m=+85.436234609" watchObservedRunningTime="2026-01-11 17:31:51.258655109 +0000 UTC m=+85.436847815" Jan 11 17:31:51 crc kubenswrapper[4837]: I0111 17:31:51.340711 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Jan 11 17:31:51 crc kubenswrapper[4837]: E0111 17:31:51.341019 4837 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-01-11 17:31:51.841004507 +0000 UTC m=+86.019197213 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 11 17:31:51 crc kubenswrapper[4837]: I0111 17:31:51.352875 4837 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-cwdvp" podStartSLOduration=42.35285776 podStartE2EDuration="42.35285776s" podCreationTimestamp="2026-01-11 17:31:09 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-11 17:31:51.338058992 +0000 UTC m=+85.516251698" watchObservedRunningTime="2026-01-11 17:31:51.35285776 +0000 UTC m=+85.531050466" Jan 11 17:31:51 crc kubenswrapper[4837]: I0111 17:31:51.354229 4837 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-authentication-operator/authentication-operator-69f744f599-l2c7d" podStartSLOduration=43.354223624 podStartE2EDuration="43.354223624s" podCreationTimestamp="2026-01-11 17:31:08 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-11 17:31:51.352074699 +0000 UTC m=+85.530267405" watchObservedRunningTime="2026-01-11 17:31:51.354223624 +0000 UTC m=+85.532416330" Jan 11 17:31:51 crc kubenswrapper[4837]: I0111 17:31:51.442502 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-4nvrr\" (UID: \"bb63c9f5-457d-4c61-8cc6-56690e66a952\") " pod="openshift-image-registry/image-registry-697d97f7c8-4nvrr" Jan 11 17:31:51 crc kubenswrapper[4837]: E0111 17:31:51.442861 4837 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-01-11 17:31:51.942846533 +0000 UTC m=+86.121039239 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-4nvrr" (UID: "bb63c9f5-457d-4c61-8cc6-56690e66a952") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 11 17:31:51 crc kubenswrapper[4837]: I0111 17:31:51.543727 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Jan 11 17:31:51 crc kubenswrapper[4837]: E0111 17:31:51.544089 4837 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-01-11 17:31:52.044074363 +0000 UTC m=+86.222267069 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 11 17:31:51 crc kubenswrapper[4837]: I0111 17:31:51.645844 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-4nvrr\" (UID: \"bb63c9f5-457d-4c61-8cc6-56690e66a952\") " pod="openshift-image-registry/image-registry-697d97f7c8-4nvrr" Jan 11 17:31:51 crc kubenswrapper[4837]: E0111 17:31:51.646337 4837 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-01-11 17:31:52.146311259 +0000 UTC m=+86.324504005 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-4nvrr" (UID: "bb63c9f5-457d-4c61-8cc6-56690e66a952") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 11 17:31:51 crc kubenswrapper[4837]: I0111 17:31:51.747448 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Jan 11 17:31:51 crc kubenswrapper[4837]: E0111 17:31:51.747613 4837 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-01-11 17:31:52.24758219 +0000 UTC m=+86.425774936 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 11 17:31:51 crc kubenswrapper[4837]: I0111 17:31:51.747793 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-4nvrr\" (UID: \"bb63c9f5-457d-4c61-8cc6-56690e66a952\") " pod="openshift-image-registry/image-registry-697d97f7c8-4nvrr" Jan 11 17:31:51 crc kubenswrapper[4837]: E0111 17:31:51.748434 4837 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-01-11 17:31:52.248414431 +0000 UTC m=+86.426607177 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-4nvrr" (UID: "bb63c9f5-457d-4c61-8cc6-56690e66a952") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 11 17:31:51 crc kubenswrapper[4837]: I0111 17:31:51.848891 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Jan 11 17:31:51 crc kubenswrapper[4837]: E0111 17:31:51.849166 4837 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-01-11 17:31:52.349140498 +0000 UTC m=+86.527333204 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 11 17:31:51 crc kubenswrapper[4837]: I0111 17:31:51.849325 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-4nvrr\" (UID: \"bb63c9f5-457d-4c61-8cc6-56690e66a952\") " pod="openshift-image-registry/image-registry-697d97f7c8-4nvrr" Jan 11 17:31:51 crc kubenswrapper[4837]: E0111 17:31:51.849598 4837 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-01-11 17:31:52.34958595 +0000 UTC m=+86.527778656 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-4nvrr" (UID: "bb63c9f5-457d-4c61-8cc6-56690e66a952") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 11 17:31:51 crc kubenswrapper[4837]: I0111 17:31:51.950503 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Jan 11 17:31:51 crc kubenswrapper[4837]: E0111 17:31:51.950624 4837 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-01-11 17:31:52.450600154 +0000 UTC m=+86.628792880 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 11 17:31:51 crc kubenswrapper[4837]: I0111 17:31:51.950812 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-4nvrr\" (UID: \"bb63c9f5-457d-4c61-8cc6-56690e66a952\") " pod="openshift-image-registry/image-registry-697d97f7c8-4nvrr" Jan 11 17:31:51 crc kubenswrapper[4837]: E0111 17:31:51.951206 4837 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-01-11 17:31:52.451191119 +0000 UTC m=+86.629383835 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-4nvrr" (UID: "bb63c9f5-457d-4c61-8cc6-56690e66a952") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 11 17:31:52 crc kubenswrapper[4837]: I0111 17:31:52.052318 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Jan 11 17:31:52 crc kubenswrapper[4837]: E0111 17:31:52.052881 4837 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-01-11 17:31:52.552856011 +0000 UTC m=+86.731048757 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 11 17:31:52 crc kubenswrapper[4837]: I0111 17:31:52.154837 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-4nvrr\" (UID: \"bb63c9f5-457d-4c61-8cc6-56690e66a952\") " pod="openshift-image-registry/image-registry-697d97f7c8-4nvrr" Jan 11 17:31:52 crc kubenswrapper[4837]: E0111 17:31:52.155365 4837 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-01-11 17:31:52.655339183 +0000 UTC m=+86.833531919 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-4nvrr" (UID: "bb63c9f5-457d-4c61-8cc6-56690e66a952") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 11 17:31:52 crc kubenswrapper[4837]: I0111 17:31:52.256028 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Jan 11 17:31:52 crc kubenswrapper[4837]: E0111 17:31:52.256146 4837 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-01-11 17:31:52.756125261 +0000 UTC m=+86.934317977 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 11 17:31:52 crc kubenswrapper[4837]: I0111 17:31:52.258309 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-4nvrr\" (UID: \"bb63c9f5-457d-4c61-8cc6-56690e66a952\") " pod="openshift-image-registry/image-registry-697d97f7c8-4nvrr" Jan 11 17:31:52 crc kubenswrapper[4837]: E0111 17:31:52.258592 4837 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-01-11 17:31:52.758577214 +0000 UTC m=+86.936769920 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-4nvrr" (UID: "bb63c9f5-457d-4c61-8cc6-56690e66a952") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 11 17:31:52 crc kubenswrapper[4837]: I0111 17:31:52.360748 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Jan 11 17:31:52 crc kubenswrapper[4837]: E0111 17:31:52.360985 4837 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-01-11 17:31:52.860959954 +0000 UTC m=+87.039152650 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 11 17:31:52 crc kubenswrapper[4837]: I0111 17:31:52.361085 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-4nvrr\" (UID: \"bb63c9f5-457d-4c61-8cc6-56690e66a952\") " pod="openshift-image-registry/image-registry-697d97f7c8-4nvrr" Jan 11 17:31:52 crc kubenswrapper[4837]: E0111 17:31:52.361463 4837 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-01-11 17:31:52.861456966 +0000 UTC m=+87.039649672 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-4nvrr" (UID: "bb63c9f5-457d-4c61-8cc6-56690e66a952") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 11 17:31:52 crc kubenswrapper[4837]: I0111 17:31:52.462126 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Jan 11 17:31:52 crc kubenswrapper[4837]: E0111 17:31:52.462284 4837 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-01-11 17:31:52.962260206 +0000 UTC m=+87.140452912 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 11 17:31:52 crc kubenswrapper[4837]: I0111 17:31:52.462389 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-4nvrr\" (UID: \"bb63c9f5-457d-4c61-8cc6-56690e66a952\") " pod="openshift-image-registry/image-registry-697d97f7c8-4nvrr" Jan 11 17:31:52 crc kubenswrapper[4837]: E0111 17:31:52.462966 4837 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-01-11 17:31:52.962940133 +0000 UTC m=+87.141132869 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-4nvrr" (UID: "bb63c9f5-457d-4c61-8cc6-56690e66a952") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 11 17:31:52 crc kubenswrapper[4837]: I0111 17:31:52.563794 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Jan 11 17:31:52 crc kubenswrapper[4837]: E0111 17:31:52.563952 4837 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-01-11 17:31:53.063926137 +0000 UTC m=+87.242118843 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 11 17:31:52 crc kubenswrapper[4837]: I0111 17:31:52.564027 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-4nvrr\" (UID: \"bb63c9f5-457d-4c61-8cc6-56690e66a952\") " pod="openshift-image-registry/image-registry-697d97f7c8-4nvrr" Jan 11 17:31:52 crc kubenswrapper[4837]: E0111 17:31:52.564382 4837 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-01-11 17:31:53.064373118 +0000 UTC m=+87.242565824 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-4nvrr" (UID: "bb63c9f5-457d-4c61-8cc6-56690e66a952") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 11 17:31:52 crc kubenswrapper[4837]: I0111 17:31:52.665338 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Jan 11 17:31:52 crc kubenswrapper[4837]: E0111 17:31:52.665502 4837 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-01-11 17:31:53.165470555 +0000 UTC m=+87.343663261 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 11 17:31:52 crc kubenswrapper[4837]: I0111 17:31:52.665638 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-4nvrr\" (UID: \"bb63c9f5-457d-4c61-8cc6-56690e66a952\") " pod="openshift-image-registry/image-registry-697d97f7c8-4nvrr" Jan 11 17:31:52 crc kubenswrapper[4837]: E0111 17:31:52.665941 4837 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-01-11 17:31:53.165933156 +0000 UTC m=+87.344125862 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-4nvrr" (UID: "bb63c9f5-457d-4c61-8cc6-56690e66a952") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 11 17:31:52 crc kubenswrapper[4837]: I0111 17:31:52.766649 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Jan 11 17:31:52 crc kubenswrapper[4837]: E0111 17:31:52.767222 4837 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-01-11 17:31:53.267208228 +0000 UTC m=+87.445400934 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 11 17:31:52 crc kubenswrapper[4837]: I0111 17:31:52.868382 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-4nvrr\" (UID: \"bb63c9f5-457d-4c61-8cc6-56690e66a952\") " pod="openshift-image-registry/image-registry-697d97f7c8-4nvrr" Jan 11 17:31:52 crc kubenswrapper[4837]: E0111 17:31:52.868732 4837 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-01-11 17:31:53.368720475 +0000 UTC m=+87.546913181 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-4nvrr" (UID: "bb63c9f5-457d-4c61-8cc6-56690e66a952") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 11 17:31:52 crc kubenswrapper[4837]: I0111 17:31:52.968984 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Jan 11 17:31:52 crc kubenswrapper[4837]: E0111 17:31:52.969367 4837 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-01-11 17:31:53.46935225 +0000 UTC m=+87.647544956 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 11 17:31:53 crc kubenswrapper[4837]: I0111 17:31:53.070738 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-4nvrr\" (UID: \"bb63c9f5-457d-4c61-8cc6-56690e66a952\") " pod="openshift-image-registry/image-registry-697d97f7c8-4nvrr" Jan 11 17:31:53 crc kubenswrapper[4837]: E0111 17:31:53.071029 4837 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-01-11 17:31:53.571018781 +0000 UTC m=+87.749211487 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-4nvrr" (UID: "bb63c9f5-457d-4c61-8cc6-56690e66a952") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 11 17:31:53 crc kubenswrapper[4837]: I0111 17:31:53.171546 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Jan 11 17:31:53 crc kubenswrapper[4837]: E0111 17:31:53.171707 4837 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-01-11 17:31:53.671667027 +0000 UTC m=+87.849859733 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 11 17:31:53 crc kubenswrapper[4837]: I0111 17:31:53.171847 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-4nvrr\" (UID: \"bb63c9f5-457d-4c61-8cc6-56690e66a952\") " pod="openshift-image-registry/image-registry-697d97f7c8-4nvrr" Jan 11 17:31:53 crc kubenswrapper[4837]: E0111 17:31:53.172179 4837 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-01-11 17:31:53.672171279 +0000 UTC m=+87.850363985 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-4nvrr" (UID: "bb63c9f5-457d-4c61-8cc6-56690e66a952") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 11 17:31:53 crc kubenswrapper[4837]: I0111 17:31:53.218502 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-dddb7" event={"ID":"1bb35332-d7fa-4163-99c1-3de2e12a6165","Type":"ContainerStarted","Data":"383d3a390f0104814291446bc88ba28ef3552045714a035eb36acd3413e7104a"} Jan 11 17:31:53 crc kubenswrapper[4837]: I0111 17:31:53.219849 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-2vscb" event={"ID":"43082497-a570-48fc-95f8-eda27581cde7","Type":"ContainerStarted","Data":"90f890094640588eece2f28f5ac9e65bd4184ff7cc027fc26926f1f97674c212"} Jan 11 17:31:53 crc kubenswrapper[4837]: I0111 17:31:53.220904 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-fxkfr" event={"ID":"4eff370b-3729-4929-b73f-752f1b00a318","Type":"ContainerStarted","Data":"9b09c1c066fe687af0f475ce61463e221ac17635652196e5f5b8367b793d2861"} Jan 11 17:31:53 crc kubenswrapper[4837]: I0111 17:31:53.222009 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd-operator/etcd-operator-b45778765-pjl6t" event={"ID":"9b730a03-0d21-4b8a-8fa6-3d64f0b3aa5b","Type":"ContainerStarted","Data":"a1b7f7df1120fce4b70fd6634e7834f034f57c30718241b79cce77566e55dc38"} Jan 11 17:31:53 crc kubenswrapper[4837]: I0111 17:31:53.223070 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca/service-ca-9c57cc56f-nqs9r" event={"ID":"334b61aa-43d5-4f60-af17-50408198b8f5","Type":"ContainerStarted","Data":"dd47cf22d2092201f57e886661935b94976d3f2c449885ff82094bd7b39e6403"} Jan 11 17:31:53 crc kubenswrapper[4837]: I0111 17:31:53.224099 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-operator-74547568cd-ps47j" event={"ID":"41ef8783-9b8b-427d-b3db-56d90bb448fa","Type":"ContainerStarted","Data":"afe44cc25a63885ea28fa3e58635174b9a9e5f82f9942750daca4e31cef7f58c"} Jan 11 17:31:53 crc kubenswrapper[4837]: I0111 17:31:53.225094 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress/router-default-5444994796-cscqg" event={"ID":"e610467b-9c0c-47ac-86c8-2d700aba3e8e","Type":"ContainerStarted","Data":"01412669f5b9adf1515772cfcfd404063522d0abc6685074ef660be9998334a6"} Jan 11 17:31:53 crc kubenswrapper[4837]: I0111 17:31:53.226219 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-api/machine-api-operator-5694c8668f-xk9j9" event={"ID":"b61e27df-5c38-48b3-b6e9-bca3ce8aa429","Type":"ContainerStarted","Data":"e75a422ccba173613060ac5dcbf3c4ffbd52e203686a2ad7dd5eebe0b77ede57"} Jan 11 17:31:53 crc kubenswrapper[4837]: I0111 17:31:53.227455 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-machine-approver/machine-approver-56656f9798-hchsh" event={"ID":"59f65053-461b-4ad3-acc6-2a29b1fd06c6","Type":"ContainerStarted","Data":"1cbad7a46598469795f895e10b0f74832c125c532ad4c737d3961e98f35b9c96"} Jan 11 17:31:53 crc kubenswrapper[4837]: I0111 17:31:53.228601 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-admission-controller-857f4d67dd-2984c" event={"ID":"8c7124f1-e7cc-4ae8-89d2-19457c045576","Type":"ContainerStarted","Data":"adaac6993af6c1acc538f0c81b0866726d49b363b9283a95a9a0822bd45ad2a3"} Jan 11 17:31:53 crc kubenswrapper[4837]: I0111 17:31:53.229659 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29469210-m2c5r" event={"ID":"90317184-ec5a-4dc2-a9f8-6075c0d78aa1","Type":"ContainerStarted","Data":"68d6e80bddec6bf073fe45ce7b2b3e74314236ca041b7b15504ef166b0de6576"} Jan 11 17:31:53 crc kubenswrapper[4837]: I0111 17:31:53.230820 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-canary/ingress-canary-wbdfb" event={"ID":"5a71c215-dd54-402e-aaaa-ad6f2320da35","Type":"ContainerStarted","Data":"c1d8a18dc6304f4e478b99a6c80025c4f301e92217c43e5664bdddf8991ad499"} Jan 11 17:31:53 crc kubenswrapper[4837]: I0111 17:31:53.231566 4837 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-multus/cni-sysctl-allowlist-ds-pvfwl" Jan 11 17:31:53 crc kubenswrapper[4837]: I0111 17:31:53.248719 4837 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-multus/cni-sysctl-allowlist-ds-pvfwl" Jan 11 17:31:53 crc kubenswrapper[4837]: I0111 17:31:53.272796 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Jan 11 17:31:53 crc kubenswrapper[4837]: E0111 17:31:53.272944 4837 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-01-11 17:31:53.772919057 +0000 UTC m=+87.951111773 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 11 17:31:53 crc kubenswrapper[4837]: I0111 17:31:53.273837 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-4nvrr\" (UID: \"bb63c9f5-457d-4c61-8cc6-56690e66a952\") " pod="openshift-image-registry/image-registry-697d97f7c8-4nvrr" Jan 11 17:31:53 crc kubenswrapper[4837]: E0111 17:31:53.276476 4837 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-01-11 17:31:53.776449197 +0000 UTC m=+87.954641973 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-4nvrr" (UID: "bb63c9f5-457d-4c61-8cc6-56690e66a952") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 11 17:31:53 crc kubenswrapper[4837]: I0111 17:31:53.281791 4837 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/cni-sysctl-allowlist-ds-pvfwl" podStartSLOduration=7.281775533 podStartE2EDuration="7.281775533s" podCreationTimestamp="2026-01-11 17:31:46 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-11 17:31:53.280905161 +0000 UTC m=+87.459097877" watchObservedRunningTime="2026-01-11 17:31:53.281775533 +0000 UTC m=+87.459968239" Jan 11 17:31:53 crc kubenswrapper[4837]: I0111 17:31:53.296283 4837 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-c9fvj" podStartSLOduration=45.296266542 podStartE2EDuration="45.296266542s" podCreationTimestamp="2026-01-11 17:31:08 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-11 17:31:53.294519128 +0000 UTC m=+87.472711834" watchObservedRunningTime="2026-01-11 17:31:53.296266542 +0000 UTC m=+87.474459238" Jan 11 17:31:53 crc kubenswrapper[4837]: I0111 17:31:53.320838 4837 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console-operator/console-operator-58897d9998-n8ldn" podStartSLOduration=45.320823768 podStartE2EDuration="45.320823768s" podCreationTimestamp="2026-01-11 17:31:08 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-11 17:31:53.318609812 +0000 UTC m=+87.496802518" watchObservedRunningTime="2026-01-11 17:31:53.320823768 +0000 UTC m=+87.499016474" Jan 11 17:31:53 crc kubenswrapper[4837]: I0111 17:31:53.348541 4837 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-5xllw" podStartSLOduration=44.348524234 podStartE2EDuration="44.348524234s" podCreationTimestamp="2026-01-11 17:31:09 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-11 17:31:53.348287138 +0000 UTC m=+87.526479844" watchObservedRunningTime="2026-01-11 17:31:53.348524234 +0000 UTC m=+87.526716940" Jan 11 17:31:53 crc kubenswrapper[4837]: I0111 17:31:53.368445 4837 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-r58pw" podStartSLOduration=45.368429902 podStartE2EDuration="45.368429902s" podCreationTimestamp="2026-01-11 17:31:08 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-11 17:31:53.367838507 +0000 UTC m=+87.546031213" watchObservedRunningTime="2026-01-11 17:31:53.368429902 +0000 UTC m=+87.546622608" Jan 11 17:31:53 crc kubenswrapper[4837]: I0111 17:31:53.374935 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Jan 11 17:31:53 crc kubenswrapper[4837]: E0111 17:31:53.375118 4837 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-01-11 17:31:53.875091652 +0000 UTC m=+88.053284358 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 11 17:31:53 crc kubenswrapper[4837]: I0111 17:31:53.375247 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-4nvrr\" (UID: \"bb63c9f5-457d-4c61-8cc6-56690e66a952\") " pod="openshift-image-registry/image-registry-697d97f7c8-4nvrr" Jan 11 17:31:53 crc kubenswrapper[4837]: E0111 17:31:53.375486 4837 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-01-11 17:31:53.875473202 +0000 UTC m=+88.053665908 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-4nvrr" (UID: "bb63c9f5-457d-4c61-8cc6-56690e66a952") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 11 17:31:53 crc kubenswrapper[4837]: I0111 17:31:53.397372 4837 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-mklqs" podStartSLOduration=45.39735482 podStartE2EDuration="45.39735482s" podCreationTimestamp="2026-01-11 17:31:08 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-11 17:31:53.395196594 +0000 UTC m=+87.573389300" watchObservedRunningTime="2026-01-11 17:31:53.39735482 +0000 UTC m=+87.575547516" Jan 11 17:31:53 crc kubenswrapper[4837]: I0111 17:31:53.476943 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Jan 11 17:31:53 crc kubenswrapper[4837]: E0111 17:31:53.477135 4837 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-01-11 17:31:53.977107712 +0000 UTC m=+88.155300418 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 11 17:31:53 crc kubenswrapper[4837]: I0111 17:31:53.477334 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-4nvrr\" (UID: \"bb63c9f5-457d-4c61-8cc6-56690e66a952\") " pod="openshift-image-registry/image-registry-697d97f7c8-4nvrr" Jan 11 17:31:53 crc kubenswrapper[4837]: E0111 17:31:53.477645 4837 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-01-11 17:31:53.977632955 +0000 UTC m=+88.155825661 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-4nvrr" (UID: "bb63c9f5-457d-4c61-8cc6-56690e66a952") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 11 17:31:53 crc kubenswrapper[4837]: I0111 17:31:53.579313 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Jan 11 17:31:53 crc kubenswrapper[4837]: E0111 17:31:53.579510 4837 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-01-11 17:31:54.079485952 +0000 UTC m=+88.257678658 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 11 17:31:53 crc kubenswrapper[4837]: I0111 17:31:53.579812 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-4nvrr\" (UID: \"bb63c9f5-457d-4c61-8cc6-56690e66a952\") " pod="openshift-image-registry/image-registry-697d97f7c8-4nvrr" Jan 11 17:31:53 crc kubenswrapper[4837]: E0111 17:31:53.580152 4837 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-01-11 17:31:54.080144178 +0000 UTC m=+88.258336884 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-4nvrr" (UID: "bb63c9f5-457d-4c61-8cc6-56690e66a952") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 11 17:31:53 crc kubenswrapper[4837]: I0111 17:31:53.681195 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Jan 11 17:31:53 crc kubenswrapper[4837]: E0111 17:31:53.681548 4837 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-01-11 17:31:54.181531422 +0000 UTC m=+88.359724128 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 11 17:31:53 crc kubenswrapper[4837]: I0111 17:31:53.783031 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-4nvrr\" (UID: \"bb63c9f5-457d-4c61-8cc6-56690e66a952\") " pod="openshift-image-registry/image-registry-697d97f7c8-4nvrr" Jan 11 17:31:53 crc kubenswrapper[4837]: E0111 17:31:53.783466 4837 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-01-11 17:31:54.283453619 +0000 UTC m=+88.461646325 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-4nvrr" (UID: "bb63c9f5-457d-4c61-8cc6-56690e66a952") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 11 17:31:53 crc kubenswrapper[4837]: I0111 17:31:53.884227 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Jan 11 17:31:53 crc kubenswrapper[4837]: E0111 17:31:53.884751 4837 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-01-11 17:31:54.384725501 +0000 UTC m=+88.562918267 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 11 17:31:53 crc kubenswrapper[4837]: I0111 17:31:53.986384 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-4nvrr\" (UID: \"bb63c9f5-457d-4c61-8cc6-56690e66a952\") " pod="openshift-image-registry/image-registry-697d97f7c8-4nvrr" Jan 11 17:31:53 crc kubenswrapper[4837]: E0111 17:31:53.987053 4837 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-01-11 17:31:54.487041049 +0000 UTC m=+88.665233755 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-4nvrr" (UID: "bb63c9f5-457d-4c61-8cc6-56690e66a952") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 11 17:31:54 crc kubenswrapper[4837]: I0111 17:31:54.097261 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Jan 11 17:31:54 crc kubenswrapper[4837]: E0111 17:31:54.097699 4837 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-01-11 17:31:54.597661609 +0000 UTC m=+88.775854315 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 11 17:31:54 crc kubenswrapper[4837]: I0111 17:31:54.199459 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-4nvrr\" (UID: \"bb63c9f5-457d-4c61-8cc6-56690e66a952\") " pod="openshift-image-registry/image-registry-697d97f7c8-4nvrr" Jan 11 17:31:54 crc kubenswrapper[4837]: E0111 17:31:54.199911 4837 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-01-11 17:31:54.699899364 +0000 UTC m=+88.878092070 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-4nvrr" (UID: "bb63c9f5-457d-4c61-8cc6-56690e66a952") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 11 17:31:54 crc kubenswrapper[4837]: I0111 17:31:54.237488 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-wphsv" event={"ID":"cb48d08d-1606-49d9-a55d-53b21ad9f404","Type":"ContainerStarted","Data":"59f142fd79b76637d77b90c4b1d3277509dbb477918b14c397be87dff9b1e2bf"} Jan 11 17:31:54 crc kubenswrapper[4837]: I0111 17:31:54.237703 4837 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-wphsv" Jan 11 17:31:54 crc kubenswrapper[4837]: I0111 17:31:54.239422 4837 patch_prober.go:28] interesting pod/packageserver-d55dfcdfc-wphsv container/packageserver namespace/openshift-operator-lifecycle-manager: Readiness probe status=failure output="Get \"https://10.217.0.28:5443/healthz\": dial tcp 10.217.0.28:5443: connect: connection refused" start-of-body= Jan 11 17:31:54 crc kubenswrapper[4837]: I0111 17:31:54.239459 4837 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-wphsv" podUID="cb48d08d-1606-49d9-a55d-53b21ad9f404" containerName="packageserver" probeResult="failure" output="Get \"https://10.217.0.28:5443/healthz\": dial tcp 10.217.0.28:5443: connect: connection refused" Jan 11 17:31:54 crc kubenswrapper[4837]: I0111 17:31:54.252805 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-2ngzf" event={"ID":"ecf9daad-b73a-4f1a-9247-ee4973ad1bc7","Type":"ContainerStarted","Data":"e73e05a6eb5e8d9464b72194698bee956ae071161b05a317cac154569d6acf67"} Jan 11 17:31:54 crc kubenswrapper[4837]: I0111 17:31:54.254703 4837 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-wphsv" podStartSLOduration=45.254692721 podStartE2EDuration="45.254692721s" podCreationTimestamp="2026-01-11 17:31:09 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-11 17:31:54.25231191 +0000 UTC m=+88.430504606" watchObservedRunningTime="2026-01-11 17:31:54.254692721 +0000 UTC m=+88.432885417" Jan 11 17:31:54 crc kubenswrapper[4837]: I0111 17:31:54.276213 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-9dr4c" event={"ID":"e59909fb-c783-46de-9955-1c31cc9fd6b2","Type":"ContainerStarted","Data":"3bbea35785cfe2b4357bc2f8813b6aebcf34cf807859a7c1d06ba30e2436c553"} Jan 11 17:31:54 crc kubenswrapper[4837]: I0111 17:31:54.292350 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-sxnbq" event={"ID":"4047a74f-e1b9-40a2-b525-bc06011477d7","Type":"ContainerStarted","Data":"66821fcc6c103bca99b897c9241896fc6e88973171b0cc20cfe80333dc0d18d8"} Jan 11 17:31:54 crc kubenswrapper[4837]: I0111 17:31:54.295861 4837 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-multus/cni-sysctl-allowlist-ds-pvfwl"] Jan 11 17:31:54 crc kubenswrapper[4837]: I0111 17:31:54.299591 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-bs6rl" event={"ID":"75e7928f-bf2f-4a17-b2eb-f4fc925c7ce3","Type":"ContainerStarted","Data":"e134be6724c1630936dc813e1e7a8dae81118fe5e48762140b6e1206988583f8"} Jan 11 17:31:54 crc kubenswrapper[4837]: I0111 17:31:54.300498 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/marketplace-operator-79b997595-gkkhd" event={"ID":"f9e34a1e-5456-4b26-b347-aa569c5987d5","Type":"ContainerStarted","Data":"22fe794e84e21e3d7722ec9d66f7b89843f7045cce8aa3bf0dc58a0edc5fa3b3"} Jan 11 17:31:54 crc kubenswrapper[4837]: I0111 17:31:54.301252 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication/oauth-openshift-558db77b4-ldzgv" event={"ID":"1b5ce6bf-72e2-494a-aa22-830e992fbec5","Type":"ContainerStarted","Data":"e85f3b8dd2e3e1c3b105111e316094cea33f35099a17bdeeaec768ca613f28ce"} Jan 11 17:31:54 crc kubenswrapper[4837]: I0111 17:31:54.302313 4837 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-authentication/oauth-openshift-558db77b4-ldzgv" Jan 11 17:31:54 crc kubenswrapper[4837]: I0111 17:31:54.303563 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-config-operator/openshift-config-operator-7777fb866f-599f5" event={"ID":"8854497b-9e74-4bd1-b465-847ad61d8779","Type":"ContainerStarted","Data":"4671007075e89e6deba029e43deca89abfc63606076d26da371bd629ad7ce674"} Jan 11 17:31:54 crc kubenswrapper[4837]: I0111 17:31:54.303634 4837 patch_prober.go:28] interesting pod/oauth-openshift-558db77b4-ldzgv container/oauth-openshift namespace/openshift-authentication: Readiness probe status=failure output="Get \"https://10.217.0.17:6443/healthz\": dial tcp 10.217.0.17:6443: connect: connection refused" start-of-body= Jan 11 17:31:54 crc kubenswrapper[4837]: I0111 17:31:54.303661 4837 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-authentication/oauth-openshift-558db77b4-ldzgv" podUID="1b5ce6bf-72e2-494a-aa22-830e992fbec5" containerName="oauth-openshift" probeResult="failure" output="Get \"https://10.217.0.17:6443/healthz\": dial tcp 10.217.0.17:6443: connect: connection refused" Jan 11 17:31:54 crc kubenswrapper[4837]: I0111 17:31:54.304791 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Jan 11 17:31:54 crc kubenswrapper[4837]: E0111 17:31:54.304870 4837 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-01-11 17:31:54.804852709 +0000 UTC m=+88.983045415 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 11 17:31:54 crc kubenswrapper[4837]: I0111 17:31:54.304802 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="hostpath-provisioner/csi-hostpathplugin-tdxwr" event={"ID":"ce38ff9a-9354-463c-a8b5-3d4bbea8694d","Type":"ContainerStarted","Data":"f1edde94bd273f1203af6d2d4d7a36f77080e5a182710e7f5835d0fb1a1cd065"} Jan 11 17:31:54 crc kubenswrapper[4837]: I0111 17:31:54.305027 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-4nvrr\" (UID: \"bb63c9f5-457d-4c61-8cc6-56690e66a952\") " pod="openshift-image-registry/image-registry-697d97f7c8-4nvrr" Jan 11 17:31:54 crc kubenswrapper[4837]: E0111 17:31:54.306583 4837 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-01-11 17:31:54.806433739 +0000 UTC m=+88.984626445 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-4nvrr" (UID: "bb63c9f5-457d-4c61-8cc6-56690e66a952") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 11 17:31:54 crc kubenswrapper[4837]: I0111 17:31:54.307423 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns-operator/dns-operator-744455d44c-n8rwr" event={"ID":"a45207f1-08c5-4ae8-9ac3-3e3099df8218","Type":"ContainerStarted","Data":"78f1ba0bf48d1c9ccc18ad614f955b07d1cbe19fd837490adba43f102ba46363"} Jan 11 17:31:54 crc kubenswrapper[4837]: I0111 17:31:54.310263 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca-operator/service-ca-operator-777779d784-vzfwx" event={"ID":"d310cf03-2422-47e4-a98b-4a0636c1b8f8","Type":"ContainerStarted","Data":"8e1ea8f2f2db22704a461bd22ebd2169b73d5be905469ee3aaa11fb818d448a7"} Jan 11 17:31:54 crc kubenswrapper[4837]: I0111 17:31:54.311732 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-controller-84d6567774-zdg78" event={"ID":"8299e03d-1f93-4032-bcad-2ce040734c86","Type":"ContainerStarted","Data":"e2b22d326431e2aa6dbb458d92b3de1b2d25c3873ea4876906f46dfa22dbdc2e"} Jan 11 17:31:54 crc kubenswrapper[4837]: I0111 17:31:54.312508 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-p7jgn" event={"ID":"09ba226e-a3c0-402a-a7d0-bbc4f46d5dd3","Type":"ContainerStarted","Data":"7f7c50e307f728818823cb10441207bb8b0da28bd8239b2fda69cdb7b6ffd5aa"} Jan 11 17:31:54 crc kubenswrapper[4837]: I0111 17:31:54.316141 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-kq5ch" event={"ID":"876b267e-baf3-4f7d-a4a3-49f44b4dfbb7","Type":"ContainerStarted","Data":"fc54b88dee5bcbe1cdf69eb89acdf0f1cd9dfd4891e9ec7792db67d09e8ba463"} Jan 11 17:31:54 crc kubenswrapper[4837]: I0111 17:31:54.331122 4837 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-ingress/router-default-5444994796-cscqg" Jan 11 17:31:54 crc kubenswrapper[4837]: I0111 17:31:54.339495 4837 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-authentication/oauth-openshift-558db77b4-ldzgv" podStartSLOduration=46.339477912 podStartE2EDuration="46.339477912s" podCreationTimestamp="2026-01-11 17:31:08 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-11 17:31:54.33861873 +0000 UTC m=+88.516811426" watchObservedRunningTime="2026-01-11 17:31:54.339477912 +0000 UTC m=+88.517670608" Jan 11 17:31:54 crc kubenswrapper[4837]: I0111 17:31:54.344069 4837 patch_prober.go:28] interesting pod/router-default-5444994796-cscqg container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Jan 11 17:31:54 crc kubenswrapper[4837]: [-]has-synced failed: reason withheld Jan 11 17:31:54 crc kubenswrapper[4837]: [+]process-running ok Jan 11 17:31:54 crc kubenswrapper[4837]: healthz check failed Jan 11 17:31:54 crc kubenswrapper[4837]: I0111 17:31:54.344115 4837 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-cscqg" podUID="e610467b-9c0c-47ac-86c8-2d700aba3e8e" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Jan 11 17:31:54 crc kubenswrapper[4837]: I0111 17:31:54.375189 4837 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ingress-canary/ingress-canary-wbdfb" podStartSLOduration=9.375175952 podStartE2EDuration="9.375175952s" podCreationTimestamp="2026-01-11 17:31:45 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-11 17:31:54.373461238 +0000 UTC m=+88.551653944" watchObservedRunningTime="2026-01-11 17:31:54.375175952 +0000 UTC m=+88.553368658" Jan 11 17:31:54 crc kubenswrapper[4837]: I0111 17:31:54.406454 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Jan 11 17:31:54 crc kubenswrapper[4837]: E0111 17:31:54.407875 4837 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-01-11 17:31:54.907852534 +0000 UTC m=+89.086045240 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 11 17:31:54 crc kubenswrapper[4837]: I0111 17:31:54.409736 4837 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-etcd-operator/etcd-operator-b45778765-pjl6t" podStartSLOduration=46.409717402 podStartE2EDuration="46.409717402s" podCreationTimestamp="2026-01-11 17:31:08 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-11 17:31:54.408364227 +0000 UTC m=+88.586556933" watchObservedRunningTime="2026-01-11 17:31:54.409717402 +0000 UTC m=+88.587910108" Jan 11 17:31:54 crc kubenswrapper[4837]: I0111 17:31:54.443198 4837 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-machine-approver/machine-approver-56656f9798-hchsh" podStartSLOduration=46.443179805 podStartE2EDuration="46.443179805s" podCreationTimestamp="2026-01-11 17:31:08 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-11 17:31:54.441881272 +0000 UTC m=+88.620073978" watchObservedRunningTime="2026-01-11 17:31:54.443179805 +0000 UTC m=+88.621372511" Jan 11 17:31:54 crc kubenswrapper[4837]: I0111 17:31:54.461891 4837 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ingress/router-default-5444994796-cscqg" podStartSLOduration=46.461862902 podStartE2EDuration="46.461862902s" podCreationTimestamp="2026-01-11 17:31:08 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-11 17:31:54.461535412 +0000 UTC m=+88.639728118" watchObservedRunningTime="2026-01-11 17:31:54.461862902 +0000 UTC m=+88.640055608" Jan 11 17:31:54 crc kubenswrapper[4837]: I0111 17:31:54.508087 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-4nvrr\" (UID: \"bb63c9f5-457d-4c61-8cc6-56690e66a952\") " pod="openshift-image-registry/image-registry-697d97f7c8-4nvrr" Jan 11 17:31:54 crc kubenswrapper[4837]: E0111 17:31:54.508563 4837 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-01-11 17:31:55.008547931 +0000 UTC m=+89.186740637 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-4nvrr" (UID: "bb63c9f5-457d-4c61-8cc6-56690e66a952") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 11 17:31:54 crc kubenswrapper[4837]: I0111 17:31:54.514039 4837 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-fxkfr" podStartSLOduration=46.513959139 podStartE2EDuration="46.513959139s" podCreationTimestamp="2026-01-11 17:31:08 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-11 17:31:54.489394073 +0000 UTC m=+88.667586779" watchObservedRunningTime="2026-01-11 17:31:54.513959139 +0000 UTC m=+88.692151835" Jan 11 17:31:54 crc kubenswrapper[4837]: I0111 17:31:54.528533 4837 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operator-lifecycle-manager/collect-profiles-29469210-m2c5r" podStartSLOduration=46.52851833 podStartE2EDuration="46.52851833s" podCreationTimestamp="2026-01-11 17:31:08 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-11 17:31:54.526103868 +0000 UTC m=+88.704296574" watchObservedRunningTime="2026-01-11 17:31:54.52851833 +0000 UTC m=+88.706711036" Jan 11 17:31:54 crc kubenswrapper[4837]: I0111 17:31:54.554011 4837 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-service-ca-operator/service-ca-operator-777779d784-vzfwx" podStartSLOduration=45.553995529 podStartE2EDuration="45.553995529s" podCreationTimestamp="2026-01-11 17:31:09 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-11 17:31:54.548830838 +0000 UTC m=+88.727023544" watchObservedRunningTime="2026-01-11 17:31:54.553995529 +0000 UTC m=+88.732188235" Jan 11 17:31:54 crc kubenswrapper[4837]: I0111 17:31:54.579295 4837 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-service-ca/service-ca-9c57cc56f-nqs9r" podStartSLOduration=45.579281723 podStartE2EDuration="45.579281723s" podCreationTimestamp="2026-01-11 17:31:09 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-11 17:31:54.577377666 +0000 UTC m=+88.755570362" watchObservedRunningTime="2026-01-11 17:31:54.579281723 +0000 UTC m=+88.757474429" Jan 11 17:31:54 crc kubenswrapper[4837]: I0111 17:31:54.613997 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Jan 11 17:31:54 crc kubenswrapper[4837]: E0111 17:31:54.614187 4837 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-01-11 17:31:55.114160553 +0000 UTC m=+89.292353259 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 11 17:31:54 crc kubenswrapper[4837]: I0111 17:31:54.614440 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-4nvrr\" (UID: \"bb63c9f5-457d-4c61-8cc6-56690e66a952\") " pod="openshift-image-registry/image-registry-697d97f7c8-4nvrr" Jan 11 17:31:54 crc kubenswrapper[4837]: E0111 17:31:54.614885 4837 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-01-11 17:31:55.114871901 +0000 UTC m=+89.293064607 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-4nvrr" (UID: "bb63c9f5-457d-4c61-8cc6-56690e66a952") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 11 17:31:54 crc kubenswrapper[4837]: I0111 17:31:54.715298 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Jan 11 17:31:54 crc kubenswrapper[4837]: E0111 17:31:54.715668 4837 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-01-11 17:31:55.215643059 +0000 UTC m=+89.393835765 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 11 17:31:54 crc kubenswrapper[4837]: I0111 17:31:54.817110 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/b0d76b5a-6ea4-4508-ac4b-0f74711d7f68-metrics-certs\") pod \"network-metrics-daemon-f2l24\" (UID: \"b0d76b5a-6ea4-4508-ac4b-0f74711d7f68\") " pod="openshift-multus/network-metrics-daemon-f2l24" Jan 11 17:31:54 crc kubenswrapper[4837]: I0111 17:31:54.817166 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-4nvrr\" (UID: \"bb63c9f5-457d-4c61-8cc6-56690e66a952\") " pod="openshift-image-registry/image-registry-697d97f7c8-4nvrr" Jan 11 17:31:54 crc kubenswrapper[4837]: E0111 17:31:54.817440 4837 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-01-11 17:31:55.317427673 +0000 UTC m=+89.495620379 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-4nvrr" (UID: "bb63c9f5-457d-4c61-8cc6-56690e66a952") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 11 17:31:54 crc kubenswrapper[4837]: I0111 17:31:54.823125 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/b0d76b5a-6ea4-4508-ac4b-0f74711d7f68-metrics-certs\") pod \"network-metrics-daemon-f2l24\" (UID: \"b0d76b5a-6ea4-4508-ac4b-0f74711d7f68\") " pod="openshift-multus/network-metrics-daemon-f2l24" Jan 11 17:31:54 crc kubenswrapper[4837]: I0111 17:31:54.877651 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-f2l24" Jan 11 17:31:54 crc kubenswrapper[4837]: I0111 17:31:54.918427 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Jan 11 17:31:54 crc kubenswrapper[4837]: E0111 17:31:54.918621 4837 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-01-11 17:31:55.418590773 +0000 UTC m=+89.596783479 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 11 17:31:54 crc kubenswrapper[4837]: I0111 17:31:54.919958 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-4nvrr\" (UID: \"bb63c9f5-457d-4c61-8cc6-56690e66a952\") " pod="openshift-image-registry/image-registry-697d97f7c8-4nvrr" Jan 11 17:31:54 crc kubenswrapper[4837]: E0111 17:31:54.920452 4837 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-01-11 17:31:55.420440169 +0000 UTC m=+89.598632875 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-4nvrr" (UID: "bb63c9f5-457d-4c61-8cc6-56690e66a952") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 11 17:31:55 crc kubenswrapper[4837]: I0111 17:31:55.021333 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Jan 11 17:31:55 crc kubenswrapper[4837]: E0111 17:31:55.021690 4837 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-01-11 17:31:55.521658209 +0000 UTC m=+89.699850905 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 11 17:31:55 crc kubenswrapper[4837]: I0111 17:31:55.122261 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-4nvrr\" (UID: \"bb63c9f5-457d-4c61-8cc6-56690e66a952\") " pod="openshift-image-registry/image-registry-697d97f7c8-4nvrr" Jan 11 17:31:55 crc kubenswrapper[4837]: E0111 17:31:55.122602 4837 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-01-11 17:31:55.622591911 +0000 UTC m=+89.800784617 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-4nvrr" (UID: "bb63c9f5-457d-4c61-8cc6-56690e66a952") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 11 17:31:55 crc kubenswrapper[4837]: I0111 17:31:55.156262 4837 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-multus/network-metrics-daemon-f2l24"] Jan 11 17:31:55 crc kubenswrapper[4837]: I0111 17:31:55.223324 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Jan 11 17:31:55 crc kubenswrapper[4837]: E0111 17:31:55.223574 4837 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-01-11 17:31:55.723544134 +0000 UTC m=+89.901736850 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 11 17:31:55 crc kubenswrapper[4837]: I0111 17:31:55.287600 4837 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-apiserver/kube-apiserver-crc" Jan 11 17:31:55 crc kubenswrapper[4837]: I0111 17:31:55.323045 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-f2l24" event={"ID":"b0d76b5a-6ea4-4508-ac4b-0f74711d7f68","Type":"ContainerStarted","Data":"b4d73bef8fdc6f835d224c4a810fe32d3d9e0cd57cbdc0c9908759b35fd5b89f"} Jan 11 17:31:55 crc kubenswrapper[4837]: I0111 17:31:55.323559 4837 patch_prober.go:28] interesting pod/packageserver-d55dfcdfc-wphsv container/packageserver namespace/openshift-operator-lifecycle-manager: Readiness probe status=failure output="Get \"https://10.217.0.28:5443/healthz\": dial tcp 10.217.0.28:5443: connect: connection refused" start-of-body= Jan 11 17:31:55 crc kubenswrapper[4837]: I0111 17:31:55.323607 4837 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-wphsv" podUID="cb48d08d-1606-49d9-a55d-53b21ad9f404" containerName="packageserver" probeResult="failure" output="Get \"https://10.217.0.28:5443/healthz\": dial tcp 10.217.0.28:5443: connect: connection refused" Jan 11 17:31:55 crc kubenswrapper[4837]: I0111 17:31:55.324296 4837 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-multus/cni-sysctl-allowlist-ds-pvfwl" podUID="31dcdcdc-a207-4f09-90af-82c452f9a3f0" containerName="kube-multus-additional-cni-plugins" containerID="cri-o://8e11a566745fc0a91a7711eeb5325461c2be37c7f2d3144b9cc34f15e4aa587a" gracePeriod=30 Jan 11 17:31:55 crc kubenswrapper[4837]: I0111 17:31:55.324535 4837 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-config-operator/openshift-config-operator-7777fb866f-599f5" Jan 11 17:31:55 crc kubenswrapper[4837]: I0111 17:31:55.325056 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-4nvrr\" (UID: \"bb63c9f5-457d-4c61-8cc6-56690e66a952\") " pod="openshift-image-registry/image-registry-697d97f7c8-4nvrr" Jan 11 17:31:55 crc kubenswrapper[4837]: E0111 17:31:55.325532 4837 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-01-11 17:31:55.825512224 +0000 UTC m=+90.003704970 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-4nvrr" (UID: "bb63c9f5-457d-4c61-8cc6-56690e66a952") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 11 17:31:55 crc kubenswrapper[4837]: I0111 17:31:55.341613 4837 patch_prober.go:28] interesting pod/router-default-5444994796-cscqg container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Jan 11 17:31:55 crc kubenswrapper[4837]: [-]has-synced failed: reason withheld Jan 11 17:31:55 crc kubenswrapper[4837]: [+]process-running ok Jan 11 17:31:55 crc kubenswrapper[4837]: healthz check failed Jan 11 17:31:55 crc kubenswrapper[4837]: I0111 17:31:55.341787 4837 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-cscqg" podUID="e610467b-9c0c-47ac-86c8-2d700aba3e8e" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Jan 11 17:31:55 crc kubenswrapper[4837]: I0111 17:31:55.353365 4837 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-config-operator/openshift-config-operator-7777fb866f-599f5" podStartSLOduration=47.353345713 podStartE2EDuration="47.353345713s" podCreationTimestamp="2026-01-11 17:31:08 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-11 17:31:55.351826214 +0000 UTC m=+89.530018940" watchObservedRunningTime="2026-01-11 17:31:55.353345713 +0000 UTC m=+89.531538419" Jan 11 17:31:55 crc kubenswrapper[4837]: I0111 17:31:55.398932 4837 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-2ngzf" podStartSLOduration=46.398912795 podStartE2EDuration="46.398912795s" podCreationTimestamp="2026-01-11 17:31:09 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-11 17:31:55.39599515 +0000 UTC m=+89.574187876" watchObservedRunningTime="2026-01-11 17:31:55.398912795 +0000 UTC m=+89.577105501" Jan 11 17:31:55 crc kubenswrapper[4837]: I0111 17:31:55.413369 4837 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/marketplace-operator-79b997595-gkkhd" podStartSLOduration=46.413353812 podStartE2EDuration="46.413353812s" podCreationTimestamp="2026-01-11 17:31:09 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-11 17:31:55.410856619 +0000 UTC m=+89.589049335" watchObservedRunningTime="2026-01-11 17:31:55.413353812 +0000 UTC m=+89.591546518" Jan 11 17:31:55 crc kubenswrapper[4837]: I0111 17:31:55.426282 4837 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-bs6rl" podStartSLOduration=47.426265152 podStartE2EDuration="47.426265152s" podCreationTimestamp="2026-01-11 17:31:08 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-11 17:31:55.424097696 +0000 UTC m=+89.602290402" watchObservedRunningTime="2026-01-11 17:31:55.426265152 +0000 UTC m=+89.604457858" Jan 11 17:31:55 crc kubenswrapper[4837]: I0111 17:31:55.427856 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Jan 11 17:31:55 crc kubenswrapper[4837]: E0111 17:31:55.428051 4837 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-01-11 17:31:55.928033916 +0000 UTC m=+90.106226622 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 11 17:31:55 crc kubenswrapper[4837]: I0111 17:31:55.428381 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-4nvrr\" (UID: \"bb63c9f5-457d-4c61-8cc6-56690e66a952\") " pod="openshift-image-registry/image-registry-697d97f7c8-4nvrr" Jan 11 17:31:55 crc kubenswrapper[4837]: E0111 17:31:55.431099 4837 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-01-11 17:31:55.931084605 +0000 UTC m=+90.109277321 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-4nvrr" (UID: "bb63c9f5-457d-4c61-8cc6-56690e66a952") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 11 17:31:55 crc kubenswrapper[4837]: I0111 17:31:55.443775 4837 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-sxnbq" podStartSLOduration=46.443756127 podStartE2EDuration="46.443756127s" podCreationTimestamp="2026-01-11 17:31:09 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-11 17:31:55.441440528 +0000 UTC m=+89.619633244" watchObservedRunningTime="2026-01-11 17:31:55.443756127 +0000 UTC m=+89.621948833" Jan 11 17:31:55 crc kubenswrapper[4837]: I0111 17:31:55.458052 4837 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-9dr4c" podStartSLOduration=47.458034851 podStartE2EDuration="47.458034851s" podCreationTimestamp="2026-01-11 17:31:08 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-11 17:31:55.457295903 +0000 UTC m=+89.635488609" watchObservedRunningTime="2026-01-11 17:31:55.458034851 +0000 UTC m=+89.636227557" Jan 11 17:31:55 crc kubenswrapper[4837]: I0111 17:31:55.473923 4837 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-dns-operator/dns-operator-744455d44c-n8rwr" podStartSLOduration=47.473900816 podStartE2EDuration="47.473900816s" podCreationTimestamp="2026-01-11 17:31:08 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-11 17:31:55.472245743 +0000 UTC m=+89.650438439" watchObservedRunningTime="2026-01-11 17:31:55.473900816 +0000 UTC m=+89.652093522" Jan 11 17:31:55 crc kubenswrapper[4837]: I0111 17:31:55.530418 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Jan 11 17:31:55 crc kubenswrapper[4837]: E0111 17:31:55.530755 4837 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-01-11 17:31:56.030737164 +0000 UTC m=+90.208929860 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 11 17:31:55 crc kubenswrapper[4837]: I0111 17:31:55.634262 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-4nvrr\" (UID: \"bb63c9f5-457d-4c61-8cc6-56690e66a952\") " pod="openshift-image-registry/image-registry-697d97f7c8-4nvrr" Jan 11 17:31:55 crc kubenswrapper[4837]: E0111 17:31:55.634873 4837 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-01-11 17:31:56.134857768 +0000 UTC m=+90.313050474 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-4nvrr" (UID: "bb63c9f5-457d-4c61-8cc6-56690e66a952") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 11 17:31:55 crc kubenswrapper[4837]: I0111 17:31:55.735761 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Jan 11 17:31:55 crc kubenswrapper[4837]: E0111 17:31:55.736009 4837 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-01-11 17:31:56.235976875 +0000 UTC m=+90.414169581 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 11 17:31:55 crc kubenswrapper[4837]: I0111 17:31:55.736090 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-4nvrr\" (UID: \"bb63c9f5-457d-4c61-8cc6-56690e66a952\") " pod="openshift-image-registry/image-registry-697d97f7c8-4nvrr" Jan 11 17:31:55 crc kubenswrapper[4837]: E0111 17:31:55.736455 4837 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-01-11 17:31:56.236441088 +0000 UTC m=+90.414633794 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-4nvrr" (UID: "bb63c9f5-457d-4c61-8cc6-56690e66a952") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 11 17:31:55 crc kubenswrapper[4837]: I0111 17:31:55.837483 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Jan 11 17:31:55 crc kubenswrapper[4837]: E0111 17:31:55.837786 4837 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-01-11 17:31:56.33775857 +0000 UTC m=+90.515951286 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 11 17:31:55 crc kubenswrapper[4837]: I0111 17:31:55.838011 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-4nvrr\" (UID: \"bb63c9f5-457d-4c61-8cc6-56690e66a952\") " pod="openshift-image-registry/image-registry-697d97f7c8-4nvrr" Jan 11 17:31:55 crc kubenswrapper[4837]: E0111 17:31:55.838341 4837 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-01-11 17:31:56.338328604 +0000 UTC m=+90.516521310 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-4nvrr" (UID: "bb63c9f5-457d-4c61-8cc6-56690e66a952") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 11 17:31:55 crc kubenswrapper[4837]: I0111 17:31:55.938598 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Jan 11 17:31:55 crc kubenswrapper[4837]: E0111 17:31:55.939990 4837 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-01-11 17:31:56.439971335 +0000 UTC m=+90.618164041 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 11 17:31:56 crc kubenswrapper[4837]: I0111 17:31:56.040580 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-4nvrr\" (UID: \"bb63c9f5-457d-4c61-8cc6-56690e66a952\") " pod="openshift-image-registry/image-registry-697d97f7c8-4nvrr" Jan 11 17:31:56 crc kubenswrapper[4837]: E0111 17:31:56.040964 4837 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-01-11 17:31:56.540952829 +0000 UTC m=+90.719145525 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-4nvrr" (UID: "bb63c9f5-457d-4c61-8cc6-56690e66a952") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 11 17:31:56 crc kubenswrapper[4837]: I0111 17:31:56.141137 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Jan 11 17:31:56 crc kubenswrapper[4837]: E0111 17:31:56.141545 4837 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-01-11 17:31:56.641504182 +0000 UTC m=+90.819696898 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 11 17:31:56 crc kubenswrapper[4837]: I0111 17:31:56.242491 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-4nvrr\" (UID: \"bb63c9f5-457d-4c61-8cc6-56690e66a952\") " pod="openshift-image-registry/image-registry-697d97f7c8-4nvrr" Jan 11 17:31:56 crc kubenswrapper[4837]: E0111 17:31:56.242931 4837 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-01-11 17:31:56.742911396 +0000 UTC m=+90.921104162 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-4nvrr" (UID: "bb63c9f5-457d-4c61-8cc6-56690e66a952") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 11 17:31:56 crc kubenswrapper[4837]: I0111 17:31:56.323493 4837 patch_prober.go:28] interesting pod/oauth-openshift-558db77b4-ldzgv container/oauth-openshift namespace/openshift-authentication: Readiness probe status=failure output="Get \"https://10.217.0.17:6443/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" start-of-body= Jan 11 17:31:56 crc kubenswrapper[4837]: I0111 17:31:56.323784 4837 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-authentication/oauth-openshift-558db77b4-ldzgv" podUID="1b5ce6bf-72e2-494a-aa22-830e992fbec5" containerName="oauth-openshift" probeResult="failure" output="Get \"https://10.217.0.17:6443/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Jan 11 17:31:56 crc kubenswrapper[4837]: I0111 17:31:56.327599 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-operator-74547568cd-ps47j" event={"ID":"41ef8783-9b8b-427d-b3db-56d90bb448fa","Type":"ContainerStarted","Data":"21e5ae2ccc5d6566b2321a0c51cb8b226a58160d7922b512bb7efdca67b2d72f"} Jan 11 17:31:56 crc kubenswrapper[4837]: I0111 17:31:56.329449 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-api/machine-api-operator-5694c8668f-xk9j9" event={"ID":"b61e27df-5c38-48b3-b6e9-bca3ce8aa429","Type":"ContainerStarted","Data":"cfdf89ee3291fa26d37be756c02921154805ca0cbf468dae769835a256db62bd"} Jan 11 17:31:56 crc kubenswrapper[4837]: I0111 17:31:56.331002 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-dddb7" event={"ID":"1bb35332-d7fa-4163-99c1-3de2e12a6165","Type":"ContainerStarted","Data":"ac938961119134566dacdf9e7c9d847b68647a0c58c7305f98671a709d441922"} Jan 11 17:31:56 crc kubenswrapper[4837]: I0111 17:31:56.332396 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-operator/ingress-operator-5b745b69d9-gsqgk" event={"ID":"f15ef5f6-e972-4986-a15c-da1e90b74ae0","Type":"ContainerStarted","Data":"26acb6b9abfd9f3fb14dc860fdae4d7021ec1890842d6d148e56f1738bac9ecb"} Jan 11 17:31:56 crc kubenswrapper[4837]: I0111 17:31:56.333575 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-admission-controller-857f4d67dd-2984c" event={"ID":"8c7124f1-e7cc-4ae8-89d2-19457c045576","Type":"ContainerStarted","Data":"a8075c7d1d2d12248d7888fcec22f657c701c0328c0765e9b3e7116b27b22db6"} Jan 11 17:31:56 crc kubenswrapper[4837]: I0111 17:31:56.334274 4837 patch_prober.go:28] interesting pod/router-default-5444994796-cscqg container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Jan 11 17:31:56 crc kubenswrapper[4837]: [-]has-synced failed: reason withheld Jan 11 17:31:56 crc kubenswrapper[4837]: [+]process-running ok Jan 11 17:31:56 crc kubenswrapper[4837]: healthz check failed Jan 11 17:31:56 crc kubenswrapper[4837]: I0111 17:31:56.334309 4837 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-cscqg" podUID="e610467b-9c0c-47ac-86c8-2d700aba3e8e" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Jan 11 17:31:56 crc kubenswrapper[4837]: I0111 17:31:56.334763 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-2vscb" event={"ID":"43082497-a570-48fc-95f8-eda27581cde7","Type":"ContainerStarted","Data":"fdd34e6f7cf07531a37a6a51204799f2e0d9066c43dacf9253d1da8eb31608d1"} Jan 11 17:31:56 crc kubenswrapper[4837]: I0111 17:31:56.336141 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-controller-84d6567774-zdg78" event={"ID":"8299e03d-1f93-4032-bcad-2ce040734c86","Type":"ContainerStarted","Data":"caac814653040fcbc4c54667f0de5a8b6edb42bda32d083d4078e40e9f48afb4"} Jan 11 17:31:56 crc kubenswrapper[4837]: I0111 17:31:56.337964 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-p7jgn" event={"ID":"09ba226e-a3c0-402a-a7d0-bbc4f46d5dd3","Type":"ContainerStarted","Data":"6229aa788ec4eab756ed3c43d0b65b74f13de9357a0ec5ec7f930b42c1a95abd"} Jan 11 17:31:56 crc kubenswrapper[4837]: I0111 17:31:56.343563 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Jan 11 17:31:56 crc kubenswrapper[4837]: E0111 17:31:56.343703 4837 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-01-11 17:31:56.843662314 +0000 UTC m=+91.021855030 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 11 17:31:56 crc kubenswrapper[4837]: I0111 17:31:56.343873 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-4nvrr\" (UID: \"bb63c9f5-457d-4c61-8cc6-56690e66a952\") " pod="openshift-image-registry/image-registry-697d97f7c8-4nvrr" Jan 11 17:31:56 crc kubenswrapper[4837]: E0111 17:31:56.344168 4837 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-01-11 17:31:56.844157367 +0000 UTC m=+91.022350073 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-4nvrr" (UID: "bb63c9f5-457d-4c61-8cc6-56690e66a952") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 11 17:31:56 crc kubenswrapper[4837]: I0111 17:31:56.444621 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Jan 11 17:31:56 crc kubenswrapper[4837]: E0111 17:31:56.444787 4837 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-01-11 17:31:56.944757751 +0000 UTC m=+91.122950457 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 11 17:31:56 crc kubenswrapper[4837]: I0111 17:31:56.444890 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-4nvrr\" (UID: \"bb63c9f5-457d-4c61-8cc6-56690e66a952\") " pod="openshift-image-registry/image-registry-697d97f7c8-4nvrr" Jan 11 17:31:56 crc kubenswrapper[4837]: E0111 17:31:56.445236 4837 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-01-11 17:31:56.945219633 +0000 UTC m=+91.123412339 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-4nvrr" (UID: "bb63c9f5-457d-4c61-8cc6-56690e66a952") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 11 17:31:56 crc kubenswrapper[4837]: I0111 17:31:56.481793 4837 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-authentication/oauth-openshift-558db77b4-ldzgv" Jan 11 17:31:56 crc kubenswrapper[4837]: I0111 17:31:56.521507 4837 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-config-operator/openshift-config-operator-7777fb866f-599f5" Jan 11 17:31:56 crc kubenswrapper[4837]: I0111 17:31:56.546166 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Jan 11 17:31:56 crc kubenswrapper[4837]: E0111 17:31:56.546566 4837 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-01-11 17:31:57.046550106 +0000 UTC m=+91.224742812 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 11 17:31:56 crc kubenswrapper[4837]: I0111 17:31:56.647466 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-4nvrr\" (UID: \"bb63c9f5-457d-4c61-8cc6-56690e66a952\") " pod="openshift-image-registry/image-registry-697d97f7c8-4nvrr" Jan 11 17:31:56 crc kubenswrapper[4837]: E0111 17:31:56.647845 4837 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-01-11 17:31:57.147834847 +0000 UTC m=+91.326027543 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-4nvrr" (UID: "bb63c9f5-457d-4c61-8cc6-56690e66a952") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 11 17:31:56 crc kubenswrapper[4837]: I0111 17:31:56.748800 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Jan 11 17:31:56 crc kubenswrapper[4837]: E0111 17:31:56.748975 4837 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-01-11 17:31:57.248950724 +0000 UTC m=+91.427143430 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 11 17:31:56 crc kubenswrapper[4837]: I0111 17:31:56.749092 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-4nvrr\" (UID: \"bb63c9f5-457d-4c61-8cc6-56690e66a952\") " pod="openshift-image-registry/image-registry-697d97f7c8-4nvrr" Jan 11 17:31:56 crc kubenswrapper[4837]: E0111 17:31:56.749422 4837 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-01-11 17:31:57.249414186 +0000 UTC m=+91.427606892 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-4nvrr" (UID: "bb63c9f5-457d-4c61-8cc6-56690e66a952") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 11 17:31:56 crc kubenswrapper[4837]: I0111 17:31:56.850387 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Jan 11 17:31:56 crc kubenswrapper[4837]: E0111 17:31:56.850574 4837 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-01-11 17:31:57.350550074 +0000 UTC m=+91.528742780 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 11 17:31:56 crc kubenswrapper[4837]: I0111 17:31:56.850884 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-4nvrr\" (UID: \"bb63c9f5-457d-4c61-8cc6-56690e66a952\") " pod="openshift-image-registry/image-registry-697d97f7c8-4nvrr" Jan 11 17:31:56 crc kubenswrapper[4837]: E0111 17:31:56.851146 4837 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-01-11 17:31:57.351135968 +0000 UTC m=+91.529328674 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-4nvrr" (UID: "bb63c9f5-457d-4c61-8cc6-56690e66a952") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 11 17:31:56 crc kubenswrapper[4837]: I0111 17:31:56.952056 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Jan 11 17:31:56 crc kubenswrapper[4837]: E0111 17:31:56.952277 4837 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-01-11 17:31:57.452244626 +0000 UTC m=+91.630437342 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 11 17:31:56 crc kubenswrapper[4837]: I0111 17:31:56.952426 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-4nvrr\" (UID: \"bb63c9f5-457d-4c61-8cc6-56690e66a952\") " pod="openshift-image-registry/image-registry-697d97f7c8-4nvrr" Jan 11 17:31:56 crc kubenswrapper[4837]: E0111 17:31:56.952789 4837 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-01-11 17:31:57.45277909 +0000 UTC m=+91.630971796 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-4nvrr" (UID: "bb63c9f5-457d-4c61-8cc6-56690e66a952") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 11 17:31:57 crc kubenswrapper[4837]: I0111 17:31:57.053548 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Jan 11 17:31:57 crc kubenswrapper[4837]: E0111 17:31:57.053865 4837 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-01-11 17:31:57.553849616 +0000 UTC m=+91.732042322 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 11 17:31:57 crc kubenswrapper[4837]: I0111 17:31:57.155006 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-4nvrr\" (UID: \"bb63c9f5-457d-4c61-8cc6-56690e66a952\") " pod="openshift-image-registry/image-registry-697d97f7c8-4nvrr" Jan 11 17:31:57 crc kubenswrapper[4837]: E0111 17:31:57.155345 4837 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-01-11 17:31:57.655326282 +0000 UTC m=+91.833518998 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-4nvrr" (UID: "bb63c9f5-457d-4c61-8cc6-56690e66a952") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 11 17:31:57 crc kubenswrapper[4837]: I0111 17:31:57.256031 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Jan 11 17:31:57 crc kubenswrapper[4837]: E0111 17:31:57.256230 4837 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-01-11 17:31:57.756199583 +0000 UTC m=+91.934392299 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 11 17:31:57 crc kubenswrapper[4837]: I0111 17:31:57.256650 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-4nvrr\" (UID: \"bb63c9f5-457d-4c61-8cc6-56690e66a952\") " pod="openshift-image-registry/image-registry-697d97f7c8-4nvrr" Jan 11 17:31:57 crc kubenswrapper[4837]: E0111 17:31:57.256988 4837 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-01-11 17:31:57.756978303 +0000 UTC m=+91.935171099 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-4nvrr" (UID: "bb63c9f5-457d-4c61-8cc6-56690e66a952") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 11 17:31:57 crc kubenswrapper[4837]: I0111 17:31:57.334708 4837 patch_prober.go:28] interesting pod/router-default-5444994796-cscqg container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Jan 11 17:31:57 crc kubenswrapper[4837]: [-]has-synced failed: reason withheld Jan 11 17:31:57 crc kubenswrapper[4837]: [+]process-running ok Jan 11 17:31:57 crc kubenswrapper[4837]: healthz check failed Jan 11 17:31:57 crc kubenswrapper[4837]: I0111 17:31:57.334776 4837 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-cscqg" podUID="e610467b-9c0c-47ac-86c8-2d700aba3e8e" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Jan 11 17:31:57 crc kubenswrapper[4837]: I0111 17:31:57.343540 4837 generic.go:334] "Generic (PLEG): container finished" podID="90317184-ec5a-4dc2-a9f8-6075c0d78aa1" containerID="68d6e80bddec6bf073fe45ce7b2b3e74314236ca041b7b15504ef166b0de6576" exitCode=0 Jan 11 17:31:57 crc kubenswrapper[4837]: I0111 17:31:57.343592 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29469210-m2c5r" event={"ID":"90317184-ec5a-4dc2-a9f8-6075c0d78aa1","Type":"ContainerDied","Data":"68d6e80bddec6bf073fe45ce7b2b3e74314236ca041b7b15504ef166b0de6576"} Jan 11 17:31:57 crc kubenswrapper[4837]: I0111 17:31:57.345587 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-apiserver/apiserver-76f77b778f-pz586" event={"ID":"09971765-a82b-4ab2-b79a-5defb8feb416","Type":"ContainerStarted","Data":"91ad29f7e847a5a5d3920b52953cd51a7094ace947a21168aedfef043232fd51"} Jan 11 17:31:57 crc kubenswrapper[4837]: I0111 17:31:57.345610 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-apiserver/apiserver-76f77b778f-pz586" event={"ID":"09971765-a82b-4ab2-b79a-5defb8feb416","Type":"ContainerStarted","Data":"123974ee0513d46200ef70d75e5f7fd2edcd665b9acc54684477d030f6761096"} Jan 11 17:31:57 crc kubenswrapper[4837]: I0111 17:31:57.346794 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-kq5ch" event={"ID":"876b267e-baf3-4f7d-a4a3-49f44b4dfbb7","Type":"ContainerStarted","Data":"73435d360942019a1cc53498209da210036856c9a689139831137e9c4014e875"} Jan 11 17:31:57 crc kubenswrapper[4837]: I0111 17:31:57.348560 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-f2l24" event={"ID":"b0d76b5a-6ea4-4508-ac4b-0f74711d7f68","Type":"ContainerStarted","Data":"306a4a6c5268c9d07caa740045edcd4143434e552679a49da09c04df5ee3abf8"} Jan 11 17:31:57 crc kubenswrapper[4837]: I0111 17:31:57.349355 4837 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-dns/dns-default-dddb7" Jan 11 17:31:57 crc kubenswrapper[4837]: I0111 17:31:57.356955 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Jan 11 17:31:57 crc kubenswrapper[4837]: E0111 17:31:57.357169 4837 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-01-11 17:31:57.857145986 +0000 UTC m=+92.035338692 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 11 17:31:57 crc kubenswrapper[4837]: I0111 17:31:57.357219 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-4nvrr\" (UID: \"bb63c9f5-457d-4c61-8cc6-56690e66a952\") " pod="openshift-image-registry/image-registry-697d97f7c8-4nvrr" Jan 11 17:31:57 crc kubenswrapper[4837]: E0111 17:31:57.357594 4837 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-01-11 17:31:57.857580547 +0000 UTC m=+92.035773253 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-4nvrr" (UID: "bb63c9f5-457d-4c61-8cc6-56690e66a952") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 11 17:31:57 crc kubenswrapper[4837]: I0111 17:31:57.375501 4837 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-api/machine-api-operator-5694c8668f-xk9j9" podStartSLOduration=48.375483434 podStartE2EDuration="48.375483434s" podCreationTimestamp="2026-01-11 17:31:09 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-11 17:31:57.370587779 +0000 UTC m=+91.548780485" watchObservedRunningTime="2026-01-11 17:31:57.375483434 +0000 UTC m=+91.553676140" Jan 11 17:31:57 crc kubenswrapper[4837]: I0111 17:31:57.396583 4837 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-p7jgn" podStartSLOduration=49.396569331 podStartE2EDuration="49.396569331s" podCreationTimestamp="2026-01-11 17:31:08 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-11 17:31:57.394425136 +0000 UTC m=+91.572617842" watchObservedRunningTime="2026-01-11 17:31:57.396569331 +0000 UTC m=+91.574762027" Jan 11 17:31:57 crc kubenswrapper[4837]: I0111 17:31:57.413990 4837 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-kq5ch" podStartSLOduration=49.413967594 podStartE2EDuration="49.413967594s" podCreationTimestamp="2026-01-11 17:31:08 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-11 17:31:57.412088807 +0000 UTC m=+91.590281513" watchObservedRunningTime="2026-01-11 17:31:57.413967594 +0000 UTC m=+91.592160300" Jan 11 17:31:57 crc kubenswrapper[4837]: I0111 17:31:57.433035 4837 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-dns/dns-default-dddb7" podStartSLOduration=11.43301939 podStartE2EDuration="11.43301939s" podCreationTimestamp="2026-01-11 17:31:46 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-11 17:31:57.431750318 +0000 UTC m=+91.609943034" watchObservedRunningTime="2026-01-11 17:31:57.43301939 +0000 UTC m=+91.611212096" Jan 11 17:31:57 crc kubenswrapper[4837]: I0111 17:31:57.453195 4837 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-config-operator/machine-config-operator-74547568cd-ps47j" podStartSLOduration=48.453180103 podStartE2EDuration="48.453180103s" podCreationTimestamp="2026-01-11 17:31:09 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-11 17:31:57.450598267 +0000 UTC m=+91.628790983" watchObservedRunningTime="2026-01-11 17:31:57.453180103 +0000 UTC m=+91.631372809" Jan 11 17:31:57 crc kubenswrapper[4837]: I0111 17:31:57.458079 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Jan 11 17:31:57 crc kubenswrapper[4837]: E0111 17:31:57.458279 4837 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-01-11 17:31:57.958235072 +0000 UTC m=+92.136427798 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 11 17:31:57 crc kubenswrapper[4837]: I0111 17:31:57.459793 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-4nvrr\" (UID: \"bb63c9f5-457d-4c61-8cc6-56690e66a952\") " pod="openshift-image-registry/image-registry-697d97f7c8-4nvrr" Jan 11 17:31:57 crc kubenswrapper[4837]: E0111 17:31:57.461490 4837 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-01-11 17:31:57.961477205 +0000 UTC m=+92.139669911 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-4nvrr" (UID: "bb63c9f5-457d-4c61-8cc6-56690e66a952") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 11 17:31:57 crc kubenswrapper[4837]: I0111 17:31:57.478037 4837 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ingress-operator/ingress-operator-5b745b69d9-gsqgk" podStartSLOduration=49.478020167 podStartE2EDuration="49.478020167s" podCreationTimestamp="2026-01-11 17:31:08 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-11 17:31:57.474443626 +0000 UTC m=+91.652636332" watchObservedRunningTime="2026-01-11 17:31:57.478020167 +0000 UTC m=+91.656212873" Jan 11 17:31:57 crc kubenswrapper[4837]: I0111 17:31:57.504807 4837 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-config-operator/machine-config-controller-84d6567774-zdg78" podStartSLOduration=48.504789749 podStartE2EDuration="48.504789749s" podCreationTimestamp="2026-01-11 17:31:09 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-11 17:31:57.502575133 +0000 UTC m=+91.680767839" watchObservedRunningTime="2026-01-11 17:31:57.504789749 +0000 UTC m=+91.682982455" Jan 11 17:31:57 crc kubenswrapper[4837]: I0111 17:31:57.521012 4837 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/multus-admission-controller-857f4d67dd-2984c" podStartSLOduration=48.520993583 podStartE2EDuration="48.520993583s" podCreationTimestamp="2026-01-11 17:31:09 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-11 17:31:57.519944835 +0000 UTC m=+91.698137541" watchObservedRunningTime="2026-01-11 17:31:57.520993583 +0000 UTC m=+91.699186289" Jan 11 17:31:57 crc kubenswrapper[4837]: I0111 17:31:57.561381 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Jan 11 17:31:57 crc kubenswrapper[4837]: E0111 17:31:57.561546 4837 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-01-11 17:31:58.061510325 +0000 UTC m=+92.239703041 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 11 17:31:57 crc kubenswrapper[4837]: I0111 17:31:57.561701 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-4nvrr\" (UID: \"bb63c9f5-457d-4c61-8cc6-56690e66a952\") " pod="openshift-image-registry/image-registry-697d97f7c8-4nvrr" Jan 11 17:31:57 crc kubenswrapper[4837]: E0111 17:31:57.562001 4837 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-01-11 17:31:58.061989137 +0000 UTC m=+92.240181843 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-4nvrr" (UID: "bb63c9f5-457d-4c61-8cc6-56690e66a952") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 11 17:31:57 crc kubenswrapper[4837]: I0111 17:31:57.570365 4837 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-2vscb" podStartSLOduration=48.57034568 podStartE2EDuration="48.57034568s" podCreationTimestamp="2026-01-11 17:31:09 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-11 17:31:57.542971242 +0000 UTC m=+91.721163948" watchObservedRunningTime="2026-01-11 17:31:57.57034568 +0000 UTC m=+91.748538396" Jan 11 17:31:57 crc kubenswrapper[4837]: I0111 17:31:57.573540 4837 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-mxndb"] Jan 11 17:31:57 crc kubenswrapper[4837]: I0111 17:31:57.578279 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-mxndb" Jan 11 17:31:57 crc kubenswrapper[4837]: I0111 17:31:57.580890 4837 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"community-operators-dockercfg-dmngl" Jan 11 17:31:57 crc kubenswrapper[4837]: I0111 17:31:57.603576 4837 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-mxndb"] Jan 11 17:31:57 crc kubenswrapper[4837]: I0111 17:31:57.662619 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Jan 11 17:31:57 crc kubenswrapper[4837]: I0111 17:31:57.662846 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-k8dhm\" (UniqueName: \"kubernetes.io/projected/1d39ff8b-c79a-46ea-af70-0902ce0ee504-kube-api-access-k8dhm\") pod \"community-operators-mxndb\" (UID: \"1d39ff8b-c79a-46ea-af70-0902ce0ee504\") " pod="openshift-marketplace/community-operators-mxndb" Jan 11 17:31:57 crc kubenswrapper[4837]: I0111 17:31:57.662886 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1d39ff8b-c79a-46ea-af70-0902ce0ee504-utilities\") pod \"community-operators-mxndb\" (UID: \"1d39ff8b-c79a-46ea-af70-0902ce0ee504\") " pod="openshift-marketplace/community-operators-mxndb" Jan 11 17:31:57 crc kubenswrapper[4837]: I0111 17:31:57.662916 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1d39ff8b-c79a-46ea-af70-0902ce0ee504-catalog-content\") pod \"community-operators-mxndb\" (UID: \"1d39ff8b-c79a-46ea-af70-0902ce0ee504\") " pod="openshift-marketplace/community-operators-mxndb" Jan 11 17:31:57 crc kubenswrapper[4837]: E0111 17:31:57.663020 4837 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-01-11 17:31:58.163006232 +0000 UTC m=+92.341198938 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 11 17:31:57 crc kubenswrapper[4837]: I0111 17:31:57.669334 4837 plugin_watcher.go:194] "Adding socket path or updating timestamp to desired state cache" path="/var/lib/kubelet/plugins_registry/kubevirt.io.hostpath-provisioner-reg.sock" Jan 11 17:31:57 crc kubenswrapper[4837]: I0111 17:31:57.671355 4837 reconciler.go:161] "OperationExecutor.RegisterPlugin started" plugin={"SocketPath":"/var/lib/kubelet/plugins_registry/kubevirt.io.hostpath-provisioner-reg.sock","Timestamp":"2026-01-11T17:31:57.669405705Z","Handler":null,"Name":""} Jan 11 17:31:57 crc kubenswrapper[4837]: I0111 17:31:57.680521 4837 csi_plugin.go:100] kubernetes.io/csi: Trying to validate a new CSI Driver with name: kubevirt.io.hostpath-provisioner endpoint: /var/lib/kubelet/plugins/csi-hostpath/csi.sock versions: 1.0.0 Jan 11 17:31:57 crc kubenswrapper[4837]: I0111 17:31:57.680604 4837 csi_plugin.go:113] kubernetes.io/csi: Register new plugin with name: kubevirt.io.hostpath-provisioner at endpoint: /var/lib/kubelet/plugins/csi-hostpath/csi.sock Jan 11 17:31:57 crc kubenswrapper[4837]: I0111 17:31:57.763960 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-4nvrr\" (UID: \"bb63c9f5-457d-4c61-8cc6-56690e66a952\") " pod="openshift-image-registry/image-registry-697d97f7c8-4nvrr" Jan 11 17:31:57 crc kubenswrapper[4837]: I0111 17:31:57.764289 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-k8dhm\" (UniqueName: \"kubernetes.io/projected/1d39ff8b-c79a-46ea-af70-0902ce0ee504-kube-api-access-k8dhm\") pod \"community-operators-mxndb\" (UID: \"1d39ff8b-c79a-46ea-af70-0902ce0ee504\") " pod="openshift-marketplace/community-operators-mxndb" Jan 11 17:31:57 crc kubenswrapper[4837]: I0111 17:31:57.764334 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1d39ff8b-c79a-46ea-af70-0902ce0ee504-utilities\") pod \"community-operators-mxndb\" (UID: \"1d39ff8b-c79a-46ea-af70-0902ce0ee504\") " pod="openshift-marketplace/community-operators-mxndb" Jan 11 17:31:57 crc kubenswrapper[4837]: I0111 17:31:57.764368 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1d39ff8b-c79a-46ea-af70-0902ce0ee504-catalog-content\") pod \"community-operators-mxndb\" (UID: \"1d39ff8b-c79a-46ea-af70-0902ce0ee504\") " pod="openshift-marketplace/community-operators-mxndb" Jan 11 17:31:57 crc kubenswrapper[4837]: I0111 17:31:57.764798 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1d39ff8b-c79a-46ea-af70-0902ce0ee504-catalog-content\") pod \"community-operators-mxndb\" (UID: \"1d39ff8b-c79a-46ea-af70-0902ce0ee504\") " pod="openshift-marketplace/community-operators-mxndb" Jan 11 17:31:57 crc kubenswrapper[4837]: I0111 17:31:57.765305 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1d39ff8b-c79a-46ea-af70-0902ce0ee504-utilities\") pod \"community-operators-mxndb\" (UID: \"1d39ff8b-c79a-46ea-af70-0902ce0ee504\") " pod="openshift-marketplace/community-operators-mxndb" Jan 11 17:31:57 crc kubenswrapper[4837]: I0111 17:31:57.766579 4837 csi_attacher.go:380] kubernetes.io/csi: attacher.MountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping MountDevice... Jan 11 17:31:57 crc kubenswrapper[4837]: I0111 17:31:57.766610 4837 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-4nvrr\" (UID: \"bb63c9f5-457d-4c61-8cc6-56690e66a952\") device mount path \"/var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/1f4776af88835e41c12b831b4c9fed40233456d14189815a54dbe7f892fc1983/globalmount\"" pod="openshift-image-registry/image-registry-697d97f7c8-4nvrr" Jan 11 17:31:57 crc kubenswrapper[4837]: I0111 17:31:57.768167 4837 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-29ts4"] Jan 11 17:31:57 crc kubenswrapper[4837]: I0111 17:31:57.769187 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-29ts4" Jan 11 17:31:57 crc kubenswrapper[4837]: I0111 17:31:57.770973 4837 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"certified-operators-dockercfg-4rs5g" Jan 11 17:31:57 crc kubenswrapper[4837]: I0111 17:31:57.790566 4837 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-29ts4"] Jan 11 17:31:57 crc kubenswrapper[4837]: I0111 17:31:57.800593 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-k8dhm\" (UniqueName: \"kubernetes.io/projected/1d39ff8b-c79a-46ea-af70-0902ce0ee504-kube-api-access-k8dhm\") pod \"community-operators-mxndb\" (UID: \"1d39ff8b-c79a-46ea-af70-0902ce0ee504\") " pod="openshift-marketplace/community-operators-mxndb" Jan 11 17:31:57 crc kubenswrapper[4837]: I0111 17:31:57.827644 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-4nvrr\" (UID: \"bb63c9f5-457d-4c61-8cc6-56690e66a952\") " pod="openshift-image-registry/image-registry-697d97f7c8-4nvrr" Jan 11 17:31:57 crc kubenswrapper[4837]: I0111 17:31:57.865402 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Jan 11 17:31:57 crc kubenswrapper[4837]: I0111 17:31:57.865697 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/30409294-8779-48ad-a6e8-36b662f09c0f-utilities\") pod \"certified-operators-29ts4\" (UID: \"30409294-8779-48ad-a6e8-36b662f09c0f\") " pod="openshift-marketplace/certified-operators-29ts4" Jan 11 17:31:57 crc kubenswrapper[4837]: I0111 17:31:57.865752 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/30409294-8779-48ad-a6e8-36b662f09c0f-catalog-content\") pod \"certified-operators-29ts4\" (UID: \"30409294-8779-48ad-a6e8-36b662f09c0f\") " pod="openshift-marketplace/certified-operators-29ts4" Jan 11 17:31:57 crc kubenswrapper[4837]: I0111 17:31:57.865792 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-69k8h\" (UniqueName: \"kubernetes.io/projected/30409294-8779-48ad-a6e8-36b662f09c0f-kube-api-access-69k8h\") pod \"certified-operators-29ts4\" (UID: \"30409294-8779-48ad-a6e8-36b662f09c0f\") " pod="openshift-marketplace/certified-operators-29ts4" Jan 11 17:31:57 crc kubenswrapper[4837]: I0111 17:31:57.938834 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-mxndb" Jan 11 17:31:57 crc kubenswrapper[4837]: I0111 17:31:57.939355 4837 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-2ngzf" Jan 11 17:31:57 crc kubenswrapper[4837]: I0111 17:31:57.939821 4837 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-2ngzf" Jan 11 17:31:57 crc kubenswrapper[4837]: I0111 17:31:57.952076 4837 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-2ngzf" Jan 11 17:31:57 crc kubenswrapper[4837]: I0111 17:31:57.967084 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/30409294-8779-48ad-a6e8-36b662f09c0f-utilities\") pod \"certified-operators-29ts4\" (UID: \"30409294-8779-48ad-a6e8-36b662f09c0f\") " pod="openshift-marketplace/certified-operators-29ts4" Jan 11 17:31:57 crc kubenswrapper[4837]: I0111 17:31:57.967142 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/30409294-8779-48ad-a6e8-36b662f09c0f-catalog-content\") pod \"certified-operators-29ts4\" (UID: \"30409294-8779-48ad-a6e8-36b662f09c0f\") " pod="openshift-marketplace/certified-operators-29ts4" Jan 11 17:31:57 crc kubenswrapper[4837]: I0111 17:31:57.967188 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-69k8h\" (UniqueName: \"kubernetes.io/projected/30409294-8779-48ad-a6e8-36b662f09c0f-kube-api-access-69k8h\") pod \"certified-operators-29ts4\" (UID: \"30409294-8779-48ad-a6e8-36b662f09c0f\") " pod="openshift-marketplace/certified-operators-29ts4" Jan 11 17:31:57 crc kubenswrapper[4837]: I0111 17:31:57.967897 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/30409294-8779-48ad-a6e8-36b662f09c0f-utilities\") pod \"certified-operators-29ts4\" (UID: \"30409294-8779-48ad-a6e8-36b662f09c0f\") " pod="openshift-marketplace/certified-operators-29ts4" Jan 11 17:31:57 crc kubenswrapper[4837]: I0111 17:31:57.967950 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/30409294-8779-48ad-a6e8-36b662f09c0f-catalog-content\") pod \"certified-operators-29ts4\" (UID: \"30409294-8779-48ad-a6e8-36b662f09c0f\") " pod="openshift-marketplace/certified-operators-29ts4" Jan 11 17:31:57 crc kubenswrapper[4837]: I0111 17:31:57.991473 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-69k8h\" (UniqueName: \"kubernetes.io/projected/30409294-8779-48ad-a6e8-36b662f09c0f-kube-api-access-69k8h\") pod \"certified-operators-29ts4\" (UID: \"30409294-8779-48ad-a6e8-36b662f09c0f\") " pod="openshift-marketplace/certified-operators-29ts4" Jan 11 17:31:58 crc kubenswrapper[4837]: I0111 17:31:58.009985 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (OuterVolumeSpecName: "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8". PluginName "kubernetes.io/csi", VolumeGidValue "" Jan 11 17:31:58 crc kubenswrapper[4837]: I0111 17:31:58.016751 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-697d97f7c8-4nvrr" Jan 11 17:31:58 crc kubenswrapper[4837]: I0111 17:31:58.017115 4837 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-p7kkt"] Jan 11 17:31:58 crc kubenswrapper[4837]: I0111 17:31:58.018044 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-p7kkt" Jan 11 17:31:58 crc kubenswrapper[4837]: I0111 17:31:58.031353 4837 patch_prober.go:28] interesting pod/downloads-7954f5f757-dmnqc container/download-server namespace/openshift-console: Readiness probe status=failure output="Get \"http://10.217.0.10:8080/\": dial tcp 10.217.0.10:8080: connect: connection refused" start-of-body= Jan 11 17:31:58 crc kubenswrapper[4837]: I0111 17:31:58.031397 4837 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console/downloads-7954f5f757-dmnqc" podUID="75ee2960-aa16-4aea-84f2-d60c34d6fb1a" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.10:8080/\": dial tcp 10.217.0.10:8080: connect: connection refused" Jan 11 17:31:58 crc kubenswrapper[4837]: I0111 17:31:58.032129 4837 patch_prober.go:28] interesting pod/downloads-7954f5f757-dmnqc container/download-server namespace/openshift-console: Liveness probe status=failure output="Get \"http://10.217.0.10:8080/\": dial tcp 10.217.0.10:8080: connect: connection refused" start-of-body= Jan 11 17:31:58 crc kubenswrapper[4837]: I0111 17:31:58.032151 4837 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-console/downloads-7954f5f757-dmnqc" podUID="75ee2960-aa16-4aea-84f2-d60c34d6fb1a" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.10:8080/\": dial tcp 10.217.0.10:8080: connect: connection refused" Jan 11 17:31:58 crc kubenswrapper[4837]: I0111 17:31:58.032380 4837 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-p7kkt"] Jan 11 17:31:58 crc kubenswrapper[4837]: I0111 17:31:58.087182 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-29ts4" Jan 11 17:31:58 crc kubenswrapper[4837]: I0111 17:31:58.145466 4837 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-console/console-f9d7485db-v8df8" Jan 11 17:31:58 crc kubenswrapper[4837]: I0111 17:31:58.145496 4837 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-console/console-f9d7485db-v8df8" Jan 11 17:31:58 crc kubenswrapper[4837]: I0111 17:31:58.160025 4837 patch_prober.go:28] interesting pod/console-f9d7485db-v8df8 container/console namespace/openshift-console: Startup probe status=failure output="Get \"https://10.217.0.6:8443/health\": dial tcp 10.217.0.6:8443: connect: connection refused" start-of-body= Jan 11 17:31:58 crc kubenswrapper[4837]: I0111 17:31:58.160074 4837 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-console/console-f9d7485db-v8df8" podUID="1141f492-afec-40f3-bde7-7072d6a75a68" containerName="console" probeResult="failure" output="Get \"https://10.217.0.6:8443/health\": dial tcp 10.217.0.6:8443: connect: connection refused" Jan 11 17:31:58 crc kubenswrapper[4837]: I0111 17:31:58.170205 4837 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-jk4fx"] Jan 11 17:31:58 crc kubenswrapper[4837]: I0111 17:31:58.172496 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rcwxh\" (UniqueName: \"kubernetes.io/projected/df3f0f97-5906-44b5-99d5-6003e1b23be1-kube-api-access-rcwxh\") pod \"community-operators-p7kkt\" (UID: \"df3f0f97-5906-44b5-99d5-6003e1b23be1\") " pod="openshift-marketplace/community-operators-p7kkt" Jan 11 17:31:58 crc kubenswrapper[4837]: I0111 17:31:58.172581 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/df3f0f97-5906-44b5-99d5-6003e1b23be1-utilities\") pod \"community-operators-p7kkt\" (UID: \"df3f0f97-5906-44b5-99d5-6003e1b23be1\") " pod="openshift-marketplace/community-operators-p7kkt" Jan 11 17:31:58 crc kubenswrapper[4837]: I0111 17:31:58.172636 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/df3f0f97-5906-44b5-99d5-6003e1b23be1-catalog-content\") pod \"community-operators-p7kkt\" (UID: \"df3f0f97-5906-44b5-99d5-6003e1b23be1\") " pod="openshift-marketplace/community-operators-p7kkt" Jan 11 17:31:58 crc kubenswrapper[4837]: I0111 17:31:58.184555 4837 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-jk4fx"] Jan 11 17:31:58 crc kubenswrapper[4837]: I0111 17:31:58.185484 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-jk4fx" Jan 11 17:31:58 crc kubenswrapper[4837]: I0111 17:31:58.219725 4837 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-mxndb"] Jan 11 17:31:58 crc kubenswrapper[4837]: I0111 17:31:58.254354 4837 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-cwdvp" Jan 11 17:31:58 crc kubenswrapper[4837]: I0111 17:31:58.273810 4837 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-cwdvp" Jan 11 17:31:58 crc kubenswrapper[4837]: I0111 17:31:58.275723 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rcwxh\" (UniqueName: \"kubernetes.io/projected/df3f0f97-5906-44b5-99d5-6003e1b23be1-kube-api-access-rcwxh\") pod \"community-operators-p7kkt\" (UID: \"df3f0f97-5906-44b5-99d5-6003e1b23be1\") " pod="openshift-marketplace/community-operators-p7kkt" Jan 11 17:31:58 crc kubenswrapper[4837]: I0111 17:31:58.275780 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1aecbb0e-6cc3-4308-a741-c1799ec8b541-catalog-content\") pod \"certified-operators-jk4fx\" (UID: \"1aecbb0e-6cc3-4308-a741-c1799ec8b541\") " pod="openshift-marketplace/certified-operators-jk4fx" Jan 11 17:31:58 crc kubenswrapper[4837]: I0111 17:31:58.275802 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/df3f0f97-5906-44b5-99d5-6003e1b23be1-utilities\") pod \"community-operators-p7kkt\" (UID: \"df3f0f97-5906-44b5-99d5-6003e1b23be1\") " pod="openshift-marketplace/community-operators-p7kkt" Jan 11 17:31:58 crc kubenswrapper[4837]: I0111 17:31:58.275820 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1aecbb0e-6cc3-4308-a741-c1799ec8b541-utilities\") pod \"certified-operators-jk4fx\" (UID: \"1aecbb0e-6cc3-4308-a741-c1799ec8b541\") " pod="openshift-marketplace/certified-operators-jk4fx" Jan 11 17:31:58 crc kubenswrapper[4837]: I0111 17:31:58.275883 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/df3f0f97-5906-44b5-99d5-6003e1b23be1-catalog-content\") pod \"community-operators-p7kkt\" (UID: \"df3f0f97-5906-44b5-99d5-6003e1b23be1\") " pod="openshift-marketplace/community-operators-p7kkt" Jan 11 17:31:58 crc kubenswrapper[4837]: I0111 17:31:58.275917 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-p7r2n\" (UniqueName: \"kubernetes.io/projected/1aecbb0e-6cc3-4308-a741-c1799ec8b541-kube-api-access-p7r2n\") pod \"certified-operators-jk4fx\" (UID: \"1aecbb0e-6cc3-4308-a741-c1799ec8b541\") " pod="openshift-marketplace/certified-operators-jk4fx" Jan 11 17:31:58 crc kubenswrapper[4837]: I0111 17:31:58.277351 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/df3f0f97-5906-44b5-99d5-6003e1b23be1-utilities\") pod \"community-operators-p7kkt\" (UID: \"df3f0f97-5906-44b5-99d5-6003e1b23be1\") " pod="openshift-marketplace/community-operators-p7kkt" Jan 11 17:31:58 crc kubenswrapper[4837]: I0111 17:31:58.278012 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/df3f0f97-5906-44b5-99d5-6003e1b23be1-catalog-content\") pod \"community-operators-p7kkt\" (UID: \"df3f0f97-5906-44b5-99d5-6003e1b23be1\") " pod="openshift-marketplace/community-operators-p7kkt" Jan 11 17:31:58 crc kubenswrapper[4837]: I0111 17:31:58.312733 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rcwxh\" (UniqueName: \"kubernetes.io/projected/df3f0f97-5906-44b5-99d5-6003e1b23be1-kube-api-access-rcwxh\") pod \"community-operators-p7kkt\" (UID: \"df3f0f97-5906-44b5-99d5-6003e1b23be1\") " pod="openshift-marketplace/community-operators-p7kkt" Jan 11 17:31:58 crc kubenswrapper[4837]: I0111 17:31:58.337072 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-p7kkt" Jan 11 17:31:58 crc kubenswrapper[4837]: I0111 17:31:58.338127 4837 patch_prober.go:28] interesting pod/router-default-5444994796-cscqg container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Jan 11 17:31:58 crc kubenswrapper[4837]: [-]has-synced failed: reason withheld Jan 11 17:31:58 crc kubenswrapper[4837]: [+]process-running ok Jan 11 17:31:58 crc kubenswrapper[4837]: healthz check failed Jan 11 17:31:58 crc kubenswrapper[4837]: I0111 17:31:58.338152 4837 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-cscqg" podUID="e610467b-9c0c-47ac-86c8-2d700aba3e8e" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Jan 11 17:31:58 crc kubenswrapper[4837]: I0111 17:31:58.348922 4837 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-console-operator/console-operator-58897d9998-n8ldn" Jan 11 17:31:58 crc kubenswrapper[4837]: I0111 17:31:58.376653 4837 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8f668bae-612b-4b75-9490-919e737c6a3b" path="/var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes" Jan 11 17:31:58 crc kubenswrapper[4837]: I0111 17:31:58.377582 4837 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console-operator/console-operator-58897d9998-n8ldn" Jan 11 17:31:58 crc kubenswrapper[4837]: I0111 17:31:58.377601 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-f2l24" event={"ID":"b0d76b5a-6ea4-4508-ac4b-0f74711d7f68","Type":"ContainerStarted","Data":"56942cccc42817e47f2fde552e3b60ffddffa77aba7ed7fbfdc8693b796ae4c2"} Jan 11 17:31:58 crc kubenswrapper[4837]: I0111 17:31:58.377616 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-mxndb" event={"ID":"1d39ff8b-c79a-46ea-af70-0902ce0ee504","Type":"ContainerStarted","Data":"91a0daa167fdd699229a9451fc55940853efe015d3de3667f9508a5aaddca406"} Jan 11 17:31:58 crc kubenswrapper[4837]: I0111 17:31:58.377744 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1aecbb0e-6cc3-4308-a741-c1799ec8b541-utilities\") pod \"certified-operators-jk4fx\" (UID: \"1aecbb0e-6cc3-4308-a741-c1799ec8b541\") " pod="openshift-marketplace/certified-operators-jk4fx" Jan 11 17:31:58 crc kubenswrapper[4837]: I0111 17:31:58.377904 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-p7r2n\" (UniqueName: \"kubernetes.io/projected/1aecbb0e-6cc3-4308-a741-c1799ec8b541-kube-api-access-p7r2n\") pod \"certified-operators-jk4fx\" (UID: \"1aecbb0e-6cc3-4308-a741-c1799ec8b541\") " pod="openshift-marketplace/certified-operators-jk4fx" Jan 11 17:31:58 crc kubenswrapper[4837]: I0111 17:31:58.378046 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1aecbb0e-6cc3-4308-a741-c1799ec8b541-catalog-content\") pod \"certified-operators-jk4fx\" (UID: \"1aecbb0e-6cc3-4308-a741-c1799ec8b541\") " pod="openshift-marketplace/certified-operators-jk4fx" Jan 11 17:31:58 crc kubenswrapper[4837]: I0111 17:31:58.378134 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1aecbb0e-6cc3-4308-a741-c1799ec8b541-utilities\") pod \"certified-operators-jk4fx\" (UID: \"1aecbb0e-6cc3-4308-a741-c1799ec8b541\") " pod="openshift-marketplace/certified-operators-jk4fx" Jan 11 17:31:58 crc kubenswrapper[4837]: I0111 17:31:58.378522 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1aecbb0e-6cc3-4308-a741-c1799ec8b541-catalog-content\") pod \"certified-operators-jk4fx\" (UID: \"1aecbb0e-6cc3-4308-a741-c1799ec8b541\") " pod="openshift-marketplace/certified-operators-jk4fx" Jan 11 17:31:58 crc kubenswrapper[4837]: I0111 17:31:58.380005 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="hostpath-provisioner/csi-hostpathplugin-tdxwr" event={"ID":"ce38ff9a-9354-463c-a8b5-3d4bbea8694d","Type":"ContainerStarted","Data":"825662e221a7ec6ccb3b50424125f25a56329bbd2f3604f844d0496e8e196706"} Jan 11 17:31:58 crc kubenswrapper[4837]: I0111 17:31:58.406757 4837 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-2ngzf" Jan 11 17:31:58 crc kubenswrapper[4837]: I0111 17:31:58.413505 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-p7r2n\" (UniqueName: \"kubernetes.io/projected/1aecbb0e-6cc3-4308-a741-c1799ec8b541-kube-api-access-p7r2n\") pod \"certified-operators-jk4fx\" (UID: \"1aecbb0e-6cc3-4308-a741-c1799ec8b541\") " pod="openshift-marketplace/certified-operators-jk4fx" Jan 11 17:31:58 crc kubenswrapper[4837]: I0111 17:31:58.425588 4837 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-registry-697d97f7c8-4nvrr"] Jan 11 17:31:58 crc kubenswrapper[4837]: I0111 17:31:58.438905 4837 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/network-metrics-daemon-f2l24" podStartSLOduration=50.438882897 podStartE2EDuration="50.438882897s" podCreationTimestamp="2026-01-11 17:31:08 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-11 17:31:58.423504286 +0000 UTC m=+92.601696992" watchObservedRunningTime="2026-01-11 17:31:58.438882897 +0000 UTC m=+92.617075603" Jan 11 17:31:58 crc kubenswrapper[4837]: I0111 17:31:58.442579 4837 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-apiserver/apiserver-76f77b778f-pz586" Jan 11 17:31:58 crc kubenswrapper[4837]: I0111 17:31:58.442612 4837 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-apiserver/apiserver-76f77b778f-pz586" Jan 11 17:31:58 crc kubenswrapper[4837]: I0111 17:31:58.469064 4837 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-29ts4"] Jan 11 17:31:58 crc kubenswrapper[4837]: I0111 17:31:58.482913 4837 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-apiserver/apiserver-76f77b778f-pz586" podStartSLOduration=50.482895949 podStartE2EDuration="50.482895949s" podCreationTimestamp="2026-01-11 17:31:08 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-11 17:31:58.481735119 +0000 UTC m=+92.659927815" watchObservedRunningTime="2026-01-11 17:31:58.482895949 +0000 UTC m=+92.661088645" Jan 11 17:31:58 crc kubenswrapper[4837]: I0111 17:31:58.525088 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-jk4fx" Jan 11 17:31:58 crc kubenswrapper[4837]: I0111 17:31:58.726307 4837 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-5xllw" Jan 11 17:31:58 crc kubenswrapper[4837]: I0111 17:31:58.747776 4837 patch_prober.go:28] interesting pod/apiserver-76f77b778f-pz586 container/openshift-apiserver namespace/openshift-apiserver: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[+]ping ok Jan 11 17:31:58 crc kubenswrapper[4837]: [+]log ok Jan 11 17:31:58 crc kubenswrapper[4837]: [+]etcd ok Jan 11 17:31:58 crc kubenswrapper[4837]: [+]poststarthook/start-apiserver-admission-initializer ok Jan 11 17:31:58 crc kubenswrapper[4837]: [+]poststarthook/generic-apiserver-start-informers ok Jan 11 17:31:58 crc kubenswrapper[4837]: [+]poststarthook/max-in-flight-filter ok Jan 11 17:31:58 crc kubenswrapper[4837]: [+]poststarthook/storage-object-count-tracker-hook ok Jan 11 17:31:58 crc kubenswrapper[4837]: [+]poststarthook/image.openshift.io-apiserver-caches ok Jan 11 17:31:58 crc kubenswrapper[4837]: [-]poststarthook/authorization.openshift.io-bootstrapclusterroles failed: reason withheld Jan 11 17:31:58 crc kubenswrapper[4837]: [-]poststarthook/authorization.openshift.io-ensurenodebootstrap-sa failed: reason withheld Jan 11 17:31:58 crc kubenswrapper[4837]: [+]poststarthook/project.openshift.io-projectcache ok Jan 11 17:31:58 crc kubenswrapper[4837]: [+]poststarthook/project.openshift.io-projectauthorizationcache ok Jan 11 17:31:58 crc kubenswrapper[4837]: [+]poststarthook/openshift.io-startinformers ok Jan 11 17:31:58 crc kubenswrapper[4837]: [+]poststarthook/openshift.io-restmapperupdater ok Jan 11 17:31:58 crc kubenswrapper[4837]: [+]poststarthook/quota.openshift.io-clusterquotamapping ok Jan 11 17:31:58 crc kubenswrapper[4837]: livez check failed Jan 11 17:31:58 crc kubenswrapper[4837]: I0111 17:31:58.747825 4837 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-apiserver/apiserver-76f77b778f-pz586" podUID="09971765-a82b-4ab2-b79a-5defb8feb416" containerName="openshift-apiserver" probeResult="failure" output="HTTP probe failed with statuscode: 500" Jan 11 17:31:58 crc kubenswrapper[4837]: I0111 17:31:58.757627 4837 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-5xllw" Jan 11 17:31:58 crc kubenswrapper[4837]: I0111 17:31:58.781157 4837 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29469210-m2c5r" Jan 11 17:31:58 crc kubenswrapper[4837]: I0111 17:31:58.892198 4837 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-p7kkt"] Jan 11 17:31:58 crc kubenswrapper[4837]: I0111 17:31:58.903251 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/90317184-ec5a-4dc2-a9f8-6075c0d78aa1-secret-volume\") pod \"90317184-ec5a-4dc2-a9f8-6075c0d78aa1\" (UID: \"90317184-ec5a-4dc2-a9f8-6075c0d78aa1\") " Jan 11 17:31:58 crc kubenswrapper[4837]: I0111 17:31:58.903320 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-l7rjn\" (UniqueName: \"kubernetes.io/projected/90317184-ec5a-4dc2-a9f8-6075c0d78aa1-kube-api-access-l7rjn\") pod \"90317184-ec5a-4dc2-a9f8-6075c0d78aa1\" (UID: \"90317184-ec5a-4dc2-a9f8-6075c0d78aa1\") " Jan 11 17:31:58 crc kubenswrapper[4837]: I0111 17:31:58.903350 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/90317184-ec5a-4dc2-a9f8-6075c0d78aa1-config-volume\") pod \"90317184-ec5a-4dc2-a9f8-6075c0d78aa1\" (UID: \"90317184-ec5a-4dc2-a9f8-6075c0d78aa1\") " Jan 11 17:31:58 crc kubenswrapper[4837]: I0111 17:31:58.904363 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/90317184-ec5a-4dc2-a9f8-6075c0d78aa1-config-volume" (OuterVolumeSpecName: "config-volume") pod "90317184-ec5a-4dc2-a9f8-6075c0d78aa1" (UID: "90317184-ec5a-4dc2-a9f8-6075c0d78aa1"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 11 17:31:58 crc kubenswrapper[4837]: I0111 17:31:58.931507 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/90317184-ec5a-4dc2-a9f8-6075c0d78aa1-secret-volume" (OuterVolumeSpecName: "secret-volume") pod "90317184-ec5a-4dc2-a9f8-6075c0d78aa1" (UID: "90317184-ec5a-4dc2-a9f8-6075c0d78aa1"). InnerVolumeSpecName "secret-volume". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 11 17:31:58 crc kubenswrapper[4837]: I0111 17:31:58.937890 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/90317184-ec5a-4dc2-a9f8-6075c0d78aa1-kube-api-access-l7rjn" (OuterVolumeSpecName: "kube-api-access-l7rjn") pod "90317184-ec5a-4dc2-a9f8-6075c0d78aa1" (UID: "90317184-ec5a-4dc2-a9f8-6075c0d78aa1"). InnerVolumeSpecName "kube-api-access-l7rjn". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 11 17:31:58 crc kubenswrapper[4837]: I0111 17:31:58.999133 4837 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-2vscb" Jan 11 17:31:59 crc kubenswrapper[4837]: I0111 17:31:59.004643 4837 reconciler_common.go:293] "Volume detached for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/90317184-ec5a-4dc2-a9f8-6075c0d78aa1-secret-volume\") on node \"crc\" DevicePath \"\"" Jan 11 17:31:59 crc kubenswrapper[4837]: I0111 17:31:59.004684 4837 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-l7rjn\" (UniqueName: \"kubernetes.io/projected/90317184-ec5a-4dc2-a9f8-6075c0d78aa1-kube-api-access-l7rjn\") on node \"crc\" DevicePath \"\"" Jan 11 17:31:59 crc kubenswrapper[4837]: I0111 17:31:59.004695 4837 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/90317184-ec5a-4dc2-a9f8-6075c0d78aa1-config-volume\") on node \"crc\" DevicePath \"\"" Jan 11 17:31:59 crc kubenswrapper[4837]: E0111 17:31:59.042941 4837 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="8e11a566745fc0a91a7711eeb5325461c2be37c7f2d3144b9cc34f15e4aa587a" cmd=["/bin/bash","-c","test -f /ready/ready"] Jan 11 17:31:59 crc kubenswrapper[4837]: E0111 17:31:59.050119 4837 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="8e11a566745fc0a91a7711eeb5325461c2be37c7f2d3144b9cc34f15e4aa587a" cmd=["/bin/bash","-c","test -f /ready/ready"] Jan 11 17:31:59 crc kubenswrapper[4837]: I0111 17:31:59.053604 4837 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-jk4fx"] Jan 11 17:31:59 crc kubenswrapper[4837]: E0111 17:31:59.066972 4837 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="8e11a566745fc0a91a7711eeb5325461c2be37c7f2d3144b9cc34f15e4aa587a" cmd=["/bin/bash","-c","test -f /ready/ready"] Jan 11 17:31:59 crc kubenswrapper[4837]: E0111 17:31:59.067046 4837 prober.go:104] "Probe errored" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" probeType="Readiness" pod="openshift-multus/cni-sysctl-allowlist-ds-pvfwl" podUID="31dcdcdc-a207-4f09-90af-82c452f9a3f0" containerName="kube-multus-additional-cni-plugins" Jan 11 17:31:59 crc kubenswrapper[4837]: I0111 17:31:59.331305 4837 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ingress/router-default-5444994796-cscqg" Jan 11 17:31:59 crc kubenswrapper[4837]: I0111 17:31:59.335489 4837 patch_prober.go:28] interesting pod/router-default-5444994796-cscqg container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Jan 11 17:31:59 crc kubenswrapper[4837]: [-]has-synced failed: reason withheld Jan 11 17:31:59 crc kubenswrapper[4837]: [+]process-running ok Jan 11 17:31:59 crc kubenswrapper[4837]: healthz check failed Jan 11 17:31:59 crc kubenswrapper[4837]: I0111 17:31:59.335567 4837 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-cscqg" podUID="e610467b-9c0c-47ac-86c8-2d700aba3e8e" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Jan 11 17:31:59 crc kubenswrapper[4837]: I0111 17:31:59.338929 4837 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/marketplace-operator-79b997595-gkkhd" Jan 11 17:31:59 crc kubenswrapper[4837]: I0111 17:31:59.339004 4837 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-sxnbq" Jan 11 17:31:59 crc kubenswrapper[4837]: I0111 17:31:59.344841 4837 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-wphsv" Jan 11 17:31:59 crc kubenswrapper[4837]: I0111 17:31:59.345032 4837 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-sxnbq" Jan 11 17:31:59 crc kubenswrapper[4837]: I0111 17:31:59.347115 4837 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/marketplace-operator-79b997595-gkkhd" Jan 11 17:31:59 crc kubenswrapper[4837]: I0111 17:31:59.388755 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-697d97f7c8-4nvrr" event={"ID":"bb63c9f5-457d-4c61-8cc6-56690e66a952","Type":"ContainerStarted","Data":"6bc2f1ac34164e766e353d67f51950114a79249b4c0b2598ba338ed5bc4949b1"} Jan 11 17:31:59 crc kubenswrapper[4837]: I0111 17:31:59.388805 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-697d97f7c8-4nvrr" event={"ID":"bb63c9f5-457d-4c61-8cc6-56690e66a952","Type":"ContainerStarted","Data":"e5206d4c4817d15db4f3a3b1c39591e603914dd0b75a0186d4159d621201c5aa"} Jan 11 17:31:59 crc kubenswrapper[4837]: I0111 17:31:59.389480 4837 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-image-registry/image-registry-697d97f7c8-4nvrr" Jan 11 17:31:59 crc kubenswrapper[4837]: I0111 17:31:59.401197 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="hostpath-provisioner/csi-hostpathplugin-tdxwr" event={"ID":"ce38ff9a-9354-463c-a8b5-3d4bbea8694d","Type":"ContainerStarted","Data":"f2cb2a8b42dd50b331b9df5fdf70bdaacdae1e7357187a1514a1907f3ecf22ba"} Jan 11 17:31:59 crc kubenswrapper[4837]: I0111 17:31:59.401253 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="hostpath-provisioner/csi-hostpathplugin-tdxwr" event={"ID":"ce38ff9a-9354-463c-a8b5-3d4bbea8694d","Type":"ContainerStarted","Data":"9ce398bb8fa1a343cd4982275df5ccf0dff39066f9c4c638864509623da17331"} Jan 11 17:31:59 crc kubenswrapper[4837]: I0111 17:31:59.405096 4837 generic.go:334] "Generic (PLEG): container finished" podID="30409294-8779-48ad-a6e8-36b662f09c0f" containerID="79a4e50dd9e139d48f1cb30436c0881bc8dd21770764ed8c1d18ce1389738be9" exitCode=0 Jan 11 17:31:59 crc kubenswrapper[4837]: I0111 17:31:59.405172 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-29ts4" event={"ID":"30409294-8779-48ad-a6e8-36b662f09c0f","Type":"ContainerDied","Data":"79a4e50dd9e139d48f1cb30436c0881bc8dd21770764ed8c1d18ce1389738be9"} Jan 11 17:31:59 crc kubenswrapper[4837]: I0111 17:31:59.405196 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-29ts4" event={"ID":"30409294-8779-48ad-a6e8-36b662f09c0f","Type":"ContainerStarted","Data":"5b85927c5fcc0dec12428ea7bf9832e3dd16d17005fa34a24b37d1ca1c528396"} Jan 11 17:31:59 crc kubenswrapper[4837]: I0111 17:31:59.406839 4837 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Jan 11 17:31:59 crc kubenswrapper[4837]: I0111 17:31:59.407827 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29469210-m2c5r" event={"ID":"90317184-ec5a-4dc2-a9f8-6075c0d78aa1","Type":"ContainerDied","Data":"d3dd044288f2ef676d6e67142fad923ef85f583071149f64a963d044c59dfa32"} Jan 11 17:31:59 crc kubenswrapper[4837]: I0111 17:31:59.407855 4837 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="d3dd044288f2ef676d6e67142fad923ef85f583071149f64a963d044c59dfa32" Jan 11 17:31:59 crc kubenswrapper[4837]: I0111 17:31:59.407911 4837 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29469210-m2c5r" Jan 11 17:31:59 crc kubenswrapper[4837]: I0111 17:31:59.415279 4837 generic.go:334] "Generic (PLEG): container finished" podID="1d39ff8b-c79a-46ea-af70-0902ce0ee504" containerID="d21ebfa716d770b8c9e8a729545de23ae68451f6837985eda39090e41ba166d6" exitCode=0 Jan 11 17:31:59 crc kubenswrapper[4837]: I0111 17:31:59.415409 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-mxndb" event={"ID":"1d39ff8b-c79a-46ea-af70-0902ce0ee504","Type":"ContainerDied","Data":"d21ebfa716d770b8c9e8a729545de23ae68451f6837985eda39090e41ba166d6"} Jan 11 17:31:59 crc kubenswrapper[4837]: I0111 17:31:59.421353 4837 generic.go:334] "Generic (PLEG): container finished" podID="1aecbb0e-6cc3-4308-a741-c1799ec8b541" containerID="1a9a71953ddde761cb532dd9ef63d52ebae6bfcd0fa0ba084bc27f74267b34df" exitCode=0 Jan 11 17:31:59 crc kubenswrapper[4837]: I0111 17:31:59.421652 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-jk4fx" event={"ID":"1aecbb0e-6cc3-4308-a741-c1799ec8b541","Type":"ContainerDied","Data":"1a9a71953ddde761cb532dd9ef63d52ebae6bfcd0fa0ba084bc27f74267b34df"} Jan 11 17:31:59 crc kubenswrapper[4837]: I0111 17:31:59.421712 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-jk4fx" event={"ID":"1aecbb0e-6cc3-4308-a741-c1799ec8b541","Type":"ContainerStarted","Data":"c70c75a47202667ce92075c9c7176bd592470daa0f9f64d3910db538c8ab857e"} Jan 11 17:31:59 crc kubenswrapper[4837]: I0111 17:31:59.432383 4837 generic.go:334] "Generic (PLEG): container finished" podID="df3f0f97-5906-44b5-99d5-6003e1b23be1" containerID="5823a4a921468ed5d06654a32fe52cdbe4ec67f93bb04e760f043ead4a7d2ecc" exitCode=0 Jan 11 17:31:59 crc kubenswrapper[4837]: I0111 17:31:59.433126 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-p7kkt" event={"ID":"df3f0f97-5906-44b5-99d5-6003e1b23be1","Type":"ContainerDied","Data":"5823a4a921468ed5d06654a32fe52cdbe4ec67f93bb04e760f043ead4a7d2ecc"} Jan 11 17:31:59 crc kubenswrapper[4837]: I0111 17:31:59.433156 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-p7kkt" event={"ID":"df3f0f97-5906-44b5-99d5-6003e1b23be1","Type":"ContainerStarted","Data":"797c061fb7e056e4028df3e86c22811a02194612125e61d890da7fc65f0a6eb6"} Jan 11 17:31:59 crc kubenswrapper[4837]: I0111 17:31:59.542732 4837 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/image-registry-697d97f7c8-4nvrr" podStartSLOduration=51.542711962 podStartE2EDuration="51.542711962s" podCreationTimestamp="2026-01-11 17:31:08 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-11 17:31:59.539862889 +0000 UTC m=+93.718055605" watchObservedRunningTime="2026-01-11 17:31:59.542711962 +0000 UTC m=+93.720904668" Jan 11 17:31:59 crc kubenswrapper[4837]: I0111 17:31:59.557815 4837 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="hostpath-provisioner/csi-hostpathplugin-tdxwr" podStartSLOduration=13.557794527 podStartE2EDuration="13.557794527s" podCreationTimestamp="2026-01-11 17:31:46 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-11 17:31:59.557294333 +0000 UTC m=+93.735487039" watchObservedRunningTime="2026-01-11 17:31:59.557794527 +0000 UTC m=+93.735987233" Jan 11 17:31:59 crc kubenswrapper[4837]: I0111 17:31:59.566568 4837 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-9blrl"] Jan 11 17:31:59 crc kubenswrapper[4837]: E0111 17:31:59.566775 4837 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="90317184-ec5a-4dc2-a9f8-6075c0d78aa1" containerName="collect-profiles" Jan 11 17:31:59 crc kubenswrapper[4837]: I0111 17:31:59.566787 4837 state_mem.go:107] "Deleted CPUSet assignment" podUID="90317184-ec5a-4dc2-a9f8-6075c0d78aa1" containerName="collect-profiles" Jan 11 17:31:59 crc kubenswrapper[4837]: I0111 17:31:59.566895 4837 memory_manager.go:354] "RemoveStaleState removing state" podUID="90317184-ec5a-4dc2-a9f8-6075c0d78aa1" containerName="collect-profiles" Jan 11 17:31:59 crc kubenswrapper[4837]: I0111 17:31:59.567572 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-9blrl" Jan 11 17:31:59 crc kubenswrapper[4837]: I0111 17:31:59.568967 4837 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"redhat-marketplace-dockercfg-x2ctb" Jan 11 17:31:59 crc kubenswrapper[4837]: I0111 17:31:59.586352 4837 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-9blrl"] Jan 11 17:31:59 crc kubenswrapper[4837]: I0111 17:31:59.720447 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5a6d8e6a-5e4a-4ff4-92ec-b2bfee9ef821-catalog-content\") pod \"redhat-marketplace-9blrl\" (UID: \"5a6d8e6a-5e4a-4ff4-92ec-b2bfee9ef821\") " pod="openshift-marketplace/redhat-marketplace-9blrl" Jan 11 17:31:59 crc kubenswrapper[4837]: I0111 17:31:59.720579 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5a6d8e6a-5e4a-4ff4-92ec-b2bfee9ef821-utilities\") pod \"redhat-marketplace-9blrl\" (UID: \"5a6d8e6a-5e4a-4ff4-92ec-b2bfee9ef821\") " pod="openshift-marketplace/redhat-marketplace-9blrl" Jan 11 17:31:59 crc kubenswrapper[4837]: I0111 17:31:59.720615 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7qkcm\" (UniqueName: \"kubernetes.io/projected/5a6d8e6a-5e4a-4ff4-92ec-b2bfee9ef821-kube-api-access-7qkcm\") pod \"redhat-marketplace-9blrl\" (UID: \"5a6d8e6a-5e4a-4ff4-92ec-b2bfee9ef821\") " pod="openshift-marketplace/redhat-marketplace-9blrl" Jan 11 17:31:59 crc kubenswrapper[4837]: I0111 17:31:59.822365 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5a6d8e6a-5e4a-4ff4-92ec-b2bfee9ef821-utilities\") pod \"redhat-marketplace-9blrl\" (UID: \"5a6d8e6a-5e4a-4ff4-92ec-b2bfee9ef821\") " pod="openshift-marketplace/redhat-marketplace-9blrl" Jan 11 17:31:59 crc kubenswrapper[4837]: I0111 17:31:59.822726 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7qkcm\" (UniqueName: \"kubernetes.io/projected/5a6d8e6a-5e4a-4ff4-92ec-b2bfee9ef821-kube-api-access-7qkcm\") pod \"redhat-marketplace-9blrl\" (UID: \"5a6d8e6a-5e4a-4ff4-92ec-b2bfee9ef821\") " pod="openshift-marketplace/redhat-marketplace-9blrl" Jan 11 17:31:59 crc kubenswrapper[4837]: I0111 17:31:59.822793 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5a6d8e6a-5e4a-4ff4-92ec-b2bfee9ef821-catalog-content\") pod \"redhat-marketplace-9blrl\" (UID: \"5a6d8e6a-5e4a-4ff4-92ec-b2bfee9ef821\") " pod="openshift-marketplace/redhat-marketplace-9blrl" Jan 11 17:31:59 crc kubenswrapper[4837]: I0111 17:31:59.823584 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5a6d8e6a-5e4a-4ff4-92ec-b2bfee9ef821-utilities\") pod \"redhat-marketplace-9blrl\" (UID: \"5a6d8e6a-5e4a-4ff4-92ec-b2bfee9ef821\") " pod="openshift-marketplace/redhat-marketplace-9blrl" Jan 11 17:31:59 crc kubenswrapper[4837]: I0111 17:31:59.823633 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5a6d8e6a-5e4a-4ff4-92ec-b2bfee9ef821-catalog-content\") pod \"redhat-marketplace-9blrl\" (UID: \"5a6d8e6a-5e4a-4ff4-92ec-b2bfee9ef821\") " pod="openshift-marketplace/redhat-marketplace-9blrl" Jan 11 17:31:59 crc kubenswrapper[4837]: I0111 17:31:59.840658 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7qkcm\" (UniqueName: \"kubernetes.io/projected/5a6d8e6a-5e4a-4ff4-92ec-b2bfee9ef821-kube-api-access-7qkcm\") pod \"redhat-marketplace-9blrl\" (UID: \"5a6d8e6a-5e4a-4ff4-92ec-b2bfee9ef821\") " pod="openshift-marketplace/redhat-marketplace-9blrl" Jan 11 17:31:59 crc kubenswrapper[4837]: I0111 17:31:59.880063 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-9blrl" Jan 11 17:31:59 crc kubenswrapper[4837]: I0111 17:31:59.970624 4837 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-dht47"] Jan 11 17:31:59 crc kubenswrapper[4837]: I0111 17:31:59.972654 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-dht47" Jan 11 17:31:59 crc kubenswrapper[4837]: I0111 17:31:59.984519 4837 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-dht47"] Jan 11 17:32:00 crc kubenswrapper[4837]: I0111 17:32:00.082228 4837 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-9blrl"] Jan 11 17:32:00 crc kubenswrapper[4837]: I0111 17:32:00.126354 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/2bf0823a-63d9-43c9-9cc7-1e2f364b7855-catalog-content\") pod \"redhat-marketplace-dht47\" (UID: \"2bf0823a-63d9-43c9-9cc7-1e2f364b7855\") " pod="openshift-marketplace/redhat-marketplace-dht47" Jan 11 17:32:00 crc kubenswrapper[4837]: I0111 17:32:00.126454 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/2bf0823a-63d9-43c9-9cc7-1e2f364b7855-utilities\") pod \"redhat-marketplace-dht47\" (UID: \"2bf0823a-63d9-43c9-9cc7-1e2f364b7855\") " pod="openshift-marketplace/redhat-marketplace-dht47" Jan 11 17:32:00 crc kubenswrapper[4837]: I0111 17:32:00.126502 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9cmwk\" (UniqueName: \"kubernetes.io/projected/2bf0823a-63d9-43c9-9cc7-1e2f364b7855-kube-api-access-9cmwk\") pod \"redhat-marketplace-dht47\" (UID: \"2bf0823a-63d9-43c9-9cc7-1e2f364b7855\") " pod="openshift-marketplace/redhat-marketplace-dht47" Jan 11 17:32:00 crc kubenswrapper[4837]: I0111 17:32:00.209957 4837 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-controller-manager/revision-pruner-9-crc"] Jan 11 17:32:00 crc kubenswrapper[4837]: I0111 17:32:00.210945 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/revision-pruner-9-crc" Jan 11 17:32:00 crc kubenswrapper[4837]: I0111 17:32:00.213882 4837 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-controller-manager"/"kube-root-ca.crt" Jan 11 17:32:00 crc kubenswrapper[4837]: I0111 17:32:00.214346 4837 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-controller-manager"/"installer-sa-dockercfg-kjl2n" Jan 11 17:32:00 crc kubenswrapper[4837]: I0111 17:32:00.216331 4837 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-controller-manager/revision-pruner-9-crc"] Jan 11 17:32:00 crc kubenswrapper[4837]: I0111 17:32:00.227546 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/2bf0823a-63d9-43c9-9cc7-1e2f364b7855-catalog-content\") pod \"redhat-marketplace-dht47\" (UID: \"2bf0823a-63d9-43c9-9cc7-1e2f364b7855\") " pod="openshift-marketplace/redhat-marketplace-dht47" Jan 11 17:32:00 crc kubenswrapper[4837]: I0111 17:32:00.227627 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/2bf0823a-63d9-43c9-9cc7-1e2f364b7855-utilities\") pod \"redhat-marketplace-dht47\" (UID: \"2bf0823a-63d9-43c9-9cc7-1e2f364b7855\") " pod="openshift-marketplace/redhat-marketplace-dht47" Jan 11 17:32:00 crc kubenswrapper[4837]: I0111 17:32:00.227719 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9cmwk\" (UniqueName: \"kubernetes.io/projected/2bf0823a-63d9-43c9-9cc7-1e2f364b7855-kube-api-access-9cmwk\") pod \"redhat-marketplace-dht47\" (UID: \"2bf0823a-63d9-43c9-9cc7-1e2f364b7855\") " pod="openshift-marketplace/redhat-marketplace-dht47" Jan 11 17:32:00 crc kubenswrapper[4837]: I0111 17:32:00.228336 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/2bf0823a-63d9-43c9-9cc7-1e2f364b7855-utilities\") pod \"redhat-marketplace-dht47\" (UID: \"2bf0823a-63d9-43c9-9cc7-1e2f364b7855\") " pod="openshift-marketplace/redhat-marketplace-dht47" Jan 11 17:32:00 crc kubenswrapper[4837]: I0111 17:32:00.228615 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/2bf0823a-63d9-43c9-9cc7-1e2f364b7855-catalog-content\") pod \"redhat-marketplace-dht47\" (UID: \"2bf0823a-63d9-43c9-9cc7-1e2f364b7855\") " pod="openshift-marketplace/redhat-marketplace-dht47" Jan 11 17:32:00 crc kubenswrapper[4837]: I0111 17:32:00.254585 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9cmwk\" (UniqueName: \"kubernetes.io/projected/2bf0823a-63d9-43c9-9cc7-1e2f364b7855-kube-api-access-9cmwk\") pod \"redhat-marketplace-dht47\" (UID: \"2bf0823a-63d9-43c9-9cc7-1e2f364b7855\") " pod="openshift-marketplace/redhat-marketplace-dht47" Jan 11 17:32:00 crc kubenswrapper[4837]: I0111 17:32:00.293033 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-dht47" Jan 11 17:32:00 crc kubenswrapper[4837]: I0111 17:32:00.329721 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/e2c2ceba-826a-4cbe-beb4-a6f43c55cd93-kube-api-access\") pod \"revision-pruner-9-crc\" (UID: \"e2c2ceba-826a-4cbe-beb4-a6f43c55cd93\") " pod="openshift-kube-controller-manager/revision-pruner-9-crc" Jan 11 17:32:00 crc kubenswrapper[4837]: I0111 17:32:00.329925 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/e2c2ceba-826a-4cbe-beb4-a6f43c55cd93-kubelet-dir\") pod \"revision-pruner-9-crc\" (UID: \"e2c2ceba-826a-4cbe-beb4-a6f43c55cd93\") " pod="openshift-kube-controller-manager/revision-pruner-9-crc" Jan 11 17:32:00 crc kubenswrapper[4837]: I0111 17:32:00.335251 4837 patch_prober.go:28] interesting pod/router-default-5444994796-cscqg container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Jan 11 17:32:00 crc kubenswrapper[4837]: [-]has-synced failed: reason withheld Jan 11 17:32:00 crc kubenswrapper[4837]: [+]process-running ok Jan 11 17:32:00 crc kubenswrapper[4837]: healthz check failed Jan 11 17:32:00 crc kubenswrapper[4837]: I0111 17:32:00.335334 4837 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-cscqg" podUID="e610467b-9c0c-47ac-86c8-2d700aba3e8e" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Jan 11 17:32:00 crc kubenswrapper[4837]: I0111 17:32:00.431210 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/e2c2ceba-826a-4cbe-beb4-a6f43c55cd93-kube-api-access\") pod \"revision-pruner-9-crc\" (UID: \"e2c2ceba-826a-4cbe-beb4-a6f43c55cd93\") " pod="openshift-kube-controller-manager/revision-pruner-9-crc" Jan 11 17:32:00 crc kubenswrapper[4837]: I0111 17:32:00.431867 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/e2c2ceba-826a-4cbe-beb4-a6f43c55cd93-kubelet-dir\") pod \"revision-pruner-9-crc\" (UID: \"e2c2ceba-826a-4cbe-beb4-a6f43c55cd93\") " pod="openshift-kube-controller-manager/revision-pruner-9-crc" Jan 11 17:32:00 crc kubenswrapper[4837]: I0111 17:32:00.431968 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/e2c2ceba-826a-4cbe-beb4-a6f43c55cd93-kubelet-dir\") pod \"revision-pruner-9-crc\" (UID: \"e2c2ceba-826a-4cbe-beb4-a6f43c55cd93\") " pod="openshift-kube-controller-manager/revision-pruner-9-crc" Jan 11 17:32:00 crc kubenswrapper[4837]: I0111 17:32:00.452610 4837 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-apiserver/revision-pruner-8-crc"] Jan 11 17:32:00 crc kubenswrapper[4837]: I0111 17:32:00.453352 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-8-crc" Jan 11 17:32:00 crc kubenswrapper[4837]: I0111 17:32:00.457036 4837 generic.go:334] "Generic (PLEG): container finished" podID="5a6d8e6a-5e4a-4ff4-92ec-b2bfee9ef821" containerID="4bf1ec8c973fcebc64c9f7fd102a91408eee4713be76ef7f51560569c85fc080" exitCode=0 Jan 11 17:32:00 crc kubenswrapper[4837]: I0111 17:32:00.459518 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-9blrl" event={"ID":"5a6d8e6a-5e4a-4ff4-92ec-b2bfee9ef821","Type":"ContainerDied","Data":"4bf1ec8c973fcebc64c9f7fd102a91408eee4713be76ef7f51560569c85fc080"} Jan 11 17:32:00 crc kubenswrapper[4837]: I0111 17:32:00.459547 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-9blrl" event={"ID":"5a6d8e6a-5e4a-4ff4-92ec-b2bfee9ef821","Type":"ContainerStarted","Data":"4e78beaaa18406072bf90fd6ff26fa4da2689d59e9678b66ef1563919ebf2ad2"} Jan 11 17:32:00 crc kubenswrapper[4837]: I0111 17:32:00.460648 4837 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-apiserver"/"installer-sa-dockercfg-5pr6n" Jan 11 17:32:00 crc kubenswrapper[4837]: I0111 17:32:00.461962 4837 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-apiserver"/"kube-root-ca.crt" Jan 11 17:32:00 crc kubenswrapper[4837]: I0111 17:32:00.464487 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/e2c2ceba-826a-4cbe-beb4-a6f43c55cd93-kube-api-access\") pod \"revision-pruner-9-crc\" (UID: \"e2c2ceba-826a-4cbe-beb4-a6f43c55cd93\") " pod="openshift-kube-controller-manager/revision-pruner-9-crc" Jan 11 17:32:00 crc kubenswrapper[4837]: I0111 17:32:00.465962 4837 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver/revision-pruner-8-crc"] Jan 11 17:32:00 crc kubenswrapper[4837]: I0111 17:32:00.518751 4837 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-dht47"] Jan 11 17:32:00 crc kubenswrapper[4837]: W0111 17:32:00.527958 4837 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod2bf0823a_63d9_43c9_9cc7_1e2f364b7855.slice/crio-8b2ffa20981fa479ca0703e74ac8b709cb484f2c0f08727794301b33699c2c82 WatchSource:0}: Error finding container 8b2ffa20981fa479ca0703e74ac8b709cb484f2c0f08727794301b33699c2c82: Status 404 returned error can't find the container with id 8b2ffa20981fa479ca0703e74ac8b709cb484f2c0f08727794301b33699c2c82 Jan 11 17:32:00 crc kubenswrapper[4837]: I0111 17:32:00.533199 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/3526a6e7-fcdc-4304-adea-bdf2a5fbe1f2-kubelet-dir\") pod \"revision-pruner-8-crc\" (UID: \"3526a6e7-fcdc-4304-adea-bdf2a5fbe1f2\") " pod="openshift-kube-apiserver/revision-pruner-8-crc" Jan 11 17:32:00 crc kubenswrapper[4837]: I0111 17:32:00.533273 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/3526a6e7-fcdc-4304-adea-bdf2a5fbe1f2-kube-api-access\") pod \"revision-pruner-8-crc\" (UID: \"3526a6e7-fcdc-4304-adea-bdf2a5fbe1f2\") " pod="openshift-kube-apiserver/revision-pruner-8-crc" Jan 11 17:32:00 crc kubenswrapper[4837]: I0111 17:32:00.537846 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/revision-pruner-9-crc" Jan 11 17:32:00 crc kubenswrapper[4837]: I0111 17:32:00.635331 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/3526a6e7-fcdc-4304-adea-bdf2a5fbe1f2-kubelet-dir\") pod \"revision-pruner-8-crc\" (UID: \"3526a6e7-fcdc-4304-adea-bdf2a5fbe1f2\") " pod="openshift-kube-apiserver/revision-pruner-8-crc" Jan 11 17:32:00 crc kubenswrapper[4837]: I0111 17:32:00.635429 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/3526a6e7-fcdc-4304-adea-bdf2a5fbe1f2-kube-api-access\") pod \"revision-pruner-8-crc\" (UID: \"3526a6e7-fcdc-4304-adea-bdf2a5fbe1f2\") " pod="openshift-kube-apiserver/revision-pruner-8-crc" Jan 11 17:32:00 crc kubenswrapper[4837]: I0111 17:32:00.635428 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/3526a6e7-fcdc-4304-adea-bdf2a5fbe1f2-kubelet-dir\") pod \"revision-pruner-8-crc\" (UID: \"3526a6e7-fcdc-4304-adea-bdf2a5fbe1f2\") " pod="openshift-kube-apiserver/revision-pruner-8-crc" Jan 11 17:32:00 crc kubenswrapper[4837]: I0111 17:32:00.655459 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/3526a6e7-fcdc-4304-adea-bdf2a5fbe1f2-kube-api-access\") pod \"revision-pruner-8-crc\" (UID: \"3526a6e7-fcdc-4304-adea-bdf2a5fbe1f2\") " pod="openshift-kube-apiserver/revision-pruner-8-crc" Jan 11 17:32:00 crc kubenswrapper[4837]: I0111 17:32:00.757017 4837 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-controller-manager/revision-pruner-9-crc"] Jan 11 17:32:00 crc kubenswrapper[4837]: I0111 17:32:00.770056 4837 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-slftv"] Jan 11 17:32:00 crc kubenswrapper[4837]: I0111 17:32:00.772420 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-slftv" Jan 11 17:32:00 crc kubenswrapper[4837]: I0111 17:32:00.774922 4837 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"redhat-operators-dockercfg-ct8rh" Jan 11 17:32:00 crc kubenswrapper[4837]: I0111 17:32:00.777207 4837 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-slftv"] Jan 11 17:32:00 crc kubenswrapper[4837]: I0111 17:32:00.801368 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-8-crc" Jan 11 17:32:00 crc kubenswrapper[4837]: I0111 17:32:00.839433 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/59797bd6-cb69-412d-952b-1673312648e2-catalog-content\") pod \"redhat-operators-slftv\" (UID: \"59797bd6-cb69-412d-952b-1673312648e2\") " pod="openshift-marketplace/redhat-operators-slftv" Jan 11 17:32:00 crc kubenswrapper[4837]: I0111 17:32:00.844750 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-md5ls\" (UniqueName: \"kubernetes.io/projected/59797bd6-cb69-412d-952b-1673312648e2-kube-api-access-md5ls\") pod \"redhat-operators-slftv\" (UID: \"59797bd6-cb69-412d-952b-1673312648e2\") " pod="openshift-marketplace/redhat-operators-slftv" Jan 11 17:32:00 crc kubenswrapper[4837]: I0111 17:32:00.844951 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/59797bd6-cb69-412d-952b-1673312648e2-utilities\") pod \"redhat-operators-slftv\" (UID: \"59797bd6-cb69-412d-952b-1673312648e2\") " pod="openshift-marketplace/redhat-operators-slftv" Jan 11 17:32:00 crc kubenswrapper[4837]: I0111 17:32:00.945932 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/59797bd6-cb69-412d-952b-1673312648e2-utilities\") pod \"redhat-operators-slftv\" (UID: \"59797bd6-cb69-412d-952b-1673312648e2\") " pod="openshift-marketplace/redhat-operators-slftv" Jan 11 17:32:00 crc kubenswrapper[4837]: I0111 17:32:00.945976 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/59797bd6-cb69-412d-952b-1673312648e2-catalog-content\") pod \"redhat-operators-slftv\" (UID: \"59797bd6-cb69-412d-952b-1673312648e2\") " pod="openshift-marketplace/redhat-operators-slftv" Jan 11 17:32:00 crc kubenswrapper[4837]: I0111 17:32:00.945999 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-md5ls\" (UniqueName: \"kubernetes.io/projected/59797bd6-cb69-412d-952b-1673312648e2-kube-api-access-md5ls\") pod \"redhat-operators-slftv\" (UID: \"59797bd6-cb69-412d-952b-1673312648e2\") " pod="openshift-marketplace/redhat-operators-slftv" Jan 11 17:32:00 crc kubenswrapper[4837]: I0111 17:32:00.946625 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/59797bd6-cb69-412d-952b-1673312648e2-utilities\") pod \"redhat-operators-slftv\" (UID: \"59797bd6-cb69-412d-952b-1673312648e2\") " pod="openshift-marketplace/redhat-operators-slftv" Jan 11 17:32:00 crc kubenswrapper[4837]: I0111 17:32:00.946644 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/59797bd6-cb69-412d-952b-1673312648e2-catalog-content\") pod \"redhat-operators-slftv\" (UID: \"59797bd6-cb69-412d-952b-1673312648e2\") " pod="openshift-marketplace/redhat-operators-slftv" Jan 11 17:32:00 crc kubenswrapper[4837]: I0111 17:32:00.970881 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-md5ls\" (UniqueName: \"kubernetes.io/projected/59797bd6-cb69-412d-952b-1673312648e2-kube-api-access-md5ls\") pod \"redhat-operators-slftv\" (UID: \"59797bd6-cb69-412d-952b-1673312648e2\") " pod="openshift-marketplace/redhat-operators-slftv" Jan 11 17:32:01 crc kubenswrapper[4837]: I0111 17:32:01.004653 4837 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver/revision-pruner-8-crc"] Jan 11 17:32:01 crc kubenswrapper[4837]: W0111 17:32:01.034981 4837 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-pod3526a6e7_fcdc_4304_adea_bdf2a5fbe1f2.slice/crio-0089f5b92fb784897508644043b5648109a56ace91647ba5c92df0f05a096a17 WatchSource:0}: Error finding container 0089f5b92fb784897508644043b5648109a56ace91647ba5c92df0f05a096a17: Status 404 returned error can't find the container with id 0089f5b92fb784897508644043b5648109a56ace91647ba5c92df0f05a096a17 Jan 11 17:32:01 crc kubenswrapper[4837]: I0111 17:32:01.101487 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-slftv" Jan 11 17:32:01 crc kubenswrapper[4837]: I0111 17:32:01.169004 4837 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-pvd2s"] Jan 11 17:32:01 crc kubenswrapper[4837]: I0111 17:32:01.170520 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-pvd2s" Jan 11 17:32:01 crc kubenswrapper[4837]: I0111 17:32:01.173178 4837 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-pvd2s"] Jan 11 17:32:01 crc kubenswrapper[4837]: I0111 17:32:01.249293 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/0f2f437a-d901-4e9b-95c1-9099d8ffaf9c-utilities\") pod \"redhat-operators-pvd2s\" (UID: \"0f2f437a-d901-4e9b-95c1-9099d8ffaf9c\") " pod="openshift-marketplace/redhat-operators-pvd2s" Jan 11 17:32:01 crc kubenswrapper[4837]: I0111 17:32:01.249350 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/0f2f437a-d901-4e9b-95c1-9099d8ffaf9c-catalog-content\") pod \"redhat-operators-pvd2s\" (UID: \"0f2f437a-d901-4e9b-95c1-9099d8ffaf9c\") " pod="openshift-marketplace/redhat-operators-pvd2s" Jan 11 17:32:01 crc kubenswrapper[4837]: I0111 17:32:01.249375 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hjfqz\" (UniqueName: \"kubernetes.io/projected/0f2f437a-d901-4e9b-95c1-9099d8ffaf9c-kube-api-access-hjfqz\") pod \"redhat-operators-pvd2s\" (UID: \"0f2f437a-d901-4e9b-95c1-9099d8ffaf9c\") " pod="openshift-marketplace/redhat-operators-pvd2s" Jan 11 17:32:01 crc kubenswrapper[4837]: I0111 17:32:01.335611 4837 patch_prober.go:28] interesting pod/router-default-5444994796-cscqg container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Jan 11 17:32:01 crc kubenswrapper[4837]: [-]has-synced failed: reason withheld Jan 11 17:32:01 crc kubenswrapper[4837]: [+]process-running ok Jan 11 17:32:01 crc kubenswrapper[4837]: healthz check failed Jan 11 17:32:01 crc kubenswrapper[4837]: I0111 17:32:01.335663 4837 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-cscqg" podUID="e610467b-9c0c-47ac-86c8-2d700aba3e8e" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Jan 11 17:32:01 crc kubenswrapper[4837]: I0111 17:32:01.349763 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/0f2f437a-d901-4e9b-95c1-9099d8ffaf9c-catalog-content\") pod \"redhat-operators-pvd2s\" (UID: \"0f2f437a-d901-4e9b-95c1-9099d8ffaf9c\") " pod="openshift-marketplace/redhat-operators-pvd2s" Jan 11 17:32:01 crc kubenswrapper[4837]: I0111 17:32:01.349795 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hjfqz\" (UniqueName: \"kubernetes.io/projected/0f2f437a-d901-4e9b-95c1-9099d8ffaf9c-kube-api-access-hjfqz\") pod \"redhat-operators-pvd2s\" (UID: \"0f2f437a-d901-4e9b-95c1-9099d8ffaf9c\") " pod="openshift-marketplace/redhat-operators-pvd2s" Jan 11 17:32:01 crc kubenswrapper[4837]: I0111 17:32:01.349875 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/0f2f437a-d901-4e9b-95c1-9099d8ffaf9c-utilities\") pod \"redhat-operators-pvd2s\" (UID: \"0f2f437a-d901-4e9b-95c1-9099d8ffaf9c\") " pod="openshift-marketplace/redhat-operators-pvd2s" Jan 11 17:32:01 crc kubenswrapper[4837]: I0111 17:32:01.350218 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/0f2f437a-d901-4e9b-95c1-9099d8ffaf9c-utilities\") pod \"redhat-operators-pvd2s\" (UID: \"0f2f437a-d901-4e9b-95c1-9099d8ffaf9c\") " pod="openshift-marketplace/redhat-operators-pvd2s" Jan 11 17:32:01 crc kubenswrapper[4837]: I0111 17:32:01.351002 4837 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-slftv"] Jan 11 17:32:01 crc kubenswrapper[4837]: I0111 17:32:01.351196 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/0f2f437a-d901-4e9b-95c1-9099d8ffaf9c-catalog-content\") pod \"redhat-operators-pvd2s\" (UID: \"0f2f437a-d901-4e9b-95c1-9099d8ffaf9c\") " pod="openshift-marketplace/redhat-operators-pvd2s" Jan 11 17:32:01 crc kubenswrapper[4837]: I0111 17:32:01.366909 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hjfqz\" (UniqueName: \"kubernetes.io/projected/0f2f437a-d901-4e9b-95c1-9099d8ffaf9c-kube-api-access-hjfqz\") pod \"redhat-operators-pvd2s\" (UID: \"0f2f437a-d901-4e9b-95c1-9099d8ffaf9c\") " pod="openshift-marketplace/redhat-operators-pvd2s" Jan 11 17:32:01 crc kubenswrapper[4837]: W0111 17:32:01.379372 4837 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod59797bd6_cb69_412d_952b_1673312648e2.slice/crio-436761847f8f2a984da0011d27620401bb5241e31549e4dcb6885900fecaf7ec WatchSource:0}: Error finding container 436761847f8f2a984da0011d27620401bb5241e31549e4dcb6885900fecaf7ec: Status 404 returned error can't find the container with id 436761847f8f2a984da0011d27620401bb5241e31549e4dcb6885900fecaf7ec Jan 11 17:32:01 crc kubenswrapper[4837]: I0111 17:32:01.465012 4837 generic.go:334] "Generic (PLEG): container finished" podID="2bf0823a-63d9-43c9-9cc7-1e2f364b7855" containerID="8ec512d8d499f7cff74278971351339d0e1358e1d54da8bda6dc623aaf984d7b" exitCode=0 Jan 11 17:32:01 crc kubenswrapper[4837]: I0111 17:32:01.465101 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-dht47" event={"ID":"2bf0823a-63d9-43c9-9cc7-1e2f364b7855","Type":"ContainerDied","Data":"8ec512d8d499f7cff74278971351339d0e1358e1d54da8bda6dc623aaf984d7b"} Jan 11 17:32:01 crc kubenswrapper[4837]: I0111 17:32:01.465446 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-dht47" event={"ID":"2bf0823a-63d9-43c9-9cc7-1e2f364b7855","Type":"ContainerStarted","Data":"8b2ffa20981fa479ca0703e74ac8b709cb484f2c0f08727794301b33699c2c82"} Jan 11 17:32:01 crc kubenswrapper[4837]: I0111 17:32:01.468320 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/revision-pruner-8-crc" event={"ID":"3526a6e7-fcdc-4304-adea-bdf2a5fbe1f2","Type":"ContainerStarted","Data":"0089f5b92fb784897508644043b5648109a56ace91647ba5c92df0f05a096a17"} Jan 11 17:32:01 crc kubenswrapper[4837]: I0111 17:32:01.470841 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-slftv" event={"ID":"59797bd6-cb69-412d-952b-1673312648e2","Type":"ContainerStarted","Data":"436761847f8f2a984da0011d27620401bb5241e31549e4dcb6885900fecaf7ec"} Jan 11 17:32:01 crc kubenswrapper[4837]: I0111 17:32:01.472583 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/revision-pruner-9-crc" event={"ID":"e2c2ceba-826a-4cbe-beb4-a6f43c55cd93","Type":"ContainerStarted","Data":"c43417bcc294bf4c1d06c628932fd5329f66c140cafdae7a45b21a27839f90c0"} Jan 11 17:32:01 crc kubenswrapper[4837]: I0111 17:32:01.561385 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-pvd2s" Jan 11 17:32:01 crc kubenswrapper[4837]: I0111 17:32:01.719153 4837 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-pvd2s"] Jan 11 17:32:02 crc kubenswrapper[4837]: I0111 17:32:02.333835 4837 patch_prober.go:28] interesting pod/router-default-5444994796-cscqg container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Jan 11 17:32:02 crc kubenswrapper[4837]: [-]has-synced failed: reason withheld Jan 11 17:32:02 crc kubenswrapper[4837]: [+]process-running ok Jan 11 17:32:02 crc kubenswrapper[4837]: healthz check failed Jan 11 17:32:02 crc kubenswrapper[4837]: I0111 17:32:02.334131 4837 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-cscqg" podUID="e610467b-9c0c-47ac-86c8-2d700aba3e8e" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Jan 11 17:32:02 crc kubenswrapper[4837]: I0111 17:32:02.477523 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-pvd2s" event={"ID":"0f2f437a-d901-4e9b-95c1-9099d8ffaf9c","Type":"ContainerStarted","Data":"d65af25391a3909726d5c7948186895e49ad733f8515b03dda341949131b0b79"} Jan 11 17:32:03 crc kubenswrapper[4837]: I0111 17:32:03.333127 4837 patch_prober.go:28] interesting pod/router-default-5444994796-cscqg container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Jan 11 17:32:03 crc kubenswrapper[4837]: [-]has-synced failed: reason withheld Jan 11 17:32:03 crc kubenswrapper[4837]: [+]process-running ok Jan 11 17:32:03 crc kubenswrapper[4837]: healthz check failed Jan 11 17:32:03 crc kubenswrapper[4837]: I0111 17:32:03.333204 4837 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-cscqg" podUID="e610467b-9c0c-47ac-86c8-2d700aba3e8e" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Jan 11 17:32:03 crc kubenswrapper[4837]: I0111 17:32:03.445167 4837 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-apiserver/apiserver-76f77b778f-pz586" Jan 11 17:32:03 crc kubenswrapper[4837]: I0111 17:32:03.452002 4837 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-apiserver/apiserver-76f77b778f-pz586" Jan 11 17:32:03 crc kubenswrapper[4837]: I0111 17:32:03.512242 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-slftv" event={"ID":"59797bd6-cb69-412d-952b-1673312648e2","Type":"ContainerStarted","Data":"3683e44906242287c17aef068398977dc417d8d42f4462c46ba87cf6ec098309"} Jan 11 17:32:03 crc kubenswrapper[4837]: I0111 17:32:03.518017 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/revision-pruner-8-crc" event={"ID":"3526a6e7-fcdc-4304-adea-bdf2a5fbe1f2","Type":"ContainerStarted","Data":"aecfb7b6d89a18402d48efa8fa20933ba92299109095c72800e4dda4811f0468"} Jan 11 17:32:04 crc kubenswrapper[4837]: I0111 17:32:04.043880 4837 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-dns/dns-default-dddb7" Jan 11 17:32:04 crc kubenswrapper[4837]: I0111 17:32:04.332880 4837 patch_prober.go:28] interesting pod/router-default-5444994796-cscqg container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Jan 11 17:32:04 crc kubenswrapper[4837]: [-]has-synced failed: reason withheld Jan 11 17:32:04 crc kubenswrapper[4837]: [+]process-running ok Jan 11 17:32:04 crc kubenswrapper[4837]: healthz check failed Jan 11 17:32:04 crc kubenswrapper[4837]: I0111 17:32:04.332949 4837 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-cscqg" podUID="e610467b-9c0c-47ac-86c8-2d700aba3e8e" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Jan 11 17:32:04 crc kubenswrapper[4837]: I0111 17:32:04.525476 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/revision-pruner-9-crc" event={"ID":"e2c2ceba-826a-4cbe-beb4-a6f43c55cd93","Type":"ContainerStarted","Data":"ca4e4a4c10788547c9aee58880b4688d2b244cf47051e796f624ccd8bea77a8b"} Jan 11 17:32:04 crc kubenswrapper[4837]: I0111 17:32:04.529130 4837 generic.go:334] "Generic (PLEG): container finished" podID="59797bd6-cb69-412d-952b-1673312648e2" containerID="3683e44906242287c17aef068398977dc417d8d42f4462c46ba87cf6ec098309" exitCode=0 Jan 11 17:32:04 crc kubenswrapper[4837]: I0111 17:32:04.529156 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-slftv" event={"ID":"59797bd6-cb69-412d-952b-1673312648e2","Type":"ContainerDied","Data":"3683e44906242287c17aef068398977dc417d8d42f4462c46ba87cf6ec098309"} Jan 11 17:32:05 crc kubenswrapper[4837]: I0111 17:32:05.337806 4837 patch_prober.go:28] interesting pod/router-default-5444994796-cscqg container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Jan 11 17:32:05 crc kubenswrapper[4837]: [-]has-synced failed: reason withheld Jan 11 17:32:05 crc kubenswrapper[4837]: [+]process-running ok Jan 11 17:32:05 crc kubenswrapper[4837]: healthz check failed Jan 11 17:32:05 crc kubenswrapper[4837]: I0111 17:32:05.337862 4837 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-cscqg" podUID="e610467b-9c0c-47ac-86c8-2d700aba3e8e" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Jan 11 17:32:05 crc kubenswrapper[4837]: I0111 17:32:05.549726 4837 generic.go:334] "Generic (PLEG): container finished" podID="3526a6e7-fcdc-4304-adea-bdf2a5fbe1f2" containerID="aecfb7b6d89a18402d48efa8fa20933ba92299109095c72800e4dda4811f0468" exitCode=0 Jan 11 17:32:05 crc kubenswrapper[4837]: I0111 17:32:05.550466 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/revision-pruner-8-crc" event={"ID":"3526a6e7-fcdc-4304-adea-bdf2a5fbe1f2","Type":"ContainerDied","Data":"aecfb7b6d89a18402d48efa8fa20933ba92299109095c72800e4dda4811f0468"} Jan 11 17:32:05 crc kubenswrapper[4837]: I0111 17:32:05.556783 4837 generic.go:334] "Generic (PLEG): container finished" podID="0f2f437a-d901-4e9b-95c1-9099d8ffaf9c" containerID="e1109df7548fbbe03c9812204667c1a90ae3db7ecebd3640143da0f6bb033fb3" exitCode=0 Jan 11 17:32:05 crc kubenswrapper[4837]: I0111 17:32:05.558085 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-pvd2s" event={"ID":"0f2f437a-d901-4e9b-95c1-9099d8ffaf9c","Type":"ContainerDied","Data":"e1109df7548fbbe03c9812204667c1a90ae3db7ecebd3640143da0f6bb033fb3"} Jan 11 17:32:06 crc kubenswrapper[4837]: I0111 17:32:06.333480 4837 patch_prober.go:28] interesting pod/router-default-5444994796-cscqg container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Jan 11 17:32:06 crc kubenswrapper[4837]: [-]has-synced failed: reason withheld Jan 11 17:32:06 crc kubenswrapper[4837]: [+]process-running ok Jan 11 17:32:06 crc kubenswrapper[4837]: healthz check failed Jan 11 17:32:06 crc kubenswrapper[4837]: I0111 17:32:06.333534 4837 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-cscqg" podUID="e610467b-9c0c-47ac-86c8-2d700aba3e8e" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Jan 11 17:32:06 crc kubenswrapper[4837]: I0111 17:32:06.565499 4837 generic.go:334] "Generic (PLEG): container finished" podID="e2c2ceba-826a-4cbe-beb4-a6f43c55cd93" containerID="ca4e4a4c10788547c9aee58880b4688d2b244cf47051e796f624ccd8bea77a8b" exitCode=0 Jan 11 17:32:06 crc kubenswrapper[4837]: I0111 17:32:06.565787 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/revision-pruner-9-crc" event={"ID":"e2c2ceba-826a-4cbe-beb4-a6f43c55cd93","Type":"ContainerDied","Data":"ca4e4a4c10788547c9aee58880b4688d2b244cf47051e796f624ccd8bea77a8b"} Jan 11 17:32:06 crc kubenswrapper[4837]: I0111 17:32:06.854272 4837 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-8-crc" Jan 11 17:32:06 crc kubenswrapper[4837]: I0111 17:32:06.957364 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/3526a6e7-fcdc-4304-adea-bdf2a5fbe1f2-kubelet-dir\") pod \"3526a6e7-fcdc-4304-adea-bdf2a5fbe1f2\" (UID: \"3526a6e7-fcdc-4304-adea-bdf2a5fbe1f2\") " Jan 11 17:32:06 crc kubenswrapper[4837]: I0111 17:32:06.957460 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/3526a6e7-fcdc-4304-adea-bdf2a5fbe1f2-kube-api-access\") pod \"3526a6e7-fcdc-4304-adea-bdf2a5fbe1f2\" (UID: \"3526a6e7-fcdc-4304-adea-bdf2a5fbe1f2\") " Jan 11 17:32:06 crc kubenswrapper[4837]: I0111 17:32:06.957491 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/3526a6e7-fcdc-4304-adea-bdf2a5fbe1f2-kubelet-dir" (OuterVolumeSpecName: "kubelet-dir") pod "3526a6e7-fcdc-4304-adea-bdf2a5fbe1f2" (UID: "3526a6e7-fcdc-4304-adea-bdf2a5fbe1f2"). InnerVolumeSpecName "kubelet-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Jan 11 17:32:06 crc kubenswrapper[4837]: I0111 17:32:06.957860 4837 reconciler_common.go:293] "Volume detached for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/3526a6e7-fcdc-4304-adea-bdf2a5fbe1f2-kubelet-dir\") on node \"crc\" DevicePath \"\"" Jan 11 17:32:06 crc kubenswrapper[4837]: I0111 17:32:06.962742 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3526a6e7-fcdc-4304-adea-bdf2a5fbe1f2-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "3526a6e7-fcdc-4304-adea-bdf2a5fbe1f2" (UID: "3526a6e7-fcdc-4304-adea-bdf2a5fbe1f2"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 11 17:32:07 crc kubenswrapper[4837]: I0111 17:32:07.059910 4837 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/3526a6e7-fcdc-4304-adea-bdf2a5fbe1f2-kube-api-access\") on node \"crc\" DevicePath \"\"" Jan 11 17:32:07 crc kubenswrapper[4837]: I0111 17:32:07.333990 4837 patch_prober.go:28] interesting pod/router-default-5444994796-cscqg container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Jan 11 17:32:07 crc kubenswrapper[4837]: [-]has-synced failed: reason withheld Jan 11 17:32:07 crc kubenswrapper[4837]: [+]process-running ok Jan 11 17:32:07 crc kubenswrapper[4837]: healthz check failed Jan 11 17:32:07 crc kubenswrapper[4837]: I0111 17:32:07.334210 4837 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-cscqg" podUID="e610467b-9c0c-47ac-86c8-2d700aba3e8e" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Jan 11 17:32:07 crc kubenswrapper[4837]: I0111 17:32:07.573716 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/revision-pruner-8-crc" event={"ID":"3526a6e7-fcdc-4304-adea-bdf2a5fbe1f2","Type":"ContainerDied","Data":"0089f5b92fb784897508644043b5648109a56ace91647ba5c92df0f05a096a17"} Jan 11 17:32:07 crc kubenswrapper[4837]: I0111 17:32:07.573759 4837 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="0089f5b92fb784897508644043b5648109a56ace91647ba5c92df0f05a096a17" Jan 11 17:32:07 crc kubenswrapper[4837]: I0111 17:32:07.573822 4837 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-8-crc" Jan 11 17:32:08 crc kubenswrapper[4837]: I0111 17:32:08.048202 4837 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console/downloads-7954f5f757-dmnqc" Jan 11 17:32:08 crc kubenswrapper[4837]: I0111 17:32:08.142773 4837 patch_prober.go:28] interesting pod/console-f9d7485db-v8df8 container/console namespace/openshift-console: Startup probe status=failure output="Get \"https://10.217.0.6:8443/health\": dial tcp 10.217.0.6:8443: connect: connection refused" start-of-body= Jan 11 17:32:08 crc kubenswrapper[4837]: I0111 17:32:08.143094 4837 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-console/console-f9d7485db-v8df8" podUID="1141f492-afec-40f3-bde7-7072d6a75a68" containerName="console" probeResult="failure" output="Get \"https://10.217.0.6:8443/health\": dial tcp 10.217.0.6:8443: connect: connection refused" Jan 11 17:32:08 crc kubenswrapper[4837]: I0111 17:32:08.336556 4837 patch_prober.go:28] interesting pod/router-default-5444994796-cscqg container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Jan 11 17:32:08 crc kubenswrapper[4837]: [-]has-synced failed: reason withheld Jan 11 17:32:08 crc kubenswrapper[4837]: [+]process-running ok Jan 11 17:32:08 crc kubenswrapper[4837]: healthz check failed Jan 11 17:32:08 crc kubenswrapper[4837]: I0111 17:32:08.336624 4837 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-cscqg" podUID="e610467b-9c0c-47ac-86c8-2d700aba3e8e" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Jan 11 17:32:09 crc kubenswrapper[4837]: E0111 17:32:09.052488 4837 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="8e11a566745fc0a91a7711eeb5325461c2be37c7f2d3144b9cc34f15e4aa587a" cmd=["/bin/bash","-c","test -f /ready/ready"] Jan 11 17:32:09 crc kubenswrapper[4837]: E0111 17:32:09.055486 4837 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="8e11a566745fc0a91a7711eeb5325461c2be37c7f2d3144b9cc34f15e4aa587a" cmd=["/bin/bash","-c","test -f /ready/ready"] Jan 11 17:32:09 crc kubenswrapper[4837]: E0111 17:32:09.058733 4837 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="8e11a566745fc0a91a7711eeb5325461c2be37c7f2d3144b9cc34f15e4aa587a" cmd=["/bin/bash","-c","test -f /ready/ready"] Jan 11 17:32:09 crc kubenswrapper[4837]: E0111 17:32:09.058783 4837 prober.go:104] "Probe errored" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" probeType="Readiness" pod="openshift-multus/cni-sysctl-allowlist-ds-pvfwl" podUID="31dcdcdc-a207-4f09-90af-82c452f9a3f0" containerName="kube-multus-additional-cni-plugins" Jan 11 17:32:09 crc kubenswrapper[4837]: I0111 17:32:09.332958 4837 patch_prober.go:28] interesting pod/router-default-5444994796-cscqg container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Jan 11 17:32:09 crc kubenswrapper[4837]: [-]has-synced failed: reason withheld Jan 11 17:32:09 crc kubenswrapper[4837]: [+]process-running ok Jan 11 17:32:09 crc kubenswrapper[4837]: healthz check failed Jan 11 17:32:09 crc kubenswrapper[4837]: I0111 17:32:09.333025 4837 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-cscqg" podUID="e610467b-9c0c-47ac-86c8-2d700aba3e8e" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Jan 11 17:32:09 crc kubenswrapper[4837]: I0111 17:32:09.490245 4837 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-b7lgc" Jan 11 17:32:10 crc kubenswrapper[4837]: I0111 17:32:10.334612 4837 patch_prober.go:28] interesting pod/router-default-5444994796-cscqg container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Jan 11 17:32:10 crc kubenswrapper[4837]: [-]has-synced failed: reason withheld Jan 11 17:32:10 crc kubenswrapper[4837]: [+]process-running ok Jan 11 17:32:10 crc kubenswrapper[4837]: healthz check failed Jan 11 17:32:10 crc kubenswrapper[4837]: I0111 17:32:10.334686 4837 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-cscqg" podUID="e610467b-9c0c-47ac-86c8-2d700aba3e8e" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Jan 11 17:32:11 crc kubenswrapper[4837]: I0111 17:32:11.333083 4837 patch_prober.go:28] interesting pod/router-default-5444994796-cscqg container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Jan 11 17:32:11 crc kubenswrapper[4837]: [-]has-synced failed: reason withheld Jan 11 17:32:11 crc kubenswrapper[4837]: [+]process-running ok Jan 11 17:32:11 crc kubenswrapper[4837]: healthz check failed Jan 11 17:32:11 crc kubenswrapper[4837]: I0111 17:32:11.333355 4837 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-cscqg" podUID="e610467b-9c0c-47ac-86c8-2d700aba3e8e" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Jan 11 17:32:12 crc kubenswrapper[4837]: I0111 17:32:12.334021 4837 patch_prober.go:28] interesting pod/router-default-5444994796-cscqg container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Jan 11 17:32:12 crc kubenswrapper[4837]: [-]has-synced failed: reason withheld Jan 11 17:32:12 crc kubenswrapper[4837]: [+]process-running ok Jan 11 17:32:12 crc kubenswrapper[4837]: healthz check failed Jan 11 17:32:12 crc kubenswrapper[4837]: I0111 17:32:12.334109 4837 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-cscqg" podUID="e610467b-9c0c-47ac-86c8-2d700aba3e8e" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Jan 11 17:32:13 crc kubenswrapper[4837]: I0111 17:32:13.059505 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Jan 11 17:32:13 crc kubenswrapper[4837]: I0111 17:32:13.059631 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Jan 11 17:32:13 crc kubenswrapper[4837]: I0111 17:32:13.059714 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Jan 11 17:32:13 crc kubenswrapper[4837]: I0111 17:32:13.059748 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Jan 11 17:32:13 crc kubenswrapper[4837]: I0111 17:32:13.064917 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Jan 11 17:32:13 crc kubenswrapper[4837]: I0111 17:32:13.064932 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Jan 11 17:32:13 crc kubenswrapper[4837]: I0111 17:32:13.067169 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Jan 11 17:32:13 crc kubenswrapper[4837]: I0111 17:32:13.173660 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Jan 11 17:32:13 crc kubenswrapper[4837]: I0111 17:32:13.185211 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Jan 11 17:32:13 crc kubenswrapper[4837]: I0111 17:32:13.211122 4837 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-controller-manager/controller-manager-879f6c89f-ljlz2"] Jan 11 17:32:13 crc kubenswrapper[4837]: I0111 17:32:13.211533 4837 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-controller-manager/controller-manager-879f6c89f-ljlz2" podUID="f9f7469a-7ddb-4d35-962e-86154f7750c9" containerName="controller-manager" containerID="cri-o://1ba2d3b02fd38acfbe485a7c569d325c8f90c37299ef3ce62f46c78cd54d92e5" gracePeriod=30 Jan 11 17:32:13 crc kubenswrapper[4837]: I0111 17:32:13.228344 4837 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6576b87f9c-cwdvp"] Jan 11 17:32:13 crc kubenswrapper[4837]: I0111 17:32:13.228582 4837 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-cwdvp" podUID="76c9940b-89a9-414c-ab2a-c4c1b4519725" containerName="route-controller-manager" containerID="cri-o://768d549fbdcc0e9a96d2e67c13e0d6694f4d56653fbacb74d5f141cf0e1bb3c4" gracePeriod=30 Jan 11 17:32:13 crc kubenswrapper[4837]: I0111 17:32:13.332782 4837 patch_prober.go:28] interesting pod/router-default-5444994796-cscqg container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Jan 11 17:32:13 crc kubenswrapper[4837]: [-]has-synced failed: reason withheld Jan 11 17:32:13 crc kubenswrapper[4837]: [+]process-running ok Jan 11 17:32:13 crc kubenswrapper[4837]: healthz check failed Jan 11 17:32:13 crc kubenswrapper[4837]: I0111 17:32:13.332925 4837 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-cscqg" podUID="e610467b-9c0c-47ac-86c8-2d700aba3e8e" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Jan 11 17:32:13 crc kubenswrapper[4837]: I0111 17:32:13.336051 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Jan 11 17:32:13 crc kubenswrapper[4837]: I0111 17:32:13.336783 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Jan 11 17:32:14 crc kubenswrapper[4837]: I0111 17:32:14.333863 4837 patch_prober.go:28] interesting pod/router-default-5444994796-cscqg container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Jan 11 17:32:14 crc kubenswrapper[4837]: [-]has-synced failed: reason withheld Jan 11 17:32:14 crc kubenswrapper[4837]: [+]process-running ok Jan 11 17:32:14 crc kubenswrapper[4837]: healthz check failed Jan 11 17:32:14 crc kubenswrapper[4837]: I0111 17:32:14.333967 4837 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-cscqg" podUID="e610467b-9c0c-47ac-86c8-2d700aba3e8e" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Jan 11 17:32:14 crc kubenswrapper[4837]: I0111 17:32:14.625354 4837 generic.go:334] "Generic (PLEG): container finished" podID="f9f7469a-7ddb-4d35-962e-86154f7750c9" containerID="1ba2d3b02fd38acfbe485a7c569d325c8f90c37299ef3ce62f46c78cd54d92e5" exitCode=0 Jan 11 17:32:14 crc kubenswrapper[4837]: I0111 17:32:14.625436 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-879f6c89f-ljlz2" event={"ID":"f9f7469a-7ddb-4d35-962e-86154f7750c9","Type":"ContainerDied","Data":"1ba2d3b02fd38acfbe485a7c569d325c8f90c37299ef3ce62f46c78cd54d92e5"} Jan 11 17:32:15 crc kubenswrapper[4837]: I0111 17:32:15.333832 4837 patch_prober.go:28] interesting pod/router-default-5444994796-cscqg container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Jan 11 17:32:15 crc kubenswrapper[4837]: [-]has-synced failed: reason withheld Jan 11 17:32:15 crc kubenswrapper[4837]: [+]process-running ok Jan 11 17:32:15 crc kubenswrapper[4837]: healthz check failed Jan 11 17:32:15 crc kubenswrapper[4837]: I0111 17:32:15.333927 4837 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-cscqg" podUID="e610467b-9c0c-47ac-86c8-2d700aba3e8e" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Jan 11 17:32:16 crc kubenswrapper[4837]: I0111 17:32:16.188724 4837 generic.go:334] "Generic (PLEG): container finished" podID="76c9940b-89a9-414c-ab2a-c4c1b4519725" containerID="768d549fbdcc0e9a96d2e67c13e0d6694f4d56653fbacb74d5f141cf0e1bb3c4" exitCode=0 Jan 11 17:32:16 crc kubenswrapper[4837]: I0111 17:32:16.188769 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-cwdvp" event={"ID":"76c9940b-89a9-414c-ab2a-c4c1b4519725","Type":"ContainerDied","Data":"768d549fbdcc0e9a96d2e67c13e0d6694f4d56653fbacb74d5f141cf0e1bb3c4"} Jan 11 17:32:16 crc kubenswrapper[4837]: I0111 17:32:16.333926 4837 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-ingress/router-default-5444994796-cscqg" Jan 11 17:32:16 crc kubenswrapper[4837]: I0111 17:32:16.337157 4837 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ingress/router-default-5444994796-cscqg" Jan 11 17:32:18 crc kubenswrapper[4837]: I0111 17:32:18.027044 4837 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-image-registry/image-registry-697d97f7c8-4nvrr" Jan 11 17:32:18 crc kubenswrapper[4837]: I0111 17:32:18.143242 4837 patch_prober.go:28] interesting pod/console-f9d7485db-v8df8 container/console namespace/openshift-console: Startup probe status=failure output="Get \"https://10.217.0.6:8443/health\": dial tcp 10.217.0.6:8443: connect: connection refused" start-of-body= Jan 11 17:32:18 crc kubenswrapper[4837]: I0111 17:32:18.143295 4837 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-console/console-f9d7485db-v8df8" podUID="1141f492-afec-40f3-bde7-7072d6a75a68" containerName="console" probeResult="failure" output="Get \"https://10.217.0.6:8443/health\": dial tcp 10.217.0.6:8443: connect: connection refused" Jan 11 17:32:18 crc kubenswrapper[4837]: I0111 17:32:18.251442 4837 patch_prober.go:28] interesting pod/route-controller-manager-6576b87f9c-cwdvp container/route-controller-manager namespace/openshift-route-controller-manager: Readiness probe status=failure output="Get \"https://10.217.0.9:8443/healthz\": dial tcp 10.217.0.9:8443: connect: connection refused" start-of-body= Jan 11 17:32:18 crc kubenswrapper[4837]: I0111 17:32:18.251499 4837 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-cwdvp" podUID="76c9940b-89a9-414c-ab2a-c4c1b4519725" containerName="route-controller-manager" probeResult="failure" output="Get \"https://10.217.0.9:8443/healthz\": dial tcp 10.217.0.9:8443: connect: connection refused" Jan 11 17:32:18 crc kubenswrapper[4837]: I0111 17:32:18.258313 4837 patch_prober.go:28] interesting pod/controller-manager-879f6c89f-ljlz2 container/controller-manager namespace/openshift-controller-manager: Readiness probe status=failure output="Get \"https://10.217.0.20:8443/healthz\": dial tcp 10.217.0.20:8443: connect: connection refused" start-of-body= Jan 11 17:32:18 crc kubenswrapper[4837]: I0111 17:32:18.258405 4837 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-controller-manager/controller-manager-879f6c89f-ljlz2" podUID="f9f7469a-7ddb-4d35-962e-86154f7750c9" containerName="controller-manager" probeResult="failure" output="Get \"https://10.217.0.20:8443/healthz\": dial tcp 10.217.0.20:8443: connect: connection refused" Jan 11 17:32:19 crc kubenswrapper[4837]: E0111 17:32:19.044143 4837 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="8e11a566745fc0a91a7711eeb5325461c2be37c7f2d3144b9cc34f15e4aa587a" cmd=["/bin/bash","-c","test -f /ready/ready"] Jan 11 17:32:19 crc kubenswrapper[4837]: E0111 17:32:19.046580 4837 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="8e11a566745fc0a91a7711eeb5325461c2be37c7f2d3144b9cc34f15e4aa587a" cmd=["/bin/bash","-c","test -f /ready/ready"] Jan 11 17:32:19 crc kubenswrapper[4837]: E0111 17:32:19.048751 4837 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="8e11a566745fc0a91a7711eeb5325461c2be37c7f2d3144b9cc34f15e4aa587a" cmd=["/bin/bash","-c","test -f /ready/ready"] Jan 11 17:32:19 crc kubenswrapper[4837]: E0111 17:32:19.048847 4837 prober.go:104] "Probe errored" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" probeType="Readiness" pod="openshift-multus/cni-sysctl-allowlist-ds-pvfwl" podUID="31dcdcdc-a207-4f09-90af-82c452f9a3f0" containerName="kube-multus-additional-cni-plugins" Jan 11 17:32:26 crc kubenswrapper[4837]: I0111 17:32:26.418083 4837 fsHandler.go:133] fs: disk usage and inodes count on following dirs took 2.73772836s: [/var/lib/containers/storage/overlay/0a0581c68d77f42afed982673fa37e934dd2323e31e7004b1a105cbafb310dab/diff /var/log/pods/openshift-ovn-kubernetes_ovnkube-node-b7lgc_e1452749-ce38-41f8-89dd-4b567f2a3250/ovn-controller/0.log]; will not log again for this container unless duration exceeds 2s Jan 11 17:32:26 crc kubenswrapper[4837]: I0111 17:32:26.456559 4837 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-etcd/etcd-crc"] Jan 11 17:32:28 crc kubenswrapper[4837]: I0111 17:32:28.150204 4837 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-console/console-f9d7485db-v8df8" Jan 11 17:32:28 crc kubenswrapper[4837]: I0111 17:32:28.156326 4837 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console/console-f9d7485db-v8df8" Jan 11 17:32:28 crc kubenswrapper[4837]: I0111 17:32:28.217975 4837 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-etcd/etcd-crc" podStartSLOduration=2.217944938 podStartE2EDuration="2.217944938s" podCreationTimestamp="2026-01-11 17:32:26 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-11 17:32:28.215127456 +0000 UTC m=+122.393320162" watchObservedRunningTime="2026-01-11 17:32:28.217944938 +0000 UTC m=+122.396137684" Jan 11 17:32:28 crc kubenswrapper[4837]: I0111 17:32:28.251401 4837 patch_prober.go:28] interesting pod/route-controller-manager-6576b87f9c-cwdvp container/route-controller-manager namespace/openshift-route-controller-manager: Readiness probe status=failure output="Get \"https://10.217.0.9:8443/healthz\": dial tcp 10.217.0.9:8443: connect: connection refused" start-of-body= Jan 11 17:32:28 crc kubenswrapper[4837]: I0111 17:32:28.251482 4837 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-cwdvp" podUID="76c9940b-89a9-414c-ab2a-c4c1b4519725" containerName="route-controller-manager" probeResult="failure" output="Get \"https://10.217.0.9:8443/healthz\": dial tcp 10.217.0.9:8443: connect: connection refused" Jan 11 17:32:28 crc kubenswrapper[4837]: I0111 17:32:28.257497 4837 patch_prober.go:28] interesting pod/controller-manager-879f6c89f-ljlz2 container/controller-manager namespace/openshift-controller-manager: Readiness probe status=failure output="Get \"https://10.217.0.20:8443/healthz\": dial tcp 10.217.0.20:8443: connect: connection refused" start-of-body= Jan 11 17:32:28 crc kubenswrapper[4837]: I0111 17:32:28.257562 4837 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-controller-manager/controller-manager-879f6c89f-ljlz2" podUID="f9f7469a-7ddb-4d35-962e-86154f7750c9" containerName="controller-manager" probeResult="failure" output="Get \"https://10.217.0.20:8443/healthz\": dial tcp 10.217.0.20:8443: connect: connection refused" Jan 11 17:32:29 crc kubenswrapper[4837]: I0111 17:32:29.007629 4837 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-2vscb" Jan 11 17:32:29 crc kubenswrapper[4837]: E0111 17:32:29.042443 4837 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 8e11a566745fc0a91a7711eeb5325461c2be37c7f2d3144b9cc34f15e4aa587a is running failed: container process not found" containerID="8e11a566745fc0a91a7711eeb5325461c2be37c7f2d3144b9cc34f15e4aa587a" cmd=["/bin/bash","-c","test -f /ready/ready"] Jan 11 17:32:29 crc kubenswrapper[4837]: E0111 17:32:29.042937 4837 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 8e11a566745fc0a91a7711eeb5325461c2be37c7f2d3144b9cc34f15e4aa587a is running failed: container process not found" containerID="8e11a566745fc0a91a7711eeb5325461c2be37c7f2d3144b9cc34f15e4aa587a" cmd=["/bin/bash","-c","test -f /ready/ready"] Jan 11 17:32:29 crc kubenswrapper[4837]: E0111 17:32:29.043640 4837 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 8e11a566745fc0a91a7711eeb5325461c2be37c7f2d3144b9cc34f15e4aa587a is running failed: container process not found" containerID="8e11a566745fc0a91a7711eeb5325461c2be37c7f2d3144b9cc34f15e4aa587a" cmd=["/bin/bash","-c","test -f /ready/ready"] Jan 11 17:32:29 crc kubenswrapper[4837]: E0111 17:32:29.043693 4837 prober.go:104] "Probe errored" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 8e11a566745fc0a91a7711eeb5325461c2be37c7f2d3144b9cc34f15e4aa587a is running failed: container process not found" probeType="Readiness" pod="openshift-multus/cni-sysctl-allowlist-ds-pvfwl" podUID="31dcdcdc-a207-4f09-90af-82c452f9a3f0" containerName="kube-multus-additional-cni-plugins" Jan 11 17:32:30 crc kubenswrapper[4837]: I0111 17:32:30.275106 4837 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_cni-sysctl-allowlist-ds-pvfwl_31dcdcdc-a207-4f09-90af-82c452f9a3f0/kube-multus-additional-cni-plugins/0.log" Jan 11 17:32:30 crc kubenswrapper[4837]: I0111 17:32:30.275523 4837 generic.go:334] "Generic (PLEG): container finished" podID="31dcdcdc-a207-4f09-90af-82c452f9a3f0" containerID="8e11a566745fc0a91a7711eeb5325461c2be37c7f2d3144b9cc34f15e4aa587a" exitCode=137 Jan 11 17:32:30 crc kubenswrapper[4837]: I0111 17:32:30.275568 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/cni-sysctl-allowlist-ds-pvfwl" event={"ID":"31dcdcdc-a207-4f09-90af-82c452f9a3f0","Type":"ContainerDied","Data":"8e11a566745fc0a91a7711eeb5325461c2be37c7f2d3144b9cc34f15e4aa587a"} Jan 11 17:32:35 crc kubenswrapper[4837]: I0111 17:32:35.449816 4837 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-apiserver/revision-pruner-9-crc"] Jan 11 17:32:35 crc kubenswrapper[4837]: E0111 17:32:35.450502 4837 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3526a6e7-fcdc-4304-adea-bdf2a5fbe1f2" containerName="pruner" Jan 11 17:32:35 crc kubenswrapper[4837]: I0111 17:32:35.450522 4837 state_mem.go:107] "Deleted CPUSet assignment" podUID="3526a6e7-fcdc-4304-adea-bdf2a5fbe1f2" containerName="pruner" Jan 11 17:32:35 crc kubenswrapper[4837]: I0111 17:32:35.451134 4837 memory_manager.go:354] "RemoveStaleState removing state" podUID="3526a6e7-fcdc-4304-adea-bdf2a5fbe1f2" containerName="pruner" Jan 11 17:32:35 crc kubenswrapper[4837]: I0111 17:32:35.451858 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-9-crc" Jan 11 17:32:35 crc kubenswrapper[4837]: I0111 17:32:35.459423 4837 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-apiserver"/"installer-sa-dockercfg-5pr6n" Jan 11 17:32:35 crc kubenswrapper[4837]: I0111 17:32:35.459718 4837 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-apiserver"/"kube-root-ca.crt" Jan 11 17:32:35 crc kubenswrapper[4837]: I0111 17:32:35.463213 4837 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver/revision-pruner-9-crc"] Jan 11 17:32:35 crc kubenswrapper[4837]: I0111 17:32:35.629479 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/7be850c8-f1a1-40c4-9c12-5ff8af233635-kubelet-dir\") pod \"revision-pruner-9-crc\" (UID: \"7be850c8-f1a1-40c4-9c12-5ff8af233635\") " pod="openshift-kube-apiserver/revision-pruner-9-crc" Jan 11 17:32:35 crc kubenswrapper[4837]: I0111 17:32:35.629561 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/7be850c8-f1a1-40c4-9c12-5ff8af233635-kube-api-access\") pod \"revision-pruner-9-crc\" (UID: \"7be850c8-f1a1-40c4-9c12-5ff8af233635\") " pod="openshift-kube-apiserver/revision-pruner-9-crc" Jan 11 17:32:35 crc kubenswrapper[4837]: I0111 17:32:35.730600 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/7be850c8-f1a1-40c4-9c12-5ff8af233635-kubelet-dir\") pod \"revision-pruner-9-crc\" (UID: \"7be850c8-f1a1-40c4-9c12-5ff8af233635\") " pod="openshift-kube-apiserver/revision-pruner-9-crc" Jan 11 17:32:35 crc kubenswrapper[4837]: I0111 17:32:35.730714 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/7be850c8-f1a1-40c4-9c12-5ff8af233635-kube-api-access\") pod \"revision-pruner-9-crc\" (UID: \"7be850c8-f1a1-40c4-9c12-5ff8af233635\") " pod="openshift-kube-apiserver/revision-pruner-9-crc" Jan 11 17:32:35 crc kubenswrapper[4837]: I0111 17:32:35.732847 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/7be850c8-f1a1-40c4-9c12-5ff8af233635-kubelet-dir\") pod \"revision-pruner-9-crc\" (UID: \"7be850c8-f1a1-40c4-9c12-5ff8af233635\") " pod="openshift-kube-apiserver/revision-pruner-9-crc" Jan 11 17:32:35 crc kubenswrapper[4837]: I0111 17:32:35.768882 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/7be850c8-f1a1-40c4-9c12-5ff8af233635-kube-api-access\") pod \"revision-pruner-9-crc\" (UID: \"7be850c8-f1a1-40c4-9c12-5ff8af233635\") " pod="openshift-kube-apiserver/revision-pruner-9-crc" Jan 11 17:32:35 crc kubenswrapper[4837]: I0111 17:32:35.781208 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-9-crc" Jan 11 17:32:38 crc kubenswrapper[4837]: I0111 17:32:38.251398 4837 patch_prober.go:28] interesting pod/route-controller-manager-6576b87f9c-cwdvp container/route-controller-manager namespace/openshift-route-controller-manager: Readiness probe status=failure output="Get \"https://10.217.0.9:8443/healthz\": dial tcp 10.217.0.9:8443: connect: connection refused" start-of-body= Jan 11 17:32:38 crc kubenswrapper[4837]: I0111 17:32:38.251953 4837 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-cwdvp" podUID="76c9940b-89a9-414c-ab2a-c4c1b4519725" containerName="route-controller-manager" probeResult="failure" output="Get \"https://10.217.0.9:8443/healthz\": dial tcp 10.217.0.9:8443: connect: connection refused" Jan 11 17:32:38 crc kubenswrapper[4837]: I0111 17:32:38.258050 4837 patch_prober.go:28] interesting pod/controller-manager-879f6c89f-ljlz2 container/controller-manager namespace/openshift-controller-manager: Readiness probe status=failure output="Get \"https://10.217.0.20:8443/healthz\": dial tcp 10.217.0.20:8443: connect: connection refused" start-of-body= Jan 11 17:32:38 crc kubenswrapper[4837]: I0111 17:32:38.258105 4837 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-controller-manager/controller-manager-879f6c89f-ljlz2" podUID="f9f7469a-7ddb-4d35-962e-86154f7750c9" containerName="controller-manager" probeResult="failure" output="Get \"https://10.217.0.20:8443/healthz\": dial tcp 10.217.0.20:8443: connect: connection refused" Jan 11 17:32:39 crc kubenswrapper[4837]: E0111 17:32:39.041905 4837 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 8e11a566745fc0a91a7711eeb5325461c2be37c7f2d3144b9cc34f15e4aa587a is running failed: container process not found" containerID="8e11a566745fc0a91a7711eeb5325461c2be37c7f2d3144b9cc34f15e4aa587a" cmd=["/bin/bash","-c","test -f /ready/ready"] Jan 11 17:32:39 crc kubenswrapper[4837]: E0111 17:32:39.043125 4837 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 8e11a566745fc0a91a7711eeb5325461c2be37c7f2d3144b9cc34f15e4aa587a is running failed: container process not found" containerID="8e11a566745fc0a91a7711eeb5325461c2be37c7f2d3144b9cc34f15e4aa587a" cmd=["/bin/bash","-c","test -f /ready/ready"] Jan 11 17:32:39 crc kubenswrapper[4837]: E0111 17:32:39.043704 4837 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 8e11a566745fc0a91a7711eeb5325461c2be37c7f2d3144b9cc34f15e4aa587a is running failed: container process not found" containerID="8e11a566745fc0a91a7711eeb5325461c2be37c7f2d3144b9cc34f15e4aa587a" cmd=["/bin/bash","-c","test -f /ready/ready"] Jan 11 17:32:39 crc kubenswrapper[4837]: E0111 17:32:39.043785 4837 prober.go:104] "Probe errored" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 8e11a566745fc0a91a7711eeb5325461c2be37c7f2d3144b9cc34f15e4aa587a is running failed: container process not found" probeType="Readiness" pod="openshift-multus/cni-sysctl-allowlist-ds-pvfwl" podUID="31dcdcdc-a207-4f09-90af-82c452f9a3f0" containerName="kube-multus-additional-cni-plugins" Jan 11 17:32:39 crc kubenswrapper[4837]: I0111 17:32:39.353020 4837 patch_prober.go:28] interesting pod/console-operator-58897d9998-n8ldn container/console-operator namespace/openshift-console-operator: Readiness probe status=failure output="Get \"https://10.217.0.12:8443/readyz\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" start-of-body= Jan 11 17:32:39 crc kubenswrapper[4837]: I0111 17:32:39.354174 4837 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console-operator/console-operator-58897d9998-n8ldn" podUID="58f0dabf-1deb-453a-9ac9-11324df2b806" containerName="console-operator" probeResult="failure" output="Get \"https://10.217.0.12:8443/readyz\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Jan 11 17:32:39 crc kubenswrapper[4837]: I0111 17:32:39.840886 4837 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-apiserver/installer-9-crc"] Jan 11 17:32:39 crc kubenswrapper[4837]: I0111 17:32:39.842370 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/installer-9-crc" Jan 11 17:32:39 crc kubenswrapper[4837]: I0111 17:32:39.854152 4837 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver/installer-9-crc"] Jan 11 17:32:40 crc kubenswrapper[4837]: I0111 17:32:40.004540 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/1ec7b443-a2dc-4c7b-b661-03ce968d0878-kube-api-access\") pod \"installer-9-crc\" (UID: \"1ec7b443-a2dc-4c7b-b661-03ce968d0878\") " pod="openshift-kube-apiserver/installer-9-crc" Jan 11 17:32:40 crc kubenswrapper[4837]: I0111 17:32:40.004751 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/1ec7b443-a2dc-4c7b-b661-03ce968d0878-var-lock\") pod \"installer-9-crc\" (UID: \"1ec7b443-a2dc-4c7b-b661-03ce968d0878\") " pod="openshift-kube-apiserver/installer-9-crc" Jan 11 17:32:40 crc kubenswrapper[4837]: I0111 17:32:40.004848 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/1ec7b443-a2dc-4c7b-b661-03ce968d0878-kubelet-dir\") pod \"installer-9-crc\" (UID: \"1ec7b443-a2dc-4c7b-b661-03ce968d0878\") " pod="openshift-kube-apiserver/installer-9-crc" Jan 11 17:32:40 crc kubenswrapper[4837]: I0111 17:32:40.105660 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/1ec7b443-a2dc-4c7b-b661-03ce968d0878-kube-api-access\") pod \"installer-9-crc\" (UID: \"1ec7b443-a2dc-4c7b-b661-03ce968d0878\") " pod="openshift-kube-apiserver/installer-9-crc" Jan 11 17:32:40 crc kubenswrapper[4837]: I0111 17:32:40.105740 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/1ec7b443-a2dc-4c7b-b661-03ce968d0878-var-lock\") pod \"installer-9-crc\" (UID: \"1ec7b443-a2dc-4c7b-b661-03ce968d0878\") " pod="openshift-kube-apiserver/installer-9-crc" Jan 11 17:32:40 crc kubenswrapper[4837]: I0111 17:32:40.105771 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/1ec7b443-a2dc-4c7b-b661-03ce968d0878-kubelet-dir\") pod \"installer-9-crc\" (UID: \"1ec7b443-a2dc-4c7b-b661-03ce968d0878\") " pod="openshift-kube-apiserver/installer-9-crc" Jan 11 17:32:40 crc kubenswrapper[4837]: I0111 17:32:40.105836 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/1ec7b443-a2dc-4c7b-b661-03ce968d0878-kubelet-dir\") pod \"installer-9-crc\" (UID: \"1ec7b443-a2dc-4c7b-b661-03ce968d0878\") " pod="openshift-kube-apiserver/installer-9-crc" Jan 11 17:32:40 crc kubenswrapper[4837]: I0111 17:32:40.105880 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/1ec7b443-a2dc-4c7b-b661-03ce968d0878-var-lock\") pod \"installer-9-crc\" (UID: \"1ec7b443-a2dc-4c7b-b661-03ce968d0878\") " pod="openshift-kube-apiserver/installer-9-crc" Jan 11 17:32:40 crc kubenswrapper[4837]: I0111 17:32:40.130600 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/1ec7b443-a2dc-4c7b-b661-03ce968d0878-kube-api-access\") pod \"installer-9-crc\" (UID: \"1ec7b443-a2dc-4c7b-b661-03ce968d0878\") " pod="openshift-kube-apiserver/installer-9-crc" Jan 11 17:32:40 crc kubenswrapper[4837]: I0111 17:32:40.170978 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/installer-9-crc" Jan 11 17:32:46 crc kubenswrapper[4837]: I0111 17:32:46.231743 4837 patch_prober.go:28] interesting pod/openshift-config-operator-7777fb866f-599f5 container/openshift-config-operator namespace/openshift-config-operator: Readiness probe status=failure output="Get \"https://10.217.0.7:8443/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" start-of-body= Jan 11 17:32:46 crc kubenswrapper[4837]: I0111 17:32:46.232458 4837 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-config-operator/openshift-config-operator-7777fb866f-599f5" podUID="8854497b-9e74-4bd1-b465-847ad61d8779" containerName="openshift-config-operator" probeResult="failure" output="Get \"https://10.217.0.7:8443/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Jan 11 17:32:48 crc kubenswrapper[4837]: I0111 17:32:48.260802 4837 patch_prober.go:28] interesting pod/controller-manager-879f6c89f-ljlz2 container/controller-manager namespace/openshift-controller-manager: Readiness probe status=failure output="Get \"https://10.217.0.20:8443/healthz\": dial tcp 10.217.0.20:8443: connect: connection refused" start-of-body= Jan 11 17:32:48 crc kubenswrapper[4837]: I0111 17:32:48.260879 4837 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-controller-manager/controller-manager-879f6c89f-ljlz2" podUID="f9f7469a-7ddb-4d35-962e-86154f7750c9" containerName="controller-manager" probeResult="failure" output="Get \"https://10.217.0.20:8443/healthz\": dial tcp 10.217.0.20:8443: connect: connection refused" Jan 11 17:32:49 crc kubenswrapper[4837]: E0111 17:32:49.041966 4837 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 8e11a566745fc0a91a7711eeb5325461c2be37c7f2d3144b9cc34f15e4aa587a is running failed: container process not found" containerID="8e11a566745fc0a91a7711eeb5325461c2be37c7f2d3144b9cc34f15e4aa587a" cmd=["/bin/bash","-c","test -f /ready/ready"] Jan 11 17:32:49 crc kubenswrapper[4837]: E0111 17:32:49.042617 4837 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 8e11a566745fc0a91a7711eeb5325461c2be37c7f2d3144b9cc34f15e4aa587a is running failed: container process not found" containerID="8e11a566745fc0a91a7711eeb5325461c2be37c7f2d3144b9cc34f15e4aa587a" cmd=["/bin/bash","-c","test -f /ready/ready"] Jan 11 17:32:49 crc kubenswrapper[4837]: E0111 17:32:49.043886 4837 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 8e11a566745fc0a91a7711eeb5325461c2be37c7f2d3144b9cc34f15e4aa587a is running failed: container process not found" containerID="8e11a566745fc0a91a7711eeb5325461c2be37c7f2d3144b9cc34f15e4aa587a" cmd=["/bin/bash","-c","test -f /ready/ready"] Jan 11 17:32:49 crc kubenswrapper[4837]: E0111 17:32:49.043973 4837 prober.go:104] "Probe errored" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 8e11a566745fc0a91a7711eeb5325461c2be37c7f2d3144b9cc34f15e4aa587a is running failed: container process not found" probeType="Readiness" pod="openshift-multus/cni-sysctl-allowlist-ds-pvfwl" podUID="31dcdcdc-a207-4f09-90af-82c452f9a3f0" containerName="kube-multus-additional-cni-plugins" Jan 11 17:32:49 crc kubenswrapper[4837]: I0111 17:32:49.250732 4837 patch_prober.go:28] interesting pod/route-controller-manager-6576b87f9c-cwdvp container/route-controller-manager namespace/openshift-route-controller-manager: Readiness probe status=failure output="Get \"https://10.217.0.9:8443/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" start-of-body= Jan 11 17:32:49 crc kubenswrapper[4837]: I0111 17:32:49.251239 4837 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-cwdvp" podUID="76c9940b-89a9-414c-ab2a-c4c1b4519725" containerName="route-controller-manager" probeResult="failure" output="Get \"https://10.217.0.9:8443/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Jan 11 17:32:53 crc kubenswrapper[4837]: I0111 17:32:53.830948 4837 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/revision-pruner-9-crc" Jan 11 17:32:53 crc kubenswrapper[4837]: I0111 17:32:53.843252 4837 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-cwdvp" Jan 11 17:32:53 crc kubenswrapper[4837]: I0111 17:32:53.918073 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/e2c2ceba-826a-4cbe-beb4-a6f43c55cd93-kubelet-dir\") pod \"e2c2ceba-826a-4cbe-beb4-a6f43c55cd93\" (UID: \"e2c2ceba-826a-4cbe-beb4-a6f43c55cd93\") " Jan 11 17:32:53 crc kubenswrapper[4837]: I0111 17:32:53.918187 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/e2c2ceba-826a-4cbe-beb4-a6f43c55cd93-kubelet-dir" (OuterVolumeSpecName: "kubelet-dir") pod "e2c2ceba-826a-4cbe-beb4-a6f43c55cd93" (UID: "e2c2ceba-826a-4cbe-beb4-a6f43c55cd93"). InnerVolumeSpecName "kubelet-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Jan 11 17:32:53 crc kubenswrapper[4837]: I0111 17:32:53.918225 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/e2c2ceba-826a-4cbe-beb4-a6f43c55cd93-kube-api-access\") pod \"e2c2ceba-826a-4cbe-beb4-a6f43c55cd93\" (UID: \"e2c2ceba-826a-4cbe-beb4-a6f43c55cd93\") " Jan 11 17:32:53 crc kubenswrapper[4837]: I0111 17:32:53.918257 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/76c9940b-89a9-414c-ab2a-c4c1b4519725-serving-cert\") pod \"76c9940b-89a9-414c-ab2a-c4c1b4519725\" (UID: \"76c9940b-89a9-414c-ab2a-c4c1b4519725\") " Jan 11 17:32:53 crc kubenswrapper[4837]: I0111 17:32:53.918303 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/76c9940b-89a9-414c-ab2a-c4c1b4519725-client-ca\") pod \"76c9940b-89a9-414c-ab2a-c4c1b4519725\" (UID: \"76c9940b-89a9-414c-ab2a-c4c1b4519725\") " Jan 11 17:32:53 crc kubenswrapper[4837]: I0111 17:32:53.918359 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zrczq\" (UniqueName: \"kubernetes.io/projected/76c9940b-89a9-414c-ab2a-c4c1b4519725-kube-api-access-zrczq\") pod \"76c9940b-89a9-414c-ab2a-c4c1b4519725\" (UID: \"76c9940b-89a9-414c-ab2a-c4c1b4519725\") " Jan 11 17:32:53 crc kubenswrapper[4837]: I0111 17:32:53.918394 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/76c9940b-89a9-414c-ab2a-c4c1b4519725-config\") pod \"76c9940b-89a9-414c-ab2a-c4c1b4519725\" (UID: \"76c9940b-89a9-414c-ab2a-c4c1b4519725\") " Jan 11 17:32:53 crc kubenswrapper[4837]: I0111 17:32:53.919447 4837 reconciler_common.go:293] "Volume detached for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/e2c2ceba-826a-4cbe-beb4-a6f43c55cd93-kubelet-dir\") on node \"crc\" DevicePath \"\"" Jan 11 17:32:53 crc kubenswrapper[4837]: I0111 17:32:53.919440 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/76c9940b-89a9-414c-ab2a-c4c1b4519725-config" (OuterVolumeSpecName: "config") pod "76c9940b-89a9-414c-ab2a-c4c1b4519725" (UID: "76c9940b-89a9-414c-ab2a-c4c1b4519725"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 11 17:32:53 crc kubenswrapper[4837]: I0111 17:32:53.919482 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/76c9940b-89a9-414c-ab2a-c4c1b4519725-client-ca" (OuterVolumeSpecName: "client-ca") pod "76c9940b-89a9-414c-ab2a-c4c1b4519725" (UID: "76c9940b-89a9-414c-ab2a-c4c1b4519725"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 11 17:32:53 crc kubenswrapper[4837]: I0111 17:32:53.925324 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/76c9940b-89a9-414c-ab2a-c4c1b4519725-kube-api-access-zrczq" (OuterVolumeSpecName: "kube-api-access-zrczq") pod "76c9940b-89a9-414c-ab2a-c4c1b4519725" (UID: "76c9940b-89a9-414c-ab2a-c4c1b4519725"). InnerVolumeSpecName "kube-api-access-zrczq". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 11 17:32:53 crc kubenswrapper[4837]: I0111 17:32:53.925372 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/76c9940b-89a9-414c-ab2a-c4c1b4519725-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "76c9940b-89a9-414c-ab2a-c4c1b4519725" (UID: "76c9940b-89a9-414c-ab2a-c4c1b4519725"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 11 17:32:53 crc kubenswrapper[4837]: I0111 17:32:53.928913 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e2c2ceba-826a-4cbe-beb4-a6f43c55cd93-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "e2c2ceba-826a-4cbe-beb4-a6f43c55cd93" (UID: "e2c2ceba-826a-4cbe-beb4-a6f43c55cd93"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 11 17:32:54 crc kubenswrapper[4837]: I0111 17:32:54.020792 4837 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-zrczq\" (UniqueName: \"kubernetes.io/projected/76c9940b-89a9-414c-ab2a-c4c1b4519725-kube-api-access-zrczq\") on node \"crc\" DevicePath \"\"" Jan 11 17:32:54 crc kubenswrapper[4837]: I0111 17:32:54.021116 4837 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/76c9940b-89a9-414c-ab2a-c4c1b4519725-config\") on node \"crc\" DevicePath \"\"" Jan 11 17:32:54 crc kubenswrapper[4837]: I0111 17:32:54.021131 4837 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/e2c2ceba-826a-4cbe-beb4-a6f43c55cd93-kube-api-access\") on node \"crc\" DevicePath \"\"" Jan 11 17:32:54 crc kubenswrapper[4837]: I0111 17:32:54.021143 4837 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/76c9940b-89a9-414c-ab2a-c4c1b4519725-serving-cert\") on node \"crc\" DevicePath \"\"" Jan 11 17:32:54 crc kubenswrapper[4837]: I0111 17:32:54.021155 4837 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/76c9940b-89a9-414c-ab2a-c4c1b4519725-client-ca\") on node \"crc\" DevicePath \"\"" Jan 11 17:32:54 crc kubenswrapper[4837]: I0111 17:32:54.411045 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/revision-pruner-9-crc" event={"ID":"e2c2ceba-826a-4cbe-beb4-a6f43c55cd93","Type":"ContainerDied","Data":"c43417bcc294bf4c1d06c628932fd5329f66c140cafdae7a45b21a27839f90c0"} Jan 11 17:32:54 crc kubenswrapper[4837]: I0111 17:32:54.411076 4837 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/revision-pruner-9-crc" Jan 11 17:32:54 crc kubenswrapper[4837]: I0111 17:32:54.411085 4837 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="c43417bcc294bf4c1d06c628932fd5329f66c140cafdae7a45b21a27839f90c0" Jan 11 17:32:54 crc kubenswrapper[4837]: I0111 17:32:54.412896 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-cwdvp" event={"ID":"76c9940b-89a9-414c-ab2a-c4c1b4519725","Type":"ContainerDied","Data":"8c438c10c32cc1ce028a5826080a60bb19fcdd0ce715bb13c6531fe27e9f26b1"} Jan 11 17:32:54 crc kubenswrapper[4837]: I0111 17:32:54.412945 4837 scope.go:117] "RemoveContainer" containerID="768d549fbdcc0e9a96d2e67c13e0d6694f4d56653fbacb74d5f141cf0e1bb3c4" Jan 11 17:32:54 crc kubenswrapper[4837]: I0111 17:32:54.413005 4837 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-cwdvp" Jan 11 17:32:54 crc kubenswrapper[4837]: I0111 17:32:54.429946 4837 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6576b87f9c-cwdvp"] Jan 11 17:32:54 crc kubenswrapper[4837]: I0111 17:32:54.432978 4837 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6576b87f9c-cwdvp"] Jan 11 17:32:55 crc kubenswrapper[4837]: E0111 17:32:55.863442 4837 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" image="registry.redhat.io/redhat/certified-operator-index:v4.18" Jan 11 17:32:55 crc kubenswrapper[4837]: E0111 17:32:55.864077 4837 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:extract-content,Image:registry.redhat.io/redhat/certified-operator-index:v4.18,Command:[/utilities/copy-content],Args:[--catalog.from=/configs --catalog.to=/extracted-catalog/catalog --cache.from=/tmp/cache --cache.to=/extracted-catalog/cache],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:utilities,ReadOnly:false,MountPath:/utilities,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:catalog-content,ReadOnly:false,MountPath:/extracted-catalog,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-69k8h,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000170000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod certified-operators-29ts4_openshift-marketplace(30409294-8779-48ad-a6e8-36b662f09c0f): ErrImagePull: rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" logger="UnhandledError" Jan 11 17:32:55 crc kubenswrapper[4837]: E0111 17:32:55.865410 4837 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ErrImagePull: \"rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled\"" pod="openshift-marketplace/certified-operators-29ts4" podUID="30409294-8779-48ad-a6e8-36b662f09c0f" Jan 11 17:32:56 crc kubenswrapper[4837]: I0111 17:32:56.374387 4837 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="76c9940b-89a9-414c-ab2a-c4c1b4519725" path="/var/lib/kubelet/pods/76c9940b-89a9-414c-ab2a-c4c1b4519725/volumes" Jan 11 17:32:56 crc kubenswrapper[4837]: I0111 17:32:56.713077 4837 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6f6d57cc69-fsbzj"] Jan 11 17:32:56 crc kubenswrapper[4837]: E0111 17:32:56.714365 4837 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e2c2ceba-826a-4cbe-beb4-a6f43c55cd93" containerName="pruner" Jan 11 17:32:56 crc kubenswrapper[4837]: I0111 17:32:56.714395 4837 state_mem.go:107] "Deleted CPUSet assignment" podUID="e2c2ceba-826a-4cbe-beb4-a6f43c55cd93" containerName="pruner" Jan 11 17:32:56 crc kubenswrapper[4837]: E0111 17:32:56.714418 4837 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="76c9940b-89a9-414c-ab2a-c4c1b4519725" containerName="route-controller-manager" Jan 11 17:32:56 crc kubenswrapper[4837]: I0111 17:32:56.714427 4837 state_mem.go:107] "Deleted CPUSet assignment" podUID="76c9940b-89a9-414c-ab2a-c4c1b4519725" containerName="route-controller-manager" Jan 11 17:32:56 crc kubenswrapper[4837]: I0111 17:32:56.714618 4837 memory_manager.go:354] "RemoveStaleState removing state" podUID="76c9940b-89a9-414c-ab2a-c4c1b4519725" containerName="route-controller-manager" Jan 11 17:32:56 crc kubenswrapper[4837]: I0111 17:32:56.714642 4837 memory_manager.go:354] "RemoveStaleState removing state" podUID="e2c2ceba-826a-4cbe-beb4-a6f43c55cd93" containerName="pruner" Jan 11 17:32:56 crc kubenswrapper[4837]: I0111 17:32:56.715121 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-6f6d57cc69-fsbzj" Jan 11 17:32:56 crc kubenswrapper[4837]: I0111 17:32:56.717182 4837 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"serving-cert" Jan 11 17:32:56 crc kubenswrapper[4837]: I0111 17:32:56.717777 4837 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"openshift-service-ca.crt" Jan 11 17:32:56 crc kubenswrapper[4837]: I0111 17:32:56.718039 4837 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"kube-root-ca.crt" Jan 11 17:32:56 crc kubenswrapper[4837]: I0111 17:32:56.718346 4837 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"route-controller-manager-sa-dockercfg-h2zr2" Jan 11 17:32:56 crc kubenswrapper[4837]: I0111 17:32:56.718586 4837 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"config" Jan 11 17:32:56 crc kubenswrapper[4837]: I0111 17:32:56.719287 4837 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6f6d57cc69-fsbzj"] Jan 11 17:32:56 crc kubenswrapper[4837]: I0111 17:32:56.720314 4837 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"client-ca" Jan 11 17:32:56 crc kubenswrapper[4837]: I0111 17:32:56.758765 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/8fb3713b-b193-4a5a-a841-50d15d8331c5-serving-cert\") pod \"route-controller-manager-6f6d57cc69-fsbzj\" (UID: \"8fb3713b-b193-4a5a-a841-50d15d8331c5\") " pod="openshift-route-controller-manager/route-controller-manager-6f6d57cc69-fsbzj" Jan 11 17:32:56 crc kubenswrapper[4837]: I0111 17:32:56.759101 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/8fb3713b-b193-4a5a-a841-50d15d8331c5-client-ca\") pod \"route-controller-manager-6f6d57cc69-fsbzj\" (UID: \"8fb3713b-b193-4a5a-a841-50d15d8331c5\") " pod="openshift-route-controller-manager/route-controller-manager-6f6d57cc69-fsbzj" Jan 11 17:32:56 crc kubenswrapper[4837]: I0111 17:32:56.759138 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5gm8l\" (UniqueName: \"kubernetes.io/projected/8fb3713b-b193-4a5a-a841-50d15d8331c5-kube-api-access-5gm8l\") pod \"route-controller-manager-6f6d57cc69-fsbzj\" (UID: \"8fb3713b-b193-4a5a-a841-50d15d8331c5\") " pod="openshift-route-controller-manager/route-controller-manager-6f6d57cc69-fsbzj" Jan 11 17:32:56 crc kubenswrapper[4837]: I0111 17:32:56.759180 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8fb3713b-b193-4a5a-a841-50d15d8331c5-config\") pod \"route-controller-manager-6f6d57cc69-fsbzj\" (UID: \"8fb3713b-b193-4a5a-a841-50d15d8331c5\") " pod="openshift-route-controller-manager/route-controller-manager-6f6d57cc69-fsbzj" Jan 11 17:32:56 crc kubenswrapper[4837]: I0111 17:32:56.860364 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8fb3713b-b193-4a5a-a841-50d15d8331c5-config\") pod \"route-controller-manager-6f6d57cc69-fsbzj\" (UID: \"8fb3713b-b193-4a5a-a841-50d15d8331c5\") " pod="openshift-route-controller-manager/route-controller-manager-6f6d57cc69-fsbzj" Jan 11 17:32:56 crc kubenswrapper[4837]: I0111 17:32:56.860432 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/8fb3713b-b193-4a5a-a841-50d15d8331c5-serving-cert\") pod \"route-controller-manager-6f6d57cc69-fsbzj\" (UID: \"8fb3713b-b193-4a5a-a841-50d15d8331c5\") " pod="openshift-route-controller-manager/route-controller-manager-6f6d57cc69-fsbzj" Jan 11 17:32:56 crc kubenswrapper[4837]: I0111 17:32:56.860456 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/8fb3713b-b193-4a5a-a841-50d15d8331c5-client-ca\") pod \"route-controller-manager-6f6d57cc69-fsbzj\" (UID: \"8fb3713b-b193-4a5a-a841-50d15d8331c5\") " pod="openshift-route-controller-manager/route-controller-manager-6f6d57cc69-fsbzj" Jan 11 17:32:56 crc kubenswrapper[4837]: I0111 17:32:56.860493 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5gm8l\" (UniqueName: \"kubernetes.io/projected/8fb3713b-b193-4a5a-a841-50d15d8331c5-kube-api-access-5gm8l\") pod \"route-controller-manager-6f6d57cc69-fsbzj\" (UID: \"8fb3713b-b193-4a5a-a841-50d15d8331c5\") " pod="openshift-route-controller-manager/route-controller-manager-6f6d57cc69-fsbzj" Jan 11 17:32:56 crc kubenswrapper[4837]: I0111 17:32:56.862162 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/8fb3713b-b193-4a5a-a841-50d15d8331c5-client-ca\") pod \"route-controller-manager-6f6d57cc69-fsbzj\" (UID: \"8fb3713b-b193-4a5a-a841-50d15d8331c5\") " pod="openshift-route-controller-manager/route-controller-manager-6f6d57cc69-fsbzj" Jan 11 17:32:56 crc kubenswrapper[4837]: I0111 17:32:56.863820 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8fb3713b-b193-4a5a-a841-50d15d8331c5-config\") pod \"route-controller-manager-6f6d57cc69-fsbzj\" (UID: \"8fb3713b-b193-4a5a-a841-50d15d8331c5\") " pod="openshift-route-controller-manager/route-controller-manager-6f6d57cc69-fsbzj" Jan 11 17:32:56 crc kubenswrapper[4837]: I0111 17:32:56.869329 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/8fb3713b-b193-4a5a-a841-50d15d8331c5-serving-cert\") pod \"route-controller-manager-6f6d57cc69-fsbzj\" (UID: \"8fb3713b-b193-4a5a-a841-50d15d8331c5\") " pod="openshift-route-controller-manager/route-controller-manager-6f6d57cc69-fsbzj" Jan 11 17:32:56 crc kubenswrapper[4837]: I0111 17:32:56.878594 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5gm8l\" (UniqueName: \"kubernetes.io/projected/8fb3713b-b193-4a5a-a841-50d15d8331c5-kube-api-access-5gm8l\") pod \"route-controller-manager-6f6d57cc69-fsbzj\" (UID: \"8fb3713b-b193-4a5a-a841-50d15d8331c5\") " pod="openshift-route-controller-manager/route-controller-manager-6f6d57cc69-fsbzj" Jan 11 17:32:57 crc kubenswrapper[4837]: I0111 17:32:57.037338 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-6f6d57cc69-fsbzj" Jan 11 17:32:57 crc kubenswrapper[4837]: E0111 17:32:57.401653 4837 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"\"" pod="openshift-marketplace/certified-operators-29ts4" podUID="30409294-8779-48ad-a6e8-36b662f09c0f" Jan 11 17:32:57 crc kubenswrapper[4837]: E0111 17:32:57.634170 4837 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" image="registry.redhat.io/redhat/community-operator-index:v4.18" Jan 11 17:32:57 crc kubenswrapper[4837]: E0111 17:32:57.634351 4837 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:extract-content,Image:registry.redhat.io/redhat/community-operator-index:v4.18,Command:[/utilities/copy-content],Args:[--catalog.from=/configs --catalog.to=/extracted-catalog/catalog --cache.from=/tmp/cache --cache.to=/extracted-catalog/cache],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:utilities,ReadOnly:false,MountPath:/utilities,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:catalog-content,ReadOnly:false,MountPath:/extracted-catalog,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-k8dhm,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000170000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod community-operators-mxndb_openshift-marketplace(1d39ff8b-c79a-46ea-af70-0902ce0ee504): ErrImagePull: rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" logger="UnhandledError" Jan 11 17:32:57 crc kubenswrapper[4837]: E0111 17:32:57.636472 4837 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ErrImagePull: \"rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled\"" pod="openshift-marketplace/community-operators-mxndb" podUID="1d39ff8b-c79a-46ea-af70-0902ce0ee504" Jan 11 17:32:57 crc kubenswrapper[4837]: E0111 17:32:57.764359 4837 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" image="registry.redhat.io/redhat/community-operator-index:v4.18" Jan 11 17:32:57 crc kubenswrapper[4837]: E0111 17:32:57.764536 4837 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:extract-content,Image:registry.redhat.io/redhat/community-operator-index:v4.18,Command:[/utilities/copy-content],Args:[--catalog.from=/configs --catalog.to=/extracted-catalog/catalog --cache.from=/tmp/cache --cache.to=/extracted-catalog/cache],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:utilities,ReadOnly:false,MountPath:/utilities,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:catalog-content,ReadOnly:false,MountPath:/extracted-catalog,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-rcwxh,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000170000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod community-operators-p7kkt_openshift-marketplace(df3f0f97-5906-44b5-99d5-6003e1b23be1): ErrImagePull: rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" logger="UnhandledError" Jan 11 17:32:57 crc kubenswrapper[4837]: E0111 17:32:57.765626 4837 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ErrImagePull: \"rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled\"" pod="openshift-marketplace/community-operators-p7kkt" podUID="df3f0f97-5906-44b5-99d5-6003e1b23be1" Jan 11 17:32:57 crc kubenswrapper[4837]: E0111 17:32:57.995123 4837 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" image="registry.redhat.io/redhat/certified-operator-index:v4.18" Jan 11 17:32:57 crc kubenswrapper[4837]: E0111 17:32:57.995268 4837 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:extract-content,Image:registry.redhat.io/redhat/certified-operator-index:v4.18,Command:[/utilities/copy-content],Args:[--catalog.from=/configs --catalog.to=/extracted-catalog/catalog --cache.from=/tmp/cache --cache.to=/extracted-catalog/cache],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:utilities,ReadOnly:false,MountPath:/utilities,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:catalog-content,ReadOnly:false,MountPath:/extracted-catalog,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-p7r2n,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000170000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod certified-operators-jk4fx_openshift-marketplace(1aecbb0e-6cc3-4308-a741-c1799ec8b541): ErrImagePull: rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" logger="UnhandledError" Jan 11 17:32:57 crc kubenswrapper[4837]: E0111 17:32:57.996429 4837 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ErrImagePull: \"rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled\"" pod="openshift-marketplace/certified-operators-jk4fx" podUID="1aecbb0e-6cc3-4308-a741-c1799ec8b541" Jan 11 17:32:59 crc kubenswrapper[4837]: E0111 17:32:59.041688 4837 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 8e11a566745fc0a91a7711eeb5325461c2be37c7f2d3144b9cc34f15e4aa587a is running failed: container process not found" containerID="8e11a566745fc0a91a7711eeb5325461c2be37c7f2d3144b9cc34f15e4aa587a" cmd=["/bin/bash","-c","test -f /ready/ready"] Jan 11 17:32:59 crc kubenswrapper[4837]: E0111 17:32:59.042917 4837 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 8e11a566745fc0a91a7711eeb5325461c2be37c7f2d3144b9cc34f15e4aa587a is running failed: container process not found" containerID="8e11a566745fc0a91a7711eeb5325461c2be37c7f2d3144b9cc34f15e4aa587a" cmd=["/bin/bash","-c","test -f /ready/ready"] Jan 11 17:32:59 crc kubenswrapper[4837]: E0111 17:32:59.043314 4837 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 8e11a566745fc0a91a7711eeb5325461c2be37c7f2d3144b9cc34f15e4aa587a is running failed: container process not found" containerID="8e11a566745fc0a91a7711eeb5325461c2be37c7f2d3144b9cc34f15e4aa587a" cmd=["/bin/bash","-c","test -f /ready/ready"] Jan 11 17:32:59 crc kubenswrapper[4837]: E0111 17:32:59.043399 4837 prober.go:104] "Probe errored" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 8e11a566745fc0a91a7711eeb5325461c2be37c7f2d3144b9cc34f15e4aa587a is running failed: container process not found" probeType="Readiness" pod="openshift-multus/cni-sysctl-allowlist-ds-pvfwl" podUID="31dcdcdc-a207-4f09-90af-82c452f9a3f0" containerName="kube-multus-additional-cni-plugins" Jan 11 17:32:59 crc kubenswrapper[4837]: I0111 17:32:59.258358 4837 patch_prober.go:28] interesting pod/controller-manager-879f6c89f-ljlz2 container/controller-manager namespace/openshift-controller-manager: Readiness probe status=failure output="Get \"https://10.217.0.20:8443/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" start-of-body= Jan 11 17:32:59 crc kubenswrapper[4837]: I0111 17:32:59.258430 4837 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-controller-manager/controller-manager-879f6c89f-ljlz2" podUID="f9f7469a-7ddb-4d35-962e-86154f7750c9" containerName="controller-manager" probeResult="failure" output="Get \"https://10.217.0.20:8443/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Jan 11 17:33:01 crc kubenswrapper[4837]: E0111 17:33:01.082384 4837 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"\"" pod="openshift-marketplace/certified-operators-jk4fx" podUID="1aecbb0e-6cc3-4308-a741-c1799ec8b541" Jan 11 17:33:01 crc kubenswrapper[4837]: E0111 17:33:01.082524 4837 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"\"" pod="openshift-marketplace/community-operators-mxndb" podUID="1d39ff8b-c79a-46ea-af70-0902ce0ee504" Jan 11 17:33:01 crc kubenswrapper[4837]: E0111 17:33:01.082542 4837 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"\"" pod="openshift-marketplace/community-operators-p7kkt" podUID="df3f0f97-5906-44b5-99d5-6003e1b23be1" Jan 11 17:33:04 crc kubenswrapper[4837]: E0111 17:33:04.878426 4837 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" image="registry.redhat.io/redhat/redhat-operator-index:v4.18" Jan 11 17:33:04 crc kubenswrapper[4837]: E0111 17:33:04.878978 4837 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:extract-content,Image:registry.redhat.io/redhat/redhat-operator-index:v4.18,Command:[/utilities/copy-content],Args:[--catalog.from=/configs --catalog.to=/extracted-catalog/catalog --cache.from=/tmp/cache --cache.to=/extracted-catalog/cache],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:utilities,ReadOnly:false,MountPath:/utilities,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:catalog-content,ReadOnly:false,MountPath:/extracted-catalog,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-hjfqz,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000170000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod redhat-operators-pvd2s_openshift-marketplace(0f2f437a-d901-4e9b-95c1-9099d8ffaf9c): ErrImagePull: rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" logger="UnhandledError" Jan 11 17:33:04 crc kubenswrapper[4837]: E0111 17:33:04.880253 4837 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ErrImagePull: \"rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled\"" pod="openshift-marketplace/redhat-operators-pvd2s" podUID="0f2f437a-d901-4e9b-95c1-9099d8ffaf9c" Jan 11 17:33:04 crc kubenswrapper[4837]: I0111 17:33:04.889086 4837 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_cni-sysctl-allowlist-ds-pvfwl_31dcdcdc-a207-4f09-90af-82c452f9a3f0/kube-multus-additional-cni-plugins/0.log" Jan 11 17:33:04 crc kubenswrapper[4837]: I0111 17:33:04.889191 4837 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-multus/cni-sysctl-allowlist-ds-pvfwl" Jan 11 17:33:04 crc kubenswrapper[4837]: E0111 17:33:04.891366 4837 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" image="registry.redhat.io/redhat/redhat-marketplace-index:v4.18" Jan 11 17:33:04 crc kubenswrapper[4837]: E0111 17:33:04.891486 4837 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:extract-content,Image:registry.redhat.io/redhat/redhat-marketplace-index:v4.18,Command:[/utilities/copy-content],Args:[--catalog.from=/configs --catalog.to=/extracted-catalog/catalog --cache.from=/tmp/cache --cache.to=/extracted-catalog/cache],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:utilities,ReadOnly:false,MountPath:/utilities,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:catalog-content,ReadOnly:false,MountPath:/extracted-catalog,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-7qkcm,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000170000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod redhat-marketplace-9blrl_openshift-marketplace(5a6d8e6a-5e4a-4ff4-92ec-b2bfee9ef821): ErrImagePull: rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" logger="UnhandledError" Jan 11 17:33:04 crc kubenswrapper[4837]: E0111 17:33:04.892822 4837 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ErrImagePull: \"rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled\"" pod="openshift-marketplace/redhat-marketplace-9blrl" podUID="5a6d8e6a-5e4a-4ff4-92ec-b2bfee9ef821" Jan 11 17:33:04 crc kubenswrapper[4837]: I0111 17:33:04.895992 4837 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-879f6c89f-ljlz2" Jan 11 17:33:04 crc kubenswrapper[4837]: I0111 17:33:04.984893 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ready\" (UniqueName: \"kubernetes.io/empty-dir/31dcdcdc-a207-4f09-90af-82c452f9a3f0-ready\") pod \"31dcdcdc-a207-4f09-90af-82c452f9a3f0\" (UID: \"31dcdcdc-a207-4f09-90af-82c452f9a3f0\") " Jan 11 17:33:04 crc kubenswrapper[4837]: I0111 17:33:04.984976 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/31dcdcdc-a207-4f09-90af-82c452f9a3f0-cni-sysctl-allowlist\") pod \"31dcdcdc-a207-4f09-90af-82c452f9a3f0\" (UID: \"31dcdcdc-a207-4f09-90af-82c452f9a3f0\") " Jan 11 17:33:04 crc kubenswrapper[4837]: I0111 17:33:04.985013 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-dnsct\" (UniqueName: \"kubernetes.io/projected/31dcdcdc-a207-4f09-90af-82c452f9a3f0-kube-api-access-dnsct\") pod \"31dcdcdc-a207-4f09-90af-82c452f9a3f0\" (UID: \"31dcdcdc-a207-4f09-90af-82c452f9a3f0\") " Jan 11 17:33:04 crc kubenswrapper[4837]: I0111 17:33:04.985140 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/31dcdcdc-a207-4f09-90af-82c452f9a3f0-tuning-conf-dir\") pod \"31dcdcdc-a207-4f09-90af-82c452f9a3f0\" (UID: \"31dcdcdc-a207-4f09-90af-82c452f9a3f0\") " Jan 11 17:33:04 crc kubenswrapper[4837]: I0111 17:33:04.985428 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/31dcdcdc-a207-4f09-90af-82c452f9a3f0-tuning-conf-dir" (OuterVolumeSpecName: "tuning-conf-dir") pod "31dcdcdc-a207-4f09-90af-82c452f9a3f0" (UID: "31dcdcdc-a207-4f09-90af-82c452f9a3f0"). InnerVolumeSpecName "tuning-conf-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Jan 11 17:33:04 crc kubenswrapper[4837]: I0111 17:33:04.985605 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/31dcdcdc-a207-4f09-90af-82c452f9a3f0-ready" (OuterVolumeSpecName: "ready") pod "31dcdcdc-a207-4f09-90af-82c452f9a3f0" (UID: "31dcdcdc-a207-4f09-90af-82c452f9a3f0"). InnerVolumeSpecName "ready". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 11 17:33:04 crc kubenswrapper[4837]: I0111 17:33:04.985904 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/31dcdcdc-a207-4f09-90af-82c452f9a3f0-cni-sysctl-allowlist" (OuterVolumeSpecName: "cni-sysctl-allowlist") pod "31dcdcdc-a207-4f09-90af-82c452f9a3f0" (UID: "31dcdcdc-a207-4f09-90af-82c452f9a3f0"). InnerVolumeSpecName "cni-sysctl-allowlist". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 11 17:33:04 crc kubenswrapper[4837]: I0111 17:33:04.992328 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/31dcdcdc-a207-4f09-90af-82c452f9a3f0-kube-api-access-dnsct" (OuterVolumeSpecName: "kube-api-access-dnsct") pod "31dcdcdc-a207-4f09-90af-82c452f9a3f0" (UID: "31dcdcdc-a207-4f09-90af-82c452f9a3f0"). InnerVolumeSpecName "kube-api-access-dnsct". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 11 17:33:05 crc kubenswrapper[4837]: I0111 17:33:05.086623 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/f9f7469a-7ddb-4d35-962e-86154f7750c9-proxy-ca-bundles\") pod \"f9f7469a-7ddb-4d35-962e-86154f7750c9\" (UID: \"f9f7469a-7ddb-4d35-962e-86154f7750c9\") " Jan 11 17:33:05 crc kubenswrapper[4837]: I0111 17:33:05.087065 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f9f7469a-7ddb-4d35-962e-86154f7750c9-config\") pod \"f9f7469a-7ddb-4d35-962e-86154f7750c9\" (UID: \"f9f7469a-7ddb-4d35-962e-86154f7750c9\") " Jan 11 17:33:05 crc kubenswrapper[4837]: I0111 17:33:05.087135 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/f9f7469a-7ddb-4d35-962e-86154f7750c9-serving-cert\") pod \"f9f7469a-7ddb-4d35-962e-86154f7750c9\" (UID: \"f9f7469a-7ddb-4d35-962e-86154f7750c9\") " Jan 11 17:33:05 crc kubenswrapper[4837]: I0111 17:33:05.087202 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-g4vz2\" (UniqueName: \"kubernetes.io/projected/f9f7469a-7ddb-4d35-962e-86154f7750c9-kube-api-access-g4vz2\") pod \"f9f7469a-7ddb-4d35-962e-86154f7750c9\" (UID: \"f9f7469a-7ddb-4d35-962e-86154f7750c9\") " Jan 11 17:33:05 crc kubenswrapper[4837]: I0111 17:33:05.087272 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/f9f7469a-7ddb-4d35-962e-86154f7750c9-client-ca\") pod \"f9f7469a-7ddb-4d35-962e-86154f7750c9\" (UID: \"f9f7469a-7ddb-4d35-962e-86154f7750c9\") " Jan 11 17:33:05 crc kubenswrapper[4837]: I0111 17:33:05.087603 4837 reconciler_common.go:293] "Volume detached for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/31dcdcdc-a207-4f09-90af-82c452f9a3f0-tuning-conf-dir\") on node \"crc\" DevicePath \"\"" Jan 11 17:33:05 crc kubenswrapper[4837]: I0111 17:33:05.087620 4837 reconciler_common.go:293] "Volume detached for volume \"ready\" (UniqueName: \"kubernetes.io/empty-dir/31dcdcdc-a207-4f09-90af-82c452f9a3f0-ready\") on node \"crc\" DevicePath \"\"" Jan 11 17:33:05 crc kubenswrapper[4837]: I0111 17:33:05.087631 4837 reconciler_common.go:293] "Volume detached for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/31dcdcdc-a207-4f09-90af-82c452f9a3f0-cni-sysctl-allowlist\") on node \"crc\" DevicePath \"\"" Jan 11 17:33:05 crc kubenswrapper[4837]: I0111 17:33:05.087661 4837 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-dnsct\" (UniqueName: \"kubernetes.io/projected/31dcdcdc-a207-4f09-90af-82c452f9a3f0-kube-api-access-dnsct\") on node \"crc\" DevicePath \"\"" Jan 11 17:33:05 crc kubenswrapper[4837]: I0111 17:33:05.088452 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/f9f7469a-7ddb-4d35-962e-86154f7750c9-proxy-ca-bundles" (OuterVolumeSpecName: "proxy-ca-bundles") pod "f9f7469a-7ddb-4d35-962e-86154f7750c9" (UID: "f9f7469a-7ddb-4d35-962e-86154f7750c9"). InnerVolumeSpecName "proxy-ca-bundles". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 11 17:33:05 crc kubenswrapper[4837]: I0111 17:33:05.088496 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/f9f7469a-7ddb-4d35-962e-86154f7750c9-client-ca" (OuterVolumeSpecName: "client-ca") pod "f9f7469a-7ddb-4d35-962e-86154f7750c9" (UID: "f9f7469a-7ddb-4d35-962e-86154f7750c9"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 11 17:33:05 crc kubenswrapper[4837]: I0111 17:33:05.089342 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/f9f7469a-7ddb-4d35-962e-86154f7750c9-config" (OuterVolumeSpecName: "config") pod "f9f7469a-7ddb-4d35-962e-86154f7750c9" (UID: "f9f7469a-7ddb-4d35-962e-86154f7750c9"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 11 17:33:05 crc kubenswrapper[4837]: I0111 17:33:05.093353 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f9f7469a-7ddb-4d35-962e-86154f7750c9-kube-api-access-g4vz2" (OuterVolumeSpecName: "kube-api-access-g4vz2") pod "f9f7469a-7ddb-4d35-962e-86154f7750c9" (UID: "f9f7469a-7ddb-4d35-962e-86154f7750c9"). InnerVolumeSpecName "kube-api-access-g4vz2". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 11 17:33:05 crc kubenswrapper[4837]: I0111 17:33:05.102866 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f9f7469a-7ddb-4d35-962e-86154f7750c9-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "f9f7469a-7ddb-4d35-962e-86154f7750c9" (UID: "f9f7469a-7ddb-4d35-962e-86154f7750c9"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 11 17:33:05 crc kubenswrapper[4837]: I0111 17:33:05.188647 4837 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f9f7469a-7ddb-4d35-962e-86154f7750c9-config\") on node \"crc\" DevicePath \"\"" Jan 11 17:33:05 crc kubenswrapper[4837]: I0111 17:33:05.188696 4837 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/f9f7469a-7ddb-4d35-962e-86154f7750c9-serving-cert\") on node \"crc\" DevicePath \"\"" Jan 11 17:33:05 crc kubenswrapper[4837]: I0111 17:33:05.188710 4837 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-g4vz2\" (UniqueName: \"kubernetes.io/projected/f9f7469a-7ddb-4d35-962e-86154f7750c9-kube-api-access-g4vz2\") on node \"crc\" DevicePath \"\"" Jan 11 17:33:05 crc kubenswrapper[4837]: I0111 17:33:05.188723 4837 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/f9f7469a-7ddb-4d35-962e-86154f7750c9-client-ca\") on node \"crc\" DevicePath \"\"" Jan 11 17:33:05 crc kubenswrapper[4837]: I0111 17:33:05.188735 4837 reconciler_common.go:293] "Volume detached for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/f9f7469a-7ddb-4d35-962e-86154f7750c9-proxy-ca-bundles\") on node \"crc\" DevicePath \"\"" Jan 11 17:33:05 crc kubenswrapper[4837]: I0111 17:33:05.477424 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-879f6c89f-ljlz2" event={"ID":"f9f7469a-7ddb-4d35-962e-86154f7750c9","Type":"ContainerDied","Data":"a69046d634c7d619658399991d46773b3a2bec6a5441e85c91462348e8410ae9"} Jan 11 17:33:05 crc kubenswrapper[4837]: I0111 17:33:05.477511 4837 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-879f6c89f-ljlz2" Jan 11 17:33:05 crc kubenswrapper[4837]: I0111 17:33:05.479142 4837 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_cni-sysctl-allowlist-ds-pvfwl_31dcdcdc-a207-4f09-90af-82c452f9a3f0/kube-multus-additional-cni-plugins/0.log" Jan 11 17:33:05 crc kubenswrapper[4837]: I0111 17:33:05.479288 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/cni-sysctl-allowlist-ds-pvfwl" event={"ID":"31dcdcdc-a207-4f09-90af-82c452f9a3f0","Type":"ContainerDied","Data":"cdc30b8e30daa7d568941c549c77456206d6e80f8186ebb36fef18a3c4bbef63"} Jan 11 17:33:05 crc kubenswrapper[4837]: I0111 17:33:05.479314 4837 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-multus/cni-sysctl-allowlist-ds-pvfwl" Jan 11 17:33:05 crc kubenswrapper[4837]: I0111 17:33:05.545411 4837 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-multus/cni-sysctl-allowlist-ds-pvfwl"] Jan 11 17:33:05 crc kubenswrapper[4837]: I0111 17:33:05.551864 4837 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-multus/cni-sysctl-allowlist-ds-pvfwl"] Jan 11 17:33:05 crc kubenswrapper[4837]: I0111 17:33:05.554562 4837 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-controller-manager/controller-manager-879f6c89f-ljlz2"] Jan 11 17:33:05 crc kubenswrapper[4837]: I0111 17:33:05.557056 4837 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-controller-manager/controller-manager-879f6c89f-ljlz2"] Jan 11 17:33:05 crc kubenswrapper[4837]: I0111 17:33:05.639111 4837 scope.go:117] "RemoveContainer" containerID="1ba2d3b02fd38acfbe485a7c569d325c8f90c37299ef3ce62f46c78cd54d92e5" Jan 11 17:33:05 crc kubenswrapper[4837]: E0111 17:33:05.671420 4837 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"\"" pod="openshift-marketplace/redhat-marketplace-9blrl" podUID="5a6d8e6a-5e4a-4ff4-92ec-b2bfee9ef821" Jan 11 17:33:05 crc kubenswrapper[4837]: W0111 17:33:05.676763 4837 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod9d751cbb_f2e2_430d_9754_c882a5e924a5.slice/crio-4bd4319dd385cfa4defc8e731ea044dc8be1d2d3d03ef405c94c6cd0a071b87e WatchSource:0}: Error finding container 4bd4319dd385cfa4defc8e731ea044dc8be1d2d3d03ef405c94c6cd0a071b87e: Status 404 returned error can't find the container with id 4bd4319dd385cfa4defc8e731ea044dc8be1d2d3d03ef405c94c6cd0a071b87e Jan 11 17:33:05 crc kubenswrapper[4837]: I0111 17:33:05.731722 4837 scope.go:117] "RemoveContainer" containerID="8e11a566745fc0a91a7711eeb5325461c2be37c7f2d3144b9cc34f15e4aa587a" Jan 11 17:33:05 crc kubenswrapper[4837]: I0111 17:33:05.956688 4837 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver/revision-pruner-9-crc"] Jan 11 17:33:05 crc kubenswrapper[4837]: W0111 17:33:05.964884 4837 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-pod7be850c8_f1a1_40c4_9c12_5ff8af233635.slice/crio-3c1c1f048bf077356ea2b5431167fd03a9989df02497c1e3e01bd5f34937cff1 WatchSource:0}: Error finding container 3c1c1f048bf077356ea2b5431167fd03a9989df02497c1e3e01bd5f34937cff1: Status 404 returned error can't find the container with id 3c1c1f048bf077356ea2b5431167fd03a9989df02497c1e3e01bd5f34937cff1 Jan 11 17:33:06 crc kubenswrapper[4837]: I0111 17:33:06.008256 4837 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver/installer-9-crc"] Jan 11 17:33:06 crc kubenswrapper[4837]: I0111 17:33:06.034046 4837 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6f6d57cc69-fsbzj"] Jan 11 17:33:06 crc kubenswrapper[4837]: W0111 17:33:06.043272 4837 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod8fb3713b_b193_4a5a_a841_50d15d8331c5.slice/crio-233746b4f7bea9a083ed6775c0f2d8422f607fbdd0d6895f689b4a0d4c9c789a WatchSource:0}: Error finding container 233746b4f7bea9a083ed6775c0f2d8422f607fbdd0d6895f689b4a0d4c9c789a: Status 404 returned error can't find the container with id 233746b4f7bea9a083ed6775c0f2d8422f607fbdd0d6895f689b4a0d4c9c789a Jan 11 17:33:06 crc kubenswrapper[4837]: W0111 17:33:06.106421 4837 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod5fe485a1_e14f_4c09_b5b9_f252bc42b7e8.slice/crio-2d2d59fec6d7e9630a89f4adfd36f6f58f5a4d1ae0de676051f82c062b827355 WatchSource:0}: Error finding container 2d2d59fec6d7e9630a89f4adfd36f6f58f5a4d1ae0de676051f82c062b827355: Status 404 returned error can't find the container with id 2d2d59fec6d7e9630a89f4adfd36f6f58f5a4d1ae0de676051f82c062b827355 Jan 11 17:33:06 crc kubenswrapper[4837]: W0111 17:33:06.127004 4837 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod3b6479f0_333b_4a96_9adf_2099afdc2447.slice/crio-cd44e35f931cbe48c2c5ffb295bea20290ceaacb5a866da951b8978ef97514e9 WatchSource:0}: Error finding container cd44e35f931cbe48c2c5ffb295bea20290ceaacb5a866da951b8978ef97514e9: Status 404 returned error can't find the container with id cd44e35f931cbe48c2c5ffb295bea20290ceaacb5a866da951b8978ef97514e9 Jan 11 17:33:06 crc kubenswrapper[4837]: I0111 17:33:06.377751 4837 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="31dcdcdc-a207-4f09-90af-82c452f9a3f0" path="/var/lib/kubelet/pods/31dcdcdc-a207-4f09-90af-82c452f9a3f0/volumes" Jan 11 17:33:06 crc kubenswrapper[4837]: I0111 17:33:06.379387 4837 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f9f7469a-7ddb-4d35-962e-86154f7750c9" path="/var/lib/kubelet/pods/f9f7469a-7ddb-4d35-962e-86154f7750c9/volumes" Jan 11 17:33:06 crc kubenswrapper[4837]: I0111 17:33:06.490710 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" event={"ID":"9d751cbb-f2e2-430d-9754-c882a5e924a5","Type":"ContainerStarted","Data":"ebfd1f531353cc67187cfac6e9786f88453e4246ba98fcd0b5f4967d738560c3"} Jan 11 17:33:06 crc kubenswrapper[4837]: I0111 17:33:06.491345 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" event={"ID":"9d751cbb-f2e2-430d-9754-c882a5e924a5","Type":"ContainerStarted","Data":"4bd4319dd385cfa4defc8e731ea044dc8be1d2d3d03ef405c94c6cd0a071b87e"} Jan 11 17:33:06 crc kubenswrapper[4837]: I0111 17:33:06.493599 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" event={"ID":"3b6479f0-333b-4a96-9adf-2099afdc2447","Type":"ContainerStarted","Data":"cd44e35f931cbe48c2c5ffb295bea20290ceaacb5a866da951b8978ef97514e9"} Jan 11 17:33:06 crc kubenswrapper[4837]: I0111 17:33:06.502059 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/revision-pruner-9-crc" event={"ID":"7be850c8-f1a1-40c4-9c12-5ff8af233635","Type":"ContainerStarted","Data":"3c1c1f048bf077356ea2b5431167fd03a9989df02497c1e3e01bd5f34937cff1"} Jan 11 17:33:06 crc kubenswrapper[4837]: I0111 17:33:06.503586 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" event={"ID":"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8","Type":"ContainerStarted","Data":"2d2d59fec6d7e9630a89f4adfd36f6f58f5a4d1ae0de676051f82c062b827355"} Jan 11 17:33:06 crc kubenswrapper[4837]: I0111 17:33:06.512120 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/installer-9-crc" event={"ID":"1ec7b443-a2dc-4c7b-b661-03ce968d0878","Type":"ContainerStarted","Data":"4dd5c6a3d2f214100793b6470f90f7583464cbc988cde026f9384f409830c00f"} Jan 11 17:33:06 crc kubenswrapper[4837]: I0111 17:33:06.523081 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-dht47" event={"ID":"2bf0823a-63d9-43c9-9cc7-1e2f364b7855","Type":"ContainerStarted","Data":"97dbb9dfdb354c6380f7d51f299799dc6abc105cd069a414692210966b79ab6c"} Jan 11 17:33:06 crc kubenswrapper[4837]: I0111 17:33:06.535906 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-6f6d57cc69-fsbzj" event={"ID":"8fb3713b-b193-4a5a-a841-50d15d8331c5","Type":"ContainerStarted","Data":"233746b4f7bea9a083ed6775c0f2d8422f607fbdd0d6895f689b4a0d4c9c789a"} Jan 11 17:33:06 crc kubenswrapper[4837]: I0111 17:33:06.568014 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-slftv" event={"ID":"59797bd6-cb69-412d-952b-1673312648e2","Type":"ContainerStarted","Data":"ff0befc9320321b51b2bd3d93706ae69b790b5ca019b8bc59af4eecf6b57da76"} Jan 11 17:33:07 crc kubenswrapper[4837]: I0111 17:33:07.596036 4837 generic.go:334] "Generic (PLEG): container finished" podID="2bf0823a-63d9-43c9-9cc7-1e2f364b7855" containerID="97dbb9dfdb354c6380f7d51f299799dc6abc105cd069a414692210966b79ab6c" exitCode=0 Jan 11 17:33:07 crc kubenswrapper[4837]: I0111 17:33:07.598075 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-dht47" event={"ID":"2bf0823a-63d9-43c9-9cc7-1e2f364b7855","Type":"ContainerDied","Data":"97dbb9dfdb354c6380f7d51f299799dc6abc105cd069a414692210966b79ab6c"} Jan 11 17:33:08 crc kubenswrapper[4837]: I0111 17:33:08.603246 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" event={"ID":"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8","Type":"ContainerStarted","Data":"fe15b3e19086456f0a6d456bd0a5145d1b8d26db7bf6e7c6ee08b39b6aa05d05"} Jan 11 17:33:08 crc kubenswrapper[4837]: I0111 17:33:08.604527 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/revision-pruner-9-crc" event={"ID":"7be850c8-f1a1-40c4-9c12-5ff8af233635","Type":"ContainerStarted","Data":"d6b3118227cf64dad8b6c65cb96b5831371f60a28c4f4dc5332367ffa24bb659"} Jan 11 17:33:08 crc kubenswrapper[4837]: I0111 17:33:08.607146 4837 generic.go:334] "Generic (PLEG): container finished" podID="59797bd6-cb69-412d-952b-1673312648e2" containerID="ff0befc9320321b51b2bd3d93706ae69b790b5ca019b8bc59af4eecf6b57da76" exitCode=0 Jan 11 17:33:08 crc kubenswrapper[4837]: I0111 17:33:08.607199 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-slftv" event={"ID":"59797bd6-cb69-412d-952b-1673312648e2","Type":"ContainerDied","Data":"ff0befc9320321b51b2bd3d93706ae69b790b5ca019b8bc59af4eecf6b57da76"} Jan 11 17:33:08 crc kubenswrapper[4837]: I0111 17:33:08.608847 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/installer-9-crc" event={"ID":"1ec7b443-a2dc-4c7b-b661-03ce968d0878","Type":"ContainerStarted","Data":"34d7a443d8101750876ab0203b1b18d7d9680c9044ac64e95d8e47f90e59a144"} Jan 11 17:33:09 crc kubenswrapper[4837]: I0111 17:33:09.615875 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" event={"ID":"3b6479f0-333b-4a96-9adf-2099afdc2447","Type":"ContainerStarted","Data":"f7b4de9d637fa5820013ab65cd414f7c3bc9a278787c5ba1b55c606277ed5f1a"} Jan 11 17:33:09 crc kubenswrapper[4837]: I0111 17:33:09.618464 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-6f6d57cc69-fsbzj" event={"ID":"8fb3713b-b193-4a5a-a841-50d15d8331c5","Type":"ContainerStarted","Data":"b2d0f54aa8410d967b15e3f168e593e8a23b3380e2574cc5e973a75720d42b1d"} Jan 11 17:33:09 crc kubenswrapper[4837]: I0111 17:33:09.639895 4837 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-apiserver/revision-pruner-9-crc" podStartSLOduration=34.639872131 podStartE2EDuration="34.639872131s" podCreationTimestamp="2026-01-11 17:32:35 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-11 17:33:09.636792238 +0000 UTC m=+163.814984984" watchObservedRunningTime="2026-01-11 17:33:09.639872131 +0000 UTC m=+163.818064877" Jan 11 17:33:10 crc kubenswrapper[4837]: I0111 17:33:10.657899 4837 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-apiserver/installer-9-crc" podStartSLOduration=31.657876362 podStartE2EDuration="31.657876362s" podCreationTimestamp="2026-01-11 17:32:39 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-11 17:33:10.656447194 +0000 UTC m=+164.834639940" watchObservedRunningTime="2026-01-11 17:33:10.657876362 +0000 UTC m=+164.836069098" Jan 11 17:33:11 crc kubenswrapper[4837]: I0111 17:33:11.637239 4837 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-route-controller-manager/route-controller-manager-6f6d57cc69-fsbzj" Jan 11 17:33:11 crc kubenswrapper[4837]: I0111 17:33:11.647041 4837 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-route-controller-manager/route-controller-manager-6f6d57cc69-fsbzj" Jan 11 17:33:11 crc kubenswrapper[4837]: I0111 17:33:11.669135 4837 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-route-controller-manager/route-controller-manager-6f6d57cc69-fsbzj" podStartSLOduration=38.669108414 podStartE2EDuration="38.669108414s" podCreationTimestamp="2026-01-11 17:32:33 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-11 17:33:11.660888395 +0000 UTC m=+165.839081131" watchObservedRunningTime="2026-01-11 17:33:11.669108414 +0000 UTC m=+165.847301120" Jan 11 17:33:12 crc kubenswrapper[4837]: I0111 17:33:12.645079 4837 generic.go:334] "Generic (PLEG): container finished" podID="7be850c8-f1a1-40c4-9c12-5ff8af233635" containerID="d6b3118227cf64dad8b6c65cb96b5831371f60a28c4f4dc5332367ffa24bb659" exitCode=0 Jan 11 17:33:12 crc kubenswrapper[4837]: I0111 17:33:12.645250 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/revision-pruner-9-crc" event={"ID":"7be850c8-f1a1-40c4-9c12-5ff8af233635","Type":"ContainerDied","Data":"d6b3118227cf64dad8b6c65cb96b5831371f60a28c4f4dc5332367ffa24bb659"} Jan 11 17:33:13 crc kubenswrapper[4837]: I0111 17:33:13.185995 4837 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-network-diagnostics/network-check-target-xd92c" Jan 11 17:33:13 crc kubenswrapper[4837]: I0111 17:33:13.735345 4837 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-controller-manager/controller-manager-b577cd54c-v775z"] Jan 11 17:33:13 crc kubenswrapper[4837]: E0111 17:33:13.735751 4837 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f9f7469a-7ddb-4d35-962e-86154f7750c9" containerName="controller-manager" Jan 11 17:33:13 crc kubenswrapper[4837]: I0111 17:33:13.735776 4837 state_mem.go:107] "Deleted CPUSet assignment" podUID="f9f7469a-7ddb-4d35-962e-86154f7750c9" containerName="controller-manager" Jan 11 17:33:13 crc kubenswrapper[4837]: E0111 17:33:13.735813 4837 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="31dcdcdc-a207-4f09-90af-82c452f9a3f0" containerName="kube-multus-additional-cni-plugins" Jan 11 17:33:13 crc kubenswrapper[4837]: I0111 17:33:13.735826 4837 state_mem.go:107] "Deleted CPUSet assignment" podUID="31dcdcdc-a207-4f09-90af-82c452f9a3f0" containerName="kube-multus-additional-cni-plugins" Jan 11 17:33:13 crc kubenswrapper[4837]: I0111 17:33:13.735995 4837 memory_manager.go:354] "RemoveStaleState removing state" podUID="f9f7469a-7ddb-4d35-962e-86154f7750c9" containerName="controller-manager" Jan 11 17:33:13 crc kubenswrapper[4837]: I0111 17:33:13.736015 4837 memory_manager.go:354] "RemoveStaleState removing state" podUID="31dcdcdc-a207-4f09-90af-82c452f9a3f0" containerName="kube-multus-additional-cni-plugins" Jan 11 17:33:13 crc kubenswrapper[4837]: I0111 17:33:13.736598 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-b577cd54c-v775z" Jan 11 17:33:13 crc kubenswrapper[4837]: I0111 17:33:13.740995 4837 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-b577cd54c-v775z"] Jan 11 17:33:13 crc kubenswrapper[4837]: I0111 17:33:13.744290 4837 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"config" Jan 11 17:33:13 crc kubenswrapper[4837]: I0111 17:33:13.744320 4837 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"client-ca" Jan 11 17:33:13 crc kubenswrapper[4837]: I0111 17:33:13.745157 4837 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-service-ca.crt" Jan 11 17:33:13 crc kubenswrapper[4837]: I0111 17:33:13.745351 4837 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"kube-root-ca.crt" Jan 11 17:33:13 crc kubenswrapper[4837]: I0111 17:33:13.745573 4837 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"openshift-controller-manager-sa-dockercfg-msq4c" Jan 11 17:33:13 crc kubenswrapper[4837]: I0111 17:33:13.745756 4837 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"serving-cert" Jan 11 17:33:13 crc kubenswrapper[4837]: I0111 17:33:13.752933 4837 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-global-ca" Jan 11 17:33:13 crc kubenswrapper[4837]: I0111 17:33:13.919175 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/99d66a69-1de3-43d5-947f-1fc4611c3022-client-ca\") pod \"controller-manager-b577cd54c-v775z\" (UID: \"99d66a69-1de3-43d5-947f-1fc4611c3022\") " pod="openshift-controller-manager/controller-manager-b577cd54c-v775z" Jan 11 17:33:13 crc kubenswrapper[4837]: I0111 17:33:13.919211 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/99d66a69-1de3-43d5-947f-1fc4611c3022-config\") pod \"controller-manager-b577cd54c-v775z\" (UID: \"99d66a69-1de3-43d5-947f-1fc4611c3022\") " pod="openshift-controller-manager/controller-manager-b577cd54c-v775z" Jan 11 17:33:13 crc kubenswrapper[4837]: I0111 17:33:13.919238 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8kx4l\" (UniqueName: \"kubernetes.io/projected/99d66a69-1de3-43d5-947f-1fc4611c3022-kube-api-access-8kx4l\") pod \"controller-manager-b577cd54c-v775z\" (UID: \"99d66a69-1de3-43d5-947f-1fc4611c3022\") " pod="openshift-controller-manager/controller-manager-b577cd54c-v775z" Jan 11 17:33:13 crc kubenswrapper[4837]: I0111 17:33:13.919264 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/99d66a69-1de3-43d5-947f-1fc4611c3022-serving-cert\") pod \"controller-manager-b577cd54c-v775z\" (UID: \"99d66a69-1de3-43d5-947f-1fc4611c3022\") " pod="openshift-controller-manager/controller-manager-b577cd54c-v775z" Jan 11 17:33:13 crc kubenswrapper[4837]: I0111 17:33:13.919288 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/99d66a69-1de3-43d5-947f-1fc4611c3022-proxy-ca-bundles\") pod \"controller-manager-b577cd54c-v775z\" (UID: \"99d66a69-1de3-43d5-947f-1fc4611c3022\") " pod="openshift-controller-manager/controller-manager-b577cd54c-v775z" Jan 11 17:33:14 crc kubenswrapper[4837]: I0111 17:33:14.032756 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/99d66a69-1de3-43d5-947f-1fc4611c3022-client-ca\") pod \"controller-manager-b577cd54c-v775z\" (UID: \"99d66a69-1de3-43d5-947f-1fc4611c3022\") " pod="openshift-controller-manager/controller-manager-b577cd54c-v775z" Jan 11 17:33:14 crc kubenswrapper[4837]: I0111 17:33:14.032825 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/99d66a69-1de3-43d5-947f-1fc4611c3022-config\") pod \"controller-manager-b577cd54c-v775z\" (UID: \"99d66a69-1de3-43d5-947f-1fc4611c3022\") " pod="openshift-controller-manager/controller-manager-b577cd54c-v775z" Jan 11 17:33:14 crc kubenswrapper[4837]: I0111 17:33:14.032872 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8kx4l\" (UniqueName: \"kubernetes.io/projected/99d66a69-1de3-43d5-947f-1fc4611c3022-kube-api-access-8kx4l\") pod \"controller-manager-b577cd54c-v775z\" (UID: \"99d66a69-1de3-43d5-947f-1fc4611c3022\") " pod="openshift-controller-manager/controller-manager-b577cd54c-v775z" Jan 11 17:33:14 crc kubenswrapper[4837]: I0111 17:33:14.032916 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/99d66a69-1de3-43d5-947f-1fc4611c3022-serving-cert\") pod \"controller-manager-b577cd54c-v775z\" (UID: \"99d66a69-1de3-43d5-947f-1fc4611c3022\") " pod="openshift-controller-manager/controller-manager-b577cd54c-v775z" Jan 11 17:33:14 crc kubenswrapper[4837]: I0111 17:33:14.032966 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/99d66a69-1de3-43d5-947f-1fc4611c3022-proxy-ca-bundles\") pod \"controller-manager-b577cd54c-v775z\" (UID: \"99d66a69-1de3-43d5-947f-1fc4611c3022\") " pod="openshift-controller-manager/controller-manager-b577cd54c-v775z" Jan 11 17:33:15 crc kubenswrapper[4837]: I0111 17:33:15.709033 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/99d66a69-1de3-43d5-947f-1fc4611c3022-client-ca\") pod \"controller-manager-b577cd54c-v775z\" (UID: \"99d66a69-1de3-43d5-947f-1fc4611c3022\") " pod="openshift-controller-manager/controller-manager-b577cd54c-v775z" Jan 11 17:33:15 crc kubenswrapper[4837]: I0111 17:33:15.709929 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/99d66a69-1de3-43d5-947f-1fc4611c3022-proxy-ca-bundles\") pod \"controller-manager-b577cd54c-v775z\" (UID: \"99d66a69-1de3-43d5-947f-1fc4611c3022\") " pod="openshift-controller-manager/controller-manager-b577cd54c-v775z" Jan 11 17:33:15 crc kubenswrapper[4837]: I0111 17:33:15.711468 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/99d66a69-1de3-43d5-947f-1fc4611c3022-config\") pod \"controller-manager-b577cd54c-v775z\" (UID: \"99d66a69-1de3-43d5-947f-1fc4611c3022\") " pod="openshift-controller-manager/controller-manager-b577cd54c-v775z" Jan 11 17:33:15 crc kubenswrapper[4837]: I0111 17:33:15.715378 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/99d66a69-1de3-43d5-947f-1fc4611c3022-serving-cert\") pod \"controller-manager-b577cd54c-v775z\" (UID: \"99d66a69-1de3-43d5-947f-1fc4611c3022\") " pod="openshift-controller-manager/controller-manager-b577cd54c-v775z" Jan 11 17:33:15 crc kubenswrapper[4837]: I0111 17:33:15.716186 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8kx4l\" (UniqueName: \"kubernetes.io/projected/99d66a69-1de3-43d5-947f-1fc4611c3022-kube-api-access-8kx4l\") pod \"controller-manager-b577cd54c-v775z\" (UID: \"99d66a69-1de3-43d5-947f-1fc4611c3022\") " pod="openshift-controller-manager/controller-manager-b577cd54c-v775z" Jan 11 17:33:15 crc kubenswrapper[4837]: I0111 17:33:15.785721 4837 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-9-crc" Jan 11 17:33:15 crc kubenswrapper[4837]: I0111 17:33:15.958883 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/7be850c8-f1a1-40c4-9c12-5ff8af233635-kubelet-dir\") pod \"7be850c8-f1a1-40c4-9c12-5ff8af233635\" (UID: \"7be850c8-f1a1-40c4-9c12-5ff8af233635\") " Jan 11 17:33:15 crc kubenswrapper[4837]: I0111 17:33:15.959023 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/7be850c8-f1a1-40c4-9c12-5ff8af233635-kube-api-access\") pod \"7be850c8-f1a1-40c4-9c12-5ff8af233635\" (UID: \"7be850c8-f1a1-40c4-9c12-5ff8af233635\") " Jan 11 17:33:15 crc kubenswrapper[4837]: I0111 17:33:15.959084 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/7be850c8-f1a1-40c4-9c12-5ff8af233635-kubelet-dir" (OuterVolumeSpecName: "kubelet-dir") pod "7be850c8-f1a1-40c4-9c12-5ff8af233635" (UID: "7be850c8-f1a1-40c4-9c12-5ff8af233635"). InnerVolumeSpecName "kubelet-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Jan 11 17:33:15 crc kubenswrapper[4837]: I0111 17:33:15.959925 4837 reconciler_common.go:293] "Volume detached for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/7be850c8-f1a1-40c4-9c12-5ff8af233635-kubelet-dir\") on node \"crc\" DevicePath \"\"" Jan 11 17:33:15 crc kubenswrapper[4837]: I0111 17:33:15.964210 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7be850c8-f1a1-40c4-9c12-5ff8af233635-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "7be850c8-f1a1-40c4-9c12-5ff8af233635" (UID: "7be850c8-f1a1-40c4-9c12-5ff8af233635"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 11 17:33:16 crc kubenswrapper[4837]: I0111 17:33:16.009381 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-b577cd54c-v775z" Jan 11 17:33:16 crc kubenswrapper[4837]: I0111 17:33:16.062595 4837 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/7be850c8-f1a1-40c4-9c12-5ff8af233635-kube-api-access\") on node \"crc\" DevicePath \"\"" Jan 11 17:33:16 crc kubenswrapper[4837]: I0111 17:33:16.678563 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/revision-pruner-9-crc" event={"ID":"7be850c8-f1a1-40c4-9c12-5ff8af233635","Type":"ContainerDied","Data":"3c1c1f048bf077356ea2b5431167fd03a9989df02497c1e3e01bd5f34937cff1"} Jan 11 17:33:16 crc kubenswrapper[4837]: I0111 17:33:16.678619 4837 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="3c1c1f048bf077356ea2b5431167fd03a9989df02497c1e3e01bd5f34937cff1" Jan 11 17:33:16 crc kubenswrapper[4837]: I0111 17:33:16.678814 4837 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-9-crc" Jan 11 17:33:39 crc kubenswrapper[4837]: I0111 17:33:39.444457 4837 patch_prober.go:28] interesting pod/machine-config-daemon-pqnst container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Jan 11 17:33:39 crc kubenswrapper[4837]: I0111 17:33:39.445093 4837 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-pqnst" podUID="1cf6fa66-290a-4e29-bafc-e60185a22fe2" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Jan 11 17:33:45 crc kubenswrapper[4837]: I0111 17:33:45.049000 4837 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-network-diagnostics/network-check-target-xd92c" Jan 11 17:33:46 crc kubenswrapper[4837]: I0111 17:33:46.106540 4837 kubelet.go:2431] "SyncLoop REMOVE" source="file" pods=["openshift-kube-apiserver/kube-apiserver-crc"] Jan 11 17:33:46 crc kubenswrapper[4837]: I0111 17:33:46.107439 4837 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver" containerID="cri-o://7d60a2ce54687f5ded99fd2e70894c23fe35a18b1bd557e6f05edca994881a3c" gracePeriod=15 Jan 11 17:33:46 crc kubenswrapper[4837]: I0111 17:33:46.107476 4837 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" containerID="cri-o://998f980d03503669fbff3f3dd9cee724c43b1a41050d8cc67ba424e9823a4220" gracePeriod=15 Jan 11 17:33:46 crc kubenswrapper[4837]: I0111 17:33:46.107631 4837 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-insecure-readyz" containerID="cri-o://c40f593938fad7c4202ddde8c6125eed092683e19464cc63448e7579e8174e54" gracePeriod=15 Jan 11 17:33:46 crc kubenswrapper[4837]: I0111 17:33:46.107777 4837 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-cert-regeneration-controller" containerID="cri-o://d3ff334f7925a2e69fa040b1ebfee0fd51669c60ff75534c1e08a9bf30b28d4d" gracePeriod=15 Jan 11 17:33:46 crc kubenswrapper[4837]: I0111 17:33:46.107873 4837 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-cert-syncer" containerID="cri-o://b0188eb536b523f8b0f5373307a7649fedd407a748d413a27ca1f61ed03e9e1d" gracePeriod=15 Jan 11 17:33:46 crc kubenswrapper[4837]: I0111 17:33:46.108861 4837 kubelet.go:2421] "SyncLoop ADD" source="file" pods=["openshift-kube-apiserver/kube-apiserver-crc"] Jan 11 17:33:46 crc kubenswrapper[4837]: E0111 17:33:46.109216 4837 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Jan 11 17:33:46 crc kubenswrapper[4837]: I0111 17:33:46.109241 4837 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Jan 11 17:33:46 crc kubenswrapper[4837]: E0111 17:33:46.109262 4837 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7be850c8-f1a1-40c4-9c12-5ff8af233635" containerName="pruner" Jan 11 17:33:46 crc kubenswrapper[4837]: I0111 17:33:46.109277 4837 state_mem.go:107] "Deleted CPUSet assignment" podUID="7be850c8-f1a1-40c4-9c12-5ff8af233635" containerName="pruner" Jan 11 17:33:46 crc kubenswrapper[4837]: E0111 17:33:46.109298 4837 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-cert-regeneration-controller" Jan 11 17:33:46 crc kubenswrapper[4837]: I0111 17:33:46.109314 4837 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-cert-regeneration-controller" Jan 11 17:33:46 crc kubenswrapper[4837]: E0111 17:33:46.109333 4837 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="setup" Jan 11 17:33:46 crc kubenswrapper[4837]: I0111 17:33:46.109347 4837 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="setup" Jan 11 17:33:46 crc kubenswrapper[4837]: E0111 17:33:46.109379 4837 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Jan 11 17:33:46 crc kubenswrapper[4837]: I0111 17:33:46.109395 4837 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Jan 11 17:33:46 crc kubenswrapper[4837]: E0111 17:33:46.109417 4837 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-cert-syncer" Jan 11 17:33:46 crc kubenswrapper[4837]: I0111 17:33:46.109432 4837 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-cert-syncer" Jan 11 17:33:46 crc kubenswrapper[4837]: E0111 17:33:46.109452 4837 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver" Jan 11 17:33:46 crc kubenswrapper[4837]: I0111 17:33:46.109468 4837 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver" Jan 11 17:33:46 crc kubenswrapper[4837]: E0111 17:33:46.109490 4837 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-insecure-readyz" Jan 11 17:33:46 crc kubenswrapper[4837]: I0111 17:33:46.109504 4837 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-insecure-readyz" Jan 11 17:33:46 crc kubenswrapper[4837]: I0111 17:33:46.109865 4837 memory_manager.go:354] "RemoveStaleState removing state" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver" Jan 11 17:33:46 crc kubenswrapper[4837]: I0111 17:33:46.109905 4837 memory_manager.go:354] "RemoveStaleState removing state" podUID="7be850c8-f1a1-40c4-9c12-5ff8af233635" containerName="pruner" Jan 11 17:33:46 crc kubenswrapper[4837]: I0111 17:33:46.109925 4837 memory_manager.go:354] "RemoveStaleState removing state" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Jan 11 17:33:46 crc kubenswrapper[4837]: I0111 17:33:46.109954 4837 memory_manager.go:354] "RemoveStaleState removing state" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-insecure-readyz" Jan 11 17:33:46 crc kubenswrapper[4837]: I0111 17:33:46.109975 4837 memory_manager.go:354] "RemoveStaleState removing state" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-cert-regeneration-controller" Jan 11 17:33:46 crc kubenswrapper[4837]: I0111 17:33:46.110001 4837 memory_manager.go:354] "RemoveStaleState removing state" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-cert-syncer" Jan 11 17:33:46 crc kubenswrapper[4837]: I0111 17:33:46.110023 4837 memory_manager.go:354] "RemoveStaleState removing state" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Jan 11 17:33:46 crc kubenswrapper[4837]: E0111 17:33:46.110252 4837 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Jan 11 17:33:46 crc kubenswrapper[4837]: I0111 17:33:46.110273 4837 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Jan 11 17:33:46 crc kubenswrapper[4837]: I0111 17:33:46.110520 4837 memory_manager.go:354] "RemoveStaleState removing state" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Jan 11 17:33:46 crc kubenswrapper[4837]: I0111 17:33:46.113160 4837 kubelet.go:2421] "SyncLoop ADD" source="file" pods=["openshift-kube-apiserver/kube-apiserver-startup-monitor-crc"] Jan 11 17:33:46 crc kubenswrapper[4837]: I0111 17:33:46.114956 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Jan 11 17:33:46 crc kubenswrapper[4837]: I0111 17:33:46.119293 4837 status_manager.go:861] "Pod was deleted and then recreated, skipping status update" pod="openshift-kube-apiserver/kube-apiserver-crc" oldPodUID="f4b27818a5e8e43d0dc095d08835c792" podUID="71bb4a3aecc4ba5b26c4b7318770ce13" Jan 11 17:33:46 crc kubenswrapper[4837]: I0111 17:33:46.158075 4837 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-apiserver/kube-apiserver-startup-monitor-crc"] Jan 11 17:33:46 crc kubenswrapper[4837]: I0111 17:33:46.227522 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-lock\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Jan 11 17:33:46 crc kubenswrapper[4837]: I0111 17:33:46.227744 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/71bb4a3aecc4ba5b26c4b7318770ce13-cert-dir\") pod \"kube-apiserver-crc\" (UID: \"71bb4a3aecc4ba5b26c4b7318770ce13\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Jan 11 17:33:46 crc kubenswrapper[4837]: I0111 17:33:46.227802 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-log\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Jan 11 17:33:46 crc kubenswrapper[4837]: I0111 17:33:46.227926 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-resource-dir\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Jan 11 17:33:46 crc kubenswrapper[4837]: I0111 17:33:46.228010 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pod-resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-pod-resource-dir\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Jan 11 17:33:46 crc kubenswrapper[4837]: I0111 17:33:46.228101 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/71bb4a3aecc4ba5b26c4b7318770ce13-audit-dir\") pod \"kube-apiserver-crc\" (UID: \"71bb4a3aecc4ba5b26c4b7318770ce13\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Jan 11 17:33:46 crc kubenswrapper[4837]: I0111 17:33:46.228166 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"manifests\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-manifests\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Jan 11 17:33:46 crc kubenswrapper[4837]: I0111 17:33:46.228235 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/71bb4a3aecc4ba5b26c4b7318770ce13-resource-dir\") pod \"kube-apiserver-crc\" (UID: \"71bb4a3aecc4ba5b26c4b7318770ce13\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Jan 11 17:33:46 crc kubenswrapper[4837]: I0111 17:33:46.329596 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-resource-dir\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Jan 11 17:33:46 crc kubenswrapper[4837]: I0111 17:33:46.329681 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pod-resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-pod-resource-dir\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Jan 11 17:33:46 crc kubenswrapper[4837]: I0111 17:33:46.329788 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/71bb4a3aecc4ba5b26c4b7318770ce13-audit-dir\") pod \"kube-apiserver-crc\" (UID: \"71bb4a3aecc4ba5b26c4b7318770ce13\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Jan 11 17:33:46 crc kubenswrapper[4837]: I0111 17:33:46.329803 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-resource-dir\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Jan 11 17:33:46 crc kubenswrapper[4837]: I0111 17:33:46.329837 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"manifests\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-manifests\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Jan 11 17:33:46 crc kubenswrapper[4837]: I0111 17:33:46.329888 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"manifests\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-manifests\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Jan 11 17:33:46 crc kubenswrapper[4837]: I0111 17:33:46.329933 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/71bb4a3aecc4ba5b26c4b7318770ce13-resource-dir\") pod \"kube-apiserver-crc\" (UID: \"71bb4a3aecc4ba5b26c4b7318770ce13\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Jan 11 17:33:46 crc kubenswrapper[4837]: I0111 17:33:46.329961 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/71bb4a3aecc4ba5b26c4b7318770ce13-audit-dir\") pod \"kube-apiserver-crc\" (UID: \"71bb4a3aecc4ba5b26c4b7318770ce13\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Jan 11 17:33:46 crc kubenswrapper[4837]: I0111 17:33:46.330013 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-lock\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Jan 11 17:33:46 crc kubenswrapper[4837]: I0111 17:33:46.330096 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/71bb4a3aecc4ba5b26c4b7318770ce13-cert-dir\") pod \"kube-apiserver-crc\" (UID: \"71bb4a3aecc4ba5b26c4b7318770ce13\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Jan 11 17:33:46 crc kubenswrapper[4837]: I0111 17:33:46.330143 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/71bb4a3aecc4ba5b26c4b7318770ce13-cert-dir\") pod \"kube-apiserver-crc\" (UID: \"71bb4a3aecc4ba5b26c4b7318770ce13\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Jan 11 17:33:46 crc kubenswrapper[4837]: I0111 17:33:46.330153 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-log\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Jan 11 17:33:46 crc kubenswrapper[4837]: I0111 17:33:46.330200 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pod-resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-pod-resource-dir\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Jan 11 17:33:46 crc kubenswrapper[4837]: I0111 17:33:46.330144 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-lock\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Jan 11 17:33:46 crc kubenswrapper[4837]: I0111 17:33:46.330204 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-log\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Jan 11 17:33:46 crc kubenswrapper[4837]: I0111 17:33:46.330098 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/71bb4a3aecc4ba5b26c4b7318770ce13-resource-dir\") pod \"kube-apiserver-crc\" (UID: \"71bb4a3aecc4ba5b26c4b7318770ce13\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Jan 11 17:33:46 crc kubenswrapper[4837]: I0111 17:33:46.450157 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Jan 11 17:33:47 crc kubenswrapper[4837]: I0111 17:33:47.619985 4837 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-check-endpoints/1.log" Jan 11 17:33:47 crc kubenswrapper[4837]: I0111 17:33:47.622348 4837 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-cert-syncer/0.log" Jan 11 17:33:47 crc kubenswrapper[4837]: I0111 17:33:47.623896 4837 generic.go:334] "Generic (PLEG): container finished" podID="f4b27818a5e8e43d0dc095d08835c792" containerID="b0188eb536b523f8b0f5373307a7649fedd407a748d413a27ca1f61ed03e9e1d" exitCode=2 Jan 11 17:33:48 crc kubenswrapper[4837]: E0111 17:33:48.984937 4837 controller.go:195] "Failed to update lease" err="Put \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.196:6443: connect: connection refused" Jan 11 17:33:48 crc kubenswrapper[4837]: E0111 17:33:48.985960 4837 controller.go:195] "Failed to update lease" err="Put \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.196:6443: connect: connection refused" Jan 11 17:33:48 crc kubenswrapper[4837]: E0111 17:33:48.986558 4837 controller.go:195] "Failed to update lease" err="Put \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.196:6443: connect: connection refused" Jan 11 17:33:48 crc kubenswrapper[4837]: E0111 17:33:48.987085 4837 controller.go:195] "Failed to update lease" err="Put \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.196:6443: connect: connection refused" Jan 11 17:33:48 crc kubenswrapper[4837]: E0111 17:33:48.987597 4837 controller.go:195] "Failed to update lease" err="Put \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.196:6443: connect: connection refused" Jan 11 17:33:48 crc kubenswrapper[4837]: I0111 17:33:48.987666 4837 controller.go:115] "failed to update lease using latest lease, fallback to ensure lease" err="failed 5 attempts to update lease" Jan 11 17:33:48 crc kubenswrapper[4837]: E0111 17:33:48.988196 4837 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.196:6443: connect: connection refused" interval="200ms" Jan 11 17:33:49 crc kubenswrapper[4837]: E0111 17:33:49.190047 4837 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.196:6443: connect: connection refused" interval="400ms" Jan 11 17:33:49 crc kubenswrapper[4837]: E0111 17:33:49.591546 4837 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.196:6443: connect: connection refused" interval="800ms" Jan 11 17:33:50 crc kubenswrapper[4837]: E0111 17:33:50.393223 4837 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.196:6443: connect: connection refused" interval="1.6s" Jan 11 17:33:51 crc kubenswrapper[4837]: E0111 17:33:51.996575 4837 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.196:6443: connect: connection refused" interval="3.2s" Jan 11 17:33:52 crc kubenswrapper[4837]: I0111 17:33:52.656342 4837 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-check-endpoints/1.log" Jan 11 17:33:52 crc kubenswrapper[4837]: I0111 17:33:52.657900 4837 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-cert-syncer/0.log" Jan 11 17:33:52 crc kubenswrapper[4837]: I0111 17:33:52.658654 4837 generic.go:334] "Generic (PLEG): container finished" podID="f4b27818a5e8e43d0dc095d08835c792" containerID="998f980d03503669fbff3f3dd9cee724c43b1a41050d8cc67ba424e9823a4220" exitCode=0 Jan 11 17:33:52 crc kubenswrapper[4837]: I0111 17:33:52.658684 4837 generic.go:334] "Generic (PLEG): container finished" podID="f4b27818a5e8e43d0dc095d08835c792" containerID="c40f593938fad7c4202ddde8c6125eed092683e19464cc63448e7579e8174e54" exitCode=0 Jan 11 17:33:52 crc kubenswrapper[4837]: I0111 17:33:52.658737 4837 scope.go:117] "RemoveContainer" containerID="45dc7defae85e8f8ec0f441595434d86ab3cf87a7c81cafb9ca178af9a300027" Jan 11 17:33:54 crc kubenswrapper[4837]: I0111 17:33:54.672602 4837 generic.go:334] "Generic (PLEG): container finished" podID="1ec7b443-a2dc-4c7b-b661-03ce968d0878" containerID="34d7a443d8101750876ab0203b1b18d7d9680c9044ac64e95d8e47f90e59a144" exitCode=0 Jan 11 17:33:54 crc kubenswrapper[4837]: I0111 17:33:54.672741 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/installer-9-crc" event={"ID":"1ec7b443-a2dc-4c7b-b661-03ce968d0878","Type":"ContainerDied","Data":"34d7a443d8101750876ab0203b1b18d7d9680c9044ac64e95d8e47f90e59a144"} Jan 11 17:33:54 crc kubenswrapper[4837]: I0111 17:33:54.673911 4837 status_manager.go:851] "Failed to get status for pod" podUID="1ec7b443-a2dc-4c7b-b661-03ce968d0878" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.196:6443: connect: connection refused" Jan 11 17:33:54 crc kubenswrapper[4837]: I0111 17:33:54.675939 4837 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-cert-syncer/0.log" Jan 11 17:33:54 crc kubenswrapper[4837]: I0111 17:33:54.676581 4837 generic.go:334] "Generic (PLEG): container finished" podID="f4b27818a5e8e43d0dc095d08835c792" containerID="d3ff334f7925a2e69fa040b1ebfee0fd51669c60ff75534c1e08a9bf30b28d4d" exitCode=0 Jan 11 17:33:54 crc kubenswrapper[4837]: I0111 17:33:54.676604 4837 generic.go:334] "Generic (PLEG): container finished" podID="f4b27818a5e8e43d0dc095d08835c792" containerID="7d60a2ce54687f5ded99fd2e70894c23fe35a18b1bd557e6f05edca994881a3c" exitCode=0 Jan 11 17:33:55 crc kubenswrapper[4837]: E0111 17:33:55.198244 4837 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.196:6443: connect: connection refused" interval="6.4s" Jan 11 17:33:56 crc kubenswrapper[4837]: I0111 17:33:56.368224 4837 status_manager.go:851] "Failed to get status for pod" podUID="1ec7b443-a2dc-4c7b-b661-03ce968d0878" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.196:6443: connect: connection refused" Jan 11 17:33:57 crc kubenswrapper[4837]: E0111 17:33:57.159179 4837 event.go:368] "Unable to write event (may retry after sleeping)" err="Patch \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/events/redhat-marketplace-9blrl.1889bd87fbb677b7\": dial tcp 38.102.83.196:6443: connect: connection refused" event="&Event{ObjectMeta:{redhat-marketplace-9blrl.1889bd87fbb677b7 openshift-marketplace 28587 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-marketplace,Name:redhat-marketplace-9blrl,UID:5a6d8e6a-5e4a-4ff4-92ec-b2bfee9ef821,APIVersion:v1,ResourceVersion:28542,FieldPath:spec.initContainers{extract-content},},Reason:Pulling,Message:Pulling image \"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\",Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-01-11 17:32:00 +0000 UTC,LastTimestamp:2026-01-11 17:33:57.15795634 +0000 UTC m=+211.336149076,Count:2,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Jan 11 17:33:59 crc kubenswrapper[4837]: I0111 17:33:59.926401 4837 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-cert-syncer/0.log" Jan 11 17:33:59 crc kubenswrapper[4837]: I0111 17:33:59.929527 4837 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-crc" Jan 11 17:33:59 crc kubenswrapper[4837]: I0111 17:33:59.930312 4837 status_manager.go:851] "Failed to get status for pod" podUID="f4b27818a5e8e43d0dc095d08835c792" pod="openshift-kube-apiserver/kube-apiserver-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\": dial tcp 38.102.83.196:6443: connect: connection refused" Jan 11 17:33:59 crc kubenswrapper[4837]: I0111 17:33:59.931059 4837 status_manager.go:851] "Failed to get status for pod" podUID="1ec7b443-a2dc-4c7b-b661-03ce968d0878" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.196:6443: connect: connection refused" Jan 11 17:34:00 crc kubenswrapper[4837]: I0111 17:34:00.034851 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-audit-dir\") pod \"f4b27818a5e8e43d0dc095d08835c792\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " Jan 11 17:34:00 crc kubenswrapper[4837]: I0111 17:34:00.034961 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-resource-dir\") pod \"f4b27818a5e8e43d0dc095d08835c792\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " Jan 11 17:34:00 crc kubenswrapper[4837]: I0111 17:34:00.035035 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-cert-dir\") pod \"f4b27818a5e8e43d0dc095d08835c792\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " Jan 11 17:34:00 crc kubenswrapper[4837]: I0111 17:34:00.035170 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-audit-dir" (OuterVolumeSpecName: "audit-dir") pod "f4b27818a5e8e43d0dc095d08835c792" (UID: "f4b27818a5e8e43d0dc095d08835c792"). InnerVolumeSpecName "audit-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Jan 11 17:34:00 crc kubenswrapper[4837]: I0111 17:34:00.035283 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-cert-dir" (OuterVolumeSpecName: "cert-dir") pod "f4b27818a5e8e43d0dc095d08835c792" (UID: "f4b27818a5e8e43d0dc095d08835c792"). InnerVolumeSpecName "cert-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Jan 11 17:34:00 crc kubenswrapper[4837]: I0111 17:34:00.035273 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-resource-dir" (OuterVolumeSpecName: "resource-dir") pod "f4b27818a5e8e43d0dc095d08835c792" (UID: "f4b27818a5e8e43d0dc095d08835c792"). InnerVolumeSpecName "resource-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Jan 11 17:34:00 crc kubenswrapper[4837]: I0111 17:34:00.035515 4837 reconciler_common.go:293] "Volume detached for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-audit-dir\") on node \"crc\" DevicePath \"\"" Jan 11 17:34:00 crc kubenswrapper[4837]: I0111 17:34:00.035601 4837 reconciler_common.go:293] "Volume detached for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-resource-dir\") on node \"crc\" DevicePath \"\"" Jan 11 17:34:00 crc kubenswrapper[4837]: I0111 17:34:00.035717 4837 reconciler_common.go:293] "Volume detached for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-cert-dir\") on node \"crc\" DevicePath \"\"" Jan 11 17:34:00 crc kubenswrapper[4837]: I0111 17:34:00.377504 4837 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f4b27818a5e8e43d0dc095d08835c792" path="/var/lib/kubelet/pods/f4b27818a5e8e43d0dc095d08835c792/volumes" Jan 11 17:34:00 crc kubenswrapper[4837]: I0111 17:34:00.581196 4837 patch_prober.go:28] interesting pod/kube-controller-manager-crc container/kube-controller-manager namespace/openshift-kube-controller-manager: Readiness probe status=failure output="Get \"https://192.168.126.11:10257/healthz\": dial tcp 192.168.126.11:10257: connect: connection refused" start-of-body= Jan 11 17:34:00 crc kubenswrapper[4837]: I0111 17:34:00.581638 4837 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-kube-controller-manager/kube-controller-manager-crc" podUID="f614b9022728cf315e60c057852e563e" containerName="kube-controller-manager" probeResult="failure" output="Get \"https://192.168.126.11:10257/healthz\": dial tcp 192.168.126.11:10257: connect: connection refused" Jan 11 17:34:00 crc kubenswrapper[4837]: I0111 17:34:00.730397 4837 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-controller-manager_kube-controller-manager-crc_f614b9022728cf315e60c057852e563e/cluster-policy-controller/0.log" Jan 11 17:34:00 crc kubenswrapper[4837]: I0111 17:34:00.731923 4837 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-controller-manager_kube-controller-manager-crc_f614b9022728cf315e60c057852e563e/kube-controller-manager/0.log" Jan 11 17:34:00 crc kubenswrapper[4837]: I0111 17:34:00.732217 4837 generic.go:334] "Generic (PLEG): container finished" podID="f614b9022728cf315e60c057852e563e" containerID="e4a6334ba6c5d767d1ea3901dae9be1569836c6043224f9ccedd0100448a80fe" exitCode=1 Jan 11 17:34:00 crc kubenswrapper[4837]: I0111 17:34:00.732299 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerDied","Data":"e4a6334ba6c5d767d1ea3901dae9be1569836c6043224f9ccedd0100448a80fe"} Jan 11 17:34:00 crc kubenswrapper[4837]: I0111 17:34:00.734161 4837 scope.go:117] "RemoveContainer" containerID="e4a6334ba6c5d767d1ea3901dae9be1569836c6043224f9ccedd0100448a80fe" Jan 11 17:34:00 crc kubenswrapper[4837]: I0111 17:34:00.734156 4837 status_manager.go:851] "Failed to get status for pod" podUID="f614b9022728cf315e60c057852e563e" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-controller-manager/pods/kube-controller-manager-crc\": dial tcp 38.102.83.196:6443: connect: connection refused" Jan 11 17:34:00 crc kubenswrapper[4837]: I0111 17:34:00.735101 4837 status_manager.go:851] "Failed to get status for pod" podUID="1ec7b443-a2dc-4c7b-b661-03ce968d0878" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.196:6443: connect: connection refused" Jan 11 17:34:00 crc kubenswrapper[4837]: I0111 17:34:00.740499 4837 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-cert-syncer/0.log" Jan 11 17:34:00 crc kubenswrapper[4837]: I0111 17:34:00.741544 4837 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-crc" Jan 11 17:34:00 crc kubenswrapper[4837]: I0111 17:34:00.742242 4837 status_manager.go:851] "Failed to get status for pod" podUID="1ec7b443-a2dc-4c7b-b661-03ce968d0878" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.196:6443: connect: connection refused" Jan 11 17:34:00 crc kubenswrapper[4837]: I0111 17:34:00.743072 4837 status_manager.go:851] "Failed to get status for pod" podUID="f4b27818a5e8e43d0dc095d08835c792" pod="openshift-kube-apiserver/kube-apiserver-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\": dial tcp 38.102.83.196:6443: connect: connection refused" Jan 11 17:34:00 crc kubenswrapper[4837]: I0111 17:34:00.743938 4837 status_manager.go:851] "Failed to get status for pod" podUID="f614b9022728cf315e60c057852e563e" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-controller-manager/pods/kube-controller-manager-crc\": dial tcp 38.102.83.196:6443: connect: connection refused" Jan 11 17:34:00 crc kubenswrapper[4837]: I0111 17:34:00.745935 4837 status_manager.go:851] "Failed to get status for pod" podUID="f614b9022728cf315e60c057852e563e" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-controller-manager/pods/kube-controller-manager-crc\": dial tcp 38.102.83.196:6443: connect: connection refused" Jan 11 17:34:00 crc kubenswrapper[4837]: I0111 17:34:00.746506 4837 status_manager.go:851] "Failed to get status for pod" podUID="1ec7b443-a2dc-4c7b-b661-03ce968d0878" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.196:6443: connect: connection refused" Jan 11 17:34:00 crc kubenswrapper[4837]: I0111 17:34:00.747178 4837 status_manager.go:851] "Failed to get status for pod" podUID="f4b27818a5e8e43d0dc095d08835c792" pod="openshift-kube-apiserver/kube-apiserver-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\": dial tcp 38.102.83.196:6443: connect: connection refused" Jan 11 17:34:01 crc kubenswrapper[4837]: I0111 17:34:01.370113 4837 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/installer-9-crc" Jan 11 17:34:01 crc kubenswrapper[4837]: I0111 17:34:01.371803 4837 status_manager.go:851] "Failed to get status for pod" podUID="f4b27818a5e8e43d0dc095d08835c792" pod="openshift-kube-apiserver/kube-apiserver-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\": dial tcp 38.102.83.196:6443: connect: connection refused" Jan 11 17:34:01 crc kubenswrapper[4837]: I0111 17:34:01.372438 4837 status_manager.go:851] "Failed to get status for pod" podUID="1ec7b443-a2dc-4c7b-b661-03ce968d0878" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.196:6443: connect: connection refused" Jan 11 17:34:01 crc kubenswrapper[4837]: I0111 17:34:01.372638 4837 status_manager.go:851] "Failed to get status for pod" podUID="f614b9022728cf315e60c057852e563e" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-controller-manager/pods/kube-controller-manager-crc\": dial tcp 38.102.83.196:6443: connect: connection refused" Jan 11 17:34:01 crc kubenswrapper[4837]: I0111 17:34:01.455539 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/1ec7b443-a2dc-4c7b-b661-03ce968d0878-kube-api-access\") pod \"1ec7b443-a2dc-4c7b-b661-03ce968d0878\" (UID: \"1ec7b443-a2dc-4c7b-b661-03ce968d0878\") " Jan 11 17:34:01 crc kubenswrapper[4837]: I0111 17:34:01.455612 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/1ec7b443-a2dc-4c7b-b661-03ce968d0878-kubelet-dir\") pod \"1ec7b443-a2dc-4c7b-b661-03ce968d0878\" (UID: \"1ec7b443-a2dc-4c7b-b661-03ce968d0878\") " Jan 11 17:34:01 crc kubenswrapper[4837]: I0111 17:34:01.455718 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/1ec7b443-a2dc-4c7b-b661-03ce968d0878-var-lock\") pod \"1ec7b443-a2dc-4c7b-b661-03ce968d0878\" (UID: \"1ec7b443-a2dc-4c7b-b661-03ce968d0878\") " Jan 11 17:34:01 crc kubenswrapper[4837]: I0111 17:34:01.455768 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/1ec7b443-a2dc-4c7b-b661-03ce968d0878-kubelet-dir" (OuterVolumeSpecName: "kubelet-dir") pod "1ec7b443-a2dc-4c7b-b661-03ce968d0878" (UID: "1ec7b443-a2dc-4c7b-b661-03ce968d0878"). InnerVolumeSpecName "kubelet-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Jan 11 17:34:01 crc kubenswrapper[4837]: I0111 17:34:01.455912 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/1ec7b443-a2dc-4c7b-b661-03ce968d0878-var-lock" (OuterVolumeSpecName: "var-lock") pod "1ec7b443-a2dc-4c7b-b661-03ce968d0878" (UID: "1ec7b443-a2dc-4c7b-b661-03ce968d0878"). InnerVolumeSpecName "var-lock". PluginName "kubernetes.io/host-path", VolumeGidValue "" Jan 11 17:34:01 crc kubenswrapper[4837]: I0111 17:34:01.456143 4837 reconciler_common.go:293] "Volume detached for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/1ec7b443-a2dc-4c7b-b661-03ce968d0878-var-lock\") on node \"crc\" DevicePath \"\"" Jan 11 17:34:01 crc kubenswrapper[4837]: I0111 17:34:01.456160 4837 reconciler_common.go:293] "Volume detached for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/1ec7b443-a2dc-4c7b-b661-03ce968d0878-kubelet-dir\") on node \"crc\" DevicePath \"\"" Jan 11 17:34:01 crc kubenswrapper[4837]: I0111 17:34:01.466015 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1ec7b443-a2dc-4c7b-b661-03ce968d0878-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "1ec7b443-a2dc-4c7b-b661-03ce968d0878" (UID: "1ec7b443-a2dc-4c7b-b661-03ce968d0878"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 11 17:34:01 crc kubenswrapper[4837]: I0111 17:34:01.557548 4837 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/1ec7b443-a2dc-4c7b-b661-03ce968d0878-kube-api-access\") on node \"crc\" DevicePath \"\"" Jan 11 17:34:01 crc kubenswrapper[4837]: E0111 17:34:01.599132 4837 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.196:6443: connect: connection refused" interval="7s" Jan 11 17:34:01 crc kubenswrapper[4837]: I0111 17:34:01.747699 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/installer-9-crc" event={"ID":"1ec7b443-a2dc-4c7b-b661-03ce968d0878","Type":"ContainerDied","Data":"4dd5c6a3d2f214100793b6470f90f7583464cbc988cde026f9384f409830c00f"} Jan 11 17:34:01 crc kubenswrapper[4837]: I0111 17:34:01.747736 4837 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="4dd5c6a3d2f214100793b6470f90f7583464cbc988cde026f9384f409830c00f" Jan 11 17:34:01 crc kubenswrapper[4837]: I0111 17:34:01.747783 4837 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/installer-9-crc" Jan 11 17:34:01 crc kubenswrapper[4837]: I0111 17:34:01.771661 4837 status_manager.go:851] "Failed to get status for pod" podUID="f614b9022728cf315e60c057852e563e" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-controller-manager/pods/kube-controller-manager-crc\": dial tcp 38.102.83.196:6443: connect: connection refused" Jan 11 17:34:01 crc kubenswrapper[4837]: I0111 17:34:01.771898 4837 status_manager.go:851] "Failed to get status for pod" podUID="1ec7b443-a2dc-4c7b-b661-03ce968d0878" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.196:6443: connect: connection refused" Jan 11 17:34:01 crc kubenswrapper[4837]: I0111 17:34:01.772307 4837 status_manager.go:851] "Failed to get status for pod" podUID="f4b27818a5e8e43d0dc095d08835c792" pod="openshift-kube-apiserver/kube-apiserver-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\": dial tcp 38.102.83.196:6443: connect: connection refused" Jan 11 17:34:02 crc kubenswrapper[4837]: I0111 17:34:02.040769 4837 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Jan 11 17:34:02 crc kubenswrapper[4837]: I0111 17:34:02.342882 4837 scope.go:117] "RemoveContainer" containerID="998f980d03503669fbff3f3dd9cee724c43b1a41050d8cc67ba424e9823a4220" Jan 11 17:34:03 crc kubenswrapper[4837]: I0111 17:34:03.056221 4837 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Jan 11 17:34:03 crc kubenswrapper[4837]: I0111 17:34:03.191986 4837 scope.go:117] "RemoveContainer" containerID="45dc7defae85e8f8ec0f441595434d86ab3cf87a7c81cafb9ca178af9a300027" Jan 11 17:34:03 crc kubenswrapper[4837]: E0111 17:34:03.192457 4837 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"45dc7defae85e8f8ec0f441595434d86ab3cf87a7c81cafb9ca178af9a300027\": container with ID starting with 45dc7defae85e8f8ec0f441595434d86ab3cf87a7c81cafb9ca178af9a300027 not found: ID does not exist" containerID="45dc7defae85e8f8ec0f441595434d86ab3cf87a7c81cafb9ca178af9a300027" Jan 11 17:34:03 crc kubenswrapper[4837]: I0111 17:34:03.192505 4837 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"45dc7defae85e8f8ec0f441595434d86ab3cf87a7c81cafb9ca178af9a300027"} err="failed to get container status \"45dc7defae85e8f8ec0f441595434d86ab3cf87a7c81cafb9ca178af9a300027\": rpc error: code = NotFound desc = could not find container \"45dc7defae85e8f8ec0f441595434d86ab3cf87a7c81cafb9ca178af9a300027\": container with ID starting with 45dc7defae85e8f8ec0f441595434d86ab3cf87a7c81cafb9ca178af9a300027 not found: ID does not exist" Jan 11 17:34:03 crc kubenswrapper[4837]: I0111 17:34:03.192536 4837 scope.go:117] "RemoveContainer" containerID="c40f593938fad7c4202ddde8c6125eed092683e19464cc63448e7579e8174e54" Jan 11 17:34:03 crc kubenswrapper[4837]: W0111 17:34:03.640155 4837 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf85e55b1a89d02b0cb034b1ea31ed45a.slice/crio-d28ea825afa0500e5eda6be7e1407769e4d2ef173f4fdc5500964d9c04da707c WatchSource:0}: Error finding container d28ea825afa0500e5eda6be7e1407769e4d2ef173f4fdc5500964d9c04da707c: Status 404 returned error can't find the container with id d28ea825afa0500e5eda6be7e1407769e4d2ef173f4fdc5500964d9c04da707c Jan 11 17:34:03 crc kubenswrapper[4837]: E0111 17:34:03.645957 4837 event.go:368] "Unable to write event (may retry after sleeping)" err="Patch \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/events/redhat-marketplace-9blrl.1889bd87fbb677b7\": dial tcp 38.102.83.196:6443: connect: connection refused" event="&Event{ObjectMeta:{redhat-marketplace-9blrl.1889bd87fbb677b7 openshift-marketplace 28587 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-marketplace,Name:redhat-marketplace-9blrl,UID:5a6d8e6a-5e4a-4ff4-92ec-b2bfee9ef821,APIVersion:v1,ResourceVersion:28542,FieldPath:spec.initContainers{extract-content},},Reason:Pulling,Message:Pulling image \"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\",Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-01-11 17:32:00 +0000 UTC,LastTimestamp:2026-01-11 17:33:57.15795634 +0000 UTC m=+211.336149076,Count:2,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Jan 11 17:34:03 crc kubenswrapper[4837]: I0111 17:34:03.765386 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" event={"ID":"f85e55b1a89d02b0cb034b1ea31ed45a","Type":"ContainerStarted","Data":"d28ea825afa0500e5eda6be7e1407769e4d2ef173f4fdc5500964d9c04da707c"} Jan 11 17:34:04 crc kubenswrapper[4837]: I0111 17:34:04.004314 4837 scope.go:117] "RemoveContainer" containerID="d3ff334f7925a2e69fa040b1ebfee0fd51669c60ff75534c1e08a9bf30b28d4d" Jan 11 17:34:04 crc kubenswrapper[4837]: E0111 17:34:04.089101 4837 log.go:32] "RunPodSandbox from runtime service failed" err=< Jan 11 17:34:04 crc kubenswrapper[4837]: rpc error: code = Unknown desc = failed to create pod network sandbox k8s_controller-manager-b577cd54c-v775z_openshift-controller-manager_99d66a69-1de3-43d5-947f-1fc4611c3022_0(7a28dd61a8ef27828a3758679c51be112e056c30fe15ea25ab1837cddbe8c419): error adding pod openshift-controller-manager_controller-manager-b577cd54c-v775z to CNI network "multus-cni-network": plugin type="multus-shim" name="multus-cni-network" failed (add): CmdAdd (shim): CNI request failed with status 400: 'ContainerID:"7a28dd61a8ef27828a3758679c51be112e056c30fe15ea25ab1837cddbe8c419" Netns:"/var/run/netns/40fc3558-cc69-48c5-9610-dc38bc018671" IfName:"eth0" Args:"IgnoreUnknown=1;K8S_POD_NAMESPACE=openshift-controller-manager;K8S_POD_NAME=controller-manager-b577cd54c-v775z;K8S_POD_INFRA_CONTAINER_ID=7a28dd61a8ef27828a3758679c51be112e056c30fe15ea25ab1837cddbe8c419;K8S_POD_UID=99d66a69-1de3-43d5-947f-1fc4611c3022" Path:"" ERRORED: error configuring pod [openshift-controller-manager/controller-manager-b577cd54c-v775z] networking: Multus: [openshift-controller-manager/controller-manager-b577cd54c-v775z/99d66a69-1de3-43d5-947f-1fc4611c3022]: error setting the networks status: SetPodNetworkStatusAnnotation: failed to update the pod controller-manager-b577cd54c-v775z in out of cluster comm: SetNetworkStatus: failed to update the pod controller-manager-b577cd54c-v775z in out of cluster comm: status update failed for pod /: Get "https://api-int.crc.testing:6443/api/v1/namespaces/openshift-controller-manager/pods/controller-manager-b577cd54c-v775z?timeout=1m0s": dial tcp 38.102.83.196:6443: connect: connection refused Jan 11 17:34:04 crc kubenswrapper[4837]: ': StdinData: {"binDir":"/var/lib/cni/bin","clusterNetwork":"/host/run/multus/cni/net.d/10-ovn-kubernetes.conf","cniVersion":"0.3.1","daemonSocketDir":"/run/multus/socket","globalNamespaces":"default,openshift-multus,openshift-sriov-network-operator,openshift-cnv","logLevel":"verbose","logToStderr":true,"name":"multus-cni-network","namespaceIsolation":true,"type":"multus-shim"} Jan 11 17:34:04 crc kubenswrapper[4837]: > Jan 11 17:34:04 crc kubenswrapper[4837]: E0111 17:34:04.089181 4837 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err=< Jan 11 17:34:04 crc kubenswrapper[4837]: rpc error: code = Unknown desc = failed to create pod network sandbox k8s_controller-manager-b577cd54c-v775z_openshift-controller-manager_99d66a69-1de3-43d5-947f-1fc4611c3022_0(7a28dd61a8ef27828a3758679c51be112e056c30fe15ea25ab1837cddbe8c419): error adding pod openshift-controller-manager_controller-manager-b577cd54c-v775z to CNI network "multus-cni-network": plugin type="multus-shim" name="multus-cni-network" failed (add): CmdAdd (shim): CNI request failed with status 400: 'ContainerID:"7a28dd61a8ef27828a3758679c51be112e056c30fe15ea25ab1837cddbe8c419" Netns:"/var/run/netns/40fc3558-cc69-48c5-9610-dc38bc018671" IfName:"eth0" Args:"IgnoreUnknown=1;K8S_POD_NAMESPACE=openshift-controller-manager;K8S_POD_NAME=controller-manager-b577cd54c-v775z;K8S_POD_INFRA_CONTAINER_ID=7a28dd61a8ef27828a3758679c51be112e056c30fe15ea25ab1837cddbe8c419;K8S_POD_UID=99d66a69-1de3-43d5-947f-1fc4611c3022" Path:"" ERRORED: error configuring pod [openshift-controller-manager/controller-manager-b577cd54c-v775z] networking: Multus: [openshift-controller-manager/controller-manager-b577cd54c-v775z/99d66a69-1de3-43d5-947f-1fc4611c3022]: error setting the networks status: SetPodNetworkStatusAnnotation: failed to update the pod controller-manager-b577cd54c-v775z in out of cluster comm: SetNetworkStatus: failed to update the pod controller-manager-b577cd54c-v775z in out of cluster comm: status update failed for pod /: Get "https://api-int.crc.testing:6443/api/v1/namespaces/openshift-controller-manager/pods/controller-manager-b577cd54c-v775z?timeout=1m0s": dial tcp 38.102.83.196:6443: connect: connection refused Jan 11 17:34:04 crc kubenswrapper[4837]: ': StdinData: {"binDir":"/var/lib/cni/bin","clusterNetwork":"/host/run/multus/cni/net.d/10-ovn-kubernetes.conf","cniVersion":"0.3.1","daemonSocketDir":"/run/multus/socket","globalNamespaces":"default,openshift-multus,openshift-sriov-network-operator,openshift-cnv","logLevel":"verbose","logToStderr":true,"name":"multus-cni-network","namespaceIsolation":true,"type":"multus-shim"} Jan 11 17:34:04 crc kubenswrapper[4837]: > pod="openshift-controller-manager/controller-manager-b577cd54c-v775z" Jan 11 17:34:04 crc kubenswrapper[4837]: E0111 17:34:04.089206 4837 kuberuntime_manager.go:1170] "CreatePodSandbox for pod failed" err=< Jan 11 17:34:04 crc kubenswrapper[4837]: rpc error: code = Unknown desc = failed to create pod network sandbox k8s_controller-manager-b577cd54c-v775z_openshift-controller-manager_99d66a69-1de3-43d5-947f-1fc4611c3022_0(7a28dd61a8ef27828a3758679c51be112e056c30fe15ea25ab1837cddbe8c419): error adding pod openshift-controller-manager_controller-manager-b577cd54c-v775z to CNI network "multus-cni-network": plugin type="multus-shim" name="multus-cni-network" failed (add): CmdAdd (shim): CNI request failed with status 400: 'ContainerID:"7a28dd61a8ef27828a3758679c51be112e056c30fe15ea25ab1837cddbe8c419" Netns:"/var/run/netns/40fc3558-cc69-48c5-9610-dc38bc018671" IfName:"eth0" Args:"IgnoreUnknown=1;K8S_POD_NAMESPACE=openshift-controller-manager;K8S_POD_NAME=controller-manager-b577cd54c-v775z;K8S_POD_INFRA_CONTAINER_ID=7a28dd61a8ef27828a3758679c51be112e056c30fe15ea25ab1837cddbe8c419;K8S_POD_UID=99d66a69-1de3-43d5-947f-1fc4611c3022" Path:"" ERRORED: error configuring pod [openshift-controller-manager/controller-manager-b577cd54c-v775z] networking: Multus: [openshift-controller-manager/controller-manager-b577cd54c-v775z/99d66a69-1de3-43d5-947f-1fc4611c3022]: error setting the networks status: SetPodNetworkStatusAnnotation: failed to update the pod controller-manager-b577cd54c-v775z in out of cluster comm: SetNetworkStatus: failed to update the pod controller-manager-b577cd54c-v775z in out of cluster comm: status update failed for pod /: Get "https://api-int.crc.testing:6443/api/v1/namespaces/openshift-controller-manager/pods/controller-manager-b577cd54c-v775z?timeout=1m0s": dial tcp 38.102.83.196:6443: connect: connection refused Jan 11 17:34:04 crc kubenswrapper[4837]: ': StdinData: {"binDir":"/var/lib/cni/bin","clusterNetwork":"/host/run/multus/cni/net.d/10-ovn-kubernetes.conf","cniVersion":"0.3.1","daemonSocketDir":"/run/multus/socket","globalNamespaces":"default,openshift-multus,openshift-sriov-network-operator,openshift-cnv","logLevel":"verbose","logToStderr":true,"name":"multus-cni-network","namespaceIsolation":true,"type":"multus-shim"} Jan 11 17:34:04 crc kubenswrapper[4837]: > pod="openshift-controller-manager/controller-manager-b577cd54c-v775z" Jan 11 17:34:04 crc kubenswrapper[4837]: E0111 17:34:04.089278 4837 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"controller-manager-b577cd54c-v775z_openshift-controller-manager(99d66a69-1de3-43d5-947f-1fc4611c3022)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"controller-manager-b577cd54c-v775z_openshift-controller-manager(99d66a69-1de3-43d5-947f-1fc4611c3022)\\\": rpc error: code = Unknown desc = failed to create pod network sandbox k8s_controller-manager-b577cd54c-v775z_openshift-controller-manager_99d66a69-1de3-43d5-947f-1fc4611c3022_0(7a28dd61a8ef27828a3758679c51be112e056c30fe15ea25ab1837cddbe8c419): error adding pod openshift-controller-manager_controller-manager-b577cd54c-v775z to CNI network \\\"multus-cni-network\\\": plugin type=\\\"multus-shim\\\" name=\\\"multus-cni-network\\\" failed (add): CmdAdd (shim): CNI request failed with status 400: 'ContainerID:\\\"7a28dd61a8ef27828a3758679c51be112e056c30fe15ea25ab1837cddbe8c419\\\" Netns:\\\"/var/run/netns/40fc3558-cc69-48c5-9610-dc38bc018671\\\" IfName:\\\"eth0\\\" Args:\\\"IgnoreUnknown=1;K8S_POD_NAMESPACE=openshift-controller-manager;K8S_POD_NAME=controller-manager-b577cd54c-v775z;K8S_POD_INFRA_CONTAINER_ID=7a28dd61a8ef27828a3758679c51be112e056c30fe15ea25ab1837cddbe8c419;K8S_POD_UID=99d66a69-1de3-43d5-947f-1fc4611c3022\\\" Path:\\\"\\\" ERRORED: error configuring pod [openshift-controller-manager/controller-manager-b577cd54c-v775z] networking: Multus: [openshift-controller-manager/controller-manager-b577cd54c-v775z/99d66a69-1de3-43d5-947f-1fc4611c3022]: error setting the networks status: SetPodNetworkStatusAnnotation: failed to update the pod controller-manager-b577cd54c-v775z in out of cluster comm: SetNetworkStatus: failed to update the pod controller-manager-b577cd54c-v775z in out of cluster comm: status update failed for pod /: Get \\\"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-controller-manager/pods/controller-manager-b577cd54c-v775z?timeout=1m0s\\\": dial tcp 38.102.83.196:6443: connect: connection refused\\n': StdinData: {\\\"binDir\\\":\\\"/var/lib/cni/bin\\\",\\\"clusterNetwork\\\":\\\"/host/run/multus/cni/net.d/10-ovn-kubernetes.conf\\\",\\\"cniVersion\\\":\\\"0.3.1\\\",\\\"daemonSocketDir\\\":\\\"/run/multus/socket\\\",\\\"globalNamespaces\\\":\\\"default,openshift-multus,openshift-sriov-network-operator,openshift-cnv\\\",\\\"logLevel\\\":\\\"verbose\\\",\\\"logToStderr\\\":true,\\\"name\\\":\\\"multus-cni-network\\\",\\\"namespaceIsolation\\\":true,\\\"type\\\":\\\"multus-shim\\\"}\"" pod="openshift-controller-manager/controller-manager-b577cd54c-v775z" podUID="99d66a69-1de3-43d5-947f-1fc4611c3022" Jan 11 17:34:04 crc kubenswrapper[4837]: I0111 17:34:04.106929 4837 scope.go:117] "RemoveContainer" containerID="b0188eb536b523f8b0f5373307a7649fedd407a748d413a27ca1f61ed03e9e1d" Jan 11 17:34:04 crc kubenswrapper[4837]: I0111 17:34:04.773083 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-b577cd54c-v775z" Jan 11 17:34:04 crc kubenswrapper[4837]: I0111 17:34:04.774146 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-b577cd54c-v775z" Jan 11 17:34:06 crc kubenswrapper[4837]: I0111 17:34:06.042220 4837 scope.go:117] "RemoveContainer" containerID="7d60a2ce54687f5ded99fd2e70894c23fe35a18b1bd557e6f05edca994881a3c" Jan 11 17:34:06 crc kubenswrapper[4837]: I0111 17:34:06.368305 4837 status_manager.go:851] "Failed to get status for pod" podUID="f614b9022728cf315e60c057852e563e" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-controller-manager/pods/kube-controller-manager-crc\": dial tcp 38.102.83.196:6443: connect: connection refused" Jan 11 17:34:06 crc kubenswrapper[4837]: I0111 17:34:06.369190 4837 status_manager.go:851] "Failed to get status for pod" podUID="1ec7b443-a2dc-4c7b-b661-03ce968d0878" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.196:6443: connect: connection refused" Jan 11 17:34:06 crc kubenswrapper[4837]: I0111 17:34:06.582872 4837 scope.go:117] "RemoveContainer" containerID="7e80355f699d3f0ea37d79f71e08cabb59de047b88df75691a2da0ef7de9d52b" Jan 11 17:34:06 crc kubenswrapper[4837]: I0111 17:34:06.787403 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-slftv" event={"ID":"59797bd6-cb69-412d-952b-1673312648e2","Type":"ContainerStarted","Data":"93720f4f395c01bc8939c97f1dcc75df87220bf32e6383b482042d185c96d3a8"} Jan 11 17:34:06 crc kubenswrapper[4837]: I0111 17:34:06.788446 4837 status_manager.go:851] "Failed to get status for pod" podUID="59797bd6-cb69-412d-952b-1673312648e2" pod="openshift-marketplace/redhat-operators-slftv" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-operators-slftv\": dial tcp 38.102.83.196:6443: connect: connection refused" Jan 11 17:34:06 crc kubenswrapper[4837]: I0111 17:34:06.788776 4837 status_manager.go:851] "Failed to get status for pod" podUID="1ec7b443-a2dc-4c7b-b661-03ce968d0878" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.196:6443: connect: connection refused" Jan 11 17:34:06 crc kubenswrapper[4837]: I0111 17:34:06.789113 4837 status_manager.go:851] "Failed to get status for pod" podUID="f614b9022728cf315e60c057852e563e" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-controller-manager/pods/kube-controller-manager-crc\": dial tcp 38.102.83.196:6443: connect: connection refused" Jan 11 17:34:06 crc kubenswrapper[4837]: I0111 17:34:06.790609 4837 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-controller-manager_kube-controller-manager-crc_f614b9022728cf315e60c057852e563e/cluster-policy-controller/0.log" Jan 11 17:34:06 crc kubenswrapper[4837]: I0111 17:34:06.791243 4837 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-controller-manager_kube-controller-manager-crc_f614b9022728cf315e60c057852e563e/kube-controller-manager/0.log" Jan 11 17:34:06 crc kubenswrapper[4837]: I0111 17:34:06.791319 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerStarted","Data":"60778049fee0a6ce1e3f3e3cb718660e850aae4bbbad898c810c79f5e51b67ef"} Jan 11 17:34:06 crc kubenswrapper[4837]: I0111 17:34:06.791763 4837 status_manager.go:851] "Failed to get status for pod" podUID="f614b9022728cf315e60c057852e563e" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-controller-manager/pods/kube-controller-manager-crc\": dial tcp 38.102.83.196:6443: connect: connection refused" Jan 11 17:34:06 crc kubenswrapper[4837]: I0111 17:34:06.792003 4837 status_manager.go:851] "Failed to get status for pod" podUID="59797bd6-cb69-412d-952b-1673312648e2" pod="openshift-marketplace/redhat-operators-slftv" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-operators-slftv\": dial tcp 38.102.83.196:6443: connect: connection refused" Jan 11 17:34:06 crc kubenswrapper[4837]: I0111 17:34:06.792308 4837 status_manager.go:851] "Failed to get status for pod" podUID="1ec7b443-a2dc-4c7b-b661-03ce968d0878" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.196:6443: connect: connection refused" Jan 11 17:34:06 crc kubenswrapper[4837]: I0111 17:34:06.793349 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-dht47" event={"ID":"2bf0823a-63d9-43c9-9cc7-1e2f364b7855","Type":"ContainerStarted","Data":"ed49fc1156a8b36ddc04d26696827f282b22c8de5b9d2f8004050cc6a4e0ef58"} Jan 11 17:34:06 crc kubenswrapper[4837]: I0111 17:34:06.794134 4837 status_manager.go:851] "Failed to get status for pod" podUID="f614b9022728cf315e60c057852e563e" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-controller-manager/pods/kube-controller-manager-crc\": dial tcp 38.102.83.196:6443: connect: connection refused" Jan 11 17:34:06 crc kubenswrapper[4837]: I0111 17:34:06.794296 4837 status_manager.go:851] "Failed to get status for pod" podUID="59797bd6-cb69-412d-952b-1673312648e2" pod="openshift-marketplace/redhat-operators-slftv" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-operators-slftv\": dial tcp 38.102.83.196:6443: connect: connection refused" Jan 11 17:34:06 crc kubenswrapper[4837]: I0111 17:34:06.794493 4837 status_manager.go:851] "Failed to get status for pod" podUID="2bf0823a-63d9-43c9-9cc7-1e2f364b7855" pod="openshift-marketplace/redhat-marketplace-dht47" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-marketplace-dht47\": dial tcp 38.102.83.196:6443: connect: connection refused" Jan 11 17:34:06 crc kubenswrapper[4837]: I0111 17:34:06.794725 4837 status_manager.go:851] "Failed to get status for pod" podUID="1ec7b443-a2dc-4c7b-b661-03ce968d0878" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.196:6443: connect: connection refused" Jan 11 17:34:06 crc kubenswrapper[4837]: I0111 17:34:06.796199 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" event={"ID":"f85e55b1a89d02b0cb034b1ea31ed45a","Type":"ContainerStarted","Data":"e739f1735147f4da1c7babcf2c3992b3c2cd28f99b4c18b0e9c868c2155bc9b2"} Jan 11 17:34:07 crc kubenswrapper[4837]: I0111 17:34:07.802951 4837 status_manager.go:851] "Failed to get status for pod" podUID="2bf0823a-63d9-43c9-9cc7-1e2f364b7855" pod="openshift-marketplace/redhat-marketplace-dht47" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-marketplace-dht47\": dial tcp 38.102.83.196:6443: connect: connection refused" Jan 11 17:34:07 crc kubenswrapper[4837]: I0111 17:34:07.803488 4837 status_manager.go:851] "Failed to get status for pod" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-startup-monitor-crc\": dial tcp 38.102.83.196:6443: connect: connection refused" Jan 11 17:34:07 crc kubenswrapper[4837]: I0111 17:34:07.804009 4837 status_manager.go:851] "Failed to get status for pod" podUID="1ec7b443-a2dc-4c7b-b661-03ce968d0878" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.196:6443: connect: connection refused" Jan 11 17:34:07 crc kubenswrapper[4837]: I0111 17:34:07.804543 4837 status_manager.go:851] "Failed to get status for pod" podUID="f614b9022728cf315e60c057852e563e" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-controller-manager/pods/kube-controller-manager-crc\": dial tcp 38.102.83.196:6443: connect: connection refused" Jan 11 17:34:07 crc kubenswrapper[4837]: I0111 17:34:07.805024 4837 status_manager.go:851] "Failed to get status for pod" podUID="59797bd6-cb69-412d-952b-1673312648e2" pod="openshift-marketplace/redhat-operators-slftv" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-operators-slftv\": dial tcp 38.102.83.196:6443: connect: connection refused" Jan 11 17:34:08 crc kubenswrapper[4837]: E0111 17:34:08.690769 4837 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.196:6443: connect: connection refused" interval="7s" Jan 11 17:34:09 crc kubenswrapper[4837]: E0111 17:34:09.301851 4837 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-01-11T17:34:09Z\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-11T17:34:09Z\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-11T17:34:09Z\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-11T17:34:09Z\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:27da5043f12d5307a70c72f97a3fa66058dee448a5dec7cd83b0aa63f5496935\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:f05e1dfe1f6582ffaf0843b908ef08d6fd1a032539e2d8ce20fd84ee0c4ec783\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1665092989},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[],\\\"sizeBytes\\\":1201976068},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:6b3b97e17390b5ee568393f2501a5fc412865074b8f6c5355ea48ab7c3983b7a\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:8bb7ea6c489e90cb357c7f50fe8266a6a6c6e23e4931a5eaa0fd33a409db20e8\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1175127379},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917}],\\\"runtimeHandlers\\\":[{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":false},\\\"name\\\":\\\"runc\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"crun\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"\\\"}]}}\" for node \"crc\": Patch \"https://api-int.crc.testing:6443/api/v1/nodes/crc/status?timeout=10s\": dial tcp 38.102.83.196:6443: connect: connection refused" Jan 11 17:34:09 crc kubenswrapper[4837]: E0111 17:34:09.302346 4837 kubelet_node_status.go:585] "Error updating node status, will retry" err="error getting node \"crc\": Get \"https://api-int.crc.testing:6443/api/v1/nodes/crc?timeout=10s\": dial tcp 38.102.83.196:6443: connect: connection refused" Jan 11 17:34:09 crc kubenswrapper[4837]: E0111 17:34:09.302799 4837 kubelet_node_status.go:585] "Error updating node status, will retry" err="error getting node \"crc\": Get \"https://api-int.crc.testing:6443/api/v1/nodes/crc?timeout=10s\": dial tcp 38.102.83.196:6443: connect: connection refused" Jan 11 17:34:09 crc kubenswrapper[4837]: E0111 17:34:09.303700 4837 kubelet_node_status.go:585] "Error updating node status, will retry" err="error getting node \"crc\": Get \"https://api-int.crc.testing:6443/api/v1/nodes/crc?timeout=10s\": dial tcp 38.102.83.196:6443: connect: connection refused" Jan 11 17:34:09 crc kubenswrapper[4837]: E0111 17:34:09.304199 4837 kubelet_node_status.go:585] "Error updating node status, will retry" err="error getting node \"crc\": Get \"https://api-int.crc.testing:6443/api/v1/nodes/crc?timeout=10s\": dial tcp 38.102.83.196:6443: connect: connection refused" Jan 11 17:34:09 crc kubenswrapper[4837]: E0111 17:34:09.304259 4837 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Jan 11 17:34:09 crc kubenswrapper[4837]: I0111 17:34:09.443824 4837 patch_prober.go:28] interesting pod/machine-config-daemon-pqnst container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Jan 11 17:34:09 crc kubenswrapper[4837]: I0111 17:34:09.443893 4837 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-pqnst" podUID="1cf6fa66-290a-4e29-bafc-e60185a22fe2" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Jan 11 17:34:10 crc kubenswrapper[4837]: I0111 17:34:10.293756 4837 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-dht47" Jan 11 17:34:10 crc kubenswrapper[4837]: I0111 17:34:10.294048 4837 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-dht47" Jan 11 17:34:10 crc kubenswrapper[4837]: I0111 17:34:10.396643 4837 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-dht47" Jan 11 17:34:10 crc kubenswrapper[4837]: I0111 17:34:10.397507 4837 status_manager.go:851] "Failed to get status for pod" podUID="f614b9022728cf315e60c057852e563e" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-controller-manager/pods/kube-controller-manager-crc\": dial tcp 38.102.83.196:6443: connect: connection refused" Jan 11 17:34:10 crc kubenswrapper[4837]: I0111 17:34:10.397749 4837 status_manager.go:851] "Failed to get status for pod" podUID="59797bd6-cb69-412d-952b-1673312648e2" pod="openshift-marketplace/redhat-operators-slftv" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-operators-slftv\": dial tcp 38.102.83.196:6443: connect: connection refused" Jan 11 17:34:10 crc kubenswrapper[4837]: I0111 17:34:10.398054 4837 status_manager.go:851] "Failed to get status for pod" podUID="2bf0823a-63d9-43c9-9cc7-1e2f364b7855" pod="openshift-marketplace/redhat-marketplace-dht47" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-marketplace-dht47\": dial tcp 38.102.83.196:6443: connect: connection refused" Jan 11 17:34:10 crc kubenswrapper[4837]: I0111 17:34:10.398630 4837 status_manager.go:851] "Failed to get status for pod" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-startup-monitor-crc\": dial tcp 38.102.83.196:6443: connect: connection refused" Jan 11 17:34:10 crc kubenswrapper[4837]: I0111 17:34:10.399106 4837 status_manager.go:851] "Failed to get status for pod" podUID="1ec7b443-a2dc-4c7b-b661-03ce968d0878" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.196:6443: connect: connection refused" Jan 11 17:34:10 crc kubenswrapper[4837]: I0111 17:34:10.581513 4837 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Jan 11 17:34:11 crc kubenswrapper[4837]: I0111 17:34:11.101657 4837 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-slftv" Jan 11 17:34:11 crc kubenswrapper[4837]: I0111 17:34:11.101992 4837 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-slftv" Jan 11 17:34:11 crc kubenswrapper[4837]: E0111 17:34:11.555846 4837 log.go:32] "RunPodSandbox from runtime service failed" err=< Jan 11 17:34:11 crc kubenswrapper[4837]: rpc error: code = Unknown desc = failed to create pod network sandbox k8s_controller-manager-b577cd54c-v775z_openshift-controller-manager_99d66a69-1de3-43d5-947f-1fc4611c3022_0(c9522f3ab0921218507c0a27c389ed8e5b04475d0df7853294aef3d0f1a25d2c): error adding pod openshift-controller-manager_controller-manager-b577cd54c-v775z to CNI network "multus-cni-network": plugin type="multus-shim" name="multus-cni-network" failed (add): CmdAdd (shim): CNI request failed with status 400: 'ContainerID:"c9522f3ab0921218507c0a27c389ed8e5b04475d0df7853294aef3d0f1a25d2c" Netns:"/var/run/netns/4cac2ad0-2126-4213-8908-a82dbbc3410d" IfName:"eth0" Args:"IgnoreUnknown=1;K8S_POD_NAMESPACE=openshift-controller-manager;K8S_POD_NAME=controller-manager-b577cd54c-v775z;K8S_POD_INFRA_CONTAINER_ID=c9522f3ab0921218507c0a27c389ed8e5b04475d0df7853294aef3d0f1a25d2c;K8S_POD_UID=99d66a69-1de3-43d5-947f-1fc4611c3022" Path:"" ERRORED: error configuring pod [openshift-controller-manager/controller-manager-b577cd54c-v775z] networking: Multus: [openshift-controller-manager/controller-manager-b577cd54c-v775z/99d66a69-1de3-43d5-947f-1fc4611c3022]: error setting the networks status: SetPodNetworkStatusAnnotation: failed to update the pod controller-manager-b577cd54c-v775z in out of cluster comm: SetNetworkStatus: failed to update the pod controller-manager-b577cd54c-v775z in out of cluster comm: status update failed for pod /: Get "https://api-int.crc.testing:6443/api/v1/namespaces/openshift-controller-manager/pods/controller-manager-b577cd54c-v775z?timeout=1m0s": dial tcp 38.102.83.196:6443: connect: connection refused Jan 11 17:34:11 crc kubenswrapper[4837]: ': StdinData: {"binDir":"/var/lib/cni/bin","clusterNetwork":"/host/run/multus/cni/net.d/10-ovn-kubernetes.conf","cniVersion":"0.3.1","daemonSocketDir":"/run/multus/socket","globalNamespaces":"default,openshift-multus,openshift-sriov-network-operator,openshift-cnv","logLevel":"verbose","logToStderr":true,"name":"multus-cni-network","namespaceIsolation":true,"type":"multus-shim"} Jan 11 17:34:11 crc kubenswrapper[4837]: > Jan 11 17:34:11 crc kubenswrapper[4837]: E0111 17:34:11.556536 4837 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err=< Jan 11 17:34:11 crc kubenswrapper[4837]: rpc error: code = Unknown desc = failed to create pod network sandbox k8s_controller-manager-b577cd54c-v775z_openshift-controller-manager_99d66a69-1de3-43d5-947f-1fc4611c3022_0(c9522f3ab0921218507c0a27c389ed8e5b04475d0df7853294aef3d0f1a25d2c): error adding pod openshift-controller-manager_controller-manager-b577cd54c-v775z to CNI network "multus-cni-network": plugin type="multus-shim" name="multus-cni-network" failed (add): CmdAdd (shim): CNI request failed with status 400: 'ContainerID:"c9522f3ab0921218507c0a27c389ed8e5b04475d0df7853294aef3d0f1a25d2c" Netns:"/var/run/netns/4cac2ad0-2126-4213-8908-a82dbbc3410d" IfName:"eth0" Args:"IgnoreUnknown=1;K8S_POD_NAMESPACE=openshift-controller-manager;K8S_POD_NAME=controller-manager-b577cd54c-v775z;K8S_POD_INFRA_CONTAINER_ID=c9522f3ab0921218507c0a27c389ed8e5b04475d0df7853294aef3d0f1a25d2c;K8S_POD_UID=99d66a69-1de3-43d5-947f-1fc4611c3022" Path:"" ERRORED: error configuring pod [openshift-controller-manager/controller-manager-b577cd54c-v775z] networking: Multus: [openshift-controller-manager/controller-manager-b577cd54c-v775z/99d66a69-1de3-43d5-947f-1fc4611c3022]: error setting the networks status: SetPodNetworkStatusAnnotation: failed to update the pod controller-manager-b577cd54c-v775z in out of cluster comm: SetNetworkStatus: failed to update the pod controller-manager-b577cd54c-v775z in out of cluster comm: status update failed for pod /: Get "https://api-int.crc.testing:6443/api/v1/namespaces/openshift-controller-manager/pods/controller-manager-b577cd54c-v775z?timeout=1m0s": dial tcp 38.102.83.196:6443: connect: connection refused Jan 11 17:34:11 crc kubenswrapper[4837]: ': StdinData: {"binDir":"/var/lib/cni/bin","clusterNetwork":"/host/run/multus/cni/net.d/10-ovn-kubernetes.conf","cniVersion":"0.3.1","daemonSocketDir":"/run/multus/socket","globalNamespaces":"default,openshift-multus,openshift-sriov-network-operator,openshift-cnv","logLevel":"verbose","logToStderr":true,"name":"multus-cni-network","namespaceIsolation":true,"type":"multus-shim"} Jan 11 17:34:11 crc kubenswrapper[4837]: > pod="openshift-controller-manager/controller-manager-b577cd54c-v775z" Jan 11 17:34:11 crc kubenswrapper[4837]: E0111 17:34:11.556583 4837 kuberuntime_manager.go:1170] "CreatePodSandbox for pod failed" err=< Jan 11 17:34:11 crc kubenswrapper[4837]: rpc error: code = Unknown desc = failed to create pod network sandbox k8s_controller-manager-b577cd54c-v775z_openshift-controller-manager_99d66a69-1de3-43d5-947f-1fc4611c3022_0(c9522f3ab0921218507c0a27c389ed8e5b04475d0df7853294aef3d0f1a25d2c): error adding pod openshift-controller-manager_controller-manager-b577cd54c-v775z to CNI network "multus-cni-network": plugin type="multus-shim" name="multus-cni-network" failed (add): CmdAdd (shim): CNI request failed with status 400: 'ContainerID:"c9522f3ab0921218507c0a27c389ed8e5b04475d0df7853294aef3d0f1a25d2c" Netns:"/var/run/netns/4cac2ad0-2126-4213-8908-a82dbbc3410d" IfName:"eth0" Args:"IgnoreUnknown=1;K8S_POD_NAMESPACE=openshift-controller-manager;K8S_POD_NAME=controller-manager-b577cd54c-v775z;K8S_POD_INFRA_CONTAINER_ID=c9522f3ab0921218507c0a27c389ed8e5b04475d0df7853294aef3d0f1a25d2c;K8S_POD_UID=99d66a69-1de3-43d5-947f-1fc4611c3022" Path:"" ERRORED: error configuring pod [openshift-controller-manager/controller-manager-b577cd54c-v775z] networking: Multus: [openshift-controller-manager/controller-manager-b577cd54c-v775z/99d66a69-1de3-43d5-947f-1fc4611c3022]: error setting the networks status: SetPodNetworkStatusAnnotation: failed to update the pod controller-manager-b577cd54c-v775z in out of cluster comm: SetNetworkStatus: failed to update the pod controller-manager-b577cd54c-v775z in out of cluster comm: status update failed for pod /: Get "https://api-int.crc.testing:6443/api/v1/namespaces/openshift-controller-manager/pods/controller-manager-b577cd54c-v775z?timeout=1m0s": dial tcp 38.102.83.196:6443: connect: connection refused Jan 11 17:34:11 crc kubenswrapper[4837]: ': StdinData: {"binDir":"/var/lib/cni/bin","clusterNetwork":"/host/run/multus/cni/net.d/10-ovn-kubernetes.conf","cniVersion":"0.3.1","daemonSocketDir":"/run/multus/socket","globalNamespaces":"default,openshift-multus,openshift-sriov-network-operator,openshift-cnv","logLevel":"verbose","logToStderr":true,"name":"multus-cni-network","namespaceIsolation":true,"type":"multus-shim"} Jan 11 17:34:11 crc kubenswrapper[4837]: > pod="openshift-controller-manager/controller-manager-b577cd54c-v775z" Jan 11 17:34:11 crc kubenswrapper[4837]: E0111 17:34:11.556747 4837 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"controller-manager-b577cd54c-v775z_openshift-controller-manager(99d66a69-1de3-43d5-947f-1fc4611c3022)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"controller-manager-b577cd54c-v775z_openshift-controller-manager(99d66a69-1de3-43d5-947f-1fc4611c3022)\\\": rpc error: code = Unknown desc = failed to create pod network sandbox k8s_controller-manager-b577cd54c-v775z_openshift-controller-manager_99d66a69-1de3-43d5-947f-1fc4611c3022_0(c9522f3ab0921218507c0a27c389ed8e5b04475d0df7853294aef3d0f1a25d2c): error adding pod openshift-controller-manager_controller-manager-b577cd54c-v775z to CNI network \\\"multus-cni-network\\\": plugin type=\\\"multus-shim\\\" name=\\\"multus-cni-network\\\" failed (add): CmdAdd (shim): CNI request failed with status 400: 'ContainerID:\\\"c9522f3ab0921218507c0a27c389ed8e5b04475d0df7853294aef3d0f1a25d2c\\\" Netns:\\\"/var/run/netns/4cac2ad0-2126-4213-8908-a82dbbc3410d\\\" IfName:\\\"eth0\\\" Args:\\\"IgnoreUnknown=1;K8S_POD_NAMESPACE=openshift-controller-manager;K8S_POD_NAME=controller-manager-b577cd54c-v775z;K8S_POD_INFRA_CONTAINER_ID=c9522f3ab0921218507c0a27c389ed8e5b04475d0df7853294aef3d0f1a25d2c;K8S_POD_UID=99d66a69-1de3-43d5-947f-1fc4611c3022\\\" Path:\\\"\\\" ERRORED: error configuring pod [openshift-controller-manager/controller-manager-b577cd54c-v775z] networking: Multus: [openshift-controller-manager/controller-manager-b577cd54c-v775z/99d66a69-1de3-43d5-947f-1fc4611c3022]: error setting the networks status: SetPodNetworkStatusAnnotation: failed to update the pod controller-manager-b577cd54c-v775z in out of cluster comm: SetNetworkStatus: failed to update the pod controller-manager-b577cd54c-v775z in out of cluster comm: status update failed for pod /: Get \\\"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-controller-manager/pods/controller-manager-b577cd54c-v775z?timeout=1m0s\\\": dial tcp 38.102.83.196:6443: connect: connection refused\\n': StdinData: {\\\"binDir\\\":\\\"/var/lib/cni/bin\\\",\\\"clusterNetwork\\\":\\\"/host/run/multus/cni/net.d/10-ovn-kubernetes.conf\\\",\\\"cniVersion\\\":\\\"0.3.1\\\",\\\"daemonSocketDir\\\":\\\"/run/multus/socket\\\",\\\"globalNamespaces\\\":\\\"default,openshift-multus,openshift-sriov-network-operator,openshift-cnv\\\",\\\"logLevel\\\":\\\"verbose\\\",\\\"logToStderr\\\":true,\\\"name\\\":\\\"multus-cni-network\\\",\\\"namespaceIsolation\\\":true,\\\"type\\\":\\\"multus-shim\\\"}\"" pod="openshift-controller-manager/controller-manager-b577cd54c-v775z" podUID="99d66a69-1de3-43d5-947f-1fc4611c3022" Jan 11 17:34:11 crc kubenswrapper[4837]: I0111 17:34:11.830694 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-29ts4" event={"ID":"30409294-8779-48ad-a6e8-36b662f09c0f","Type":"ContainerStarted","Data":"68f6ca2c2560051f1c829a084474dabb893c400d035ba1b6d9936195de41fb0d"} Jan 11 17:34:11 crc kubenswrapper[4837]: I0111 17:34:11.831645 4837 status_manager.go:851] "Failed to get status for pod" podUID="f614b9022728cf315e60c057852e563e" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-controller-manager/pods/kube-controller-manager-crc\": dial tcp 38.102.83.196:6443: connect: connection refused" Jan 11 17:34:11 crc kubenswrapper[4837]: I0111 17:34:11.832164 4837 status_manager.go:851] "Failed to get status for pod" podUID="59797bd6-cb69-412d-952b-1673312648e2" pod="openshift-marketplace/redhat-operators-slftv" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-operators-slftv\": dial tcp 38.102.83.196:6443: connect: connection refused" Jan 11 17:34:11 crc kubenswrapper[4837]: I0111 17:34:11.832575 4837 status_manager.go:851] "Failed to get status for pod" podUID="30409294-8779-48ad-a6e8-36b662f09c0f" pod="openshift-marketplace/certified-operators-29ts4" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/certified-operators-29ts4\": dial tcp 38.102.83.196:6443: connect: connection refused" Jan 11 17:34:11 crc kubenswrapper[4837]: I0111 17:34:11.832898 4837 status_manager.go:851] "Failed to get status for pod" podUID="2bf0823a-63d9-43c9-9cc7-1e2f364b7855" pod="openshift-marketplace/redhat-marketplace-dht47" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-marketplace-dht47\": dial tcp 38.102.83.196:6443: connect: connection refused" Jan 11 17:34:11 crc kubenswrapper[4837]: I0111 17:34:11.833180 4837 status_manager.go:851] "Failed to get status for pod" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-startup-monitor-crc\": dial tcp 38.102.83.196:6443: connect: connection refused" Jan 11 17:34:11 crc kubenswrapper[4837]: I0111 17:34:11.833408 4837 status_manager.go:851] "Failed to get status for pod" podUID="1ec7b443-a2dc-4c7b-b661-03ce968d0878" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.196:6443: connect: connection refused" Jan 11 17:34:12 crc kubenswrapper[4837]: I0111 17:34:12.041219 4837 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Jan 11 17:34:12 crc kubenswrapper[4837]: I0111 17:34:12.049311 4837 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Jan 11 17:34:12 crc kubenswrapper[4837]: I0111 17:34:12.050042 4837 status_manager.go:851] "Failed to get status for pod" podUID="f614b9022728cf315e60c057852e563e" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-controller-manager/pods/kube-controller-manager-crc\": dial tcp 38.102.83.196:6443: connect: connection refused" Jan 11 17:34:12 crc kubenswrapper[4837]: I0111 17:34:12.050390 4837 status_manager.go:851] "Failed to get status for pod" podUID="59797bd6-cb69-412d-952b-1673312648e2" pod="openshift-marketplace/redhat-operators-slftv" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-operators-slftv\": dial tcp 38.102.83.196:6443: connect: connection refused" Jan 11 17:34:12 crc kubenswrapper[4837]: I0111 17:34:12.050777 4837 status_manager.go:851] "Failed to get status for pod" podUID="30409294-8779-48ad-a6e8-36b662f09c0f" pod="openshift-marketplace/certified-operators-29ts4" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/certified-operators-29ts4\": dial tcp 38.102.83.196:6443: connect: connection refused" Jan 11 17:34:12 crc kubenswrapper[4837]: I0111 17:34:12.051042 4837 status_manager.go:851] "Failed to get status for pod" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-startup-monitor-crc\": dial tcp 38.102.83.196:6443: connect: connection refused" Jan 11 17:34:12 crc kubenswrapper[4837]: I0111 17:34:12.051277 4837 status_manager.go:851] "Failed to get status for pod" podUID="2bf0823a-63d9-43c9-9cc7-1e2f364b7855" pod="openshift-marketplace/redhat-marketplace-dht47" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-marketplace-dht47\": dial tcp 38.102.83.196:6443: connect: connection refused" Jan 11 17:34:12 crc kubenswrapper[4837]: I0111 17:34:12.051561 4837 status_manager.go:851] "Failed to get status for pod" podUID="1ec7b443-a2dc-4c7b-b661-03ce968d0878" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.196:6443: connect: connection refused" Jan 11 17:34:12 crc kubenswrapper[4837]: I0111 17:34:12.135363 4837 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-slftv" podUID="59797bd6-cb69-412d-952b-1673312648e2" containerName="registry-server" probeResult="failure" output=< Jan 11 17:34:12 crc kubenswrapper[4837]: timeout: failed to connect service ":50051" within 1s Jan 11 17:34:12 crc kubenswrapper[4837]: > Jan 11 17:34:12 crc kubenswrapper[4837]: I0111 17:34:12.838559 4837 generic.go:334] "Generic (PLEG): container finished" podID="1aecbb0e-6cc3-4308-a741-c1799ec8b541" containerID="360e607e10374d567a08d1492554aba5251c8aaf1f5261f894082ce4af7e526b" exitCode=0 Jan 11 17:34:12 crc kubenswrapper[4837]: I0111 17:34:12.838659 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-jk4fx" event={"ID":"1aecbb0e-6cc3-4308-a741-c1799ec8b541","Type":"ContainerDied","Data":"360e607e10374d567a08d1492554aba5251c8aaf1f5261f894082ce4af7e526b"} Jan 11 17:34:12 crc kubenswrapper[4837]: I0111 17:34:12.840202 4837 status_manager.go:851] "Failed to get status for pod" podUID="1aecbb0e-6cc3-4308-a741-c1799ec8b541" pod="openshift-marketplace/certified-operators-jk4fx" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/certified-operators-jk4fx\": dial tcp 38.102.83.196:6443: connect: connection refused" Jan 11 17:34:12 crc kubenswrapper[4837]: I0111 17:34:12.840524 4837 status_manager.go:851] "Failed to get status for pod" podUID="2bf0823a-63d9-43c9-9cc7-1e2f364b7855" pod="openshift-marketplace/redhat-marketplace-dht47" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-marketplace-dht47\": dial tcp 38.102.83.196:6443: connect: connection refused" Jan 11 17:34:12 crc kubenswrapper[4837]: I0111 17:34:12.840871 4837 status_manager.go:851] "Failed to get status for pod" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-startup-monitor-crc\": dial tcp 38.102.83.196:6443: connect: connection refused" Jan 11 17:34:12 crc kubenswrapper[4837]: I0111 17:34:12.841186 4837 status_manager.go:851] "Failed to get status for pod" podUID="1ec7b443-a2dc-4c7b-b661-03ce968d0878" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.196:6443: connect: connection refused" Jan 11 17:34:12 crc kubenswrapper[4837]: I0111 17:34:12.841549 4837 status_manager.go:851] "Failed to get status for pod" podUID="f614b9022728cf315e60c057852e563e" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-controller-manager/pods/kube-controller-manager-crc\": dial tcp 38.102.83.196:6443: connect: connection refused" Jan 11 17:34:12 crc kubenswrapper[4837]: I0111 17:34:12.842091 4837 status_manager.go:851] "Failed to get status for pod" podUID="59797bd6-cb69-412d-952b-1673312648e2" pod="openshift-marketplace/redhat-operators-slftv" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-operators-slftv\": dial tcp 38.102.83.196:6443: connect: connection refused" Jan 11 17:34:12 crc kubenswrapper[4837]: I0111 17:34:12.842470 4837 status_manager.go:851] "Failed to get status for pod" podUID="30409294-8779-48ad-a6e8-36b662f09c0f" pod="openshift-marketplace/certified-operators-29ts4" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/certified-operators-29ts4\": dial tcp 38.102.83.196:6443: connect: connection refused" Jan 11 17:34:12 crc kubenswrapper[4837]: I0111 17:34:12.844099 4837 generic.go:334] "Generic (PLEG): container finished" podID="df3f0f97-5906-44b5-99d5-6003e1b23be1" containerID="8fb1b9dddaaa758c26194ce8b895f2c3edf56bee4706366bec94305c47cb43c3" exitCode=0 Jan 11 17:34:12 crc kubenswrapper[4837]: I0111 17:34:12.844208 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-p7kkt" event={"ID":"df3f0f97-5906-44b5-99d5-6003e1b23be1","Type":"ContainerDied","Data":"8fb1b9dddaaa758c26194ce8b895f2c3edf56bee4706366bec94305c47cb43c3"} Jan 11 17:34:12 crc kubenswrapper[4837]: I0111 17:34:12.845248 4837 status_manager.go:851] "Failed to get status for pod" podUID="1aecbb0e-6cc3-4308-a741-c1799ec8b541" pod="openshift-marketplace/certified-operators-jk4fx" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/certified-operators-jk4fx\": dial tcp 38.102.83.196:6443: connect: connection refused" Jan 11 17:34:12 crc kubenswrapper[4837]: I0111 17:34:12.845617 4837 status_manager.go:851] "Failed to get status for pod" podUID="2bf0823a-63d9-43c9-9cc7-1e2f364b7855" pod="openshift-marketplace/redhat-marketplace-dht47" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-marketplace-dht47\": dial tcp 38.102.83.196:6443: connect: connection refused" Jan 11 17:34:12 crc kubenswrapper[4837]: I0111 17:34:12.845964 4837 status_manager.go:851] "Failed to get status for pod" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-startup-monitor-crc\": dial tcp 38.102.83.196:6443: connect: connection refused" Jan 11 17:34:12 crc kubenswrapper[4837]: I0111 17:34:12.846333 4837 status_manager.go:851] "Failed to get status for pod" podUID="1ec7b443-a2dc-4c7b-b661-03ce968d0878" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.196:6443: connect: connection refused" Jan 11 17:34:12 crc kubenswrapper[4837]: I0111 17:34:12.846892 4837 status_manager.go:851] "Failed to get status for pod" podUID="df3f0f97-5906-44b5-99d5-6003e1b23be1" pod="openshift-marketplace/community-operators-p7kkt" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/community-operators-p7kkt\": dial tcp 38.102.83.196:6443: connect: connection refused" Jan 11 17:34:12 crc kubenswrapper[4837]: I0111 17:34:12.847342 4837 status_manager.go:851] "Failed to get status for pod" podUID="f614b9022728cf315e60c057852e563e" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-controller-manager/pods/kube-controller-manager-crc\": dial tcp 38.102.83.196:6443: connect: connection refused" Jan 11 17:34:12 crc kubenswrapper[4837]: I0111 17:34:12.847485 4837 generic.go:334] "Generic (PLEG): container finished" podID="30409294-8779-48ad-a6e8-36b662f09c0f" containerID="68f6ca2c2560051f1c829a084474dabb893c400d035ba1b6d9936195de41fb0d" exitCode=0 Jan 11 17:34:12 crc kubenswrapper[4837]: I0111 17:34:12.847544 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-29ts4" event={"ID":"30409294-8779-48ad-a6e8-36b662f09c0f","Type":"ContainerDied","Data":"68f6ca2c2560051f1c829a084474dabb893c400d035ba1b6d9936195de41fb0d"} Jan 11 17:34:12 crc kubenswrapper[4837]: I0111 17:34:12.847781 4837 status_manager.go:851] "Failed to get status for pod" podUID="59797bd6-cb69-412d-952b-1673312648e2" pod="openshift-marketplace/redhat-operators-slftv" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-operators-slftv\": dial tcp 38.102.83.196:6443: connect: connection refused" Jan 11 17:34:12 crc kubenswrapper[4837]: I0111 17:34:12.848162 4837 status_manager.go:851] "Failed to get status for pod" podUID="30409294-8779-48ad-a6e8-36b662f09c0f" pod="openshift-marketplace/certified-operators-29ts4" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/certified-operators-29ts4\": dial tcp 38.102.83.196:6443: connect: connection refused" Jan 11 17:34:12 crc kubenswrapper[4837]: I0111 17:34:12.849125 4837 status_manager.go:851] "Failed to get status for pod" podUID="df3f0f97-5906-44b5-99d5-6003e1b23be1" pod="openshift-marketplace/community-operators-p7kkt" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/community-operators-p7kkt\": dial tcp 38.102.83.196:6443: connect: connection refused" Jan 11 17:34:12 crc kubenswrapper[4837]: I0111 17:34:12.849445 4837 status_manager.go:851] "Failed to get status for pod" podUID="f614b9022728cf315e60c057852e563e" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-controller-manager/pods/kube-controller-manager-crc\": dial tcp 38.102.83.196:6443: connect: connection refused" Jan 11 17:34:12 crc kubenswrapper[4837]: I0111 17:34:12.849882 4837 status_manager.go:851] "Failed to get status for pod" podUID="59797bd6-cb69-412d-952b-1673312648e2" pod="openshift-marketplace/redhat-operators-slftv" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-operators-slftv\": dial tcp 38.102.83.196:6443: connect: connection refused" Jan 11 17:34:12 crc kubenswrapper[4837]: I0111 17:34:12.850322 4837 status_manager.go:851] "Failed to get status for pod" podUID="30409294-8779-48ad-a6e8-36b662f09c0f" pod="openshift-marketplace/certified-operators-29ts4" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/certified-operators-29ts4\": dial tcp 38.102.83.196:6443: connect: connection refused" Jan 11 17:34:12 crc kubenswrapper[4837]: I0111 17:34:12.850762 4837 status_manager.go:851] "Failed to get status for pod" podUID="1aecbb0e-6cc3-4308-a741-c1799ec8b541" pod="openshift-marketplace/certified-operators-jk4fx" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/certified-operators-jk4fx\": dial tcp 38.102.83.196:6443: connect: connection refused" Jan 11 17:34:12 crc kubenswrapper[4837]: I0111 17:34:12.851698 4837 status_manager.go:851] "Failed to get status for pod" podUID="2bf0823a-63d9-43c9-9cc7-1e2f364b7855" pod="openshift-marketplace/redhat-marketplace-dht47" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-marketplace-dht47\": dial tcp 38.102.83.196:6443: connect: connection refused" Jan 11 17:34:12 crc kubenswrapper[4837]: I0111 17:34:12.851936 4837 generic.go:334] "Generic (PLEG): container finished" podID="5a6d8e6a-5e4a-4ff4-92ec-b2bfee9ef821" containerID="421eb4044940a46e9146cd116d17d0e4f6ac4e87355b9596d42b49d3eef92753" exitCode=0 Jan 11 17:34:12 crc kubenswrapper[4837]: I0111 17:34:12.851982 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-9blrl" event={"ID":"5a6d8e6a-5e4a-4ff4-92ec-b2bfee9ef821","Type":"ContainerDied","Data":"421eb4044940a46e9146cd116d17d0e4f6ac4e87355b9596d42b49d3eef92753"} Jan 11 17:34:12 crc kubenswrapper[4837]: I0111 17:34:12.852033 4837 status_manager.go:851] "Failed to get status for pod" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-startup-monitor-crc\": dial tcp 38.102.83.196:6443: connect: connection refused" Jan 11 17:34:12 crc kubenswrapper[4837]: I0111 17:34:12.852460 4837 status_manager.go:851] "Failed to get status for pod" podUID="1ec7b443-a2dc-4c7b-b661-03ce968d0878" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.196:6443: connect: connection refused" Jan 11 17:34:12 crc kubenswrapper[4837]: I0111 17:34:12.852951 4837 status_manager.go:851] "Failed to get status for pod" podUID="f614b9022728cf315e60c057852e563e" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-controller-manager/pods/kube-controller-manager-crc\": dial tcp 38.102.83.196:6443: connect: connection refused" Jan 11 17:34:12 crc kubenswrapper[4837]: I0111 17:34:12.853318 4837 status_manager.go:851] "Failed to get status for pod" podUID="59797bd6-cb69-412d-952b-1673312648e2" pod="openshift-marketplace/redhat-operators-slftv" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-operators-slftv\": dial tcp 38.102.83.196:6443: connect: connection refused" Jan 11 17:34:12 crc kubenswrapper[4837]: I0111 17:34:12.853707 4837 status_manager.go:851] "Failed to get status for pod" podUID="30409294-8779-48ad-a6e8-36b662f09c0f" pod="openshift-marketplace/certified-operators-29ts4" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/certified-operators-29ts4\": dial tcp 38.102.83.196:6443: connect: connection refused" Jan 11 17:34:12 crc kubenswrapper[4837]: I0111 17:34:12.854396 4837 status_manager.go:851] "Failed to get status for pod" podUID="5a6d8e6a-5e4a-4ff4-92ec-b2bfee9ef821" pod="openshift-marketplace/redhat-marketplace-9blrl" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-marketplace-9blrl\": dial tcp 38.102.83.196:6443: connect: connection refused" Jan 11 17:34:12 crc kubenswrapper[4837]: I0111 17:34:12.854765 4837 status_manager.go:851] "Failed to get status for pod" podUID="1aecbb0e-6cc3-4308-a741-c1799ec8b541" pod="openshift-marketplace/certified-operators-jk4fx" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/certified-operators-jk4fx\": dial tcp 38.102.83.196:6443: connect: connection refused" Jan 11 17:34:12 crc kubenswrapper[4837]: I0111 17:34:12.854856 4837 generic.go:334] "Generic (PLEG): container finished" podID="0f2f437a-d901-4e9b-95c1-9099d8ffaf9c" containerID="2c9b2d61359d4e71a2879e5d1978c4c4518ed5190b05203736f532079767d5b9" exitCode=0 Jan 11 17:34:12 crc kubenswrapper[4837]: I0111 17:34:12.854891 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-pvd2s" event={"ID":"0f2f437a-d901-4e9b-95c1-9099d8ffaf9c","Type":"ContainerDied","Data":"2c9b2d61359d4e71a2879e5d1978c4c4518ed5190b05203736f532079767d5b9"} Jan 11 17:34:12 crc kubenswrapper[4837]: I0111 17:34:12.855116 4837 status_manager.go:851] "Failed to get status for pod" podUID="2bf0823a-63d9-43c9-9cc7-1e2f364b7855" pod="openshift-marketplace/redhat-marketplace-dht47" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-marketplace-dht47\": dial tcp 38.102.83.196:6443: connect: connection refused" Jan 11 17:34:12 crc kubenswrapper[4837]: I0111 17:34:12.855401 4837 status_manager.go:851] "Failed to get status for pod" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-startup-monitor-crc\": dial tcp 38.102.83.196:6443: connect: connection refused" Jan 11 17:34:12 crc kubenswrapper[4837]: I0111 17:34:12.856355 4837 status_manager.go:851] "Failed to get status for pod" podUID="1ec7b443-a2dc-4c7b-b661-03ce968d0878" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.196:6443: connect: connection refused" Jan 11 17:34:12 crc kubenswrapper[4837]: I0111 17:34:12.856775 4837 status_manager.go:851] "Failed to get status for pod" podUID="df3f0f97-5906-44b5-99d5-6003e1b23be1" pod="openshift-marketplace/community-operators-p7kkt" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/community-operators-p7kkt\": dial tcp 38.102.83.196:6443: connect: connection refused" Jan 11 17:34:12 crc kubenswrapper[4837]: I0111 17:34:12.857208 4837 status_manager.go:851] "Failed to get status for pod" podUID="0f2f437a-d901-4e9b-95c1-9099d8ffaf9c" pod="openshift-marketplace/redhat-operators-pvd2s" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-operators-pvd2s\": dial tcp 38.102.83.196:6443: connect: connection refused" Jan 11 17:34:12 crc kubenswrapper[4837]: I0111 17:34:12.857392 4837 generic.go:334] "Generic (PLEG): container finished" podID="1d39ff8b-c79a-46ea-af70-0902ce0ee504" containerID="f91edf638c7e4927c1c4f40abc693622925ffbc5d1e9a0cc5110a01a70e32751" exitCode=0 Jan 11 17:34:12 crc kubenswrapper[4837]: I0111 17:34:12.857439 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-mxndb" event={"ID":"1d39ff8b-c79a-46ea-af70-0902ce0ee504","Type":"ContainerDied","Data":"f91edf638c7e4927c1c4f40abc693622925ffbc5d1e9a0cc5110a01a70e32751"} Jan 11 17:34:12 crc kubenswrapper[4837]: I0111 17:34:12.857573 4837 status_manager.go:851] "Failed to get status for pod" podUID="df3f0f97-5906-44b5-99d5-6003e1b23be1" pod="openshift-marketplace/community-operators-p7kkt" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/community-operators-p7kkt\": dial tcp 38.102.83.196:6443: connect: connection refused" Jan 11 17:34:12 crc kubenswrapper[4837]: I0111 17:34:12.857881 4837 status_manager.go:851] "Failed to get status for pod" podUID="f614b9022728cf315e60c057852e563e" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-controller-manager/pods/kube-controller-manager-crc\": dial tcp 38.102.83.196:6443: connect: connection refused" Jan 11 17:34:12 crc kubenswrapper[4837]: I0111 17:34:12.858032 4837 status_manager.go:851] "Failed to get status for pod" podUID="59797bd6-cb69-412d-952b-1673312648e2" pod="openshift-marketplace/redhat-operators-slftv" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-operators-slftv\": dial tcp 38.102.83.196:6443: connect: connection refused" Jan 11 17:34:12 crc kubenswrapper[4837]: I0111 17:34:12.858180 4837 status_manager.go:851] "Failed to get status for pod" podUID="30409294-8779-48ad-a6e8-36b662f09c0f" pod="openshift-marketplace/certified-operators-29ts4" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/certified-operators-29ts4\": dial tcp 38.102.83.196:6443: connect: connection refused" Jan 11 17:34:12 crc kubenswrapper[4837]: I0111 17:34:12.858311 4837 status_manager.go:851] "Failed to get status for pod" podUID="5a6d8e6a-5e4a-4ff4-92ec-b2bfee9ef821" pod="openshift-marketplace/redhat-marketplace-9blrl" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-marketplace-9blrl\": dial tcp 38.102.83.196:6443: connect: connection refused" Jan 11 17:34:12 crc kubenswrapper[4837]: I0111 17:34:12.858607 4837 status_manager.go:851] "Failed to get status for pod" podUID="1aecbb0e-6cc3-4308-a741-c1799ec8b541" pod="openshift-marketplace/certified-operators-jk4fx" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/certified-operators-jk4fx\": dial tcp 38.102.83.196:6443: connect: connection refused" Jan 11 17:34:12 crc kubenswrapper[4837]: I0111 17:34:12.859875 4837 status_manager.go:851] "Failed to get status for pod" podUID="2bf0823a-63d9-43c9-9cc7-1e2f364b7855" pod="openshift-marketplace/redhat-marketplace-dht47" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-marketplace-dht47\": dial tcp 38.102.83.196:6443: connect: connection refused" Jan 11 17:34:12 crc kubenswrapper[4837]: I0111 17:34:12.860367 4837 status_manager.go:851] "Failed to get status for pod" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-startup-monitor-crc\": dial tcp 38.102.83.196:6443: connect: connection refused" Jan 11 17:34:12 crc kubenswrapper[4837]: I0111 17:34:12.861902 4837 status_manager.go:851] "Failed to get status for pod" podUID="1ec7b443-a2dc-4c7b-b661-03ce968d0878" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.196:6443: connect: connection refused" Jan 11 17:34:12 crc kubenswrapper[4837]: I0111 17:34:12.862468 4837 status_manager.go:851] "Failed to get status for pod" podUID="0f2f437a-d901-4e9b-95c1-9099d8ffaf9c" pod="openshift-marketplace/redhat-operators-pvd2s" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-operators-pvd2s\": dial tcp 38.102.83.196:6443: connect: connection refused" Jan 11 17:34:12 crc kubenswrapper[4837]: I0111 17:34:12.862935 4837 status_manager.go:851] "Failed to get status for pod" podUID="1d39ff8b-c79a-46ea-af70-0902ce0ee504" pod="openshift-marketplace/community-operators-mxndb" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/community-operators-mxndb\": dial tcp 38.102.83.196:6443: connect: connection refused" Jan 11 17:34:12 crc kubenswrapper[4837]: I0111 17:34:12.863392 4837 status_manager.go:851] "Failed to get status for pod" podUID="df3f0f97-5906-44b5-99d5-6003e1b23be1" pod="openshift-marketplace/community-operators-p7kkt" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/community-operators-p7kkt\": dial tcp 38.102.83.196:6443: connect: connection refused" Jan 11 17:34:12 crc kubenswrapper[4837]: I0111 17:34:12.863859 4837 status_manager.go:851] "Failed to get status for pod" podUID="f614b9022728cf315e60c057852e563e" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-controller-manager/pods/kube-controller-manager-crc\": dial tcp 38.102.83.196:6443: connect: connection refused" Jan 11 17:34:12 crc kubenswrapper[4837]: I0111 17:34:12.865004 4837 status_manager.go:851] "Failed to get status for pod" podUID="59797bd6-cb69-412d-952b-1673312648e2" pod="openshift-marketplace/redhat-operators-slftv" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-operators-slftv\": dial tcp 38.102.83.196:6443: connect: connection refused" Jan 11 17:34:12 crc kubenswrapper[4837]: I0111 17:34:12.865445 4837 status_manager.go:851] "Failed to get status for pod" podUID="30409294-8779-48ad-a6e8-36b662f09c0f" pod="openshift-marketplace/certified-operators-29ts4" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/certified-operators-29ts4\": dial tcp 38.102.83.196:6443: connect: connection refused" Jan 11 17:34:12 crc kubenswrapper[4837]: I0111 17:34:12.865909 4837 status_manager.go:851] "Failed to get status for pod" podUID="5a6d8e6a-5e4a-4ff4-92ec-b2bfee9ef821" pod="openshift-marketplace/redhat-marketplace-9blrl" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-marketplace-9blrl\": dial tcp 38.102.83.196:6443: connect: connection refused" Jan 11 17:34:12 crc kubenswrapper[4837]: I0111 17:34:12.866405 4837 status_manager.go:851] "Failed to get status for pod" podUID="1aecbb0e-6cc3-4308-a741-c1799ec8b541" pod="openshift-marketplace/certified-operators-jk4fx" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/certified-operators-jk4fx\": dial tcp 38.102.83.196:6443: connect: connection refused" Jan 11 17:34:12 crc kubenswrapper[4837]: I0111 17:34:12.866841 4837 status_manager.go:851] "Failed to get status for pod" podUID="2bf0823a-63d9-43c9-9cc7-1e2f364b7855" pod="openshift-marketplace/redhat-marketplace-dht47" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-marketplace-dht47\": dial tcp 38.102.83.196:6443: connect: connection refused" Jan 11 17:34:12 crc kubenswrapper[4837]: I0111 17:34:12.867313 4837 status_manager.go:851] "Failed to get status for pod" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-startup-monitor-crc\": dial tcp 38.102.83.196:6443: connect: connection refused" Jan 11 17:34:12 crc kubenswrapper[4837]: I0111 17:34:12.867722 4837 status_manager.go:851] "Failed to get status for pod" podUID="1ec7b443-a2dc-4c7b-b661-03ce968d0878" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.196:6443: connect: connection refused" Jan 11 17:34:13 crc kubenswrapper[4837]: I0111 17:34:13.363431 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-crc" Jan 11 17:34:13 crc kubenswrapper[4837]: I0111 17:34:13.364425 4837 status_manager.go:851] "Failed to get status for pod" podUID="1ec7b443-a2dc-4c7b-b661-03ce968d0878" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.196:6443: connect: connection refused" Jan 11 17:34:13 crc kubenswrapper[4837]: I0111 17:34:13.364994 4837 status_manager.go:851] "Failed to get status for pod" podUID="0f2f437a-d901-4e9b-95c1-9099d8ffaf9c" pod="openshift-marketplace/redhat-operators-pvd2s" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-operators-pvd2s\": dial tcp 38.102.83.196:6443: connect: connection refused" Jan 11 17:34:13 crc kubenswrapper[4837]: I0111 17:34:13.365493 4837 status_manager.go:851] "Failed to get status for pod" podUID="1d39ff8b-c79a-46ea-af70-0902ce0ee504" pod="openshift-marketplace/community-operators-mxndb" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/community-operators-mxndb\": dial tcp 38.102.83.196:6443: connect: connection refused" Jan 11 17:34:13 crc kubenswrapper[4837]: I0111 17:34:13.365965 4837 status_manager.go:851] "Failed to get status for pod" podUID="df3f0f97-5906-44b5-99d5-6003e1b23be1" pod="openshift-marketplace/community-operators-p7kkt" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/community-operators-p7kkt\": dial tcp 38.102.83.196:6443: connect: connection refused" Jan 11 17:34:13 crc kubenswrapper[4837]: I0111 17:34:13.366415 4837 status_manager.go:851] "Failed to get status for pod" podUID="f614b9022728cf315e60c057852e563e" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-controller-manager/pods/kube-controller-manager-crc\": dial tcp 38.102.83.196:6443: connect: connection refused" Jan 11 17:34:13 crc kubenswrapper[4837]: I0111 17:34:13.366957 4837 status_manager.go:851] "Failed to get status for pod" podUID="59797bd6-cb69-412d-952b-1673312648e2" pod="openshift-marketplace/redhat-operators-slftv" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-operators-slftv\": dial tcp 38.102.83.196:6443: connect: connection refused" Jan 11 17:34:13 crc kubenswrapper[4837]: I0111 17:34:13.367344 4837 status_manager.go:851] "Failed to get status for pod" podUID="30409294-8779-48ad-a6e8-36b662f09c0f" pod="openshift-marketplace/certified-operators-29ts4" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/certified-operators-29ts4\": dial tcp 38.102.83.196:6443: connect: connection refused" Jan 11 17:34:13 crc kubenswrapper[4837]: I0111 17:34:13.367872 4837 status_manager.go:851] "Failed to get status for pod" podUID="5a6d8e6a-5e4a-4ff4-92ec-b2bfee9ef821" pod="openshift-marketplace/redhat-marketplace-9blrl" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-marketplace-9blrl\": dial tcp 38.102.83.196:6443: connect: connection refused" Jan 11 17:34:13 crc kubenswrapper[4837]: I0111 17:34:13.368603 4837 status_manager.go:851] "Failed to get status for pod" podUID="1aecbb0e-6cc3-4308-a741-c1799ec8b541" pod="openshift-marketplace/certified-operators-jk4fx" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/certified-operators-jk4fx\": dial tcp 38.102.83.196:6443: connect: connection refused" Jan 11 17:34:13 crc kubenswrapper[4837]: I0111 17:34:13.369159 4837 status_manager.go:851] "Failed to get status for pod" podUID="2bf0823a-63d9-43c9-9cc7-1e2f364b7855" pod="openshift-marketplace/redhat-marketplace-dht47" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-marketplace-dht47\": dial tcp 38.102.83.196:6443: connect: connection refused" Jan 11 17:34:13 crc kubenswrapper[4837]: I0111 17:34:13.369645 4837 status_manager.go:851] "Failed to get status for pod" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-startup-monitor-crc\": dial tcp 38.102.83.196:6443: connect: connection refused" Jan 11 17:34:13 crc kubenswrapper[4837]: I0111 17:34:13.386343 4837 kubelet.go:1909] "Trying to delete pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="46acf340-e77e-47f8-ba25-95b9b5870af3" Jan 11 17:34:13 crc kubenswrapper[4837]: I0111 17:34:13.386389 4837 mirror_client.go:130] "Deleting a mirror pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="46acf340-e77e-47f8-ba25-95b9b5870af3" Jan 11 17:34:13 crc kubenswrapper[4837]: E0111 17:34:13.386979 4837 mirror_client.go:138] "Failed deleting a mirror pod" err="Delete \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\": dial tcp 38.102.83.196:6443: connect: connection refused" pod="openshift-kube-apiserver/kube-apiserver-crc" Jan 11 17:34:13 crc kubenswrapper[4837]: I0111 17:34:13.387606 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-crc" Jan 11 17:34:13 crc kubenswrapper[4837]: E0111 17:34:13.647360 4837 event.go:368] "Unable to write event (may retry after sleeping)" err="Patch \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/events/redhat-marketplace-9blrl.1889bd87fbb677b7\": dial tcp 38.102.83.196:6443: connect: connection refused" event="&Event{ObjectMeta:{redhat-marketplace-9blrl.1889bd87fbb677b7 openshift-marketplace 28587 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-marketplace,Name:redhat-marketplace-9blrl,UID:5a6d8e6a-5e4a-4ff4-92ec-b2bfee9ef821,APIVersion:v1,ResourceVersion:28542,FieldPath:spec.initContainers{extract-content},},Reason:Pulling,Message:Pulling image \"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\",Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-01-11 17:32:00 +0000 UTC,LastTimestamp:2026-01-11 17:33:57.15795634 +0000 UTC m=+211.336149076,Count:2,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Jan 11 17:34:13 crc kubenswrapper[4837]: I0111 17:34:13.868125 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"71bb4a3aecc4ba5b26c4b7318770ce13","Type":"ContainerStarted","Data":"c71094c8effe51e5937be20c7b909f81b824678d95654ef81160d3fca3131370"} Jan 11 17:34:13 crc kubenswrapper[4837]: I0111 17:34:13.868212 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"71bb4a3aecc4ba5b26c4b7318770ce13","Type":"ContainerStarted","Data":"82529288acfb65cef13e9cdd63622ff34bf883b1a20eb4f9ff12e9d0676247c3"} Jan 11 17:34:14 crc kubenswrapper[4837]: I0111 17:34:14.874584 4837 generic.go:334] "Generic (PLEG): container finished" podID="71bb4a3aecc4ba5b26c4b7318770ce13" containerID="c71094c8effe51e5937be20c7b909f81b824678d95654ef81160d3fca3131370" exitCode=0 Jan 11 17:34:14 crc kubenswrapper[4837]: I0111 17:34:14.874744 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"71bb4a3aecc4ba5b26c4b7318770ce13","Type":"ContainerDied","Data":"c71094c8effe51e5937be20c7b909f81b824678d95654ef81160d3fca3131370"} Jan 11 17:34:14 crc kubenswrapper[4837]: I0111 17:34:14.875016 4837 kubelet.go:1909] "Trying to delete pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="46acf340-e77e-47f8-ba25-95b9b5870af3" Jan 11 17:34:14 crc kubenswrapper[4837]: I0111 17:34:14.875050 4837 mirror_client.go:130] "Deleting a mirror pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="46acf340-e77e-47f8-ba25-95b9b5870af3" Jan 11 17:34:14 crc kubenswrapper[4837]: I0111 17:34:14.875483 4837 status_manager.go:851] "Failed to get status for pod" podUID="df3f0f97-5906-44b5-99d5-6003e1b23be1" pod="openshift-marketplace/community-operators-p7kkt" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/community-operators-p7kkt\": dial tcp 38.102.83.196:6443: connect: connection refused" Jan 11 17:34:14 crc kubenswrapper[4837]: E0111 17:34:14.875516 4837 mirror_client.go:138] "Failed deleting a mirror pod" err="Delete \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\": dial tcp 38.102.83.196:6443: connect: connection refused" pod="openshift-kube-apiserver/kube-apiserver-crc" Jan 11 17:34:14 crc kubenswrapper[4837]: I0111 17:34:14.875697 4837 status_manager.go:851] "Failed to get status for pod" podUID="f614b9022728cf315e60c057852e563e" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-controller-manager/pods/kube-controller-manager-crc\": dial tcp 38.102.83.196:6443: connect: connection refused" Jan 11 17:34:14 crc kubenswrapper[4837]: I0111 17:34:14.876414 4837 status_manager.go:851] "Failed to get status for pod" podUID="59797bd6-cb69-412d-952b-1673312648e2" pod="openshift-marketplace/redhat-operators-slftv" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-operators-slftv\": dial tcp 38.102.83.196:6443: connect: connection refused" Jan 11 17:34:14 crc kubenswrapper[4837]: I0111 17:34:14.876701 4837 status_manager.go:851] "Failed to get status for pod" podUID="30409294-8779-48ad-a6e8-36b662f09c0f" pod="openshift-marketplace/certified-operators-29ts4" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/certified-operators-29ts4\": dial tcp 38.102.83.196:6443: connect: connection refused" Jan 11 17:34:14 crc kubenswrapper[4837]: I0111 17:34:14.876936 4837 status_manager.go:851] "Failed to get status for pod" podUID="5a6d8e6a-5e4a-4ff4-92ec-b2bfee9ef821" pod="openshift-marketplace/redhat-marketplace-9blrl" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-marketplace-9blrl\": dial tcp 38.102.83.196:6443: connect: connection refused" Jan 11 17:34:14 crc kubenswrapper[4837]: I0111 17:34:14.877130 4837 status_manager.go:851] "Failed to get status for pod" podUID="1aecbb0e-6cc3-4308-a741-c1799ec8b541" pod="openshift-marketplace/certified-operators-jk4fx" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/certified-operators-jk4fx\": dial tcp 38.102.83.196:6443: connect: connection refused" Jan 11 17:34:14 crc kubenswrapper[4837]: I0111 17:34:14.877310 4837 status_manager.go:851] "Failed to get status for pod" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-startup-monitor-crc\": dial tcp 38.102.83.196:6443: connect: connection refused" Jan 11 17:34:14 crc kubenswrapper[4837]: I0111 17:34:14.877535 4837 status_manager.go:851] "Failed to get status for pod" podUID="2bf0823a-63d9-43c9-9cc7-1e2f364b7855" pod="openshift-marketplace/redhat-marketplace-dht47" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-marketplace-dht47\": dial tcp 38.102.83.196:6443: connect: connection refused" Jan 11 17:34:14 crc kubenswrapper[4837]: I0111 17:34:14.877760 4837 status_manager.go:851] "Failed to get status for pod" podUID="1ec7b443-a2dc-4c7b-b661-03ce968d0878" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.196:6443: connect: connection refused" Jan 11 17:34:14 crc kubenswrapper[4837]: I0111 17:34:14.877969 4837 status_manager.go:851] "Failed to get status for pod" podUID="0f2f437a-d901-4e9b-95c1-9099d8ffaf9c" pod="openshift-marketplace/redhat-operators-pvd2s" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-operators-pvd2s\": dial tcp 38.102.83.196:6443: connect: connection refused" Jan 11 17:34:14 crc kubenswrapper[4837]: I0111 17:34:14.878168 4837 status_manager.go:851] "Failed to get status for pod" podUID="1d39ff8b-c79a-46ea-af70-0902ce0ee504" pod="openshift-marketplace/community-operators-mxndb" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/community-operators-mxndb\": dial tcp 38.102.83.196:6443: connect: connection refused" Jan 11 17:34:14 crc kubenswrapper[4837]: I0111 17:34:14.879061 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-p7kkt" event={"ID":"df3f0f97-5906-44b5-99d5-6003e1b23be1","Type":"ContainerStarted","Data":"de99e4d2f93d0a5bbca6d85a22db32b108f9a98ea80b39fcee90267e5d310462"} Jan 11 17:34:14 crc kubenswrapper[4837]: I0111 17:34:14.879637 4837 status_manager.go:851] "Failed to get status for pod" podUID="df3f0f97-5906-44b5-99d5-6003e1b23be1" pod="openshift-marketplace/community-operators-p7kkt" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/community-operators-p7kkt\": dial tcp 38.102.83.196:6443: connect: connection refused" Jan 11 17:34:14 crc kubenswrapper[4837]: I0111 17:34:14.879983 4837 status_manager.go:851] "Failed to get status for pod" podUID="f614b9022728cf315e60c057852e563e" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-controller-manager/pods/kube-controller-manager-crc\": dial tcp 38.102.83.196:6443: connect: connection refused" Jan 11 17:34:14 crc kubenswrapper[4837]: I0111 17:34:14.880254 4837 status_manager.go:851] "Failed to get status for pod" podUID="59797bd6-cb69-412d-952b-1673312648e2" pod="openshift-marketplace/redhat-operators-slftv" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-operators-slftv\": dial tcp 38.102.83.196:6443: connect: connection refused" Jan 11 17:34:14 crc kubenswrapper[4837]: I0111 17:34:14.880505 4837 status_manager.go:851] "Failed to get status for pod" podUID="30409294-8779-48ad-a6e8-36b662f09c0f" pod="openshift-marketplace/certified-operators-29ts4" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/certified-operators-29ts4\": dial tcp 38.102.83.196:6443: connect: connection refused" Jan 11 17:34:14 crc kubenswrapper[4837]: I0111 17:34:14.880752 4837 status_manager.go:851] "Failed to get status for pod" podUID="5a6d8e6a-5e4a-4ff4-92ec-b2bfee9ef821" pod="openshift-marketplace/redhat-marketplace-9blrl" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-marketplace-9blrl\": dial tcp 38.102.83.196:6443: connect: connection refused" Jan 11 17:34:14 crc kubenswrapper[4837]: I0111 17:34:14.881016 4837 status_manager.go:851] "Failed to get status for pod" podUID="1aecbb0e-6cc3-4308-a741-c1799ec8b541" pod="openshift-marketplace/certified-operators-jk4fx" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/certified-operators-jk4fx\": dial tcp 38.102.83.196:6443: connect: connection refused" Jan 11 17:34:14 crc kubenswrapper[4837]: I0111 17:34:14.881259 4837 status_manager.go:851] "Failed to get status for pod" podUID="2bf0823a-63d9-43c9-9cc7-1e2f364b7855" pod="openshift-marketplace/redhat-marketplace-dht47" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-marketplace-dht47\": dial tcp 38.102.83.196:6443: connect: connection refused" Jan 11 17:34:14 crc kubenswrapper[4837]: I0111 17:34:14.881428 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-29ts4" event={"ID":"30409294-8779-48ad-a6e8-36b662f09c0f","Type":"ContainerStarted","Data":"8fc494cd2c7422ba58b5dfac309e99f679c0b0579429b7f587db8be68c5c72ea"} Jan 11 17:34:14 crc kubenswrapper[4837]: I0111 17:34:14.881511 4837 status_manager.go:851] "Failed to get status for pod" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-startup-monitor-crc\": dial tcp 38.102.83.196:6443: connect: connection refused" Jan 11 17:34:14 crc kubenswrapper[4837]: I0111 17:34:14.881787 4837 status_manager.go:851] "Failed to get status for pod" podUID="1ec7b443-a2dc-4c7b-b661-03ce968d0878" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.196:6443: connect: connection refused" Jan 11 17:34:14 crc kubenswrapper[4837]: I0111 17:34:14.882058 4837 status_manager.go:851] "Failed to get status for pod" podUID="0f2f437a-d901-4e9b-95c1-9099d8ffaf9c" pod="openshift-marketplace/redhat-operators-pvd2s" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-operators-pvd2s\": dial tcp 38.102.83.196:6443: connect: connection refused" Jan 11 17:34:14 crc kubenswrapper[4837]: I0111 17:34:14.882321 4837 status_manager.go:851] "Failed to get status for pod" podUID="1d39ff8b-c79a-46ea-af70-0902ce0ee504" pod="openshift-marketplace/community-operators-mxndb" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/community-operators-mxndb\": dial tcp 38.102.83.196:6443: connect: connection refused" Jan 11 17:34:14 crc kubenswrapper[4837]: I0111 17:34:14.882653 4837 status_manager.go:851] "Failed to get status for pod" podUID="df3f0f97-5906-44b5-99d5-6003e1b23be1" pod="openshift-marketplace/community-operators-p7kkt" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/community-operators-p7kkt\": dial tcp 38.102.83.196:6443: connect: connection refused" Jan 11 17:34:14 crc kubenswrapper[4837]: I0111 17:34:14.883001 4837 status_manager.go:851] "Failed to get status for pod" podUID="f614b9022728cf315e60c057852e563e" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-controller-manager/pods/kube-controller-manager-crc\": dial tcp 38.102.83.196:6443: connect: connection refused" Jan 11 17:34:14 crc kubenswrapper[4837]: I0111 17:34:14.883290 4837 status_manager.go:851] "Failed to get status for pod" podUID="59797bd6-cb69-412d-952b-1673312648e2" pod="openshift-marketplace/redhat-operators-slftv" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-operators-slftv\": dial tcp 38.102.83.196:6443: connect: connection refused" Jan 11 17:34:14 crc kubenswrapper[4837]: I0111 17:34:14.883526 4837 status_manager.go:851] "Failed to get status for pod" podUID="30409294-8779-48ad-a6e8-36b662f09c0f" pod="openshift-marketplace/certified-operators-29ts4" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/certified-operators-29ts4\": dial tcp 38.102.83.196:6443: connect: connection refused" Jan 11 17:34:14 crc kubenswrapper[4837]: I0111 17:34:14.883776 4837 status_manager.go:851] "Failed to get status for pod" podUID="5a6d8e6a-5e4a-4ff4-92ec-b2bfee9ef821" pod="openshift-marketplace/redhat-marketplace-9blrl" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-marketplace-9blrl\": dial tcp 38.102.83.196:6443: connect: connection refused" Jan 11 17:34:14 crc kubenswrapper[4837]: I0111 17:34:14.884067 4837 status_manager.go:851] "Failed to get status for pod" podUID="1aecbb0e-6cc3-4308-a741-c1799ec8b541" pod="openshift-marketplace/certified-operators-jk4fx" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/certified-operators-jk4fx\": dial tcp 38.102.83.196:6443: connect: connection refused" Jan 11 17:34:14 crc kubenswrapper[4837]: I0111 17:34:14.884312 4837 status_manager.go:851] "Failed to get status for pod" podUID="2bf0823a-63d9-43c9-9cc7-1e2f364b7855" pod="openshift-marketplace/redhat-marketplace-dht47" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-marketplace-dht47\": dial tcp 38.102.83.196:6443: connect: connection refused" Jan 11 17:34:14 crc kubenswrapper[4837]: I0111 17:34:14.884643 4837 status_manager.go:851] "Failed to get status for pod" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-startup-monitor-crc\": dial tcp 38.102.83.196:6443: connect: connection refused" Jan 11 17:34:14 crc kubenswrapper[4837]: I0111 17:34:14.885025 4837 status_manager.go:851] "Failed to get status for pod" podUID="1ec7b443-a2dc-4c7b-b661-03ce968d0878" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.196:6443: connect: connection refused" Jan 11 17:34:14 crc kubenswrapper[4837]: I0111 17:34:14.885279 4837 status_manager.go:851] "Failed to get status for pod" podUID="0f2f437a-d901-4e9b-95c1-9099d8ffaf9c" pod="openshift-marketplace/redhat-operators-pvd2s" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-operators-pvd2s\": dial tcp 38.102.83.196:6443: connect: connection refused" Jan 11 17:34:14 crc kubenswrapper[4837]: I0111 17:34:14.885532 4837 status_manager.go:851] "Failed to get status for pod" podUID="1d39ff8b-c79a-46ea-af70-0902ce0ee504" pod="openshift-marketplace/community-operators-mxndb" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/community-operators-mxndb\": dial tcp 38.102.83.196:6443: connect: connection refused" Jan 11 17:34:15 crc kubenswrapper[4837]: I0111 17:34:15.894557 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"71bb4a3aecc4ba5b26c4b7318770ce13","Type":"ContainerStarted","Data":"5617441f2f542c4d5a2db29e2cb77e6cc597f416226a41b8fa5dcddc0125293e"} Jan 11 17:34:15 crc kubenswrapper[4837]: I0111 17:34:15.894914 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"71bb4a3aecc4ba5b26c4b7318770ce13","Type":"ContainerStarted","Data":"f6e332ac216b402f3a5a0be12ee4bbc86db596997ed693821e100893341e98cb"} Jan 11 17:34:15 crc kubenswrapper[4837]: I0111 17:34:15.898014 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-mxndb" event={"ID":"1d39ff8b-c79a-46ea-af70-0902ce0ee504","Type":"ContainerStarted","Data":"fd605aaf7ff12f50189ebe4c1f7e471af62160afef188bd4fb35bee5021e7b90"} Jan 11 17:34:15 crc kubenswrapper[4837]: I0111 17:34:15.911584 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-jk4fx" event={"ID":"1aecbb0e-6cc3-4308-a741-c1799ec8b541","Type":"ContainerStarted","Data":"9a28e04adf3e0396d207e7a32a1dafbe70b8bfaf68d43934c84b492e1875a5a8"} Jan 11 17:34:15 crc kubenswrapper[4837]: I0111 17:34:15.915691 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-9blrl" event={"ID":"5a6d8e6a-5e4a-4ff4-92ec-b2bfee9ef821","Type":"ContainerStarted","Data":"4fba69108dfb028b8c2325a937358a28a67b68d650ad355ffcc40011912243f8"} Jan 11 17:34:15 crc kubenswrapper[4837]: I0111 17:34:15.917979 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-pvd2s" event={"ID":"0f2f437a-d901-4e9b-95c1-9099d8ffaf9c","Type":"ContainerStarted","Data":"b8deaa61f47f6740e8ce536e0d2ee2622e91727f7b1f178ca10613b63e74ebd4"} Jan 11 17:34:16 crc kubenswrapper[4837]: I0111 17:34:16.932756 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"71bb4a3aecc4ba5b26c4b7318770ce13","Type":"ContainerStarted","Data":"7e561b4f4404d75406dadf0f02b0786b52b8997f5b9ee99c0d1eb2190847b59d"} Jan 11 17:34:16 crc kubenswrapper[4837]: I0111 17:34:16.933086 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"71bb4a3aecc4ba5b26c4b7318770ce13","Type":"ContainerStarted","Data":"7a0758f3363b7376d0d201cfdf13aea1109be5d0bac1b88127742af1089f919f"} Jan 11 17:34:16 crc kubenswrapper[4837]: I0111 17:34:16.933100 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"71bb4a3aecc4ba5b26c4b7318770ce13","Type":"ContainerStarted","Data":"0853104b32ef09116ec1445b3aa375c2a4779c7ca2cb41092468eb2dc4faab1e"} Jan 11 17:34:16 crc kubenswrapper[4837]: I0111 17:34:16.933222 4837 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-apiserver/kube-apiserver-crc" Jan 11 17:34:16 crc kubenswrapper[4837]: I0111 17:34:16.933319 4837 kubelet.go:1909] "Trying to delete pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="46acf340-e77e-47f8-ba25-95b9b5870af3" Jan 11 17:34:16 crc kubenswrapper[4837]: I0111 17:34:16.933356 4837 mirror_client.go:130] "Deleting a mirror pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="46acf340-e77e-47f8-ba25-95b9b5870af3" Jan 11 17:34:17 crc kubenswrapper[4837]: I0111 17:34:17.940243 4837 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-mxndb" Jan 11 17:34:17 crc kubenswrapper[4837]: I0111 17:34:17.940302 4837 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-mxndb" Jan 11 17:34:18 crc kubenswrapper[4837]: I0111 17:34:18.014751 4837 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-mxndb" Jan 11 17:34:18 crc kubenswrapper[4837]: I0111 17:34:18.088716 4837 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-29ts4" Jan 11 17:34:18 crc kubenswrapper[4837]: I0111 17:34:18.088756 4837 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-29ts4" Jan 11 17:34:18 crc kubenswrapper[4837]: I0111 17:34:18.138536 4837 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-29ts4" Jan 11 17:34:18 crc kubenswrapper[4837]: I0111 17:34:18.338336 4837 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-p7kkt" Jan 11 17:34:18 crc kubenswrapper[4837]: I0111 17:34:18.338388 4837 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-p7kkt" Jan 11 17:34:18 crc kubenswrapper[4837]: I0111 17:34:18.377641 4837 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-p7kkt" Jan 11 17:34:18 crc kubenswrapper[4837]: I0111 17:34:18.387843 4837 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-apiserver/kube-apiserver-crc" Jan 11 17:34:18 crc kubenswrapper[4837]: I0111 17:34:18.387880 4837 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-kube-apiserver/kube-apiserver-crc" Jan 11 17:34:18 crc kubenswrapper[4837]: I0111 17:34:18.393574 4837 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-kube-apiserver/kube-apiserver-crc" Jan 11 17:34:18 crc kubenswrapper[4837]: I0111 17:34:18.526309 4837 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-jk4fx" Jan 11 17:34:18 crc kubenswrapper[4837]: I0111 17:34:18.526382 4837 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-jk4fx" Jan 11 17:34:18 crc kubenswrapper[4837]: I0111 17:34:18.589630 4837 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-jk4fx" Jan 11 17:34:19 crc kubenswrapper[4837]: I0111 17:34:19.881209 4837 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-9blrl" Jan 11 17:34:19 crc kubenswrapper[4837]: I0111 17:34:19.881834 4837 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-9blrl" Jan 11 17:34:19 crc kubenswrapper[4837]: I0111 17:34:19.953929 4837 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-9blrl" Jan 11 17:34:20 crc kubenswrapper[4837]: I0111 17:34:20.021475 4837 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-9blrl" Jan 11 17:34:20 crc kubenswrapper[4837]: I0111 17:34:20.361926 4837 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-dht47" Jan 11 17:34:20 crc kubenswrapper[4837]: I0111 17:34:20.585895 4837 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Jan 11 17:34:21 crc kubenswrapper[4837]: I0111 17:34:21.138599 4837 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-slftv" Jan 11 17:34:21 crc kubenswrapper[4837]: I0111 17:34:21.172327 4837 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-slftv" Jan 11 17:34:21 crc kubenswrapper[4837]: I0111 17:34:21.561997 4837 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-pvd2s" Jan 11 17:34:21 crc kubenswrapper[4837]: I0111 17:34:21.562312 4837 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-pvd2s" Jan 11 17:34:21 crc kubenswrapper[4837]: I0111 17:34:21.957228 4837 kubelet.go:1914] "Deleted mirror pod because it is outdated" pod="openshift-kube-apiserver/kube-apiserver-crc" Jan 11 17:34:22 crc kubenswrapper[4837]: I0111 17:34:22.210442 4837 status_manager.go:861] "Pod was deleted and then recreated, skipping status update" pod="openshift-kube-apiserver/kube-apiserver-crc" oldPodUID="71bb4a3aecc4ba5b26c4b7318770ce13" podUID="8a3219d4-253a-4350-aadc-e2c99628d1fb" Jan 11 17:34:22 crc kubenswrapper[4837]: I0111 17:34:22.363804 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-b577cd54c-v775z" Jan 11 17:34:22 crc kubenswrapper[4837]: I0111 17:34:22.364306 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-b577cd54c-v775z" Jan 11 17:34:22 crc kubenswrapper[4837]: I0111 17:34:22.609757 4837 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-pvd2s" podUID="0f2f437a-d901-4e9b-95c1-9099d8ffaf9c" containerName="registry-server" probeResult="failure" output=< Jan 11 17:34:22 crc kubenswrapper[4837]: timeout: failed to connect service ":50051" within 1s Jan 11 17:34:22 crc kubenswrapper[4837]: > Jan 11 17:34:22 crc kubenswrapper[4837]: I0111 17:34:22.967344 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-b577cd54c-v775z" event={"ID":"99d66a69-1de3-43d5-947f-1fc4611c3022","Type":"ContainerStarted","Data":"673ae980dce39c77fe50aed9c3b8bcaaa849eeb42fa7dc141ad57c6de25790a3"} Jan 11 17:34:22 crc kubenswrapper[4837]: I0111 17:34:22.967623 4837 kubelet.go:1909] "Trying to delete pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="46acf340-e77e-47f8-ba25-95b9b5870af3" Jan 11 17:34:22 crc kubenswrapper[4837]: I0111 17:34:22.967733 4837 mirror_client.go:130] "Deleting a mirror pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="46acf340-e77e-47f8-ba25-95b9b5870af3" Jan 11 17:34:22 crc kubenswrapper[4837]: I0111 17:34:22.972631 4837 status_manager.go:861] "Pod was deleted and then recreated, skipping status update" pod="openshift-kube-apiserver/kube-apiserver-crc" oldPodUID="71bb4a3aecc4ba5b26c4b7318770ce13" podUID="8a3219d4-253a-4350-aadc-e2c99628d1fb" Jan 11 17:34:22 crc kubenswrapper[4837]: I0111 17:34:22.974152 4837 status_manager.go:308] "Container readiness changed before pod has synced" pod="openshift-kube-apiserver/kube-apiserver-crc" containerID="cri-o://f6e332ac216b402f3a5a0be12ee4bbc86db596997ed693821e100893341e98cb" Jan 11 17:34:22 crc kubenswrapper[4837]: I0111 17:34:22.974229 4837 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-apiserver/kube-apiserver-crc" Jan 11 17:34:23 crc kubenswrapper[4837]: I0111 17:34:23.976011 4837 kubelet.go:1909] "Trying to delete pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="46acf340-e77e-47f8-ba25-95b9b5870af3" Jan 11 17:34:23 crc kubenswrapper[4837]: I0111 17:34:23.976091 4837 mirror_client.go:130] "Deleting a mirror pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="46acf340-e77e-47f8-ba25-95b9b5870af3" Jan 11 17:34:23 crc kubenswrapper[4837]: I0111 17:34:23.978395 4837 status_manager.go:861] "Pod was deleted and then recreated, skipping status update" pod="openshift-kube-apiserver/kube-apiserver-crc" oldPodUID="71bb4a3aecc4ba5b26c4b7318770ce13" podUID="8a3219d4-253a-4350-aadc-e2c99628d1fb" Jan 11 17:34:25 crc kubenswrapper[4837]: I0111 17:34:25.987373 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-b577cd54c-v775z" event={"ID":"99d66a69-1de3-43d5-947f-1fc4611c3022","Type":"ContainerStarted","Data":"b717c4896ee849d8240fac8d8dffad05b9e9e18033d97ff49c7cd76f76fc62d5"} Jan 11 17:34:26 crc kubenswrapper[4837]: I0111 17:34:26.994622 4837 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-controller-manager/controller-manager-b577cd54c-v775z" Jan 11 17:34:26 crc kubenswrapper[4837]: I0111 17:34:26.999604 4837 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-controller-manager/controller-manager-b577cd54c-v775z" Jan 11 17:34:34 crc kubenswrapper[4837]: I0111 17:34:28.131834 4837 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-29ts4" Jan 11 17:34:34 crc kubenswrapper[4837]: I0111 17:34:28.376741 4837 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-p7kkt" Jan 11 17:34:34 crc kubenswrapper[4837]: I0111 17:34:28.566128 4837 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-jk4fx" Jan 11 17:34:34 crc kubenswrapper[4837]: I0111 17:34:28.974682 4837 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-marketplace/community-operators-mxndb" podUID="1d39ff8b-c79a-46ea-af70-0902ce0ee504" containerName="registry-server" probeResult="failure" output=< Jan 11 17:34:34 crc kubenswrapper[4837]: timeout: failed to connect service ":50051" within 1s Jan 11 17:34:34 crc kubenswrapper[4837]: > Jan 11 17:34:34 crc kubenswrapper[4837]: I0111 17:34:28.974937 4837 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-marketplace/community-operators-mxndb" podUID="1d39ff8b-c79a-46ea-af70-0902ce0ee504" containerName="registry-server" probeResult="failure" output=< Jan 11 17:34:34 crc kubenswrapper[4837]: timeout: failed to connect service ":50051" within 1s Jan 11 17:34:34 crc kubenswrapper[4837]: > Jan 11 17:34:34 crc kubenswrapper[4837]: I0111 17:34:31.610592 4837 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-pvd2s" Jan 11 17:34:34 crc kubenswrapper[4837]: I0111 17:34:31.698922 4837 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-pvd2s" Jan 11 17:34:34 crc kubenswrapper[4837]: I0111 17:34:32.141886 4837 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-marketplace/redhat-operators-slftv" podUID="59797bd6-cb69-412d-952b-1673312648e2" containerName="registry-server" probeResult="failure" output=< Jan 11 17:34:34 crc kubenswrapper[4837]: timeout: failed to connect service ":50051" within 1s Jan 11 17:34:34 crc kubenswrapper[4837]: > Jan 11 17:34:37 crc kubenswrapper[4837]: I0111 17:34:37.980385 4837 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-mxndb" Jan 11 17:34:39 crc kubenswrapper[4837]: I0111 17:34:39.443893 4837 patch_prober.go:28] interesting pod/machine-config-daemon-pqnst container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Jan 11 17:34:39 crc kubenswrapper[4837]: I0111 17:34:39.443957 4837 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-pqnst" podUID="1cf6fa66-290a-4e29-bafc-e60185a22fe2" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Jan 11 17:34:39 crc kubenswrapper[4837]: I0111 17:34:39.444008 4837 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-pqnst" Jan 11 17:34:39 crc kubenswrapper[4837]: I0111 17:34:39.444845 4837 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"ef1d1b5ff926a1f2f0f357177d19214a1c92fbf76f445fb1a767d0d17cd1b4cb"} pod="openshift-machine-config-operator/machine-config-daemon-pqnst" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Jan 11 17:34:39 crc kubenswrapper[4837]: I0111 17:34:39.444949 4837 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-pqnst" podUID="1cf6fa66-290a-4e29-bafc-e60185a22fe2" containerName="machine-config-daemon" containerID="cri-o://ef1d1b5ff926a1f2f0f357177d19214a1c92fbf76f445fb1a767d0d17cd1b4cb" gracePeriod=600 Jan 11 17:34:40 crc kubenswrapper[4837]: I0111 17:34:40.075551 4837 generic.go:334] "Generic (PLEG): container finished" podID="1cf6fa66-290a-4e29-bafc-e60185a22fe2" containerID="ef1d1b5ff926a1f2f0f357177d19214a1c92fbf76f445fb1a767d0d17cd1b4cb" exitCode=0 Jan 11 17:34:40 crc kubenswrapper[4837]: I0111 17:34:40.075669 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-pqnst" event={"ID":"1cf6fa66-290a-4e29-bafc-e60185a22fe2","Type":"ContainerDied","Data":"ef1d1b5ff926a1f2f0f357177d19214a1c92fbf76f445fb1a767d0d17cd1b4cb"} Jan 11 17:34:42 crc kubenswrapper[4837]: I0111 17:34:42.088405 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-pqnst" event={"ID":"1cf6fa66-290a-4e29-bafc-e60185a22fe2","Type":"ContainerStarted","Data":"51a674f214ce0ccc55b2fa9005d4dce39df2f19cf7b9f9089590388fd9cdba68"} Jan 11 17:34:44 crc kubenswrapper[4837]: I0111 17:34:44.958078 4837 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-console"/"networking-console-plugin" Jan 11 17:34:46 crc kubenswrapper[4837]: I0111 17:34:46.044294 4837 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"etcd-client" Jan 11 17:34:47 crc kubenswrapper[4837]: I0111 17:34:47.255325 4837 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca-operator"/"service-ca-operator-dockercfg-rg9jl" Jan 11 17:34:47 crc kubenswrapper[4837]: I0111 17:34:47.875996 4837 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"control-plane-machine-set-operator-tls" Jan 11 17:34:48 crc kubenswrapper[4837]: I0111 17:34:48.027027 4837 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-marketplace"/"kube-root-ca.crt" Jan 11 17:34:48 crc kubenswrapper[4837]: I0111 17:34:48.232856 4837 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca-operator"/"serving-cert" Jan 11 17:34:48 crc kubenswrapper[4837]: I0111 17:34:48.234026 4837 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console-operator"/"console-operator-dockercfg-4xjcr" Jan 11 17:34:48 crc kubenswrapper[4837]: I0111 17:34:48.952991 4837 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"multus-admission-controller-secret" Jan 11 17:34:49 crc kubenswrapper[4837]: I0111 17:34:49.344868 4837 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"openshift-service-ca.crt" Jan 11 17:34:49 crc kubenswrapper[4837]: I0111 17:34:49.488300 4837 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"kube-root-ca.crt" Jan 11 17:34:50 crc kubenswrapper[4837]: I0111 17:34:50.647843 4837 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"trusted-ca-bundle" Jan 11 17:34:50 crc kubenswrapper[4837]: I0111 17:34:50.908364 4837 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"kube-root-ca.crt" Jan 11 17:34:50 crc kubenswrapper[4837]: I0111 17:34:50.908422 4837 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"machine-api-operator-dockercfg-mfbb7" Jan 11 17:34:50 crc kubenswrapper[4837]: I0111 17:34:50.952144 4837 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"serving-cert" Jan 11 17:34:50 crc kubenswrapper[4837]: I0111 17:34:50.998749 4837 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-kubernetes-control-plane-dockercfg-gs7dd" Jan 11 17:34:51 crc kubenswrapper[4837]: I0111 17:34:51.288479 4837 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-scheduler-operator"/"kube-scheduler-operator-serving-cert" Jan 11 17:34:51 crc kubenswrapper[4837]: I0111 17:34:51.417985 4837 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"openshift-service-ca.crt" Jan 11 17:34:51 crc kubenswrapper[4837]: I0111 17:34:51.512732 4837 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca-operator"/"service-ca-operator-config" Jan 11 17:34:51 crc kubenswrapper[4837]: I0111 17:34:51.881320 4837 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"serving-cert" Jan 11 17:34:52 crc kubenswrapper[4837]: I0111 17:34:52.003932 4837 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"etcd-serving-ca" Jan 11 17:34:52 crc kubenswrapper[4837]: I0111 17:34:52.174942 4837 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"kube-root-ca.crt" Jan 11 17:34:52 crc kubenswrapper[4837]: I0111 17:34:52.476253 4837 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"service-ca-bundle" Jan 11 17:34:52 crc kubenswrapper[4837]: I0111 17:34:52.608113 4837 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"v4-0-config-system-trusted-ca-bundle" Jan 11 17:34:52 crc kubenswrapper[4837]: I0111 17:34:52.701911 4837 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-version"/"default-dockercfg-gxtc4" Jan 11 17:34:53 crc kubenswrapper[4837]: I0111 17:34:53.003429 4837 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"console-dockercfg-f62pw" Jan 11 17:34:53 crc kubenswrapper[4837]: I0111 17:34:53.124449 4837 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"kube-root-ca.crt" Jan 11 17:34:53 crc kubenswrapper[4837]: I0111 17:34:53.156151 4837 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-node-metrics-cert" Jan 11 17:34:53 crc kubenswrapper[4837]: I0111 17:34:53.531341 4837 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"oauth-openshift-dockercfg-znhcc" Jan 11 17:34:53 crc kubenswrapper[4837]: I0111 17:34:53.668016 4837 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager-operator"/"openshift-service-ca.crt" Jan 11 17:34:53 crc kubenswrapper[4837]: I0111 17:34:53.669988 4837 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"etcd-client" Jan 11 17:34:53 crc kubenswrapper[4837]: I0111 17:34:53.702934 4837 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-operator"/"trusted-ca" Jan 11 17:34:53 crc kubenswrapper[4837]: I0111 17:34:53.912976 4837 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"machine-api-operator-images" Jan 11 17:34:54 crc kubenswrapper[4837]: I0111 17:34:54.309188 4837 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-kubernetes-node-dockercfg-pwtwl" Jan 11 17:34:54 crc kubenswrapper[4837]: I0111 17:34:54.408492 4837 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"cluster-image-registry-operator-dockercfg-m4qtx" Jan 11 17:34:54 crc kubenswrapper[4837]: I0111 17:34:54.445999 4837 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"trusted-ca-bundle" Jan 11 17:34:54 crc kubenswrapper[4837]: I0111 17:34:54.447996 4837 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"kube-root-ca.crt" Jan 11 17:34:54 crc kubenswrapper[4837]: I0111 17:34:54.702191 4837 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-scheduler-operator"/"kube-root-ca.crt" Jan 11 17:34:54 crc kubenswrapper[4837]: I0111 17:34:54.813346 4837 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"console-operator-config" Jan 11 17:34:55 crc kubenswrapper[4837]: I0111 17:34:55.022667 4837 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"config" Jan 11 17:34:55 crc kubenswrapper[4837]: I0111 17:34:55.023757 4837 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-apiserver-operator"/"kube-apiserver-operator-config" Jan 11 17:34:55 crc kubenswrapper[4837]: I0111 17:34:55.165295 4837 reflector.go:368] Caches populated for *v1.Service from k8s.io/client-go/informers/factory.go:160 Jan 11 17:34:55 crc kubenswrapper[4837]: I0111 17:34:55.198568 4837 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"console-config" Jan 11 17:34:55 crc kubenswrapper[4837]: I0111 17:34:55.265982 4837 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"encryption-config-1" Jan 11 17:34:55 crc kubenswrapper[4837]: I0111 17:34:55.356709 4837 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-control-plane-metrics-cert" Jan 11 17:34:55 crc kubenswrapper[4837]: I0111 17:34:55.370794 4837 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-machine-approver"/"machine-approver-tls" Jan 11 17:34:55 crc kubenswrapper[4837]: I0111 17:34:55.373488 4837 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator"/"openshift-service-ca.crt" Jan 11 17:34:55 crc kubenswrapper[4837]: I0111 17:34:55.411252 4837 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"kube-root-ca.crt" Jan 11 17:34:55 crc kubenswrapper[4837]: I0111 17:34:55.480921 4837 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-operator"/"ingress-operator-dockercfg-7lnqk" Jan 11 17:34:55 crc kubenswrapper[4837]: I0111 17:34:55.528105 4837 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-version"/"kube-root-ca.crt" Jan 11 17:34:55 crc kubenswrapper[4837]: I0111 17:34:55.610157 4837 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"openshift-service-ca.crt" Jan 11 17:34:55 crc kubenswrapper[4837]: I0111 17:34:55.723927 4837 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"trusted-ca-bundle" Jan 11 17:34:55 crc kubenswrapper[4837]: I0111 17:34:55.773571 4837 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver-operator"/"openshift-apiserver-operator-serving-cert" Jan 11 17:34:55 crc kubenswrapper[4837]: I0111 17:34:55.923900 4837 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-operator"/"iptables-alerter-script" Jan 11 17:34:55 crc kubenswrapper[4837]: I0111 17:34:55.938966 4837 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"config" Jan 11 17:34:56 crc kubenswrapper[4837]: I0111 17:34:56.130840 4837 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"olm-operator-serviceaccount-dockercfg-rq7zk" Jan 11 17:34:56 crc kubenswrapper[4837]: I0111 17:34:56.143639 4837 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-samples-operator"/"cluster-samples-operator-dockercfg-xpp9w" Jan 11 17:34:56 crc kubenswrapper[4837]: I0111 17:34:56.246484 4837 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"openshift-service-ca.crt" Jan 11 17:34:56 crc kubenswrapper[4837]: I0111 17:34:56.354527 4837 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"encryption-config-1" Jan 11 17:34:56 crc kubenswrapper[4837]: I0111 17:34:56.378627 4837 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-operator-dockercfg-98p87" Jan 11 17:34:56 crc kubenswrapper[4837]: I0111 17:34:56.513252 4837 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-metrics-certs-default" Jan 11 17:34:56 crc kubenswrapper[4837]: I0111 17:34:56.550418 4837 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"serving-cert" Jan 11 17:34:56 crc kubenswrapper[4837]: I0111 17:34:56.612915 4837 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-certs-default" Jan 11 17:34:56 crc kubenswrapper[4837]: I0111 17:34:56.673761 4837 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"installation-pull-secrets" Jan 11 17:34:56 crc kubenswrapper[4837]: I0111 17:34:56.724166 4837 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"env-overrides" Jan 11 17:34:56 crc kubenswrapper[4837]: I0111 17:34:56.745021 4837 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"audit" Jan 11 17:34:56 crc kubenswrapper[4837]: I0111 17:34:56.919022 4837 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-daemon-dockercfg-r5tcq" Jan 11 17:34:56 crc kubenswrapper[4837]: I0111 17:34:56.995309 4837 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca"/"service-ca-dockercfg-pn86c" Jan 11 17:34:57 crc kubenswrapper[4837]: I0111 17:34:57.143175 4837 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"machine-config-operator-images" Jan 11 17:34:57 crc kubenswrapper[4837]: I0111 17:34:57.277701 4837 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"kube-root-ca.crt" Jan 11 17:34:57 crc kubenswrapper[4837]: I0111 17:34:57.287755 4837 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-version"/"cluster-version-operator-serving-cert" Jan 11 17:34:57 crc kubenswrapper[4837]: I0111 17:34:57.441664 4837 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"default-cni-sysctl-allowlist" Jan 11 17:34:57 crc kubenswrapper[4837]: I0111 17:34:57.656617 4837 reflector.go:368] Caches populated for *v1.RuntimeClass from k8s.io/client-go/informers/factory.go:160 Jan 11 17:34:57 crc kubenswrapper[4837]: I0111 17:34:57.814643 4837 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns"/"dns-default-metrics-tls" Jan 11 17:34:57 crc kubenswrapper[4837]: I0111 17:34:57.887091 4837 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"machine-api-operator-tls" Jan 11 17:34:57 crc kubenswrapper[4837]: I0111 17:34:57.947383 4837 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress"/"service-ca-bundle" Jan 11 17:34:57 crc kubenswrapper[4837]: I0111 17:34:57.989928 4837 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"etcd-serving-ca" Jan 11 17:34:58 crc kubenswrapper[4837]: I0111 17:34:58.050223 4837 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-apiserver-operator"/"kube-apiserver-operator-dockercfg-x57mr" Jan 11 17:34:58 crc kubenswrapper[4837]: I0111 17:34:58.154860 4837 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"machine-approver-config" Jan 11 17:34:58 crc kubenswrapper[4837]: I0111 17:34:58.237861 4837 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-controller-manager-operator"/"kube-root-ca.crt" Jan 11 17:34:58 crc kubenswrapper[4837]: I0111 17:34:58.249630 4837 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"kube-root-ca.crt" Jan 11 17:34:58 crc kubenswrapper[4837]: I0111 17:34:58.441415 4837 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-scheduler-operator"/"openshift-kube-scheduler-operator-dockercfg-qt55r" Jan 11 17:34:58 crc kubenswrapper[4837]: I0111 17:34:58.973376 4837 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"kube-root-ca.crt" Jan 11 17:34:58 crc kubenswrapper[4837]: I0111 17:34:58.983625 4837 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-network-console"/"networking-console-plugin-cert" Jan 11 17:34:59 crc kubenswrapper[4837]: I0111 17:34:59.039000 4837 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-operator"/"openshift-service-ca.crt" Jan 11 17:34:59 crc kubenswrapper[4837]: I0111 17:34:59.039186 4837 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"oauth-serving-cert" Jan 11 17:34:59 crc kubenswrapper[4837]: I0111 17:34:59.111862 4837 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns"/"node-resolver-dockercfg-kz9s7" Jan 11 17:34:59 crc kubenswrapper[4837]: I0111 17:34:59.126405 4837 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"kube-root-ca.crt" Jan 11 17:34:59 crc kubenswrapper[4837]: I0111 17:34:59.169380 4837 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"marketplace-operator-metrics" Jan 11 17:34:59 crc kubenswrapper[4837]: I0111 17:34:59.212282 4837 generic.go:334] "Generic (PLEG): container finished" podID="f9e34a1e-5456-4b26-b347-aa569c5987d5" containerID="22fe794e84e21e3d7722ec9d66f7b89843f7045cce8aa3bf0dc58a0edc5fa3b3" exitCode=0 Jan 11 17:34:59 crc kubenswrapper[4837]: I0111 17:34:59.212330 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/marketplace-operator-79b997595-gkkhd" event={"ID":"f9e34a1e-5456-4b26-b347-aa569c5987d5","Type":"ContainerDied","Data":"22fe794e84e21e3d7722ec9d66f7b89843f7045cce8aa3bf0dc58a0edc5fa3b3"} Jan 11 17:34:59 crc kubenswrapper[4837]: I0111 17:34:59.212869 4837 scope.go:117] "RemoveContainer" containerID="22fe794e84e21e3d7722ec9d66f7b89843f7045cce8aa3bf0dc58a0edc5fa3b3" Jan 11 17:34:59 crc kubenswrapper[4837]: I0111 17:34:59.338279 4837 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/marketplace-operator-79b997595-gkkhd" Jan 11 17:34:59 crc kubenswrapper[4837]: I0111 17:34:59.338652 4837 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-marketplace/marketplace-operator-79b997595-gkkhd" Jan 11 17:34:59 crc kubenswrapper[4837]: I0111 17:34:59.502731 4837 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"openshift-service-ca.crt" Jan 11 17:34:59 crc kubenswrapper[4837]: I0111 17:34:59.512154 4837 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"openshift-service-ca.crt" Jan 11 17:34:59 crc kubenswrapper[4837]: I0111 17:34:59.795202 4837 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-router-certs" Jan 11 17:34:59 crc kubenswrapper[4837]: I0111 17:34:59.797906 4837 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-network-node-identity"/"network-node-identity-cert" Jan 11 17:34:59 crc kubenswrapper[4837]: I0111 17:34:59.814769 4837 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca-operator"/"openshift-service-ca.crt" Jan 11 17:34:59 crc kubenswrapper[4837]: I0111 17:34:59.847468 4837 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-template-provider-selection" Jan 11 17:34:59 crc kubenswrapper[4837]: I0111 17:34:59.913744 4837 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"kube-rbac-proxy" Jan 11 17:34:59 crc kubenswrapper[4837]: I0111 17:34:59.984062 4837 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator"/"kube-root-ca.crt" Jan 11 17:35:00 crc kubenswrapper[4837]: I0111 17:35:00.026901 4837 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"kube-rbac-proxy" Jan 11 17:35:00 crc kubenswrapper[4837]: I0111 17:35:00.141341 4837 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"config" Jan 11 17:35:00 crc kubenswrapper[4837]: I0111 17:35:00.203783 4837 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-template-login" Jan 11 17:35:00 crc kubenswrapper[4837]: I0111 17:35:00.220991 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/marketplace-operator-79b997595-gkkhd" event={"ID":"f9e34a1e-5456-4b26-b347-aa569c5987d5","Type":"ContainerStarted","Data":"bd0e8b7a016ff310114e5185abb0d0ca9be702174d96027d70c5053ce5f91a72"} Jan 11 17:35:00 crc kubenswrapper[4837]: I0111 17:35:00.221549 4837 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/marketplace-operator-79b997595-gkkhd" Jan 11 17:35:00 crc kubenswrapper[4837]: I0111 17:35:00.229288 4837 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/marketplace-operator-79b997595-gkkhd" Jan 11 17:35:00 crc kubenswrapper[4837]: I0111 17:35:00.255669 4837 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"image-import-ca" Jan 11 17:35:00 crc kubenswrapper[4837]: I0111 17:35:00.339665 4837 reflector.go:368] Caches populated for *v1.Pod from pkg/kubelet/config/apiserver.go:66 Jan 11 17:35:00 crc kubenswrapper[4837]: I0111 17:35:00.339763 4837 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-mxndb" podStartSLOduration=47.469463973 podStartE2EDuration="3m3.339730095s" podCreationTimestamp="2026-01-11 17:31:57 +0000 UTC" firstStartedPulling="2026-01-11 17:31:59.416771481 +0000 UTC m=+93.594964197" lastFinishedPulling="2026-01-11 17:34:15.287037603 +0000 UTC m=+229.465230319" observedRunningTime="2026-01-11 17:34:22.006763934 +0000 UTC m=+236.184956650" watchObservedRunningTime="2026-01-11 17:35:00.339730095 +0000 UTC m=+274.517922801" Jan 11 17:35:00 crc kubenswrapper[4837]: I0111 17:35:00.340846 4837 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-dht47" podStartSLOduration=60.185626019 podStartE2EDuration="3m1.340838156s" podCreationTimestamp="2026-01-11 17:31:59 +0000 UTC" firstStartedPulling="2026-01-11 17:32:01.467013658 +0000 UTC m=+95.645206364" lastFinishedPulling="2026-01-11 17:34:02.622225795 +0000 UTC m=+216.800418501" observedRunningTime="2026-01-11 17:34:22.12856228 +0000 UTC m=+236.306754986" watchObservedRunningTime="2026-01-11 17:35:00.340838156 +0000 UTC m=+274.519030862" Jan 11 17:35:00 crc kubenswrapper[4837]: I0111 17:35:00.341787 4837 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-slftv" podStartSLOduration=66.436851934 podStartE2EDuration="3m0.341780761s" podCreationTimestamp="2026-01-11 17:32:00 +0000 UTC" firstStartedPulling="2026-01-11 17:32:05.559253812 +0000 UTC m=+99.737446548" lastFinishedPulling="2026-01-11 17:33:59.464182629 +0000 UTC m=+213.642375375" observedRunningTime="2026-01-11 17:34:22.0600224 +0000 UTC m=+236.238215136" watchObservedRunningTime="2026-01-11 17:35:00.341780761 +0000 UTC m=+274.519973467" Jan 11 17:35:00 crc kubenswrapper[4837]: I0111 17:35:00.342204 4837 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-jk4fx" podStartSLOduration=46.733337075 podStartE2EDuration="3m2.342199874s" podCreationTimestamp="2026-01-11 17:31:58 +0000 UTC" firstStartedPulling="2026-01-11 17:31:59.425496514 +0000 UTC m=+93.603689220" lastFinishedPulling="2026-01-11 17:34:15.034359313 +0000 UTC m=+229.212552019" observedRunningTime="2026-01-11 17:34:22.285284148 +0000 UTC m=+236.463476854" watchObservedRunningTime="2026-01-11 17:35:00.342199874 +0000 UTC m=+274.520392580" Jan 11 17:35:00 crc kubenswrapper[4837]: I0111 17:35:00.343085 4837 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-p7kkt" podStartSLOduration=48.412407595 podStartE2EDuration="3m3.343080248s" podCreationTimestamp="2026-01-11 17:31:57 +0000 UTC" firstStartedPulling="2026-01-11 17:31:59.435086989 +0000 UTC m=+93.613279695" lastFinishedPulling="2026-01-11 17:34:14.365759642 +0000 UTC m=+228.543952348" observedRunningTime="2026-01-11 17:34:22.024818379 +0000 UTC m=+236.203011095" watchObservedRunningTime="2026-01-11 17:35:00.343080248 +0000 UTC m=+274.521272954" Jan 11 17:35:00 crc kubenswrapper[4837]: I0111 17:35:00.344608 4837 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" podStartSLOduration=74.34460384 podStartE2EDuration="1m14.34460384s" podCreationTimestamp="2026-01-11 17:33:46 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-11 17:34:22.142397978 +0000 UTC m=+236.320590684" watchObservedRunningTime="2026-01-11 17:35:00.34460384 +0000 UTC m=+274.522796546" Jan 11 17:35:00 crc kubenswrapper[4837]: I0111 17:35:00.344982 4837 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-controller-manager/controller-manager-b577cd54c-v775z" podStartSLOduration=147.34497806 podStartE2EDuration="2m27.34497806s" podCreationTimestamp="2026-01-11 17:32:33 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-11 17:34:27.010839466 +0000 UTC m=+241.189032172" watchObservedRunningTime="2026-01-11 17:35:00.34497806 +0000 UTC m=+274.523170766" Jan 11 17:35:00 crc kubenswrapper[4837]: I0111 17:35:00.345296 4837 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-29ts4" podStartSLOduration=48.557833705 podStartE2EDuration="3m3.34529387s" podCreationTimestamp="2026-01-11 17:31:57 +0000 UTC" firstStartedPulling="2026-01-11 17:31:59.406575662 +0000 UTC m=+93.584768368" lastFinishedPulling="2026-01-11 17:34:14.194035797 +0000 UTC m=+228.372228533" observedRunningTime="2026-01-11 17:34:22.074127938 +0000 UTC m=+236.252320644" watchObservedRunningTime="2026-01-11 17:35:00.34529387 +0000 UTC m=+274.523486576" Jan 11 17:35:00 crc kubenswrapper[4837]: I0111 17:35:00.345364 4837 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-9blrl" podStartSLOduration=47.05893227 podStartE2EDuration="3m1.345362072s" podCreationTimestamp="2026-01-11 17:31:59 +0000 UTC" firstStartedPulling="2026-01-11 17:32:00.460901335 +0000 UTC m=+94.639094051" lastFinishedPulling="2026-01-11 17:34:14.747331147 +0000 UTC m=+228.925523853" observedRunningTime="2026-01-11 17:34:22.26572699 +0000 UTC m=+236.443919696" watchObservedRunningTime="2026-01-11 17:35:00.345362072 +0000 UTC m=+274.523554778" Jan 11 17:35:00 crc kubenswrapper[4837]: I0111 17:35:00.345427 4837 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-pvd2s" podStartSLOduration=50.322402007 podStartE2EDuration="2m59.345425133s" podCreationTimestamp="2026-01-11 17:32:01 +0000 UTC" firstStartedPulling="2026-01-11 17:32:06.567214383 +0000 UTC m=+100.745407089" lastFinishedPulling="2026-01-11 17:34:15.590237509 +0000 UTC m=+229.768430215" observedRunningTime="2026-01-11 17:34:22.314275767 +0000 UTC m=+236.492468473" watchObservedRunningTime="2026-01-11 17:35:00.345425133 +0000 UTC m=+274.523617839" Jan 11 17:35:00 crc kubenswrapper[4837]: I0111 17:35:00.346080 4837 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-kube-apiserver/kube-apiserver-crc"] Jan 11 17:35:00 crc kubenswrapper[4837]: I0111 17:35:00.346125 4837 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-apiserver/kube-apiserver-crc"] Jan 11 17:35:00 crc kubenswrapper[4837]: I0111 17:35:00.346148 4837 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-b577cd54c-v775z"] Jan 11 17:35:00 crc kubenswrapper[4837]: I0111 17:35:00.351102 4837 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-apiserver/kube-apiserver-crc" Jan 11 17:35:00 crc kubenswrapper[4837]: I0111 17:35:00.370267 4837 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-apiserver/kube-apiserver-crc" podStartSLOduration=39.370249284 podStartE2EDuration="39.370249284s" podCreationTimestamp="2026-01-11 17:34:21 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-11 17:35:00.364456292 +0000 UTC m=+274.542648998" watchObservedRunningTime="2026-01-11 17:35:00.370249284 +0000 UTC m=+274.548441990" Jan 11 17:35:00 crc kubenswrapper[4837]: I0111 17:35:00.706297 4837 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"package-server-manager-serving-cert" Jan 11 17:35:00 crc kubenswrapper[4837]: I0111 17:35:00.707094 4837 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-ocp-branding-template" Jan 11 17:35:00 crc kubenswrapper[4837]: I0111 17:35:00.762425 4837 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-stats-default" Jan 11 17:35:00 crc kubenswrapper[4837]: I0111 17:35:00.837868 4837 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"authentication-operator-config" Jan 11 17:35:00 crc kubenswrapper[4837]: I0111 17:35:00.842729 4837 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-samples-operator"/"openshift-service-ca.crt" Jan 11 17:35:00 crc kubenswrapper[4837]: I0111 17:35:00.846379 4837 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"openshift-service-ca.crt" Jan 11 17:35:00 crc kubenswrapper[4837]: I0111 17:35:00.968840 4837 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-operator"/"openshift-service-ca.crt" Jan 11 17:35:01 crc kubenswrapper[4837]: I0111 17:35:01.001336 4837 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca"/"signing-cabundle" Jan 11 17:35:01 crc kubenswrapper[4837]: I0111 17:35:01.297622 4837 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"default-dockercfg-chnjx" Jan 11 17:35:01 crc kubenswrapper[4837]: I0111 17:35:01.308246 4837 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-machine-approver"/"machine-approver-sa-dockercfg-nl2j4" Jan 11 17:35:01 crc kubenswrapper[4837]: I0111 17:35:01.417069 4837 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"openshift-controller-manager-sa-dockercfg-msq4c" Jan 11 17:35:01 crc kubenswrapper[4837]: I0111 17:35:01.503120 4837 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"openshift-service-ca.crt" Jan 11 17:35:01 crc kubenswrapper[4837]: I0111 17:35:01.553279 4837 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-diagnostics"/"openshift-service-ca.crt" Jan 11 17:35:01 crc kubenswrapper[4837]: I0111 17:35:01.678221 4837 reflector.go:368] Caches populated for *v1.Node from k8s.io/client-go/informers/factory.go:160 Jan 11 17:35:01 crc kubenswrapper[4837]: I0111 17:35:01.778572 4837 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"node-ca-dockercfg-4777p" Jan 11 17:35:01 crc kubenswrapper[4837]: I0111 17:35:01.945998 4837 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-apiserver-operator"/"kube-apiserver-operator-serving-cert" Jan 11 17:35:01 crc kubenswrapper[4837]: I0111 17:35:01.950374 4837 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-service-ca.crt" Jan 11 17:35:02 crc kubenswrapper[4837]: I0111 17:35:02.005094 4837 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca"/"signing-key" Jan 11 17:35:02 crc kubenswrapper[4837]: I0111 17:35:02.025472 4837 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"multus-daemon-config" Jan 11 17:35:02 crc kubenswrapper[4837]: I0111 17:35:02.296880 4837 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"default-dockercfg-2q5b6" Jan 11 17:35:02 crc kubenswrapper[4837]: I0111 17:35:02.411816 4837 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-network-operator"/"metrics-tls" Jan 11 17:35:02 crc kubenswrapper[4837]: I0111 17:35:02.891708 4837 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns-operator"/"dns-operator-dockercfg-9mqw5" Jan 11 17:35:03 crc kubenswrapper[4837]: I0111 17:35:03.029398 4837 reflector.go:368] Caches populated for *v1.ConfigMap from object-"hostpath-provisioner"/"kube-root-ca.crt" Jan 11 17:35:03 crc kubenswrapper[4837]: I0111 17:35:03.096962 4837 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager-operator"/"openshift-controller-manager-operator-serving-cert" Jan 11 17:35:03 crc kubenswrapper[4837]: I0111 17:35:03.123190 4837 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"kube-rbac-proxy" Jan 11 17:35:03 crc kubenswrapper[4837]: I0111 17:35:03.238853 4837 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"kube-root-ca.crt" Jan 11 17:35:03 crc kubenswrapper[4837]: I0111 17:35:03.339981 4837 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"olm-operator-serving-cert" Jan 11 17:35:03 crc kubenswrapper[4837]: I0111 17:35:03.564077 4837 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"mcc-proxy-tls" Jan 11 17:35:03 crc kubenswrapper[4837]: I0111 17:35:03.745741 4837 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-controller-manager-operator"/"kube-controller-manager-operator-serving-cert" Jan 11 17:35:03 crc kubenswrapper[4837]: I0111 17:35:03.750284 4837 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"ovnkube-config" Jan 11 17:35:03 crc kubenswrapper[4837]: I0111 17:35:03.919859 4837 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console-operator"/"serving-cert" Jan 11 17:35:03 crc kubenswrapper[4837]: I0111 17:35:03.929937 4837 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager-operator"/"openshift-controller-manager-operator-dockercfg-vw8fw" Jan 11 17:35:03 crc kubenswrapper[4837]: I0111 17:35:03.983137 4837 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-session" Jan 11 17:35:04 crc kubenswrapper[4837]: I0111 17:35:04.151836 4837 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-storage-version-migrator"/"kube-storage-version-migrator-sa-dockercfg-5xfcg" Jan 11 17:35:04 crc kubenswrapper[4837]: I0111 17:35:04.640543 4837 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"openshift-service-ca.crt" Jan 11 17:35:04 crc kubenswrapper[4837]: I0111 17:35:04.858791 4837 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-server-dockercfg-qx5rd" Jan 11 17:35:04 crc kubenswrapper[4837]: I0111 17:35:04.982791 4837 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"redhat-operators-dockercfg-ct8rh" Jan 11 17:35:05 crc kubenswrapper[4837]: I0111 17:35:05.035923 4837 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator-operator"/"openshift-service-ca.crt" Jan 11 17:35:05 crc kubenswrapper[4837]: I0111 17:35:05.057231 4837 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-samples-operator"/"kube-root-ca.crt" Jan 11 17:35:05 crc kubenswrapper[4837]: I0111 17:35:05.072627 4837 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"openshift-service-ca.crt" Jan 11 17:35:05 crc kubenswrapper[4837]: I0111 17:35:05.209255 4837 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"env-overrides" Jan 11 17:35:05 crc kubenswrapper[4837]: I0111 17:35:05.561404 4837 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"metrics-daemon-sa-dockercfg-d427c" Jan 11 17:35:05 crc kubenswrapper[4837]: I0111 17:35:05.806623 4837 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"multus-ac-dockercfg-9lkdf" Jan 11 17:35:05 crc kubenswrapper[4837]: I0111 17:35:05.985149 4837 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"multus-ancillary-tools-dockercfg-vnmsz" Jan 11 17:35:06 crc kubenswrapper[4837]: I0111 17:35:06.047982 4837 kubelet.go:2431] "SyncLoop REMOVE" source="file" pods=["openshift-kube-apiserver/kube-apiserver-startup-monitor-crc"] Jan 11 17:35:06 crc kubenswrapper[4837]: I0111 17:35:06.048348 4837 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" containerName="startup-monitor" containerID="cri-o://e739f1735147f4da1c7babcf2c3992b3c2cd28f99b4c18b0e9c868c2155bc9b2" gracePeriod=5 Jan 11 17:35:06 crc kubenswrapper[4837]: I0111 17:35:06.052749 4837 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns"/"dns-default" Jan 11 17:35:06 crc kubenswrapper[4837]: I0111 17:35:06.138048 4837 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"openshift-service-ca.crt" Jan 11 17:35:06 crc kubenswrapper[4837]: I0111 17:35:06.143603 4837 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-template-error" Jan 11 17:35:06 crc kubenswrapper[4837]: I0111 17:35:06.150118 4837 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-config-operator"/"openshift-config-operator-dockercfg-7pc5z" Jan 11 17:35:06 crc kubenswrapper[4837]: I0111 17:35:06.265376 4837 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"client-ca" Jan 11 17:35:06 crc kubenswrapper[4837]: I0111 17:35:06.582531 4837 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"serving-cert" Jan 11 17:35:06 crc kubenswrapper[4837]: I0111 17:35:06.618180 4837 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca-operator"/"kube-root-ca.crt" Jan 11 17:35:06 crc kubenswrapper[4837]: I0111 17:35:06.629484 4837 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver-operator"/"openshift-service-ca.crt" Jan 11 17:35:06 crc kubenswrapper[4837]: I0111 17:35:06.822968 4837 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"etcd-operator-config" Jan 11 17:35:06 crc kubenswrapper[4837]: I0111 17:35:06.838054 4837 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca"/"kube-root-ca.crt" Jan 11 17:35:06 crc kubenswrapper[4837]: I0111 17:35:06.979264 4837 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-canary"/"kube-root-ca.crt" Jan 11 17:35:07 crc kubenswrapper[4837]: I0111 17:35:07.213016 4837 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"proxy-tls" Jan 11 17:35:07 crc kubenswrapper[4837]: I0111 17:35:07.353018 4837 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"console-oauth-config" Jan 11 17:35:07 crc kubenswrapper[4837]: I0111 17:35:07.422743 4837 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"oauth-apiserver-sa-dockercfg-6r2bq" Jan 11 17:35:07 crc kubenswrapper[4837]: I0111 17:35:07.486018 4837 reflector.go:368] Caches populated for *v1.ConfigMap from object-"hostpath-provisioner"/"openshift-service-ca.crt" Jan 11 17:35:07 crc kubenswrapper[4837]: I0111 17:35:07.583043 4837 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"control-plane-machine-set-operator-dockercfg-k9rxt" Jan 11 17:35:07 crc kubenswrapper[4837]: I0111 17:35:07.669511 4837 reflector.go:368] Caches populated for *v1.Secret from object-"hostpath-provisioner"/"csi-hostpath-provisioner-sa-dockercfg-qd74k" Jan 11 17:35:07 crc kubenswrapper[4837]: I0111 17:35:07.757091 4837 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-apiserver-operator"/"kube-root-ca.crt" Jan 11 17:35:08 crc kubenswrapper[4837]: I0111 17:35:08.191882 4837 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-idp-0-file-data" Jan 11 17:35:08 crc kubenswrapper[4837]: I0111 17:35:08.227254 4837 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-etcd-operator"/"etcd-operator-dockercfg-r9srn" Jan 11 17:35:08 crc kubenswrapper[4837]: I0111 17:35:08.458800 4837 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"client-ca" Jan 11 17:35:08 crc kubenswrapper[4837]: I0111 17:35:08.464126 4837 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver-operator"/"openshift-apiserver-operator-config" Jan 11 17:35:08 crc kubenswrapper[4837]: I0111 17:35:08.567401 4837 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-operator"/"kube-root-ca.crt" Jan 11 17:35:09 crc kubenswrapper[4837]: I0111 17:35:08.646239 4837 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"ovnkube-script-lib" Jan 11 17:35:09 crc kubenswrapper[4837]: I0111 17:35:08.813007 4837 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"openshift-service-ca.crt" Jan 11 17:35:09 crc kubenswrapper[4837]: I0111 17:35:09.019610 4837 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-controller-manager-operator"/"kube-controller-manager-operator-dockercfg-gkqpw" Jan 11 17:35:09 crc kubenswrapper[4837]: I0111 17:35:09.038831 4837 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns"/"openshift-service-ca.crt" Jan 11 17:35:09 crc kubenswrapper[4837]: I0111 17:35:09.262435 4837 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"audit-1" Jan 11 17:35:09 crc kubenswrapper[4837]: I0111 17:35:09.295791 4837 reflector.go:368] Caches populated for *v1.CSIDriver from k8s.io/client-go/informers/factory.go:160 Jan 11 17:35:09 crc kubenswrapper[4837]: I0111 17:35:09.370205 4837 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"kube-root-ca.crt" Jan 11 17:35:09 crc kubenswrapper[4837]: I0111 17:35:09.524001 4837 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-operator"/"metrics-tls" Jan 11 17:35:09 crc kubenswrapper[4837]: I0111 17:35:09.528708 4837 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"kube-root-ca.crt" Jan 11 17:35:09 crc kubenswrapper[4837]: I0111 17:35:09.571583 4837 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-server-tls" Jan 11 17:35:09 crc kubenswrapper[4837]: I0111 17:35:09.571731 4837 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"etcd-ca-bundle" Jan 11 17:35:09 crc kubenswrapper[4837]: I0111 17:35:09.581073 4837 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"registry-dockercfg-kzzsd" Jan 11 17:35:09 crc kubenswrapper[4837]: I0111 17:35:09.665184 4837 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"image-registry-tls" Jan 11 17:35:09 crc kubenswrapper[4837]: I0111 17:35:09.683604 4837 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-global-ca" Jan 11 17:35:09 crc kubenswrapper[4837]: I0111 17:35:09.751094 4837 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"metrics-daemon-secret" Jan 11 17:35:09 crc kubenswrapper[4837]: I0111 17:35:09.898354 4837 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns-operator"/"metrics-tls" Jan 11 17:35:10 crc kubenswrapper[4837]: I0111 17:35:10.010124 4837 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-canary"/"canary-serving-cert" Jan 11 17:35:10 crc kubenswrapper[4837]: I0111 17:35:10.093886 4837 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca"/"openshift-service-ca.crt" Jan 11 17:35:10 crc kubenswrapper[4837]: I0111 17:35:10.315271 4837 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager-operator"/"openshift-controller-manager-operator-config" Jan 11 17:35:10 crc kubenswrapper[4837]: I0111 17:35:10.384853 4837 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"audit-1" Jan 11 17:35:10 crc kubenswrapper[4837]: I0111 17:35:10.460220 4837 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns"/"dns-dockercfg-jwfmh" Jan 11 17:35:10 crc kubenswrapper[4837]: I0111 17:35:10.477753 4837 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"pprof-cert" Jan 11 17:35:10 crc kubenswrapper[4837]: I0111 17:35:10.543630 4837 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"node-bootstrapper-token" Jan 11 17:35:10 crc kubenswrapper[4837]: I0111 17:35:10.695269 4837 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"etcd-service-ca-bundle" Jan 11 17:35:10 crc kubenswrapper[4837]: I0111 17:35:10.758634 4837 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-marketplace"/"openshift-service-ca.crt" Jan 11 17:35:10 crc kubenswrapper[4837]: I0111 17:35:10.793343 4837 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"community-operators-dockercfg-dmngl" Jan 11 17:35:10 crc kubenswrapper[4837]: I0111 17:35:10.982298 4837 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"kube-root-ca.crt" Jan 11 17:35:11 crc kubenswrapper[4837]: I0111 17:35:11.603542 4837 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-canary"/"openshift-service-ca.crt" Jan 11 17:35:11 crc kubenswrapper[4837]: I0111 17:35:11.751547 4837 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"trusted-ca" Jan 11 17:35:11 crc kubenswrapper[4837]: I0111 17:35:11.998509 4837 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"openshift-service-ca.crt" Jan 11 17:35:12 crc kubenswrapper[4837]: I0111 17:35:12.017857 4837 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-marketplace"/"marketplace-trusted-ca" Jan 11 17:35:12 crc kubenswrapper[4837]: I0111 17:35:12.110833 4837 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-etcd-operator"/"etcd-client" Jan 11 17:35:12 crc kubenswrapper[4837]: I0111 17:35:12.136870 4837 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-storage-version-migrator-operator"/"kube-storage-version-migrator-operator-dockercfg-2bh8d" Jan 11 17:35:12 crc kubenswrapper[4837]: I0111 17:35:12.138320 4837 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"mco-proxy-tls" Jan 11 17:35:12 crc kubenswrapper[4837]: I0111 17:35:12.142476 4837 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"cni-copy-resources" Jan 11 17:35:12 crc kubenswrapper[4837]: I0111 17:35:12.146948 4837 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns-operator"/"kube-root-ca.crt" Jan 11 17:35:12 crc kubenswrapper[4837]: I0111 17:35:12.295764 4837 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-startup-monitor-crc_f85e55b1a89d02b0cb034b1ea31ed45a/startup-monitor/0.log" Jan 11 17:35:12 crc kubenswrapper[4837]: I0111 17:35:12.295810 4837 generic.go:334] "Generic (PLEG): container finished" podID="f85e55b1a89d02b0cb034b1ea31ed45a" containerID="e739f1735147f4da1c7babcf2c3992b3c2cd28f99b4c18b0e9c868c2155bc9b2" exitCode=137 Jan 11 17:35:12 crc kubenswrapper[4837]: I0111 17:35:12.330431 4837 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-canary"/"default-dockercfg-2llfx" Jan 11 17:35:12 crc kubenswrapper[4837]: I0111 17:35:12.363526 4837 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"openshift-service-ca.crt" Jan 11 17:35:12 crc kubenswrapper[4837]: I0111 17:35:12.513391 4837 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication-operator"/"serving-cert" Jan 11 17:35:12 crc kubenswrapper[4837]: I0111 17:35:12.528977 4837 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"marketplace-operator-dockercfg-5nsgg" Jan 11 17:35:12 crc kubenswrapper[4837]: I0111 17:35:12.808150 4837 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-startup-monitor-crc_f85e55b1a89d02b0cb034b1ea31ed45a/startup-monitor/0.log" Jan 11 17:35:12 crc kubenswrapper[4837]: I0111 17:35:12.808251 4837 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Jan 11 17:35:12 crc kubenswrapper[4837]: I0111 17:35:12.997331 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-lock\") pod \"f85e55b1a89d02b0cb034b1ea31ed45a\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " Jan 11 17:35:12 crc kubenswrapper[4837]: I0111 17:35:12.997486 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-resource-dir\") pod \"f85e55b1a89d02b0cb034b1ea31ed45a\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " Jan 11 17:35:12 crc kubenswrapper[4837]: I0111 17:35:12.997492 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-lock" (OuterVolumeSpecName: "var-lock") pod "f85e55b1a89d02b0cb034b1ea31ed45a" (UID: "f85e55b1a89d02b0cb034b1ea31ed45a"). InnerVolumeSpecName "var-lock". PluginName "kubernetes.io/host-path", VolumeGidValue "" Jan 11 17:35:12 crc kubenswrapper[4837]: I0111 17:35:12.997621 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"manifests\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-manifests\") pod \"f85e55b1a89d02b0cb034b1ea31ed45a\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " Jan 11 17:35:12 crc kubenswrapper[4837]: I0111 17:35:12.997523 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-resource-dir" (OuterVolumeSpecName: "resource-dir") pod "f85e55b1a89d02b0cb034b1ea31ed45a" (UID: "f85e55b1a89d02b0cb034b1ea31ed45a"). InnerVolumeSpecName "resource-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Jan 11 17:35:12 crc kubenswrapper[4837]: I0111 17:35:12.997771 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pod-resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-pod-resource-dir\") pod \"f85e55b1a89d02b0cb034b1ea31ed45a\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " Jan 11 17:35:12 crc kubenswrapper[4837]: I0111 17:35:12.997650 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-manifests" (OuterVolumeSpecName: "manifests") pod "f85e55b1a89d02b0cb034b1ea31ed45a" (UID: "f85e55b1a89d02b0cb034b1ea31ed45a"). InnerVolumeSpecName "manifests". PluginName "kubernetes.io/host-path", VolumeGidValue "" Jan 11 17:35:12 crc kubenswrapper[4837]: I0111 17:35:12.997845 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-log\") pod \"f85e55b1a89d02b0cb034b1ea31ed45a\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " Jan 11 17:35:12 crc kubenswrapper[4837]: I0111 17:35:12.998131 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-log" (OuterVolumeSpecName: "var-log") pod "f85e55b1a89d02b0cb034b1ea31ed45a" (UID: "f85e55b1a89d02b0cb034b1ea31ed45a"). InnerVolumeSpecName "var-log". PluginName "kubernetes.io/host-path", VolumeGidValue "" Jan 11 17:35:12 crc kubenswrapper[4837]: I0111 17:35:12.998396 4837 reconciler_common.go:293] "Volume detached for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-resource-dir\") on node \"crc\" DevicePath \"\"" Jan 11 17:35:12 crc kubenswrapper[4837]: I0111 17:35:12.998439 4837 reconciler_common.go:293] "Volume detached for volume \"manifests\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-manifests\") on node \"crc\" DevicePath \"\"" Jan 11 17:35:12 crc kubenswrapper[4837]: I0111 17:35:12.998464 4837 reconciler_common.go:293] "Volume detached for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-log\") on node \"crc\" DevicePath \"\"" Jan 11 17:35:12 crc kubenswrapper[4837]: I0111 17:35:12.998488 4837 reconciler_common.go:293] "Volume detached for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-lock\") on node \"crc\" DevicePath \"\"" Jan 11 17:35:13 crc kubenswrapper[4837]: I0111 17:35:13.013291 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-pod-resource-dir" (OuterVolumeSpecName: "pod-resource-dir") pod "f85e55b1a89d02b0cb034b1ea31ed45a" (UID: "f85e55b1a89d02b0cb034b1ea31ed45a"). InnerVolumeSpecName "pod-resource-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Jan 11 17:35:13 crc kubenswrapper[4837]: I0111 17:35:13.074059 4837 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-config-operator"/"kube-root-ca.crt" Jan 11 17:35:13 crc kubenswrapper[4837]: I0111 17:35:13.100331 4837 reconciler_common.go:293] "Volume detached for volume \"pod-resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-pod-resource-dir\") on node \"crc\" DevicePath \"\"" Jan 11 17:35:13 crc kubenswrapper[4837]: I0111 17:35:13.101078 4837 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-controller-manager/controller-manager-b577cd54c-v775z"] Jan 11 17:35:13 crc kubenswrapper[4837]: I0111 17:35:13.101388 4837 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-controller-manager/controller-manager-b577cd54c-v775z" podUID="99d66a69-1de3-43d5-947f-1fc4611c3022" containerName="controller-manager" containerID="cri-o://b717c4896ee849d8240fac8d8dffad05b9e9e18033d97ff49c7cd76f76fc62d5" gracePeriod=30 Jan 11 17:35:13 crc kubenswrapper[4837]: I0111 17:35:13.183884 4837 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"route-controller-manager-sa-dockercfg-h2zr2" Jan 11 17:35:13 crc kubenswrapper[4837]: I0111 17:35:13.194026 4837 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"openshift-service-ca.crt" Jan 11 17:35:13 crc kubenswrapper[4837]: I0111 17:35:13.194430 4837 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6f6d57cc69-fsbzj"] Jan 11 17:35:13 crc kubenswrapper[4837]: I0111 17:35:13.194665 4837 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-route-controller-manager/route-controller-manager-6f6d57cc69-fsbzj" podUID="8fb3713b-b193-4a5a-a841-50d15d8331c5" containerName="route-controller-manager" containerID="cri-o://b2d0f54aa8410d967b15e3f168e593e8a23b3380e2574cc5e973a75720d42b1d" gracePeriod=30 Jan 11 17:35:13 crc kubenswrapper[4837]: I0111 17:35:13.301819 4837 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-startup-monitor-crc_f85e55b1a89d02b0cb034b1ea31ed45a/startup-monitor/0.log" Jan 11 17:35:13 crc kubenswrapper[4837]: I0111 17:35:13.301886 4837 scope.go:117] "RemoveContainer" containerID="e739f1735147f4da1c7babcf2c3992b3c2cd28f99b4c18b0e9c868c2155bc9b2" Jan 11 17:35:13 crc kubenswrapper[4837]: I0111 17:35:13.301925 4837 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Jan 11 17:35:13 crc kubenswrapper[4837]: I0111 17:35:13.556379 4837 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"openshift-service-ca.crt" Jan 11 17:35:13 crc kubenswrapper[4837]: I0111 17:35:13.693235 4837 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator-operator"/"config" Jan 11 17:35:13 crc kubenswrapper[4837]: I0111 17:35:13.844148 4837 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns"/"kube-root-ca.crt" Jan 11 17:35:14 crc kubenswrapper[4837]: I0111 17:35:14.145317 4837 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress"/"kube-root-ca.crt" Jan 11 17:35:14 crc kubenswrapper[4837]: I0111 17:35:14.148723 4837 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-dockercfg-zdk86" Jan 11 17:35:14 crc kubenswrapper[4837]: I0111 17:35:14.236433 4837 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-6f6d57cc69-fsbzj" Jan 11 17:35:14 crc kubenswrapper[4837]: I0111 17:35:14.265949 4837 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-route-controller-manager/route-controller-manager-75766b8986-l8bhc"] Jan 11 17:35:14 crc kubenswrapper[4837]: E0111 17:35:14.266178 4837 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1ec7b443-a2dc-4c7b-b661-03ce968d0878" containerName="installer" Jan 11 17:35:14 crc kubenswrapper[4837]: I0111 17:35:14.266201 4837 state_mem.go:107] "Deleted CPUSet assignment" podUID="1ec7b443-a2dc-4c7b-b661-03ce968d0878" containerName="installer" Jan 11 17:35:14 crc kubenswrapper[4837]: E0111 17:35:14.266214 4837 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8fb3713b-b193-4a5a-a841-50d15d8331c5" containerName="route-controller-manager" Jan 11 17:35:14 crc kubenswrapper[4837]: I0111 17:35:14.266222 4837 state_mem.go:107] "Deleted CPUSet assignment" podUID="8fb3713b-b193-4a5a-a841-50d15d8331c5" containerName="route-controller-manager" Jan 11 17:35:14 crc kubenswrapper[4837]: E0111 17:35:14.266252 4837 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" containerName="startup-monitor" Jan 11 17:35:14 crc kubenswrapper[4837]: I0111 17:35:14.266260 4837 state_mem.go:107] "Deleted CPUSet assignment" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" containerName="startup-monitor" Jan 11 17:35:14 crc kubenswrapper[4837]: I0111 17:35:14.266362 4837 memory_manager.go:354] "RemoveStaleState removing state" podUID="1ec7b443-a2dc-4c7b-b661-03ce968d0878" containerName="installer" Jan 11 17:35:14 crc kubenswrapper[4837]: I0111 17:35:14.266374 4837 memory_manager.go:354] "RemoveStaleState removing state" podUID="8fb3713b-b193-4a5a-a841-50d15d8331c5" containerName="route-controller-manager" Jan 11 17:35:14 crc kubenswrapper[4837]: I0111 17:35:14.266387 4837 memory_manager.go:354] "RemoveStaleState removing state" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" containerName="startup-monitor" Jan 11 17:35:14 crc kubenswrapper[4837]: I0111 17:35:14.266823 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-75766b8986-l8bhc" Jan 11 17:35:14 crc kubenswrapper[4837]: I0111 17:35:14.284091 4837 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-75766b8986-l8bhc"] Jan 11 17:35:14 crc kubenswrapper[4837]: I0111 17:35:14.310865 4837 generic.go:334] "Generic (PLEG): container finished" podID="99d66a69-1de3-43d5-947f-1fc4611c3022" containerID="b717c4896ee849d8240fac8d8dffad05b9e9e18033d97ff49c7cd76f76fc62d5" exitCode=0 Jan 11 17:35:14 crc kubenswrapper[4837]: I0111 17:35:14.310922 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-b577cd54c-v775z" event={"ID":"99d66a69-1de3-43d5-947f-1fc4611c3022","Type":"ContainerDied","Data":"b717c4896ee849d8240fac8d8dffad05b9e9e18033d97ff49c7cd76f76fc62d5"} Jan 11 17:35:14 crc kubenswrapper[4837]: I0111 17:35:14.312521 4837 generic.go:334] "Generic (PLEG): container finished" podID="8fb3713b-b193-4a5a-a841-50d15d8331c5" containerID="b2d0f54aa8410d967b15e3f168e593e8a23b3380e2574cc5e973a75720d42b1d" exitCode=0 Jan 11 17:35:14 crc kubenswrapper[4837]: I0111 17:35:14.312581 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-6f6d57cc69-fsbzj" event={"ID":"8fb3713b-b193-4a5a-a841-50d15d8331c5","Type":"ContainerDied","Data":"b2d0f54aa8410d967b15e3f168e593e8a23b3380e2574cc5e973a75720d42b1d"} Jan 11 17:35:14 crc kubenswrapper[4837]: I0111 17:35:14.312619 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-6f6d57cc69-fsbzj" event={"ID":"8fb3713b-b193-4a5a-a841-50d15d8331c5","Type":"ContainerDied","Data":"233746b4f7bea9a083ed6775c0f2d8422f607fbdd0d6895f689b4a0d4c9c789a"} Jan 11 17:35:14 crc kubenswrapper[4837]: I0111 17:35:14.312646 4837 scope.go:117] "RemoveContainer" containerID="b2d0f54aa8410d967b15e3f168e593e8a23b3380e2574cc5e973a75720d42b1d" Jan 11 17:35:14 crc kubenswrapper[4837]: I0111 17:35:14.312840 4837 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-6f6d57cc69-fsbzj" Jan 11 17:35:14 crc kubenswrapper[4837]: I0111 17:35:14.349785 4837 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-b577cd54c-v775z" Jan 11 17:35:14 crc kubenswrapper[4837]: I0111 17:35:14.358779 4837 scope.go:117] "RemoveContainer" containerID="b2d0f54aa8410d967b15e3f168e593e8a23b3380e2574cc5e973a75720d42b1d" Jan 11 17:35:14 crc kubenswrapper[4837]: E0111 17:35:14.359090 4837 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b2d0f54aa8410d967b15e3f168e593e8a23b3380e2574cc5e973a75720d42b1d\": container with ID starting with b2d0f54aa8410d967b15e3f168e593e8a23b3380e2574cc5e973a75720d42b1d not found: ID does not exist" containerID="b2d0f54aa8410d967b15e3f168e593e8a23b3380e2574cc5e973a75720d42b1d" Jan 11 17:35:14 crc kubenswrapper[4837]: I0111 17:35:14.359125 4837 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b2d0f54aa8410d967b15e3f168e593e8a23b3380e2574cc5e973a75720d42b1d"} err="failed to get container status \"b2d0f54aa8410d967b15e3f168e593e8a23b3380e2574cc5e973a75720d42b1d\": rpc error: code = NotFound desc = could not find container \"b2d0f54aa8410d967b15e3f168e593e8a23b3380e2574cc5e973a75720d42b1d\": container with ID starting with b2d0f54aa8410d967b15e3f168e593e8a23b3380e2574cc5e973a75720d42b1d not found: ID does not exist" Jan 11 17:35:14 crc kubenswrapper[4837]: I0111 17:35:14.375429 4837 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" path="/var/lib/kubelet/pods/f85e55b1a89d02b0cb034b1ea31ed45a/volumes" Jan 11 17:35:14 crc kubenswrapper[4837]: I0111 17:35:14.376393 4837 mirror_client.go:130] "Deleting a mirror pod" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" podUID="" Jan 11 17:35:14 crc kubenswrapper[4837]: I0111 17:35:14.391376 4837 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-kube-apiserver/kube-apiserver-startup-monitor-crc"] Jan 11 17:35:14 crc kubenswrapper[4837]: I0111 17:35:14.391421 4837 kubelet.go:2649] "Unable to find pod for mirror pod, skipping" mirrorPod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" mirrorPodUID="be578014-0340-4e6e-89b6-09516bd010a8" Jan 11 17:35:14 crc kubenswrapper[4837]: I0111 17:35:14.391451 4837 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-kube-apiserver/kube-apiserver-startup-monitor-crc"] Jan 11 17:35:14 crc kubenswrapper[4837]: I0111 17:35:14.391470 4837 kubelet.go:2673] "Unable to find pod for mirror pod, skipping" mirrorPod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" mirrorPodUID="be578014-0340-4e6e-89b6-09516bd010a8" Jan 11 17:35:14 crc kubenswrapper[4837]: I0111 17:35:14.419997 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-5gm8l\" (UniqueName: \"kubernetes.io/projected/8fb3713b-b193-4a5a-a841-50d15d8331c5-kube-api-access-5gm8l\") pod \"8fb3713b-b193-4a5a-a841-50d15d8331c5\" (UID: \"8fb3713b-b193-4a5a-a841-50d15d8331c5\") " Jan 11 17:35:14 crc kubenswrapper[4837]: I0111 17:35:14.420081 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8fb3713b-b193-4a5a-a841-50d15d8331c5-config\") pod \"8fb3713b-b193-4a5a-a841-50d15d8331c5\" (UID: \"8fb3713b-b193-4a5a-a841-50d15d8331c5\") " Jan 11 17:35:14 crc kubenswrapper[4837]: I0111 17:35:14.420645 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/8fb3713b-b193-4a5a-a841-50d15d8331c5-serving-cert\") pod \"8fb3713b-b193-4a5a-a841-50d15d8331c5\" (UID: \"8fb3713b-b193-4a5a-a841-50d15d8331c5\") " Jan 11 17:35:14 crc kubenswrapper[4837]: I0111 17:35:14.420742 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/8fb3713b-b193-4a5a-a841-50d15d8331c5-client-ca\") pod \"8fb3713b-b193-4a5a-a841-50d15d8331c5\" (UID: \"8fb3713b-b193-4a5a-a841-50d15d8331c5\") " Jan 11 17:35:14 crc kubenswrapper[4837]: I0111 17:35:14.421454 4837 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"console-serving-cert" Jan 11 17:35:14 crc kubenswrapper[4837]: I0111 17:35:14.421462 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8fb3713b-b193-4a5a-a841-50d15d8331c5-config" (OuterVolumeSpecName: "config") pod "8fb3713b-b193-4a5a-a841-50d15d8331c5" (UID: "8fb3713b-b193-4a5a-a841-50d15d8331c5"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 11 17:35:14 crc kubenswrapper[4837]: I0111 17:35:14.421627 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c2b20a17-42a9-4f8a-98ff-1f21615f9d2c-config\") pod \"route-controller-manager-75766b8986-l8bhc\" (UID: \"c2b20a17-42a9-4f8a-98ff-1f21615f9d2c\") " pod="openshift-route-controller-manager/route-controller-manager-75766b8986-l8bhc" Jan 11 17:35:14 crc kubenswrapper[4837]: I0111 17:35:14.421705 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/c2b20a17-42a9-4f8a-98ff-1f21615f9d2c-serving-cert\") pod \"route-controller-manager-75766b8986-l8bhc\" (UID: \"c2b20a17-42a9-4f8a-98ff-1f21615f9d2c\") " pod="openshift-route-controller-manager/route-controller-manager-75766b8986-l8bhc" Jan 11 17:35:14 crc kubenswrapper[4837]: I0111 17:35:14.421799 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/c2b20a17-42a9-4f8a-98ff-1f21615f9d2c-client-ca\") pod \"route-controller-manager-75766b8986-l8bhc\" (UID: \"c2b20a17-42a9-4f8a-98ff-1f21615f9d2c\") " pod="openshift-route-controller-manager/route-controller-manager-75766b8986-l8bhc" Jan 11 17:35:14 crc kubenswrapper[4837]: I0111 17:35:14.421869 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wh99p\" (UniqueName: \"kubernetes.io/projected/c2b20a17-42a9-4f8a-98ff-1f21615f9d2c-kube-api-access-wh99p\") pod \"route-controller-manager-75766b8986-l8bhc\" (UID: \"c2b20a17-42a9-4f8a-98ff-1f21615f9d2c\") " pod="openshift-route-controller-manager/route-controller-manager-75766b8986-l8bhc" Jan 11 17:35:14 crc kubenswrapper[4837]: I0111 17:35:14.421967 4837 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8fb3713b-b193-4a5a-a841-50d15d8331c5-config\") on node \"crc\" DevicePath \"\"" Jan 11 17:35:14 crc kubenswrapper[4837]: I0111 17:35:14.422456 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8fb3713b-b193-4a5a-a841-50d15d8331c5-client-ca" (OuterVolumeSpecName: "client-ca") pod "8fb3713b-b193-4a5a-a841-50d15d8331c5" (UID: "8fb3713b-b193-4a5a-a841-50d15d8331c5"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 11 17:35:14 crc kubenswrapper[4837]: I0111 17:35:14.426289 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8fb3713b-b193-4a5a-a841-50d15d8331c5-kube-api-access-5gm8l" (OuterVolumeSpecName: "kube-api-access-5gm8l") pod "8fb3713b-b193-4a5a-a841-50d15d8331c5" (UID: "8fb3713b-b193-4a5a-a841-50d15d8331c5"). InnerVolumeSpecName "kube-api-access-5gm8l". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 11 17:35:14 crc kubenswrapper[4837]: I0111 17:35:14.426576 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8fb3713b-b193-4a5a-a841-50d15d8331c5-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "8fb3713b-b193-4a5a-a841-50d15d8331c5" (UID: "8fb3713b-b193-4a5a-a841-50d15d8331c5"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 11 17:35:14 crc kubenswrapper[4837]: I0111 17:35:14.457799 4837 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"packageserver-service-cert" Jan 11 17:35:14 crc kubenswrapper[4837]: I0111 17:35:14.522753 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/99d66a69-1de3-43d5-947f-1fc4611c3022-proxy-ca-bundles\") pod \"99d66a69-1de3-43d5-947f-1fc4611c3022\" (UID: \"99d66a69-1de3-43d5-947f-1fc4611c3022\") " Jan 11 17:35:14 crc kubenswrapper[4837]: I0111 17:35:14.522934 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/99d66a69-1de3-43d5-947f-1fc4611c3022-client-ca\") pod \"99d66a69-1de3-43d5-947f-1fc4611c3022\" (UID: \"99d66a69-1de3-43d5-947f-1fc4611c3022\") " Jan 11 17:35:14 crc kubenswrapper[4837]: I0111 17:35:14.522982 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8kx4l\" (UniqueName: \"kubernetes.io/projected/99d66a69-1de3-43d5-947f-1fc4611c3022-kube-api-access-8kx4l\") pod \"99d66a69-1de3-43d5-947f-1fc4611c3022\" (UID: \"99d66a69-1de3-43d5-947f-1fc4611c3022\") " Jan 11 17:35:14 crc kubenswrapper[4837]: I0111 17:35:14.523012 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/99d66a69-1de3-43d5-947f-1fc4611c3022-config\") pod \"99d66a69-1de3-43d5-947f-1fc4611c3022\" (UID: \"99d66a69-1de3-43d5-947f-1fc4611c3022\") " Jan 11 17:35:14 crc kubenswrapper[4837]: I0111 17:35:14.523037 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/99d66a69-1de3-43d5-947f-1fc4611c3022-serving-cert\") pod \"99d66a69-1de3-43d5-947f-1fc4611c3022\" (UID: \"99d66a69-1de3-43d5-947f-1fc4611c3022\") " Jan 11 17:35:14 crc kubenswrapper[4837]: I0111 17:35:14.523201 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c2b20a17-42a9-4f8a-98ff-1f21615f9d2c-config\") pod \"route-controller-manager-75766b8986-l8bhc\" (UID: \"c2b20a17-42a9-4f8a-98ff-1f21615f9d2c\") " pod="openshift-route-controller-manager/route-controller-manager-75766b8986-l8bhc" Jan 11 17:35:14 crc kubenswrapper[4837]: I0111 17:35:14.523232 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/c2b20a17-42a9-4f8a-98ff-1f21615f9d2c-serving-cert\") pod \"route-controller-manager-75766b8986-l8bhc\" (UID: \"c2b20a17-42a9-4f8a-98ff-1f21615f9d2c\") " pod="openshift-route-controller-manager/route-controller-manager-75766b8986-l8bhc" Jan 11 17:35:14 crc kubenswrapper[4837]: I0111 17:35:14.523292 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/c2b20a17-42a9-4f8a-98ff-1f21615f9d2c-client-ca\") pod \"route-controller-manager-75766b8986-l8bhc\" (UID: \"c2b20a17-42a9-4f8a-98ff-1f21615f9d2c\") " pod="openshift-route-controller-manager/route-controller-manager-75766b8986-l8bhc" Jan 11 17:35:14 crc kubenswrapper[4837]: I0111 17:35:14.523357 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wh99p\" (UniqueName: \"kubernetes.io/projected/c2b20a17-42a9-4f8a-98ff-1f21615f9d2c-kube-api-access-wh99p\") pod \"route-controller-manager-75766b8986-l8bhc\" (UID: \"c2b20a17-42a9-4f8a-98ff-1f21615f9d2c\") " pod="openshift-route-controller-manager/route-controller-manager-75766b8986-l8bhc" Jan 11 17:35:14 crc kubenswrapper[4837]: I0111 17:35:14.523409 4837 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-5gm8l\" (UniqueName: \"kubernetes.io/projected/8fb3713b-b193-4a5a-a841-50d15d8331c5-kube-api-access-5gm8l\") on node \"crc\" DevicePath \"\"" Jan 11 17:35:14 crc kubenswrapper[4837]: I0111 17:35:14.523422 4837 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/8fb3713b-b193-4a5a-a841-50d15d8331c5-serving-cert\") on node \"crc\" DevicePath \"\"" Jan 11 17:35:14 crc kubenswrapper[4837]: I0111 17:35:14.523430 4837 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/8fb3713b-b193-4a5a-a841-50d15d8331c5-client-ca\") on node \"crc\" DevicePath \"\"" Jan 11 17:35:14 crc kubenswrapper[4837]: I0111 17:35:14.523988 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/99d66a69-1de3-43d5-947f-1fc4611c3022-proxy-ca-bundles" (OuterVolumeSpecName: "proxy-ca-bundles") pod "99d66a69-1de3-43d5-947f-1fc4611c3022" (UID: "99d66a69-1de3-43d5-947f-1fc4611c3022"). InnerVolumeSpecName "proxy-ca-bundles". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 11 17:35:14 crc kubenswrapper[4837]: I0111 17:35:14.524181 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/99d66a69-1de3-43d5-947f-1fc4611c3022-config" (OuterVolumeSpecName: "config") pod "99d66a69-1de3-43d5-947f-1fc4611c3022" (UID: "99d66a69-1de3-43d5-947f-1fc4611c3022"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 11 17:35:14 crc kubenswrapper[4837]: I0111 17:35:14.524568 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c2b20a17-42a9-4f8a-98ff-1f21615f9d2c-config\") pod \"route-controller-manager-75766b8986-l8bhc\" (UID: \"c2b20a17-42a9-4f8a-98ff-1f21615f9d2c\") " pod="openshift-route-controller-manager/route-controller-manager-75766b8986-l8bhc" Jan 11 17:35:14 crc kubenswrapper[4837]: I0111 17:35:14.524910 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/99d66a69-1de3-43d5-947f-1fc4611c3022-client-ca" (OuterVolumeSpecName: "client-ca") pod "99d66a69-1de3-43d5-947f-1fc4611c3022" (UID: "99d66a69-1de3-43d5-947f-1fc4611c3022"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 11 17:35:14 crc kubenswrapper[4837]: I0111 17:35:14.525505 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/c2b20a17-42a9-4f8a-98ff-1f21615f9d2c-client-ca\") pod \"route-controller-manager-75766b8986-l8bhc\" (UID: \"c2b20a17-42a9-4f8a-98ff-1f21615f9d2c\") " pod="openshift-route-controller-manager/route-controller-manager-75766b8986-l8bhc" Jan 11 17:35:14 crc kubenswrapper[4837]: I0111 17:35:14.525732 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/99d66a69-1de3-43d5-947f-1fc4611c3022-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "99d66a69-1de3-43d5-947f-1fc4611c3022" (UID: "99d66a69-1de3-43d5-947f-1fc4611c3022"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 11 17:35:14 crc kubenswrapper[4837]: I0111 17:35:14.528857 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/c2b20a17-42a9-4f8a-98ff-1f21615f9d2c-serving-cert\") pod \"route-controller-manager-75766b8986-l8bhc\" (UID: \"c2b20a17-42a9-4f8a-98ff-1f21615f9d2c\") " pod="openshift-route-controller-manager/route-controller-manager-75766b8986-l8bhc" Jan 11 17:35:14 crc kubenswrapper[4837]: I0111 17:35:14.529124 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/99d66a69-1de3-43d5-947f-1fc4611c3022-kube-api-access-8kx4l" (OuterVolumeSpecName: "kube-api-access-8kx4l") pod "99d66a69-1de3-43d5-947f-1fc4611c3022" (UID: "99d66a69-1de3-43d5-947f-1fc4611c3022"). InnerVolumeSpecName "kube-api-access-8kx4l". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 11 17:35:14 crc kubenswrapper[4837]: I0111 17:35:14.539068 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wh99p\" (UniqueName: \"kubernetes.io/projected/c2b20a17-42a9-4f8a-98ff-1f21615f9d2c-kube-api-access-wh99p\") pod \"route-controller-manager-75766b8986-l8bhc\" (UID: \"c2b20a17-42a9-4f8a-98ff-1f21615f9d2c\") " pod="openshift-route-controller-manager/route-controller-manager-75766b8986-l8bhc" Jan 11 17:35:14 crc kubenswrapper[4837]: I0111 17:35:14.583234 4837 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"kube-root-ca.crt" Jan 11 17:35:14 crc kubenswrapper[4837]: I0111 17:35:14.586458 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-75766b8986-l8bhc" Jan 11 17:35:14 crc kubenswrapper[4837]: I0111 17:35:14.624712 4837 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/99d66a69-1de3-43d5-947f-1fc4611c3022-client-ca\") on node \"crc\" DevicePath \"\"" Jan 11 17:35:14 crc kubenswrapper[4837]: I0111 17:35:14.624764 4837 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-8kx4l\" (UniqueName: \"kubernetes.io/projected/99d66a69-1de3-43d5-947f-1fc4611c3022-kube-api-access-8kx4l\") on node \"crc\" DevicePath \"\"" Jan 11 17:35:14 crc kubenswrapper[4837]: I0111 17:35:14.624788 4837 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/99d66a69-1de3-43d5-947f-1fc4611c3022-config\") on node \"crc\" DevicePath \"\"" Jan 11 17:35:14 crc kubenswrapper[4837]: I0111 17:35:14.624805 4837 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/99d66a69-1de3-43d5-947f-1fc4611c3022-serving-cert\") on node \"crc\" DevicePath \"\"" Jan 11 17:35:14 crc kubenswrapper[4837]: I0111 17:35:14.624823 4837 reconciler_common.go:293] "Volume detached for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/99d66a69-1de3-43d5-947f-1fc4611c3022-proxy-ca-bundles\") on node \"crc\" DevicePath \"\"" Jan 11 17:35:14 crc kubenswrapper[4837]: I0111 17:35:14.662588 4837 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6f6d57cc69-fsbzj"] Jan 11 17:35:14 crc kubenswrapper[4837]: I0111 17:35:14.669479 4837 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6f6d57cc69-fsbzj"] Jan 11 17:35:14 crc kubenswrapper[4837]: I0111 17:35:14.675278 4837 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-etcd-operator"/"etcd-operator-serving-cert" Jan 11 17:35:14 crc kubenswrapper[4837]: I0111 17:35:14.786600 4837 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-config-operator"/"openshift-service-ca.crt" Jan 11 17:35:14 crc kubenswrapper[4837]: I0111 17:35:14.832886 4837 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"v4-0-config-system-service-ca" Jan 11 17:35:14 crc kubenswrapper[4837]: I0111 17:35:14.886091 4837 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-75766b8986-l8bhc"] Jan 11 17:35:14 crc kubenswrapper[4837]: W0111 17:35:14.886587 4837 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podc2b20a17_42a9_4f8a_98ff_1f21615f9d2c.slice/crio-f49096fd3605368603625cbcc3e48927b1385284f570bcd219e68a2c8d17416e WatchSource:0}: Error finding container f49096fd3605368603625cbcc3e48927b1385284f570bcd219e68a2c8d17416e: Status 404 returned error can't find the container with id f49096fd3605368603625cbcc3e48927b1385284f570bcd219e68a2c8d17416e Jan 11 17:35:15 crc kubenswrapper[4837]: I0111 17:35:15.052628 4837 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"redhat-marketplace-dockercfg-x2ctb" Jan 11 17:35:15 crc kubenswrapper[4837]: I0111 17:35:15.322245 4837 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-b577cd54c-v775z" Jan 11 17:35:15 crc kubenswrapper[4837]: I0111 17:35:15.322238 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-b577cd54c-v775z" event={"ID":"99d66a69-1de3-43d5-947f-1fc4611c3022","Type":"ContainerDied","Data":"673ae980dce39c77fe50aed9c3b8bcaaa849eeb42fa7dc141ad57c6de25790a3"} Jan 11 17:35:15 crc kubenswrapper[4837]: I0111 17:35:15.322441 4837 scope.go:117] "RemoveContainer" containerID="b717c4896ee849d8240fac8d8dffad05b9e9e18033d97ff49c7cd76f76fc62d5" Jan 11 17:35:15 crc kubenswrapper[4837]: I0111 17:35:15.328313 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-75766b8986-l8bhc" event={"ID":"c2b20a17-42a9-4f8a-98ff-1f21615f9d2c","Type":"ContainerStarted","Data":"f49096fd3605368603625cbcc3e48927b1385284f570bcd219e68a2c8d17416e"} Jan 11 17:35:15 crc kubenswrapper[4837]: I0111 17:35:15.353270 4837 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-scheduler-operator"/"openshift-kube-scheduler-operator-config" Jan 11 17:35:15 crc kubenswrapper[4837]: I0111 17:35:15.374169 4837 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-controller-manager/controller-manager-b577cd54c-v775z"] Jan 11 17:35:15 crc kubenswrapper[4837]: I0111 17:35:15.381242 4837 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-controller-manager/controller-manager-b577cd54c-v775z"] Jan 11 17:35:15 crc kubenswrapper[4837]: I0111 17:35:15.549703 4837 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"v4-0-config-system-cliconfig" Jan 11 17:35:15 crc kubenswrapper[4837]: I0111 17:35:15.630357 4837 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-samples-operator"/"samples-operator-tls" Jan 11 17:35:15 crc kubenswrapper[4837]: I0111 17:35:15.644534 4837 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"image-registry-certificates" Jan 11 17:35:15 crc kubenswrapper[4837]: I0111 17:35:15.763110 4837 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication-operator"/"authentication-operator-dockercfg-mz9bj" Jan 11 17:35:15 crc kubenswrapper[4837]: I0111 17:35:15.807505 4837 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-controller-dockercfg-c2lfx" Jan 11 17:35:15 crc kubenswrapper[4837]: I0111 17:35:15.999721 4837 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"image-registry-operator-tls" Jan 11 17:35:16 crc kubenswrapper[4837]: I0111 17:35:16.227883 4837 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"kube-root-ca.crt" Jan 11 17:35:16 crc kubenswrapper[4837]: I0111 17:35:16.231106 4837 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"kube-root-ca.crt" Jan 11 17:35:16 crc kubenswrapper[4837]: I0111 17:35:16.335117 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-75766b8986-l8bhc" event={"ID":"c2b20a17-42a9-4f8a-98ff-1f21615f9d2c","Type":"ContainerStarted","Data":"818dcc714fa566af1fe0883879696614c9c2e99546dfd30cea8ecacd99208cd8"} Jan 11 17:35:16 crc kubenswrapper[4837]: I0111 17:35:16.335567 4837 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-route-controller-manager/route-controller-manager-75766b8986-l8bhc" Jan 11 17:35:16 crc kubenswrapper[4837]: I0111 17:35:16.367854 4837 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-route-controller-manager/route-controller-manager-75766b8986-l8bhc" podStartSLOduration=3.367825031 podStartE2EDuration="3.367825031s" podCreationTimestamp="2026-01-11 17:35:13 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-11 17:35:16.361757532 +0000 UTC m=+290.539950278" watchObservedRunningTime="2026-01-11 17:35:16.367825031 +0000 UTC m=+290.546017767" Jan 11 17:35:16 crc kubenswrapper[4837]: I0111 17:35:16.369355 4837 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"certified-operators-dockercfg-4rs5g" Jan 11 17:35:16 crc kubenswrapper[4837]: I0111 17:35:16.378464 4837 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8fb3713b-b193-4a5a-a841-50d15d8331c5" path="/var/lib/kubelet/pods/8fb3713b-b193-4a5a-a841-50d15d8331c5/volumes" Jan 11 17:35:16 crc kubenswrapper[4837]: I0111 17:35:16.379142 4837 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="99d66a69-1de3-43d5-947f-1fc4611c3022" path="/var/lib/kubelet/pods/99d66a69-1de3-43d5-947f-1fc4611c3022/volumes" Jan 11 17:35:16 crc kubenswrapper[4837]: I0111 17:35:16.405742 4837 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-route-controller-manager/route-controller-manager-75766b8986-l8bhc" Jan 11 17:35:16 crc kubenswrapper[4837]: I0111 17:35:16.503403 4837 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"ovnkube-identity-cm" Jan 11 17:35:16 crc kubenswrapper[4837]: I0111 17:35:16.797307 4837 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-controller-manager/controller-manager-86c687b466-kld8z"] Jan 11 17:35:16 crc kubenswrapper[4837]: E0111 17:35:16.797507 4837 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="99d66a69-1de3-43d5-947f-1fc4611c3022" containerName="controller-manager" Jan 11 17:35:16 crc kubenswrapper[4837]: I0111 17:35:16.797521 4837 state_mem.go:107] "Deleted CPUSet assignment" podUID="99d66a69-1de3-43d5-947f-1fc4611c3022" containerName="controller-manager" Jan 11 17:35:16 crc kubenswrapper[4837]: I0111 17:35:16.797616 4837 memory_manager.go:354] "RemoveStaleState removing state" podUID="99d66a69-1de3-43d5-947f-1fc4611c3022" containerName="controller-manager" Jan 11 17:35:16 crc kubenswrapper[4837]: I0111 17:35:16.798032 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-86c687b466-kld8z" Jan 11 17:35:16 crc kubenswrapper[4837]: I0111 17:35:16.800233 4837 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"serving-cert" Jan 11 17:35:16 crc kubenswrapper[4837]: I0111 17:35:16.800249 4837 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"openshift-controller-manager-sa-dockercfg-msq4c" Jan 11 17:35:16 crc kubenswrapper[4837]: I0111 17:35:16.800434 4837 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"client-ca" Jan 11 17:35:16 crc kubenswrapper[4837]: I0111 17:35:16.800539 4837 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-service-ca.crt" Jan 11 17:35:16 crc kubenswrapper[4837]: I0111 17:35:16.800700 4837 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"kube-root-ca.crt" Jan 11 17:35:16 crc kubenswrapper[4837]: I0111 17:35:16.806233 4837 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"config" Jan 11 17:35:16 crc kubenswrapper[4837]: I0111 17:35:16.808497 4837 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-86c687b466-kld8z"] Jan 11 17:35:16 crc kubenswrapper[4837]: I0111 17:35:16.816500 4837 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-global-ca" Jan 11 17:35:16 crc kubenswrapper[4837]: I0111 17:35:16.861218 4837 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-storage-version-migrator-operator"/"serving-cert" Jan 11 17:35:16 crc kubenswrapper[4837]: I0111 17:35:16.991294 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/afffd243-4d6f-49f0-b71b-bbc1e3e71ff7-client-ca\") pod \"controller-manager-86c687b466-kld8z\" (UID: \"afffd243-4d6f-49f0-b71b-bbc1e3e71ff7\") " pod="openshift-controller-manager/controller-manager-86c687b466-kld8z" Jan 11 17:35:16 crc kubenswrapper[4837]: I0111 17:35:16.991789 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fqnwm\" (UniqueName: \"kubernetes.io/projected/afffd243-4d6f-49f0-b71b-bbc1e3e71ff7-kube-api-access-fqnwm\") pod \"controller-manager-86c687b466-kld8z\" (UID: \"afffd243-4d6f-49f0-b71b-bbc1e3e71ff7\") " pod="openshift-controller-manager/controller-manager-86c687b466-kld8z" Jan 11 17:35:16 crc kubenswrapper[4837]: I0111 17:35:16.992042 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/afffd243-4d6f-49f0-b71b-bbc1e3e71ff7-serving-cert\") pod \"controller-manager-86c687b466-kld8z\" (UID: \"afffd243-4d6f-49f0-b71b-bbc1e3e71ff7\") " pod="openshift-controller-manager/controller-manager-86c687b466-kld8z" Jan 11 17:35:16 crc kubenswrapper[4837]: I0111 17:35:16.992236 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/afffd243-4d6f-49f0-b71b-bbc1e3e71ff7-config\") pod \"controller-manager-86c687b466-kld8z\" (UID: \"afffd243-4d6f-49f0-b71b-bbc1e3e71ff7\") " pod="openshift-controller-manager/controller-manager-86c687b466-kld8z" Jan 11 17:35:16 crc kubenswrapper[4837]: I0111 17:35:16.992443 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/afffd243-4d6f-49f0-b71b-bbc1e3e71ff7-proxy-ca-bundles\") pod \"controller-manager-86c687b466-kld8z\" (UID: \"afffd243-4d6f-49f0-b71b-bbc1e3e71ff7\") " pod="openshift-controller-manager/controller-manager-86c687b466-kld8z" Jan 11 17:35:17 crc kubenswrapper[4837]: I0111 17:35:17.092931 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/afffd243-4d6f-49f0-b71b-bbc1e3e71ff7-config\") pod \"controller-manager-86c687b466-kld8z\" (UID: \"afffd243-4d6f-49f0-b71b-bbc1e3e71ff7\") " pod="openshift-controller-manager/controller-manager-86c687b466-kld8z" Jan 11 17:35:17 crc kubenswrapper[4837]: I0111 17:35:17.092994 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/afffd243-4d6f-49f0-b71b-bbc1e3e71ff7-proxy-ca-bundles\") pod \"controller-manager-86c687b466-kld8z\" (UID: \"afffd243-4d6f-49f0-b71b-bbc1e3e71ff7\") " pod="openshift-controller-manager/controller-manager-86c687b466-kld8z" Jan 11 17:35:17 crc kubenswrapper[4837]: I0111 17:35:17.093048 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/afffd243-4d6f-49f0-b71b-bbc1e3e71ff7-client-ca\") pod \"controller-manager-86c687b466-kld8z\" (UID: \"afffd243-4d6f-49f0-b71b-bbc1e3e71ff7\") " pod="openshift-controller-manager/controller-manager-86c687b466-kld8z" Jan 11 17:35:17 crc kubenswrapper[4837]: I0111 17:35:17.093086 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fqnwm\" (UniqueName: \"kubernetes.io/projected/afffd243-4d6f-49f0-b71b-bbc1e3e71ff7-kube-api-access-fqnwm\") pod \"controller-manager-86c687b466-kld8z\" (UID: \"afffd243-4d6f-49f0-b71b-bbc1e3e71ff7\") " pod="openshift-controller-manager/controller-manager-86c687b466-kld8z" Jan 11 17:35:17 crc kubenswrapper[4837]: I0111 17:35:17.093131 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/afffd243-4d6f-49f0-b71b-bbc1e3e71ff7-serving-cert\") pod \"controller-manager-86c687b466-kld8z\" (UID: \"afffd243-4d6f-49f0-b71b-bbc1e3e71ff7\") " pod="openshift-controller-manager/controller-manager-86c687b466-kld8z" Jan 11 17:35:17 crc kubenswrapper[4837]: I0111 17:35:17.094809 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/afffd243-4d6f-49f0-b71b-bbc1e3e71ff7-client-ca\") pod \"controller-manager-86c687b466-kld8z\" (UID: \"afffd243-4d6f-49f0-b71b-bbc1e3e71ff7\") " pod="openshift-controller-manager/controller-manager-86c687b466-kld8z" Jan 11 17:35:17 crc kubenswrapper[4837]: I0111 17:35:17.095494 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/afffd243-4d6f-49f0-b71b-bbc1e3e71ff7-proxy-ca-bundles\") pod \"controller-manager-86c687b466-kld8z\" (UID: \"afffd243-4d6f-49f0-b71b-bbc1e3e71ff7\") " pod="openshift-controller-manager/controller-manager-86c687b466-kld8z" Jan 11 17:35:17 crc kubenswrapper[4837]: I0111 17:35:17.096090 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/afffd243-4d6f-49f0-b71b-bbc1e3e71ff7-config\") pod \"controller-manager-86c687b466-kld8z\" (UID: \"afffd243-4d6f-49f0-b71b-bbc1e3e71ff7\") " pod="openshift-controller-manager/controller-manager-86c687b466-kld8z" Jan 11 17:35:17 crc kubenswrapper[4837]: I0111 17:35:17.101278 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/afffd243-4d6f-49f0-b71b-bbc1e3e71ff7-serving-cert\") pod \"controller-manager-86c687b466-kld8z\" (UID: \"afffd243-4d6f-49f0-b71b-bbc1e3e71ff7\") " pod="openshift-controller-manager/controller-manager-86c687b466-kld8z" Jan 11 17:35:17 crc kubenswrapper[4837]: I0111 17:35:17.115665 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fqnwm\" (UniqueName: \"kubernetes.io/projected/afffd243-4d6f-49f0-b71b-bbc1e3e71ff7-kube-api-access-fqnwm\") pod \"controller-manager-86c687b466-kld8z\" (UID: \"afffd243-4d6f-49f0-b71b-bbc1e3e71ff7\") " pod="openshift-controller-manager/controller-manager-86c687b466-kld8z" Jan 11 17:35:17 crc kubenswrapper[4837]: I0111 17:35:17.154642 4837 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver-operator"/"openshift-apiserver-operator-dockercfg-xtcjv" Jan 11 17:35:17 crc kubenswrapper[4837]: I0111 17:35:17.163492 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-86c687b466-kld8z" Jan 11 17:35:17 crc kubenswrapper[4837]: I0111 17:35:17.320311 4837 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-serving-cert" Jan 11 17:35:17 crc kubenswrapper[4837]: I0111 17:35:17.388573 4837 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"openshift-apiserver-sa-dockercfg-djjff" Jan 11 17:35:17 crc kubenswrapper[4837]: I0111 17:35:17.397570 4837 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress"/"openshift-service-ca.crt" Jan 11 17:35:17 crc kubenswrapper[4837]: I0111 17:35:17.407198 4837 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-86c687b466-kld8z"] Jan 11 17:35:17 crc kubenswrapper[4837]: I0111 17:35:17.408430 4837 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator-operator"/"kube-root-ca.crt" Jan 11 17:35:17 crc kubenswrapper[4837]: I0111 17:35:17.507038 4837 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns-operator"/"openshift-service-ca.crt" Jan 11 17:35:17 crc kubenswrapper[4837]: I0111 17:35:17.555625 4837 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"trusted-ca" Jan 11 17:35:17 crc kubenswrapper[4837]: I0111 17:35:17.633975 4837 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-operator"/"kube-root-ca.crt" Jan 11 17:35:17 crc kubenswrapper[4837]: I0111 17:35:17.684396 4837 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-controller-manager-operator"/"kube-controller-manager-operator-config" Jan 11 17:35:17 crc kubenswrapper[4837]: I0111 17:35:17.726942 4837 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-version"/"openshift-service-ca.crt" Jan 11 17:35:17 crc kubenswrapper[4837]: I0111 17:35:17.965943 4837 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"trusted-ca-bundle" Jan 11 17:35:18 crc kubenswrapper[4837]: I0111 17:35:18.352085 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-86c687b466-kld8z" event={"ID":"afffd243-4d6f-49f0-b71b-bbc1e3e71ff7","Type":"ContainerStarted","Data":"1d35738e3a7c971de04bba8b53b2cf6b89bc84f420f77fbf68d5348fc8cae140"} Jan 11 17:35:19 crc kubenswrapper[4837]: I0111 17:35:19.299761 4837 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-config-operator"/"config-operator-serving-cert" Jan 11 17:35:19 crc kubenswrapper[4837]: I0111 17:35:19.584252 4837 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager-operator"/"kube-root-ca.crt" Jan 11 17:35:20 crc kubenswrapper[4837]: I0111 17:35:20.703210 4837 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"service-ca" Jan 11 17:35:21 crc kubenswrapper[4837]: I0111 17:35:21.379494 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-86c687b466-kld8z" event={"ID":"afffd243-4d6f-49f0-b71b-bbc1e3e71ff7","Type":"ContainerStarted","Data":"aab7c26b21f17ac4bedfc0d6b6de2b936f345c71cc823e290c79245c8a20ca7f"} Jan 11 17:35:21 crc kubenswrapper[4837]: I0111 17:35:21.871164 4837 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"catalog-operator-serving-cert" Jan 11 17:35:22 crc kubenswrapper[4837]: I0111 17:35:22.383951 4837 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-controller-manager/controller-manager-86c687b466-kld8z" Jan 11 17:35:22 crc kubenswrapper[4837]: I0111 17:35:22.388764 4837 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-controller-manager/controller-manager-86c687b466-kld8z" Jan 11 17:35:22 crc kubenswrapper[4837]: I0111 17:35:22.404563 4837 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-controller-manager/controller-manager-86c687b466-kld8z" podStartSLOduration=9.40454241 podStartE2EDuration="9.40454241s" podCreationTimestamp="2026-01-11 17:35:13 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-11 17:35:22.400316463 +0000 UTC m=+296.578509169" watchObservedRunningTime="2026-01-11 17:35:22.40454241 +0000 UTC m=+296.582735126" Jan 11 17:35:22 crc kubenswrapper[4837]: I0111 17:35:22.749136 4837 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver-operator"/"kube-root-ca.crt" Jan 11 17:35:24 crc kubenswrapper[4837]: I0111 17:35:24.113868 4837 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-diagnostics"/"kube-root-ca.crt" Jan 11 17:35:25 crc kubenswrapper[4837]: I0111 17:35:25.985738 4837 cert_rotation.go:91] certificate rotation detected, shutting down client connections to start using new credentials Jan 11 17:35:35 crc kubenswrapper[4837]: I0111 17:35:35.169631 4837 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-controller-manager/controller-manager-86c687b466-kld8z"] Jan 11 17:35:35 crc kubenswrapper[4837]: I0111 17:35:35.170444 4837 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-controller-manager/controller-manager-86c687b466-kld8z" podUID="afffd243-4d6f-49f0-b71b-bbc1e3e71ff7" containerName="controller-manager" containerID="cri-o://aab7c26b21f17ac4bedfc0d6b6de2b936f345c71cc823e290c79245c8a20ca7f" gracePeriod=30 Jan 11 17:35:35 crc kubenswrapper[4837]: I0111 17:35:35.191390 4837 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-75766b8986-l8bhc"] Jan 11 17:35:35 crc kubenswrapper[4837]: I0111 17:35:35.191730 4837 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-route-controller-manager/route-controller-manager-75766b8986-l8bhc" podUID="c2b20a17-42a9-4f8a-98ff-1f21615f9d2c" containerName="route-controller-manager" containerID="cri-o://818dcc714fa566af1fe0883879696614c9c2e99546dfd30cea8ecacd99208cd8" gracePeriod=30 Jan 11 17:35:36 crc kubenswrapper[4837]: I0111 17:35:36.475425 4837 generic.go:334] "Generic (PLEG): container finished" podID="afffd243-4d6f-49f0-b71b-bbc1e3e71ff7" containerID="aab7c26b21f17ac4bedfc0d6b6de2b936f345c71cc823e290c79245c8a20ca7f" exitCode=0 Jan 11 17:35:36 crc kubenswrapper[4837]: I0111 17:35:36.475559 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-86c687b466-kld8z" event={"ID":"afffd243-4d6f-49f0-b71b-bbc1e3e71ff7","Type":"ContainerDied","Data":"aab7c26b21f17ac4bedfc0d6b6de2b936f345c71cc823e290c79245c8a20ca7f"} Jan 11 17:35:36 crc kubenswrapper[4837]: I0111 17:35:36.478269 4837 generic.go:334] "Generic (PLEG): container finished" podID="c2b20a17-42a9-4f8a-98ff-1f21615f9d2c" containerID="818dcc714fa566af1fe0883879696614c9c2e99546dfd30cea8ecacd99208cd8" exitCode=0 Jan 11 17:35:36 crc kubenswrapper[4837]: I0111 17:35:36.478326 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-75766b8986-l8bhc" event={"ID":"c2b20a17-42a9-4f8a-98ff-1f21615f9d2c","Type":"ContainerDied","Data":"818dcc714fa566af1fe0883879696614c9c2e99546dfd30cea8ecacd99208cd8"} Jan 11 17:35:37 crc kubenswrapper[4837]: I0111 17:35:37.246104 4837 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-75766b8986-l8bhc" Jan 11 17:35:37 crc kubenswrapper[4837]: I0111 17:35:37.251941 4837 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-86c687b466-kld8z" Jan 11 17:35:37 crc kubenswrapper[4837]: I0111 17:35:37.284890 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/afffd243-4d6f-49f0-b71b-bbc1e3e71ff7-config\") pod \"afffd243-4d6f-49f0-b71b-bbc1e3e71ff7\" (UID: \"afffd243-4d6f-49f0-b71b-bbc1e3e71ff7\") " Jan 11 17:35:37 crc kubenswrapper[4837]: I0111 17:35:37.284994 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/afffd243-4d6f-49f0-b71b-bbc1e3e71ff7-client-ca\") pod \"afffd243-4d6f-49f0-b71b-bbc1e3e71ff7\" (UID: \"afffd243-4d6f-49f0-b71b-bbc1e3e71ff7\") " Jan 11 17:35:37 crc kubenswrapper[4837]: I0111 17:35:37.285032 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-wh99p\" (UniqueName: \"kubernetes.io/projected/c2b20a17-42a9-4f8a-98ff-1f21615f9d2c-kube-api-access-wh99p\") pod \"c2b20a17-42a9-4f8a-98ff-1f21615f9d2c\" (UID: \"c2b20a17-42a9-4f8a-98ff-1f21615f9d2c\") " Jan 11 17:35:37 crc kubenswrapper[4837]: I0111 17:35:37.285054 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/c2b20a17-42a9-4f8a-98ff-1f21615f9d2c-client-ca\") pod \"c2b20a17-42a9-4f8a-98ff-1f21615f9d2c\" (UID: \"c2b20a17-42a9-4f8a-98ff-1f21615f9d2c\") " Jan 11 17:35:37 crc kubenswrapper[4837]: I0111 17:35:37.285086 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/c2b20a17-42a9-4f8a-98ff-1f21615f9d2c-serving-cert\") pod \"c2b20a17-42a9-4f8a-98ff-1f21615f9d2c\" (UID: \"c2b20a17-42a9-4f8a-98ff-1f21615f9d2c\") " Jan 11 17:35:37 crc kubenswrapper[4837]: I0111 17:35:37.285152 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fqnwm\" (UniqueName: \"kubernetes.io/projected/afffd243-4d6f-49f0-b71b-bbc1e3e71ff7-kube-api-access-fqnwm\") pod \"afffd243-4d6f-49f0-b71b-bbc1e3e71ff7\" (UID: \"afffd243-4d6f-49f0-b71b-bbc1e3e71ff7\") " Jan 11 17:35:37 crc kubenswrapper[4837]: I0111 17:35:37.285177 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/afffd243-4d6f-49f0-b71b-bbc1e3e71ff7-proxy-ca-bundles\") pod \"afffd243-4d6f-49f0-b71b-bbc1e3e71ff7\" (UID: \"afffd243-4d6f-49f0-b71b-bbc1e3e71ff7\") " Jan 11 17:35:37 crc kubenswrapper[4837]: I0111 17:35:37.285199 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c2b20a17-42a9-4f8a-98ff-1f21615f9d2c-config\") pod \"c2b20a17-42a9-4f8a-98ff-1f21615f9d2c\" (UID: \"c2b20a17-42a9-4f8a-98ff-1f21615f9d2c\") " Jan 11 17:35:37 crc kubenswrapper[4837]: I0111 17:35:37.285227 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/afffd243-4d6f-49f0-b71b-bbc1e3e71ff7-serving-cert\") pod \"afffd243-4d6f-49f0-b71b-bbc1e3e71ff7\" (UID: \"afffd243-4d6f-49f0-b71b-bbc1e3e71ff7\") " Jan 11 17:35:37 crc kubenswrapper[4837]: I0111 17:35:37.286510 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/afffd243-4d6f-49f0-b71b-bbc1e3e71ff7-proxy-ca-bundles" (OuterVolumeSpecName: "proxy-ca-bundles") pod "afffd243-4d6f-49f0-b71b-bbc1e3e71ff7" (UID: "afffd243-4d6f-49f0-b71b-bbc1e3e71ff7"). InnerVolumeSpecName "proxy-ca-bundles". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 11 17:35:37 crc kubenswrapper[4837]: I0111 17:35:37.286788 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c2b20a17-42a9-4f8a-98ff-1f21615f9d2c-client-ca" (OuterVolumeSpecName: "client-ca") pod "c2b20a17-42a9-4f8a-98ff-1f21615f9d2c" (UID: "c2b20a17-42a9-4f8a-98ff-1f21615f9d2c"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 11 17:35:37 crc kubenswrapper[4837]: I0111 17:35:37.288169 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c2b20a17-42a9-4f8a-98ff-1f21615f9d2c-config" (OuterVolumeSpecName: "config") pod "c2b20a17-42a9-4f8a-98ff-1f21615f9d2c" (UID: "c2b20a17-42a9-4f8a-98ff-1f21615f9d2c"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 11 17:35:37 crc kubenswrapper[4837]: I0111 17:35:37.289408 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/afffd243-4d6f-49f0-b71b-bbc1e3e71ff7-client-ca" (OuterVolumeSpecName: "client-ca") pod "afffd243-4d6f-49f0-b71b-bbc1e3e71ff7" (UID: "afffd243-4d6f-49f0-b71b-bbc1e3e71ff7"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 11 17:35:37 crc kubenswrapper[4837]: I0111 17:35:37.289973 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/afffd243-4d6f-49f0-b71b-bbc1e3e71ff7-config" (OuterVolumeSpecName: "config") pod "afffd243-4d6f-49f0-b71b-bbc1e3e71ff7" (UID: "afffd243-4d6f-49f0-b71b-bbc1e3e71ff7"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 11 17:35:37 crc kubenswrapper[4837]: I0111 17:35:37.290955 4837 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6d64b775c9-dr5p4"] Jan 11 17:35:37 crc kubenswrapper[4837]: E0111 17:35:37.291406 4837 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="afffd243-4d6f-49f0-b71b-bbc1e3e71ff7" containerName="controller-manager" Jan 11 17:35:37 crc kubenswrapper[4837]: I0111 17:35:37.291431 4837 state_mem.go:107] "Deleted CPUSet assignment" podUID="afffd243-4d6f-49f0-b71b-bbc1e3e71ff7" containerName="controller-manager" Jan 11 17:35:37 crc kubenswrapper[4837]: E0111 17:35:37.291460 4837 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c2b20a17-42a9-4f8a-98ff-1f21615f9d2c" containerName="route-controller-manager" Jan 11 17:35:37 crc kubenswrapper[4837]: I0111 17:35:37.291472 4837 state_mem.go:107] "Deleted CPUSet assignment" podUID="c2b20a17-42a9-4f8a-98ff-1f21615f9d2c" containerName="route-controller-manager" Jan 11 17:35:37 crc kubenswrapper[4837]: I0111 17:35:37.292170 4837 memory_manager.go:354] "RemoveStaleState removing state" podUID="c2b20a17-42a9-4f8a-98ff-1f21615f9d2c" containerName="route-controller-manager" Jan 11 17:35:37 crc kubenswrapper[4837]: I0111 17:35:37.292207 4837 memory_manager.go:354] "RemoveStaleState removing state" podUID="afffd243-4d6f-49f0-b71b-bbc1e3e71ff7" containerName="controller-manager" Jan 11 17:35:37 crc kubenswrapper[4837]: I0111 17:35:37.292837 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-6d64b775c9-dr5p4" Jan 11 17:35:37 crc kubenswrapper[4837]: I0111 17:35:37.292961 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/afffd243-4d6f-49f0-b71b-bbc1e3e71ff7-kube-api-access-fqnwm" (OuterVolumeSpecName: "kube-api-access-fqnwm") pod "afffd243-4d6f-49f0-b71b-bbc1e3e71ff7" (UID: "afffd243-4d6f-49f0-b71b-bbc1e3e71ff7"). InnerVolumeSpecName "kube-api-access-fqnwm". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 11 17:35:37 crc kubenswrapper[4837]: I0111 17:35:37.300398 4837 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6d64b775c9-dr5p4"] Jan 11 17:35:37 crc kubenswrapper[4837]: I0111 17:35:37.303845 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/afffd243-4d6f-49f0-b71b-bbc1e3e71ff7-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "afffd243-4d6f-49f0-b71b-bbc1e3e71ff7" (UID: "afffd243-4d6f-49f0-b71b-bbc1e3e71ff7"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 11 17:35:37 crc kubenswrapper[4837]: I0111 17:35:37.307303 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c2b20a17-42a9-4f8a-98ff-1f21615f9d2c-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "c2b20a17-42a9-4f8a-98ff-1f21615f9d2c" (UID: "c2b20a17-42a9-4f8a-98ff-1f21615f9d2c"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 11 17:35:37 crc kubenswrapper[4837]: I0111 17:35:37.309537 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c2b20a17-42a9-4f8a-98ff-1f21615f9d2c-kube-api-access-wh99p" (OuterVolumeSpecName: "kube-api-access-wh99p") pod "c2b20a17-42a9-4f8a-98ff-1f21615f9d2c" (UID: "c2b20a17-42a9-4f8a-98ff-1f21615f9d2c"). InnerVolumeSpecName "kube-api-access-wh99p". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 11 17:35:37 crc kubenswrapper[4837]: I0111 17:35:37.386841 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rd78w\" (UniqueName: \"kubernetes.io/projected/e409442a-5f44-4cb2-a615-be75544036bd-kube-api-access-rd78w\") pod \"route-controller-manager-6d64b775c9-dr5p4\" (UID: \"e409442a-5f44-4cb2-a615-be75544036bd\") " pod="openshift-route-controller-manager/route-controller-manager-6d64b775c9-dr5p4" Jan 11 17:35:37 crc kubenswrapper[4837]: I0111 17:35:37.387247 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/e409442a-5f44-4cb2-a615-be75544036bd-serving-cert\") pod \"route-controller-manager-6d64b775c9-dr5p4\" (UID: \"e409442a-5f44-4cb2-a615-be75544036bd\") " pod="openshift-route-controller-manager/route-controller-manager-6d64b775c9-dr5p4" Jan 11 17:35:37 crc kubenswrapper[4837]: I0111 17:35:37.387280 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/e409442a-5f44-4cb2-a615-be75544036bd-client-ca\") pod \"route-controller-manager-6d64b775c9-dr5p4\" (UID: \"e409442a-5f44-4cb2-a615-be75544036bd\") " pod="openshift-route-controller-manager/route-controller-manager-6d64b775c9-dr5p4" Jan 11 17:35:37 crc kubenswrapper[4837]: I0111 17:35:37.387366 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e409442a-5f44-4cb2-a615-be75544036bd-config\") pod \"route-controller-manager-6d64b775c9-dr5p4\" (UID: \"e409442a-5f44-4cb2-a615-be75544036bd\") " pod="openshift-route-controller-manager/route-controller-manager-6d64b775c9-dr5p4" Jan 11 17:35:37 crc kubenswrapper[4837]: I0111 17:35:37.387417 4837 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-fqnwm\" (UniqueName: \"kubernetes.io/projected/afffd243-4d6f-49f0-b71b-bbc1e3e71ff7-kube-api-access-fqnwm\") on node \"crc\" DevicePath \"\"" Jan 11 17:35:37 crc kubenswrapper[4837]: I0111 17:35:37.387432 4837 reconciler_common.go:293] "Volume detached for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/afffd243-4d6f-49f0-b71b-bbc1e3e71ff7-proxy-ca-bundles\") on node \"crc\" DevicePath \"\"" Jan 11 17:35:37 crc kubenswrapper[4837]: I0111 17:35:37.387445 4837 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c2b20a17-42a9-4f8a-98ff-1f21615f9d2c-config\") on node \"crc\" DevicePath \"\"" Jan 11 17:35:37 crc kubenswrapper[4837]: I0111 17:35:37.387457 4837 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/afffd243-4d6f-49f0-b71b-bbc1e3e71ff7-serving-cert\") on node \"crc\" DevicePath \"\"" Jan 11 17:35:37 crc kubenswrapper[4837]: I0111 17:35:37.387469 4837 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/afffd243-4d6f-49f0-b71b-bbc1e3e71ff7-config\") on node \"crc\" DevicePath \"\"" Jan 11 17:35:37 crc kubenswrapper[4837]: I0111 17:35:37.387481 4837 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/afffd243-4d6f-49f0-b71b-bbc1e3e71ff7-client-ca\") on node \"crc\" DevicePath \"\"" Jan 11 17:35:37 crc kubenswrapper[4837]: I0111 17:35:37.387493 4837 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-wh99p\" (UniqueName: \"kubernetes.io/projected/c2b20a17-42a9-4f8a-98ff-1f21615f9d2c-kube-api-access-wh99p\") on node \"crc\" DevicePath \"\"" Jan 11 17:35:37 crc kubenswrapper[4837]: I0111 17:35:37.387505 4837 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/c2b20a17-42a9-4f8a-98ff-1f21615f9d2c-client-ca\") on node \"crc\" DevicePath \"\"" Jan 11 17:35:37 crc kubenswrapper[4837]: I0111 17:35:37.387515 4837 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/c2b20a17-42a9-4f8a-98ff-1f21615f9d2c-serving-cert\") on node \"crc\" DevicePath \"\"" Jan 11 17:35:37 crc kubenswrapper[4837]: I0111 17:35:37.488222 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rd78w\" (UniqueName: \"kubernetes.io/projected/e409442a-5f44-4cb2-a615-be75544036bd-kube-api-access-rd78w\") pod \"route-controller-manager-6d64b775c9-dr5p4\" (UID: \"e409442a-5f44-4cb2-a615-be75544036bd\") " pod="openshift-route-controller-manager/route-controller-manager-6d64b775c9-dr5p4" Jan 11 17:35:37 crc kubenswrapper[4837]: I0111 17:35:37.488328 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/e409442a-5f44-4cb2-a615-be75544036bd-serving-cert\") pod \"route-controller-manager-6d64b775c9-dr5p4\" (UID: \"e409442a-5f44-4cb2-a615-be75544036bd\") " pod="openshift-route-controller-manager/route-controller-manager-6d64b775c9-dr5p4" Jan 11 17:35:37 crc kubenswrapper[4837]: I0111 17:35:37.488376 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/e409442a-5f44-4cb2-a615-be75544036bd-client-ca\") pod \"route-controller-manager-6d64b775c9-dr5p4\" (UID: \"e409442a-5f44-4cb2-a615-be75544036bd\") " pod="openshift-route-controller-manager/route-controller-manager-6d64b775c9-dr5p4" Jan 11 17:35:37 crc kubenswrapper[4837]: I0111 17:35:37.488477 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e409442a-5f44-4cb2-a615-be75544036bd-config\") pod \"route-controller-manager-6d64b775c9-dr5p4\" (UID: \"e409442a-5f44-4cb2-a615-be75544036bd\") " pod="openshift-route-controller-manager/route-controller-manager-6d64b775c9-dr5p4" Jan 11 17:35:37 crc kubenswrapper[4837]: I0111 17:35:37.490390 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-75766b8986-l8bhc" event={"ID":"c2b20a17-42a9-4f8a-98ff-1f21615f9d2c","Type":"ContainerDied","Data":"f49096fd3605368603625cbcc3e48927b1385284f570bcd219e68a2c8d17416e"} Jan 11 17:35:37 crc kubenswrapper[4837]: I0111 17:35:37.490491 4837 scope.go:117] "RemoveContainer" containerID="818dcc714fa566af1fe0883879696614c9c2e99546dfd30cea8ecacd99208cd8" Jan 11 17:35:37 crc kubenswrapper[4837]: I0111 17:35:37.490614 4837 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-75766b8986-l8bhc" Jan 11 17:35:37 crc kubenswrapper[4837]: I0111 17:35:37.490854 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/e409442a-5f44-4cb2-a615-be75544036bd-client-ca\") pod \"route-controller-manager-6d64b775c9-dr5p4\" (UID: \"e409442a-5f44-4cb2-a615-be75544036bd\") " pod="openshift-route-controller-manager/route-controller-manager-6d64b775c9-dr5p4" Jan 11 17:35:37 crc kubenswrapper[4837]: I0111 17:35:37.491300 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e409442a-5f44-4cb2-a615-be75544036bd-config\") pod \"route-controller-manager-6d64b775c9-dr5p4\" (UID: \"e409442a-5f44-4cb2-a615-be75544036bd\") " pod="openshift-route-controller-manager/route-controller-manager-6d64b775c9-dr5p4" Jan 11 17:35:37 crc kubenswrapper[4837]: I0111 17:35:37.495236 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-86c687b466-kld8z" event={"ID":"afffd243-4d6f-49f0-b71b-bbc1e3e71ff7","Type":"ContainerDied","Data":"1d35738e3a7c971de04bba8b53b2cf6b89bc84f420f77fbf68d5348fc8cae140"} Jan 11 17:35:37 crc kubenswrapper[4837]: I0111 17:35:37.495436 4837 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-86c687b466-kld8z" Jan 11 17:35:37 crc kubenswrapper[4837]: I0111 17:35:37.499987 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/e409442a-5f44-4cb2-a615-be75544036bd-serving-cert\") pod \"route-controller-manager-6d64b775c9-dr5p4\" (UID: \"e409442a-5f44-4cb2-a615-be75544036bd\") " pod="openshift-route-controller-manager/route-controller-manager-6d64b775c9-dr5p4" Jan 11 17:35:37 crc kubenswrapper[4837]: I0111 17:35:37.523263 4837 scope.go:117] "RemoveContainer" containerID="aab7c26b21f17ac4bedfc0d6b6de2b936f345c71cc823e290c79245c8a20ca7f" Jan 11 17:35:37 crc kubenswrapper[4837]: I0111 17:35:37.526073 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rd78w\" (UniqueName: \"kubernetes.io/projected/e409442a-5f44-4cb2-a615-be75544036bd-kube-api-access-rd78w\") pod \"route-controller-manager-6d64b775c9-dr5p4\" (UID: \"e409442a-5f44-4cb2-a615-be75544036bd\") " pod="openshift-route-controller-manager/route-controller-manager-6d64b775c9-dr5p4" Jan 11 17:35:37 crc kubenswrapper[4837]: I0111 17:35:37.550740 4837 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-controller-manager/controller-manager-86c687b466-kld8z"] Jan 11 17:35:37 crc kubenswrapper[4837]: I0111 17:35:37.554997 4837 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-controller-manager/controller-manager-86c687b466-kld8z"] Jan 11 17:35:37 crc kubenswrapper[4837]: I0111 17:35:37.562112 4837 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-75766b8986-l8bhc"] Jan 11 17:35:37 crc kubenswrapper[4837]: I0111 17:35:37.566441 4837 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-75766b8986-l8bhc"] Jan 11 17:35:37 crc kubenswrapper[4837]: I0111 17:35:37.637521 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-6d64b775c9-dr5p4" Jan 11 17:35:37 crc kubenswrapper[4837]: I0111 17:35:37.902795 4837 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6d64b775c9-dr5p4"] Jan 11 17:35:38 crc kubenswrapper[4837]: I0111 17:35:38.165856 4837 patch_prober.go:28] interesting pod/controller-manager-86c687b466-kld8z container/controller-manager namespace/openshift-controller-manager: Readiness probe status=failure output="Get \"https://10.217.0.60:8443/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" start-of-body= Jan 11 17:35:38 crc kubenswrapper[4837]: I0111 17:35:38.166322 4837 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-controller-manager/controller-manager-86c687b466-kld8z" podUID="afffd243-4d6f-49f0-b71b-bbc1e3e71ff7" containerName="controller-manager" probeResult="failure" output="Get \"https://10.217.0.60:8443/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Jan 11 17:35:38 crc kubenswrapper[4837]: I0111 17:35:38.379469 4837 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="afffd243-4d6f-49f0-b71b-bbc1e3e71ff7" path="/var/lib/kubelet/pods/afffd243-4d6f-49f0-b71b-bbc1e3e71ff7/volumes" Jan 11 17:35:38 crc kubenswrapper[4837]: I0111 17:35:38.380766 4837 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c2b20a17-42a9-4f8a-98ff-1f21615f9d2c" path="/var/lib/kubelet/pods/c2b20a17-42a9-4f8a-98ff-1f21615f9d2c/volumes" Jan 11 17:35:38 crc kubenswrapper[4837]: I0111 17:35:38.505520 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-6d64b775c9-dr5p4" event={"ID":"e409442a-5f44-4cb2-a615-be75544036bd","Type":"ContainerStarted","Data":"3b11c074ec2c6bf85ba79d1a90dacbe8c7dcbb54d0058aef3f715d9de43be2c5"} Jan 11 17:35:38 crc kubenswrapper[4837]: I0111 17:35:38.505583 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-6d64b775c9-dr5p4" event={"ID":"e409442a-5f44-4cb2-a615-be75544036bd","Type":"ContainerStarted","Data":"9e7d7cebe2ba72e06107d49a293e481be73b3ef516a9f773351afd0e12a288c0"} Jan 11 17:35:38 crc kubenswrapper[4837]: I0111 17:35:38.506132 4837 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-route-controller-manager/route-controller-manager-6d64b775c9-dr5p4" Jan 11 17:35:38 crc kubenswrapper[4837]: I0111 17:35:38.970519 4837 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-route-controller-manager/route-controller-manager-6d64b775c9-dr5p4" Jan 11 17:35:38 crc kubenswrapper[4837]: I0111 17:35:38.991016 4837 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-route-controller-manager/route-controller-manager-6d64b775c9-dr5p4" podStartSLOduration=3.990989164 podStartE2EDuration="3.990989164s" podCreationTimestamp="2026-01-11 17:35:35 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-11 17:35:38.535534021 +0000 UTC m=+312.713726737" watchObservedRunningTime="2026-01-11 17:35:38.990989164 +0000 UTC m=+313.169181910" Jan 11 17:35:39 crc kubenswrapper[4837]: I0111 17:35:39.822045 4837 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-controller-manager/controller-manager-5d887777b5-jjg7k"] Jan 11 17:35:39 crc kubenswrapper[4837]: I0111 17:35:39.823347 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-5d887777b5-jjg7k" Jan 11 17:35:39 crc kubenswrapper[4837]: I0111 17:35:39.837359 4837 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-5d887777b5-jjg7k"] Jan 11 17:35:39 crc kubenswrapper[4837]: I0111 17:35:39.838270 4837 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"serving-cert" Jan 11 17:35:39 crc kubenswrapper[4837]: I0111 17:35:39.838533 4837 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"config" Jan 11 17:35:39 crc kubenswrapper[4837]: I0111 17:35:39.838737 4837 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"openshift-controller-manager-sa-dockercfg-msq4c" Jan 11 17:35:39 crc kubenswrapper[4837]: I0111 17:35:39.838983 4837 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"kube-root-ca.crt" Jan 11 17:35:39 crc kubenswrapper[4837]: I0111 17:35:39.841632 4837 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"client-ca" Jan 11 17:35:39 crc kubenswrapper[4837]: I0111 17:35:39.842055 4837 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-service-ca.crt" Jan 11 17:35:39 crc kubenswrapper[4837]: I0111 17:35:39.845432 4837 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-global-ca" Jan 11 17:35:39 crc kubenswrapper[4837]: I0111 17:35:39.930930 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-48d5s\" (UniqueName: \"kubernetes.io/projected/a9d572b6-d310-4a29-af60-e1a39c5a8859-kube-api-access-48d5s\") pod \"controller-manager-5d887777b5-jjg7k\" (UID: \"a9d572b6-d310-4a29-af60-e1a39c5a8859\") " pod="openshift-controller-manager/controller-manager-5d887777b5-jjg7k" Jan 11 17:35:39 crc kubenswrapper[4837]: I0111 17:35:39.930988 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/a9d572b6-d310-4a29-af60-e1a39c5a8859-client-ca\") pod \"controller-manager-5d887777b5-jjg7k\" (UID: \"a9d572b6-d310-4a29-af60-e1a39c5a8859\") " pod="openshift-controller-manager/controller-manager-5d887777b5-jjg7k" Jan 11 17:35:39 crc kubenswrapper[4837]: I0111 17:35:39.931017 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a9d572b6-d310-4a29-af60-e1a39c5a8859-config\") pod \"controller-manager-5d887777b5-jjg7k\" (UID: \"a9d572b6-d310-4a29-af60-e1a39c5a8859\") " pod="openshift-controller-manager/controller-manager-5d887777b5-jjg7k" Jan 11 17:35:39 crc kubenswrapper[4837]: I0111 17:35:39.931095 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/a9d572b6-d310-4a29-af60-e1a39c5a8859-serving-cert\") pod \"controller-manager-5d887777b5-jjg7k\" (UID: \"a9d572b6-d310-4a29-af60-e1a39c5a8859\") " pod="openshift-controller-manager/controller-manager-5d887777b5-jjg7k" Jan 11 17:35:39 crc kubenswrapper[4837]: I0111 17:35:39.931116 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/a9d572b6-d310-4a29-af60-e1a39c5a8859-proxy-ca-bundles\") pod \"controller-manager-5d887777b5-jjg7k\" (UID: \"a9d572b6-d310-4a29-af60-e1a39c5a8859\") " pod="openshift-controller-manager/controller-manager-5d887777b5-jjg7k" Jan 11 17:35:40 crc kubenswrapper[4837]: I0111 17:35:40.032447 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-48d5s\" (UniqueName: \"kubernetes.io/projected/a9d572b6-d310-4a29-af60-e1a39c5a8859-kube-api-access-48d5s\") pod \"controller-manager-5d887777b5-jjg7k\" (UID: \"a9d572b6-d310-4a29-af60-e1a39c5a8859\") " pod="openshift-controller-manager/controller-manager-5d887777b5-jjg7k" Jan 11 17:35:40 crc kubenswrapper[4837]: I0111 17:35:40.032596 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/a9d572b6-d310-4a29-af60-e1a39c5a8859-client-ca\") pod \"controller-manager-5d887777b5-jjg7k\" (UID: \"a9d572b6-d310-4a29-af60-e1a39c5a8859\") " pod="openshift-controller-manager/controller-manager-5d887777b5-jjg7k" Jan 11 17:35:40 crc kubenswrapper[4837]: I0111 17:35:40.032668 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a9d572b6-d310-4a29-af60-e1a39c5a8859-config\") pod \"controller-manager-5d887777b5-jjg7k\" (UID: \"a9d572b6-d310-4a29-af60-e1a39c5a8859\") " pod="openshift-controller-manager/controller-manager-5d887777b5-jjg7k" Jan 11 17:35:40 crc kubenswrapper[4837]: I0111 17:35:40.032891 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/a9d572b6-d310-4a29-af60-e1a39c5a8859-serving-cert\") pod \"controller-manager-5d887777b5-jjg7k\" (UID: \"a9d572b6-d310-4a29-af60-e1a39c5a8859\") " pod="openshift-controller-manager/controller-manager-5d887777b5-jjg7k" Jan 11 17:35:40 crc kubenswrapper[4837]: I0111 17:35:40.032939 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/a9d572b6-d310-4a29-af60-e1a39c5a8859-proxy-ca-bundles\") pod \"controller-manager-5d887777b5-jjg7k\" (UID: \"a9d572b6-d310-4a29-af60-e1a39c5a8859\") " pod="openshift-controller-manager/controller-manager-5d887777b5-jjg7k" Jan 11 17:35:40 crc kubenswrapper[4837]: I0111 17:35:40.035429 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/a9d572b6-d310-4a29-af60-e1a39c5a8859-proxy-ca-bundles\") pod \"controller-manager-5d887777b5-jjg7k\" (UID: \"a9d572b6-d310-4a29-af60-e1a39c5a8859\") " pod="openshift-controller-manager/controller-manager-5d887777b5-jjg7k" Jan 11 17:35:40 crc kubenswrapper[4837]: I0111 17:35:40.036994 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a9d572b6-d310-4a29-af60-e1a39c5a8859-config\") pod \"controller-manager-5d887777b5-jjg7k\" (UID: \"a9d572b6-d310-4a29-af60-e1a39c5a8859\") " pod="openshift-controller-manager/controller-manager-5d887777b5-jjg7k" Jan 11 17:35:40 crc kubenswrapper[4837]: I0111 17:35:40.038930 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/a9d572b6-d310-4a29-af60-e1a39c5a8859-client-ca\") pod \"controller-manager-5d887777b5-jjg7k\" (UID: \"a9d572b6-d310-4a29-af60-e1a39c5a8859\") " pod="openshift-controller-manager/controller-manager-5d887777b5-jjg7k" Jan 11 17:35:40 crc kubenswrapper[4837]: I0111 17:35:40.043359 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/a9d572b6-d310-4a29-af60-e1a39c5a8859-serving-cert\") pod \"controller-manager-5d887777b5-jjg7k\" (UID: \"a9d572b6-d310-4a29-af60-e1a39c5a8859\") " pod="openshift-controller-manager/controller-manager-5d887777b5-jjg7k" Jan 11 17:35:40 crc kubenswrapper[4837]: I0111 17:35:40.063390 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-48d5s\" (UniqueName: \"kubernetes.io/projected/a9d572b6-d310-4a29-af60-e1a39c5a8859-kube-api-access-48d5s\") pod \"controller-manager-5d887777b5-jjg7k\" (UID: \"a9d572b6-d310-4a29-af60-e1a39c5a8859\") " pod="openshift-controller-manager/controller-manager-5d887777b5-jjg7k" Jan 11 17:35:40 crc kubenswrapper[4837]: I0111 17:35:40.189318 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-5d887777b5-jjg7k" Jan 11 17:35:40 crc kubenswrapper[4837]: I0111 17:35:40.664332 4837 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-5d887777b5-jjg7k"] Jan 11 17:35:41 crc kubenswrapper[4837]: I0111 17:35:41.531434 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-5d887777b5-jjg7k" event={"ID":"a9d572b6-d310-4a29-af60-e1a39c5a8859","Type":"ContainerStarted","Data":"0057d3b44c8439c0e3378f8419bb78076cafbe98689de885c4f692c37447060c"} Jan 11 17:35:41 crc kubenswrapper[4837]: I0111 17:35:41.531816 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-5d887777b5-jjg7k" event={"ID":"a9d572b6-d310-4a29-af60-e1a39c5a8859","Type":"ContainerStarted","Data":"7363ab73ecd62783c0cd1a3a3f2fb1370d18ca14066f9cb558552045f707dc91"} Jan 11 17:35:41 crc kubenswrapper[4837]: I0111 17:35:41.531843 4837 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-controller-manager/controller-manager-5d887777b5-jjg7k" Jan 11 17:35:41 crc kubenswrapper[4837]: I0111 17:35:41.536383 4837 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-controller-manager/controller-manager-5d887777b5-jjg7k" Jan 11 17:35:41 crc kubenswrapper[4837]: I0111 17:35:41.549861 4837 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-controller-manager/controller-manager-5d887777b5-jjg7k" podStartSLOduration=6.549843294 podStartE2EDuration="6.549843294s" podCreationTimestamp="2026-01-11 17:35:35 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-11 17:35:41.547394205 +0000 UTC m=+315.725586911" watchObservedRunningTime="2026-01-11 17:35:41.549843294 +0000 UTC m=+315.728036000" Jan 11 17:35:53 crc kubenswrapper[4837]: I0111 17:35:53.093313 4837 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-controller-manager/controller-manager-5d887777b5-jjg7k"] Jan 11 17:35:53 crc kubenswrapper[4837]: I0111 17:35:53.094085 4837 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-controller-manager/controller-manager-5d887777b5-jjg7k" podUID="a9d572b6-d310-4a29-af60-e1a39c5a8859" containerName="controller-manager" containerID="cri-o://0057d3b44c8439c0e3378f8419bb78076cafbe98689de885c4f692c37447060c" gracePeriod=30 Jan 11 17:35:53 crc kubenswrapper[4837]: I0111 17:35:53.190464 4837 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6d64b775c9-dr5p4"] Jan 11 17:35:53 crc kubenswrapper[4837]: I0111 17:35:53.190736 4837 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-route-controller-manager/route-controller-manager-6d64b775c9-dr5p4" podUID="e409442a-5f44-4cb2-a615-be75544036bd" containerName="route-controller-manager" containerID="cri-o://3b11c074ec2c6bf85ba79d1a90dacbe8c7dcbb54d0058aef3f715d9de43be2c5" gracePeriod=30 Jan 11 17:35:53 crc kubenswrapper[4837]: I0111 17:35:53.619458 4837 generic.go:334] "Generic (PLEG): container finished" podID="e409442a-5f44-4cb2-a615-be75544036bd" containerID="3b11c074ec2c6bf85ba79d1a90dacbe8c7dcbb54d0058aef3f715d9de43be2c5" exitCode=0 Jan 11 17:35:53 crc kubenswrapper[4837]: I0111 17:35:53.619537 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-6d64b775c9-dr5p4" event={"ID":"e409442a-5f44-4cb2-a615-be75544036bd","Type":"ContainerDied","Data":"3b11c074ec2c6bf85ba79d1a90dacbe8c7dcbb54d0058aef3f715d9de43be2c5"} Jan 11 17:35:53 crc kubenswrapper[4837]: I0111 17:35:53.620800 4837 generic.go:334] "Generic (PLEG): container finished" podID="a9d572b6-d310-4a29-af60-e1a39c5a8859" containerID="0057d3b44c8439c0e3378f8419bb78076cafbe98689de885c4f692c37447060c" exitCode=0 Jan 11 17:35:53 crc kubenswrapper[4837]: I0111 17:35:53.620830 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-5d887777b5-jjg7k" event={"ID":"a9d572b6-d310-4a29-af60-e1a39c5a8859","Type":"ContainerDied","Data":"0057d3b44c8439c0e3378f8419bb78076cafbe98689de885c4f692c37447060c"} Jan 11 17:35:53 crc kubenswrapper[4837]: I0111 17:35:53.731294 4837 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-6d64b775c9-dr5p4" Jan 11 17:35:53 crc kubenswrapper[4837]: I0111 17:35:53.916572 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-rd78w\" (UniqueName: \"kubernetes.io/projected/e409442a-5f44-4cb2-a615-be75544036bd-kube-api-access-rd78w\") pod \"e409442a-5f44-4cb2-a615-be75544036bd\" (UID: \"e409442a-5f44-4cb2-a615-be75544036bd\") " Jan 11 17:35:53 crc kubenswrapper[4837]: I0111 17:35:53.916623 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e409442a-5f44-4cb2-a615-be75544036bd-config\") pod \"e409442a-5f44-4cb2-a615-be75544036bd\" (UID: \"e409442a-5f44-4cb2-a615-be75544036bd\") " Jan 11 17:35:53 crc kubenswrapper[4837]: I0111 17:35:53.916668 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/e409442a-5f44-4cb2-a615-be75544036bd-client-ca\") pod \"e409442a-5f44-4cb2-a615-be75544036bd\" (UID: \"e409442a-5f44-4cb2-a615-be75544036bd\") " Jan 11 17:35:53 crc kubenswrapper[4837]: I0111 17:35:53.916770 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/e409442a-5f44-4cb2-a615-be75544036bd-serving-cert\") pod \"e409442a-5f44-4cb2-a615-be75544036bd\" (UID: \"e409442a-5f44-4cb2-a615-be75544036bd\") " Jan 11 17:35:53 crc kubenswrapper[4837]: I0111 17:35:53.917520 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e409442a-5f44-4cb2-a615-be75544036bd-config" (OuterVolumeSpecName: "config") pod "e409442a-5f44-4cb2-a615-be75544036bd" (UID: "e409442a-5f44-4cb2-a615-be75544036bd"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 11 17:35:53 crc kubenswrapper[4837]: I0111 17:35:53.917557 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e409442a-5f44-4cb2-a615-be75544036bd-client-ca" (OuterVolumeSpecName: "client-ca") pod "e409442a-5f44-4cb2-a615-be75544036bd" (UID: "e409442a-5f44-4cb2-a615-be75544036bd"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 11 17:35:53 crc kubenswrapper[4837]: I0111 17:35:53.923234 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e409442a-5f44-4cb2-a615-be75544036bd-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "e409442a-5f44-4cb2-a615-be75544036bd" (UID: "e409442a-5f44-4cb2-a615-be75544036bd"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 11 17:35:53 crc kubenswrapper[4837]: I0111 17:35:53.923461 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e409442a-5f44-4cb2-a615-be75544036bd-kube-api-access-rd78w" (OuterVolumeSpecName: "kube-api-access-rd78w") pod "e409442a-5f44-4cb2-a615-be75544036bd" (UID: "e409442a-5f44-4cb2-a615-be75544036bd"). InnerVolumeSpecName "kube-api-access-rd78w". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 11 17:35:54 crc kubenswrapper[4837]: I0111 17:35:54.018161 4837 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-rd78w\" (UniqueName: \"kubernetes.io/projected/e409442a-5f44-4cb2-a615-be75544036bd-kube-api-access-rd78w\") on node \"crc\" DevicePath \"\"" Jan 11 17:35:54 crc kubenswrapper[4837]: I0111 17:35:54.018217 4837 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e409442a-5f44-4cb2-a615-be75544036bd-config\") on node \"crc\" DevicePath \"\"" Jan 11 17:35:54 crc kubenswrapper[4837]: I0111 17:35:54.018234 4837 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/e409442a-5f44-4cb2-a615-be75544036bd-client-ca\") on node \"crc\" DevicePath \"\"" Jan 11 17:35:54 crc kubenswrapper[4837]: I0111 17:35:54.018246 4837 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/e409442a-5f44-4cb2-a615-be75544036bd-serving-cert\") on node \"crc\" DevicePath \"\"" Jan 11 17:35:54 crc kubenswrapper[4837]: I0111 17:35:54.115176 4837 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-5d887777b5-jjg7k" Jan 11 17:35:54 crc kubenswrapper[4837]: I0111 17:35:54.220415 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a9d572b6-d310-4a29-af60-e1a39c5a8859-config\") pod \"a9d572b6-d310-4a29-af60-e1a39c5a8859\" (UID: \"a9d572b6-d310-4a29-af60-e1a39c5a8859\") " Jan 11 17:35:54 crc kubenswrapper[4837]: I0111 17:35:54.220464 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/a9d572b6-d310-4a29-af60-e1a39c5a8859-serving-cert\") pod \"a9d572b6-d310-4a29-af60-e1a39c5a8859\" (UID: \"a9d572b6-d310-4a29-af60-e1a39c5a8859\") " Jan 11 17:35:54 crc kubenswrapper[4837]: I0111 17:35:54.220560 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/a9d572b6-d310-4a29-af60-e1a39c5a8859-proxy-ca-bundles\") pod \"a9d572b6-d310-4a29-af60-e1a39c5a8859\" (UID: \"a9d572b6-d310-4a29-af60-e1a39c5a8859\") " Jan 11 17:35:54 crc kubenswrapper[4837]: I0111 17:35:54.220667 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-48d5s\" (UniqueName: \"kubernetes.io/projected/a9d572b6-d310-4a29-af60-e1a39c5a8859-kube-api-access-48d5s\") pod \"a9d572b6-d310-4a29-af60-e1a39c5a8859\" (UID: \"a9d572b6-d310-4a29-af60-e1a39c5a8859\") " Jan 11 17:35:54 crc kubenswrapper[4837]: I0111 17:35:54.220772 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/a9d572b6-d310-4a29-af60-e1a39c5a8859-client-ca\") pod \"a9d572b6-d310-4a29-af60-e1a39c5a8859\" (UID: \"a9d572b6-d310-4a29-af60-e1a39c5a8859\") " Jan 11 17:35:54 crc kubenswrapper[4837]: I0111 17:35:54.221624 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a9d572b6-d310-4a29-af60-e1a39c5a8859-proxy-ca-bundles" (OuterVolumeSpecName: "proxy-ca-bundles") pod "a9d572b6-d310-4a29-af60-e1a39c5a8859" (UID: "a9d572b6-d310-4a29-af60-e1a39c5a8859"). InnerVolumeSpecName "proxy-ca-bundles". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 11 17:35:54 crc kubenswrapper[4837]: I0111 17:35:54.221757 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a9d572b6-d310-4a29-af60-e1a39c5a8859-client-ca" (OuterVolumeSpecName: "client-ca") pod "a9d572b6-d310-4a29-af60-e1a39c5a8859" (UID: "a9d572b6-d310-4a29-af60-e1a39c5a8859"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 11 17:35:54 crc kubenswrapper[4837]: I0111 17:35:54.221956 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a9d572b6-d310-4a29-af60-e1a39c5a8859-config" (OuterVolumeSpecName: "config") pod "a9d572b6-d310-4a29-af60-e1a39c5a8859" (UID: "a9d572b6-d310-4a29-af60-e1a39c5a8859"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 11 17:35:54 crc kubenswrapper[4837]: I0111 17:35:54.222011 4837 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/a9d572b6-d310-4a29-af60-e1a39c5a8859-client-ca\") on node \"crc\" DevicePath \"\"" Jan 11 17:35:54 crc kubenswrapper[4837]: I0111 17:35:54.222037 4837 reconciler_common.go:293] "Volume detached for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/a9d572b6-d310-4a29-af60-e1a39c5a8859-proxy-ca-bundles\") on node \"crc\" DevicePath \"\"" Jan 11 17:35:54 crc kubenswrapper[4837]: I0111 17:35:54.225482 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a9d572b6-d310-4a29-af60-e1a39c5a8859-kube-api-access-48d5s" (OuterVolumeSpecName: "kube-api-access-48d5s") pod "a9d572b6-d310-4a29-af60-e1a39c5a8859" (UID: "a9d572b6-d310-4a29-af60-e1a39c5a8859"). InnerVolumeSpecName "kube-api-access-48d5s". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 11 17:35:54 crc kubenswrapper[4837]: I0111 17:35:54.226585 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a9d572b6-d310-4a29-af60-e1a39c5a8859-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "a9d572b6-d310-4a29-af60-e1a39c5a8859" (UID: "a9d572b6-d310-4a29-af60-e1a39c5a8859"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 11 17:35:54 crc kubenswrapper[4837]: I0111 17:35:54.323160 4837 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-48d5s\" (UniqueName: \"kubernetes.io/projected/a9d572b6-d310-4a29-af60-e1a39c5a8859-kube-api-access-48d5s\") on node \"crc\" DevicePath \"\"" Jan 11 17:35:54 crc kubenswrapper[4837]: I0111 17:35:54.323194 4837 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a9d572b6-d310-4a29-af60-e1a39c5a8859-config\") on node \"crc\" DevicePath \"\"" Jan 11 17:35:54 crc kubenswrapper[4837]: I0111 17:35:54.323205 4837 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/a9d572b6-d310-4a29-af60-e1a39c5a8859-serving-cert\") on node \"crc\" DevicePath \"\"" Jan 11 17:35:54 crc kubenswrapper[4837]: I0111 17:35:54.501462 4837 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-jk4fx"] Jan 11 17:35:54 crc kubenswrapper[4837]: I0111 17:35:54.501713 4837 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-jk4fx" podUID="1aecbb0e-6cc3-4308-a741-c1799ec8b541" containerName="registry-server" containerID="cri-o://9a28e04adf3e0396d207e7a32a1dafbe70b8bfaf68d43934c84b492e1875a5a8" gracePeriod=2 Jan 11 17:35:54 crc kubenswrapper[4837]: I0111 17:35:54.629580 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-5d887777b5-jjg7k" event={"ID":"a9d572b6-d310-4a29-af60-e1a39c5a8859","Type":"ContainerDied","Data":"7363ab73ecd62783c0cd1a3a3f2fb1370d18ca14066f9cb558552045f707dc91"} Jan 11 17:35:54 crc kubenswrapper[4837]: I0111 17:35:54.629651 4837 scope.go:117] "RemoveContainer" containerID="0057d3b44c8439c0e3378f8419bb78076cafbe98689de885c4f692c37447060c" Jan 11 17:35:54 crc kubenswrapper[4837]: I0111 17:35:54.629586 4837 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-5d887777b5-jjg7k" Jan 11 17:35:54 crc kubenswrapper[4837]: I0111 17:35:54.633471 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-6d64b775c9-dr5p4" event={"ID":"e409442a-5f44-4cb2-a615-be75544036bd","Type":"ContainerDied","Data":"9e7d7cebe2ba72e06107d49a293e481be73b3ef516a9f773351afd0e12a288c0"} Jan 11 17:35:54 crc kubenswrapper[4837]: I0111 17:35:54.633549 4837 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-6d64b775c9-dr5p4" Jan 11 17:35:54 crc kubenswrapper[4837]: I0111 17:35:54.641325 4837 generic.go:334] "Generic (PLEG): container finished" podID="1aecbb0e-6cc3-4308-a741-c1799ec8b541" containerID="9a28e04adf3e0396d207e7a32a1dafbe70b8bfaf68d43934c84b492e1875a5a8" exitCode=0 Jan 11 17:35:54 crc kubenswrapper[4837]: I0111 17:35:54.641580 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-jk4fx" event={"ID":"1aecbb0e-6cc3-4308-a741-c1799ec8b541","Type":"ContainerDied","Data":"9a28e04adf3e0396d207e7a32a1dafbe70b8bfaf68d43934c84b492e1875a5a8"} Jan 11 17:35:54 crc kubenswrapper[4837]: I0111 17:35:54.662593 4837 scope.go:117] "RemoveContainer" containerID="3b11c074ec2c6bf85ba79d1a90dacbe8c7dcbb54d0058aef3f715d9de43be2c5" Jan 11 17:35:54 crc kubenswrapper[4837]: I0111 17:35:54.667888 4837 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6d64b775c9-dr5p4"] Jan 11 17:35:54 crc kubenswrapper[4837]: I0111 17:35:54.679017 4837 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6d64b775c9-dr5p4"] Jan 11 17:35:54 crc kubenswrapper[4837]: I0111 17:35:54.683637 4837 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-controller-manager/controller-manager-5d887777b5-jjg7k"] Jan 11 17:35:54 crc kubenswrapper[4837]: I0111 17:35:54.688257 4837 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-controller-manager/controller-manager-5d887777b5-jjg7k"] Jan 11 17:35:54 crc kubenswrapper[4837]: I0111 17:35:54.837407 4837 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-controller-manager/controller-manager-78cdf68d5d-rp67p"] Jan 11 17:35:54 crc kubenswrapper[4837]: E0111 17:35:54.837686 4837 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e409442a-5f44-4cb2-a615-be75544036bd" containerName="route-controller-manager" Jan 11 17:35:54 crc kubenswrapper[4837]: I0111 17:35:54.837701 4837 state_mem.go:107] "Deleted CPUSet assignment" podUID="e409442a-5f44-4cb2-a615-be75544036bd" containerName="route-controller-manager" Jan 11 17:35:54 crc kubenswrapper[4837]: E0111 17:35:54.837713 4837 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a9d572b6-d310-4a29-af60-e1a39c5a8859" containerName="controller-manager" Jan 11 17:35:54 crc kubenswrapper[4837]: I0111 17:35:54.837719 4837 state_mem.go:107] "Deleted CPUSet assignment" podUID="a9d572b6-d310-4a29-af60-e1a39c5a8859" containerName="controller-manager" Jan 11 17:35:54 crc kubenswrapper[4837]: I0111 17:35:54.837818 4837 memory_manager.go:354] "RemoveStaleState removing state" podUID="a9d572b6-d310-4a29-af60-e1a39c5a8859" containerName="controller-manager" Jan 11 17:35:54 crc kubenswrapper[4837]: I0111 17:35:54.837831 4837 memory_manager.go:354] "RemoveStaleState removing state" podUID="e409442a-5f44-4cb2-a615-be75544036bd" containerName="route-controller-manager" Jan 11 17:35:54 crc kubenswrapper[4837]: I0111 17:35:54.838273 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-78cdf68d5d-rp67p" Jan 11 17:35:54 crc kubenswrapper[4837]: I0111 17:35:54.844220 4837 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-route-controller-manager/route-controller-manager-697c9bc45b-xc67k"] Jan 11 17:35:54 crc kubenswrapper[4837]: I0111 17:35:54.844950 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-697c9bc45b-xc67k" Jan 11 17:35:54 crc kubenswrapper[4837]: I0111 17:35:54.845746 4837 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-service-ca.crt" Jan 11 17:35:54 crc kubenswrapper[4837]: I0111 17:35:54.845840 4837 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"kube-root-ca.crt" Jan 11 17:35:54 crc kubenswrapper[4837]: I0111 17:35:54.846193 4837 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"config" Jan 11 17:35:54 crc kubenswrapper[4837]: I0111 17:35:54.846336 4837 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"serving-cert" Jan 11 17:35:54 crc kubenswrapper[4837]: I0111 17:35:54.846434 4837 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"client-ca" Jan 11 17:35:54 crc kubenswrapper[4837]: I0111 17:35:54.846554 4837 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"openshift-controller-manager-sa-dockercfg-msq4c" Jan 11 17:35:54 crc kubenswrapper[4837]: I0111 17:35:54.851347 4837 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-78cdf68d5d-rp67p"] Jan 11 17:35:54 crc kubenswrapper[4837]: I0111 17:35:54.852402 4837 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-global-ca" Jan 11 17:35:54 crc kubenswrapper[4837]: I0111 17:35:54.852520 4837 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"config" Jan 11 17:35:54 crc kubenswrapper[4837]: I0111 17:35:54.852633 4837 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"serving-cert" Jan 11 17:35:54 crc kubenswrapper[4837]: I0111 17:35:54.852852 4837 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"openshift-service-ca.crt" Jan 11 17:35:54 crc kubenswrapper[4837]: I0111 17:35:54.852921 4837 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"route-controller-manager-sa-dockercfg-h2zr2" Jan 11 17:35:54 crc kubenswrapper[4837]: I0111 17:35:54.853057 4837 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"kube-root-ca.crt" Jan 11 17:35:54 crc kubenswrapper[4837]: I0111 17:35:54.853168 4837 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"client-ca" Jan 11 17:35:54 crc kubenswrapper[4837]: I0111 17:35:54.855689 4837 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-697c9bc45b-xc67k"] Jan 11 17:35:54 crc kubenswrapper[4837]: I0111 17:35:54.931439 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/6cb608ee-09ed-44ec-a8b7-d22aa14fe80d-serving-cert\") pod \"route-controller-manager-697c9bc45b-xc67k\" (UID: \"6cb608ee-09ed-44ec-a8b7-d22aa14fe80d\") " pod="openshift-route-controller-manager/route-controller-manager-697c9bc45b-xc67k" Jan 11 17:35:54 crc kubenswrapper[4837]: I0111 17:35:54.931492 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/6cb608ee-09ed-44ec-a8b7-d22aa14fe80d-client-ca\") pod \"route-controller-manager-697c9bc45b-xc67k\" (UID: \"6cb608ee-09ed-44ec-a8b7-d22aa14fe80d\") " pod="openshift-route-controller-manager/route-controller-manager-697c9bc45b-xc67k" Jan 11 17:35:54 crc kubenswrapper[4837]: I0111 17:35:54.931519 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d940561a-5e7f-455d-bb6e-30fa73b1c4b7-config\") pod \"controller-manager-78cdf68d5d-rp67p\" (UID: \"d940561a-5e7f-455d-bb6e-30fa73b1c4b7\") " pod="openshift-controller-manager/controller-manager-78cdf68d5d-rp67p" Jan 11 17:35:54 crc kubenswrapper[4837]: I0111 17:35:54.931541 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/d940561a-5e7f-455d-bb6e-30fa73b1c4b7-serving-cert\") pod \"controller-manager-78cdf68d5d-rp67p\" (UID: \"d940561a-5e7f-455d-bb6e-30fa73b1c4b7\") " pod="openshift-controller-manager/controller-manager-78cdf68d5d-rp67p" Jan 11 17:35:54 crc kubenswrapper[4837]: I0111 17:35:54.931562 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-h285z\" (UniqueName: \"kubernetes.io/projected/6cb608ee-09ed-44ec-a8b7-d22aa14fe80d-kube-api-access-h285z\") pod \"route-controller-manager-697c9bc45b-xc67k\" (UID: \"6cb608ee-09ed-44ec-a8b7-d22aa14fe80d\") " pod="openshift-route-controller-manager/route-controller-manager-697c9bc45b-xc67k" Jan 11 17:35:54 crc kubenswrapper[4837]: I0111 17:35:54.931581 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6cb608ee-09ed-44ec-a8b7-d22aa14fe80d-config\") pod \"route-controller-manager-697c9bc45b-xc67k\" (UID: \"6cb608ee-09ed-44ec-a8b7-d22aa14fe80d\") " pod="openshift-route-controller-manager/route-controller-manager-697c9bc45b-xc67k" Jan 11 17:35:54 crc kubenswrapper[4837]: I0111 17:35:54.931602 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-l69hg\" (UniqueName: \"kubernetes.io/projected/d940561a-5e7f-455d-bb6e-30fa73b1c4b7-kube-api-access-l69hg\") pod \"controller-manager-78cdf68d5d-rp67p\" (UID: \"d940561a-5e7f-455d-bb6e-30fa73b1c4b7\") " pod="openshift-controller-manager/controller-manager-78cdf68d5d-rp67p" Jan 11 17:35:54 crc kubenswrapper[4837]: I0111 17:35:54.931652 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/d940561a-5e7f-455d-bb6e-30fa73b1c4b7-proxy-ca-bundles\") pod \"controller-manager-78cdf68d5d-rp67p\" (UID: \"d940561a-5e7f-455d-bb6e-30fa73b1c4b7\") " pod="openshift-controller-manager/controller-manager-78cdf68d5d-rp67p" Jan 11 17:35:54 crc kubenswrapper[4837]: I0111 17:35:54.931731 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/d940561a-5e7f-455d-bb6e-30fa73b1c4b7-client-ca\") pod \"controller-manager-78cdf68d5d-rp67p\" (UID: \"d940561a-5e7f-455d-bb6e-30fa73b1c4b7\") " pod="openshift-controller-manager/controller-manager-78cdf68d5d-rp67p" Jan 11 17:35:54 crc kubenswrapper[4837]: I0111 17:35:54.944086 4837 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-jk4fx" Jan 11 17:35:55 crc kubenswrapper[4837]: I0111 17:35:55.033104 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1aecbb0e-6cc3-4308-a741-c1799ec8b541-catalog-content\") pod \"1aecbb0e-6cc3-4308-a741-c1799ec8b541\" (UID: \"1aecbb0e-6cc3-4308-a741-c1799ec8b541\") " Jan 11 17:35:55 crc kubenswrapper[4837]: I0111 17:35:55.033235 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1aecbb0e-6cc3-4308-a741-c1799ec8b541-utilities\") pod \"1aecbb0e-6cc3-4308-a741-c1799ec8b541\" (UID: \"1aecbb0e-6cc3-4308-a741-c1799ec8b541\") " Jan 11 17:35:55 crc kubenswrapper[4837]: I0111 17:35:55.033301 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-p7r2n\" (UniqueName: \"kubernetes.io/projected/1aecbb0e-6cc3-4308-a741-c1799ec8b541-kube-api-access-p7r2n\") pod \"1aecbb0e-6cc3-4308-a741-c1799ec8b541\" (UID: \"1aecbb0e-6cc3-4308-a741-c1799ec8b541\") " Jan 11 17:35:55 crc kubenswrapper[4837]: I0111 17:35:55.033458 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/6cb608ee-09ed-44ec-a8b7-d22aa14fe80d-serving-cert\") pod \"route-controller-manager-697c9bc45b-xc67k\" (UID: \"6cb608ee-09ed-44ec-a8b7-d22aa14fe80d\") " pod="openshift-route-controller-manager/route-controller-manager-697c9bc45b-xc67k" Jan 11 17:35:55 crc kubenswrapper[4837]: I0111 17:35:55.033485 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/6cb608ee-09ed-44ec-a8b7-d22aa14fe80d-client-ca\") pod \"route-controller-manager-697c9bc45b-xc67k\" (UID: \"6cb608ee-09ed-44ec-a8b7-d22aa14fe80d\") " pod="openshift-route-controller-manager/route-controller-manager-697c9bc45b-xc67k" Jan 11 17:35:55 crc kubenswrapper[4837]: I0111 17:35:55.033505 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d940561a-5e7f-455d-bb6e-30fa73b1c4b7-config\") pod \"controller-manager-78cdf68d5d-rp67p\" (UID: \"d940561a-5e7f-455d-bb6e-30fa73b1c4b7\") " pod="openshift-controller-manager/controller-manager-78cdf68d5d-rp67p" Jan 11 17:35:55 crc kubenswrapper[4837]: I0111 17:35:55.033531 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/d940561a-5e7f-455d-bb6e-30fa73b1c4b7-serving-cert\") pod \"controller-manager-78cdf68d5d-rp67p\" (UID: \"d940561a-5e7f-455d-bb6e-30fa73b1c4b7\") " pod="openshift-controller-manager/controller-manager-78cdf68d5d-rp67p" Jan 11 17:35:55 crc kubenswrapper[4837]: I0111 17:35:55.033558 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-h285z\" (UniqueName: \"kubernetes.io/projected/6cb608ee-09ed-44ec-a8b7-d22aa14fe80d-kube-api-access-h285z\") pod \"route-controller-manager-697c9bc45b-xc67k\" (UID: \"6cb608ee-09ed-44ec-a8b7-d22aa14fe80d\") " pod="openshift-route-controller-manager/route-controller-manager-697c9bc45b-xc67k" Jan 11 17:35:55 crc kubenswrapper[4837]: I0111 17:35:55.033575 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6cb608ee-09ed-44ec-a8b7-d22aa14fe80d-config\") pod \"route-controller-manager-697c9bc45b-xc67k\" (UID: \"6cb608ee-09ed-44ec-a8b7-d22aa14fe80d\") " pod="openshift-route-controller-manager/route-controller-manager-697c9bc45b-xc67k" Jan 11 17:35:55 crc kubenswrapper[4837]: I0111 17:35:55.033596 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-l69hg\" (UniqueName: \"kubernetes.io/projected/d940561a-5e7f-455d-bb6e-30fa73b1c4b7-kube-api-access-l69hg\") pod \"controller-manager-78cdf68d5d-rp67p\" (UID: \"d940561a-5e7f-455d-bb6e-30fa73b1c4b7\") " pod="openshift-controller-manager/controller-manager-78cdf68d5d-rp67p" Jan 11 17:35:55 crc kubenswrapper[4837]: I0111 17:35:55.033624 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/d940561a-5e7f-455d-bb6e-30fa73b1c4b7-proxy-ca-bundles\") pod \"controller-manager-78cdf68d5d-rp67p\" (UID: \"d940561a-5e7f-455d-bb6e-30fa73b1c4b7\") " pod="openshift-controller-manager/controller-manager-78cdf68d5d-rp67p" Jan 11 17:35:55 crc kubenswrapper[4837]: I0111 17:35:55.033646 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/d940561a-5e7f-455d-bb6e-30fa73b1c4b7-client-ca\") pod \"controller-manager-78cdf68d5d-rp67p\" (UID: \"d940561a-5e7f-455d-bb6e-30fa73b1c4b7\") " pod="openshift-controller-manager/controller-manager-78cdf68d5d-rp67p" Jan 11 17:35:55 crc kubenswrapper[4837]: I0111 17:35:55.034397 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/1aecbb0e-6cc3-4308-a741-c1799ec8b541-utilities" (OuterVolumeSpecName: "utilities") pod "1aecbb0e-6cc3-4308-a741-c1799ec8b541" (UID: "1aecbb0e-6cc3-4308-a741-c1799ec8b541"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 11 17:35:55 crc kubenswrapper[4837]: I0111 17:35:55.034625 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/6cb608ee-09ed-44ec-a8b7-d22aa14fe80d-client-ca\") pod \"route-controller-manager-697c9bc45b-xc67k\" (UID: \"6cb608ee-09ed-44ec-a8b7-d22aa14fe80d\") " pod="openshift-route-controller-manager/route-controller-manager-697c9bc45b-xc67k" Jan 11 17:35:55 crc kubenswrapper[4837]: I0111 17:35:55.034801 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/d940561a-5e7f-455d-bb6e-30fa73b1c4b7-client-ca\") pod \"controller-manager-78cdf68d5d-rp67p\" (UID: \"d940561a-5e7f-455d-bb6e-30fa73b1c4b7\") " pod="openshift-controller-manager/controller-manager-78cdf68d5d-rp67p" Jan 11 17:35:55 crc kubenswrapper[4837]: I0111 17:35:55.035052 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/d940561a-5e7f-455d-bb6e-30fa73b1c4b7-proxy-ca-bundles\") pod \"controller-manager-78cdf68d5d-rp67p\" (UID: \"d940561a-5e7f-455d-bb6e-30fa73b1c4b7\") " pod="openshift-controller-manager/controller-manager-78cdf68d5d-rp67p" Jan 11 17:35:55 crc kubenswrapper[4837]: I0111 17:35:55.035151 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d940561a-5e7f-455d-bb6e-30fa73b1c4b7-config\") pod \"controller-manager-78cdf68d5d-rp67p\" (UID: \"d940561a-5e7f-455d-bb6e-30fa73b1c4b7\") " pod="openshift-controller-manager/controller-manager-78cdf68d5d-rp67p" Jan 11 17:35:55 crc kubenswrapper[4837]: I0111 17:35:55.035252 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6cb608ee-09ed-44ec-a8b7-d22aa14fe80d-config\") pod \"route-controller-manager-697c9bc45b-xc67k\" (UID: \"6cb608ee-09ed-44ec-a8b7-d22aa14fe80d\") " pod="openshift-route-controller-manager/route-controller-manager-697c9bc45b-xc67k" Jan 11 17:35:55 crc kubenswrapper[4837]: I0111 17:35:55.038959 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/d940561a-5e7f-455d-bb6e-30fa73b1c4b7-serving-cert\") pod \"controller-manager-78cdf68d5d-rp67p\" (UID: \"d940561a-5e7f-455d-bb6e-30fa73b1c4b7\") " pod="openshift-controller-manager/controller-manager-78cdf68d5d-rp67p" Jan 11 17:35:55 crc kubenswrapper[4837]: I0111 17:35:55.039521 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/6cb608ee-09ed-44ec-a8b7-d22aa14fe80d-serving-cert\") pod \"route-controller-manager-697c9bc45b-xc67k\" (UID: \"6cb608ee-09ed-44ec-a8b7-d22aa14fe80d\") " pod="openshift-route-controller-manager/route-controller-manager-697c9bc45b-xc67k" Jan 11 17:35:55 crc kubenswrapper[4837]: I0111 17:35:55.051908 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1aecbb0e-6cc3-4308-a741-c1799ec8b541-kube-api-access-p7r2n" (OuterVolumeSpecName: "kube-api-access-p7r2n") pod "1aecbb0e-6cc3-4308-a741-c1799ec8b541" (UID: "1aecbb0e-6cc3-4308-a741-c1799ec8b541"). InnerVolumeSpecName "kube-api-access-p7r2n". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 11 17:35:55 crc kubenswrapper[4837]: I0111 17:35:55.061867 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-h285z\" (UniqueName: \"kubernetes.io/projected/6cb608ee-09ed-44ec-a8b7-d22aa14fe80d-kube-api-access-h285z\") pod \"route-controller-manager-697c9bc45b-xc67k\" (UID: \"6cb608ee-09ed-44ec-a8b7-d22aa14fe80d\") " pod="openshift-route-controller-manager/route-controller-manager-697c9bc45b-xc67k" Jan 11 17:35:55 crc kubenswrapper[4837]: I0111 17:35:55.062994 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-l69hg\" (UniqueName: \"kubernetes.io/projected/d940561a-5e7f-455d-bb6e-30fa73b1c4b7-kube-api-access-l69hg\") pod \"controller-manager-78cdf68d5d-rp67p\" (UID: \"d940561a-5e7f-455d-bb6e-30fa73b1c4b7\") " pod="openshift-controller-manager/controller-manager-78cdf68d5d-rp67p" Jan 11 17:35:55 crc kubenswrapper[4837]: I0111 17:35:55.082548 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/1aecbb0e-6cc3-4308-a741-c1799ec8b541-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "1aecbb0e-6cc3-4308-a741-c1799ec8b541" (UID: "1aecbb0e-6cc3-4308-a741-c1799ec8b541"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 11 17:35:55 crc kubenswrapper[4837]: I0111 17:35:55.135095 4837 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-p7r2n\" (UniqueName: \"kubernetes.io/projected/1aecbb0e-6cc3-4308-a741-c1799ec8b541-kube-api-access-p7r2n\") on node \"crc\" DevicePath \"\"" Jan 11 17:35:55 crc kubenswrapper[4837]: I0111 17:35:55.135143 4837 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1aecbb0e-6cc3-4308-a741-c1799ec8b541-catalog-content\") on node \"crc\" DevicePath \"\"" Jan 11 17:35:55 crc kubenswrapper[4837]: I0111 17:35:55.135158 4837 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1aecbb0e-6cc3-4308-a741-c1799ec8b541-utilities\") on node \"crc\" DevicePath \"\"" Jan 11 17:35:55 crc kubenswrapper[4837]: I0111 17:35:55.159893 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-78cdf68d5d-rp67p" Jan 11 17:35:55 crc kubenswrapper[4837]: I0111 17:35:55.169704 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-697c9bc45b-xc67k" Jan 11 17:35:55 crc kubenswrapper[4837]: I0111 17:35:55.465190 4837 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-697c9bc45b-xc67k"] Jan 11 17:35:55 crc kubenswrapper[4837]: W0111 17:35:55.472842 4837 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod6cb608ee_09ed_44ec_a8b7_d22aa14fe80d.slice/crio-84fcd41d92dd170aa4af66180a73e09731dc8a9347087489ad4209b04c98fc4e WatchSource:0}: Error finding container 84fcd41d92dd170aa4af66180a73e09731dc8a9347087489ad4209b04c98fc4e: Status 404 returned error can't find the container with id 84fcd41d92dd170aa4af66180a73e09731dc8a9347087489ad4209b04c98fc4e Jan 11 17:35:55 crc kubenswrapper[4837]: I0111 17:35:55.503344 4837 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-p7kkt"] Jan 11 17:35:55 crc kubenswrapper[4837]: I0111 17:35:55.503621 4837 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-p7kkt" podUID="df3f0f97-5906-44b5-99d5-6003e1b23be1" containerName="registry-server" containerID="cri-o://de99e4d2f93d0a5bbca6d85a22db32b108f9a98ea80b39fcee90267e5d310462" gracePeriod=2 Jan 11 17:35:55 crc kubenswrapper[4837]: I0111 17:35:55.617583 4837 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-78cdf68d5d-rp67p"] Jan 11 17:35:55 crc kubenswrapper[4837]: I0111 17:35:55.649628 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-jk4fx" event={"ID":"1aecbb0e-6cc3-4308-a741-c1799ec8b541","Type":"ContainerDied","Data":"c70c75a47202667ce92075c9c7176bd592470daa0f9f64d3910db538c8ab857e"} Jan 11 17:35:55 crc kubenswrapper[4837]: I0111 17:35:55.649699 4837 scope.go:117] "RemoveContainer" containerID="9a28e04adf3e0396d207e7a32a1dafbe70b8bfaf68d43934c84b492e1875a5a8" Jan 11 17:35:55 crc kubenswrapper[4837]: I0111 17:35:55.649714 4837 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-jk4fx" Jan 11 17:35:55 crc kubenswrapper[4837]: I0111 17:35:55.656255 4837 generic.go:334] "Generic (PLEG): container finished" podID="df3f0f97-5906-44b5-99d5-6003e1b23be1" containerID="de99e4d2f93d0a5bbca6d85a22db32b108f9a98ea80b39fcee90267e5d310462" exitCode=0 Jan 11 17:35:55 crc kubenswrapper[4837]: I0111 17:35:55.656350 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-p7kkt" event={"ID":"df3f0f97-5906-44b5-99d5-6003e1b23be1","Type":"ContainerDied","Data":"de99e4d2f93d0a5bbca6d85a22db32b108f9a98ea80b39fcee90267e5d310462"} Jan 11 17:35:55 crc kubenswrapper[4837]: I0111 17:35:55.658487 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-697c9bc45b-xc67k" event={"ID":"6cb608ee-09ed-44ec-a8b7-d22aa14fe80d","Type":"ContainerStarted","Data":"84fcd41d92dd170aa4af66180a73e09731dc8a9347087489ad4209b04c98fc4e"} Jan 11 17:35:55 crc kubenswrapper[4837]: I0111 17:35:55.660395 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-78cdf68d5d-rp67p" event={"ID":"d940561a-5e7f-455d-bb6e-30fa73b1c4b7","Type":"ContainerStarted","Data":"d6aa26495a594d2887e26d4f1ad3257348ec62f509f38d1d3b877c7480db4056"} Jan 11 17:35:55 crc kubenswrapper[4837]: I0111 17:35:55.665598 4837 scope.go:117] "RemoveContainer" containerID="360e607e10374d567a08d1492554aba5251c8aaf1f5261f894082ce4af7e526b" Jan 11 17:35:55 crc kubenswrapper[4837]: I0111 17:35:55.685535 4837 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-jk4fx"] Jan 11 17:35:55 crc kubenswrapper[4837]: I0111 17:35:55.686698 4837 scope.go:117] "RemoveContainer" containerID="1a9a71953ddde761cb532dd9ef63d52ebae6bfcd0fa0ba084bc27f74267b34df" Jan 11 17:35:55 crc kubenswrapper[4837]: I0111 17:35:55.688598 4837 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-jk4fx"] Jan 11 17:35:55 crc kubenswrapper[4837]: I0111 17:35:55.951176 4837 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-p7kkt" Jan 11 17:35:56 crc kubenswrapper[4837]: I0111 17:35:56.048059 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/df3f0f97-5906-44b5-99d5-6003e1b23be1-catalog-content\") pod \"df3f0f97-5906-44b5-99d5-6003e1b23be1\" (UID: \"df3f0f97-5906-44b5-99d5-6003e1b23be1\") " Jan 11 17:35:56 crc kubenswrapper[4837]: I0111 17:35:56.048122 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/df3f0f97-5906-44b5-99d5-6003e1b23be1-utilities\") pod \"df3f0f97-5906-44b5-99d5-6003e1b23be1\" (UID: \"df3f0f97-5906-44b5-99d5-6003e1b23be1\") " Jan 11 17:35:56 crc kubenswrapper[4837]: I0111 17:35:56.048184 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-rcwxh\" (UniqueName: \"kubernetes.io/projected/df3f0f97-5906-44b5-99d5-6003e1b23be1-kube-api-access-rcwxh\") pod \"df3f0f97-5906-44b5-99d5-6003e1b23be1\" (UID: \"df3f0f97-5906-44b5-99d5-6003e1b23be1\") " Jan 11 17:35:56 crc kubenswrapper[4837]: I0111 17:35:56.049185 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/df3f0f97-5906-44b5-99d5-6003e1b23be1-utilities" (OuterVolumeSpecName: "utilities") pod "df3f0f97-5906-44b5-99d5-6003e1b23be1" (UID: "df3f0f97-5906-44b5-99d5-6003e1b23be1"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 11 17:35:56 crc kubenswrapper[4837]: I0111 17:35:56.052900 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/df3f0f97-5906-44b5-99d5-6003e1b23be1-kube-api-access-rcwxh" (OuterVolumeSpecName: "kube-api-access-rcwxh") pod "df3f0f97-5906-44b5-99d5-6003e1b23be1" (UID: "df3f0f97-5906-44b5-99d5-6003e1b23be1"). InnerVolumeSpecName "kube-api-access-rcwxh". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 11 17:35:56 crc kubenswrapper[4837]: I0111 17:35:56.101319 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/df3f0f97-5906-44b5-99d5-6003e1b23be1-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "df3f0f97-5906-44b5-99d5-6003e1b23be1" (UID: "df3f0f97-5906-44b5-99d5-6003e1b23be1"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 11 17:35:56 crc kubenswrapper[4837]: I0111 17:35:56.149389 4837 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-rcwxh\" (UniqueName: \"kubernetes.io/projected/df3f0f97-5906-44b5-99d5-6003e1b23be1-kube-api-access-rcwxh\") on node \"crc\" DevicePath \"\"" Jan 11 17:35:56 crc kubenswrapper[4837]: I0111 17:35:56.149444 4837 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/df3f0f97-5906-44b5-99d5-6003e1b23be1-catalog-content\") on node \"crc\" DevicePath \"\"" Jan 11 17:35:56 crc kubenswrapper[4837]: I0111 17:35:56.149457 4837 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/df3f0f97-5906-44b5-99d5-6003e1b23be1-utilities\") on node \"crc\" DevicePath \"\"" Jan 11 17:35:56 crc kubenswrapper[4837]: I0111 17:35:56.369739 4837 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1aecbb0e-6cc3-4308-a741-c1799ec8b541" path="/var/lib/kubelet/pods/1aecbb0e-6cc3-4308-a741-c1799ec8b541/volumes" Jan 11 17:35:56 crc kubenswrapper[4837]: I0111 17:35:56.370716 4837 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a9d572b6-d310-4a29-af60-e1a39c5a8859" path="/var/lib/kubelet/pods/a9d572b6-d310-4a29-af60-e1a39c5a8859/volumes" Jan 11 17:35:56 crc kubenswrapper[4837]: I0111 17:35:56.371253 4837 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e409442a-5f44-4cb2-a615-be75544036bd" path="/var/lib/kubelet/pods/e409442a-5f44-4cb2-a615-be75544036bd/volumes" Jan 11 17:35:56 crc kubenswrapper[4837]: I0111 17:35:56.668160 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-p7kkt" event={"ID":"df3f0f97-5906-44b5-99d5-6003e1b23be1","Type":"ContainerDied","Data":"797c061fb7e056e4028df3e86c22811a02194612125e61d890da7fc65f0a6eb6"} Jan 11 17:35:56 crc kubenswrapper[4837]: I0111 17:35:56.668183 4837 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-p7kkt" Jan 11 17:35:56 crc kubenswrapper[4837]: I0111 17:35:56.668208 4837 scope.go:117] "RemoveContainer" containerID="de99e4d2f93d0a5bbca6d85a22db32b108f9a98ea80b39fcee90267e5d310462" Jan 11 17:35:56 crc kubenswrapper[4837]: I0111 17:35:56.669495 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-78cdf68d5d-rp67p" event={"ID":"d940561a-5e7f-455d-bb6e-30fa73b1c4b7","Type":"ContainerStarted","Data":"2668822f06013c245a6f6b7d1f3b753d5e9d71e07f839edc3a48ba75009b6c82"} Jan 11 17:35:56 crc kubenswrapper[4837]: I0111 17:35:56.669695 4837 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-controller-manager/controller-manager-78cdf68d5d-rp67p" Jan 11 17:35:56 crc kubenswrapper[4837]: I0111 17:35:56.672025 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-697c9bc45b-xc67k" event={"ID":"6cb608ee-09ed-44ec-a8b7-d22aa14fe80d","Type":"ContainerStarted","Data":"1d13a91e4e0878bdf0fc145511e4abb890c2c9134ff494c0a8020641853098fa"} Jan 11 17:35:56 crc kubenswrapper[4837]: I0111 17:35:56.672217 4837 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-route-controller-manager/route-controller-manager-697c9bc45b-xc67k" Jan 11 17:35:56 crc kubenswrapper[4837]: I0111 17:35:56.678466 4837 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-controller-manager/controller-manager-78cdf68d5d-rp67p" Jan 11 17:35:56 crc kubenswrapper[4837]: I0111 17:35:56.678693 4837 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-route-controller-manager/route-controller-manager-697c9bc45b-xc67k" Jan 11 17:35:56 crc kubenswrapper[4837]: I0111 17:35:56.688186 4837 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-p7kkt"] Jan 11 17:35:56 crc kubenswrapper[4837]: I0111 17:35:56.692586 4837 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-p7kkt"] Jan 11 17:35:56 crc kubenswrapper[4837]: I0111 17:35:56.700759 4837 scope.go:117] "RemoveContainer" containerID="8fb1b9dddaaa758c26194ce8b895f2c3edf56bee4706366bec94305c47cb43c3" Jan 11 17:35:56 crc kubenswrapper[4837]: I0111 17:35:56.702400 4837 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-route-controller-manager/route-controller-manager-697c9bc45b-xc67k" podStartSLOduration=3.702375493 podStartE2EDuration="3.702375493s" podCreationTimestamp="2026-01-11 17:35:53 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-11 17:35:56.700893902 +0000 UTC m=+330.879086628" watchObservedRunningTime="2026-01-11 17:35:56.702375493 +0000 UTC m=+330.880568229" Jan 11 17:35:56 crc kubenswrapper[4837]: I0111 17:35:56.724659 4837 scope.go:117] "RemoveContainer" containerID="5823a4a921468ed5d06654a32fe52cdbe4ec67f93bb04e760f043ead4a7d2ecc" Jan 11 17:35:56 crc kubenswrapper[4837]: I0111 17:35:56.737770 4837 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-controller-manager/controller-manager-78cdf68d5d-rp67p" podStartSLOduration=3.737747659 podStartE2EDuration="3.737747659s" podCreationTimestamp="2026-01-11 17:35:53 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-11 17:35:56.719271274 +0000 UTC m=+330.897463980" watchObservedRunningTime="2026-01-11 17:35:56.737747659 +0000 UTC m=+330.915940365" Jan 11 17:35:56 crc kubenswrapper[4837]: I0111 17:35:56.905580 4837 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-dht47"] Jan 11 17:35:56 crc kubenswrapper[4837]: I0111 17:35:56.906133 4837 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-dht47" podUID="2bf0823a-63d9-43c9-9cc7-1e2f364b7855" containerName="registry-server" containerID="cri-o://ed49fc1156a8b36ddc04d26696827f282b22c8de5b9d2f8004050cc6a4e0ef58" gracePeriod=2 Jan 11 17:35:57 crc kubenswrapper[4837]: I0111 17:35:57.354811 4837 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-dht47" Jan 11 17:35:57 crc kubenswrapper[4837]: I0111 17:35:57.468220 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/2bf0823a-63d9-43c9-9cc7-1e2f364b7855-utilities\") pod \"2bf0823a-63d9-43c9-9cc7-1e2f364b7855\" (UID: \"2bf0823a-63d9-43c9-9cc7-1e2f364b7855\") " Jan 11 17:35:57 crc kubenswrapper[4837]: I0111 17:35:57.468554 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-9cmwk\" (UniqueName: \"kubernetes.io/projected/2bf0823a-63d9-43c9-9cc7-1e2f364b7855-kube-api-access-9cmwk\") pod \"2bf0823a-63d9-43c9-9cc7-1e2f364b7855\" (UID: \"2bf0823a-63d9-43c9-9cc7-1e2f364b7855\") " Jan 11 17:35:57 crc kubenswrapper[4837]: I0111 17:35:57.468608 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/2bf0823a-63d9-43c9-9cc7-1e2f364b7855-catalog-content\") pod \"2bf0823a-63d9-43c9-9cc7-1e2f364b7855\" (UID: \"2bf0823a-63d9-43c9-9cc7-1e2f364b7855\") " Jan 11 17:35:57 crc kubenswrapper[4837]: I0111 17:35:57.469032 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/2bf0823a-63d9-43c9-9cc7-1e2f364b7855-utilities" (OuterVolumeSpecName: "utilities") pod "2bf0823a-63d9-43c9-9cc7-1e2f364b7855" (UID: "2bf0823a-63d9-43c9-9cc7-1e2f364b7855"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 11 17:35:57 crc kubenswrapper[4837]: I0111 17:35:57.474422 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2bf0823a-63d9-43c9-9cc7-1e2f364b7855-kube-api-access-9cmwk" (OuterVolumeSpecName: "kube-api-access-9cmwk") pod "2bf0823a-63d9-43c9-9cc7-1e2f364b7855" (UID: "2bf0823a-63d9-43c9-9cc7-1e2f364b7855"). InnerVolumeSpecName "kube-api-access-9cmwk". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 11 17:35:57 crc kubenswrapper[4837]: I0111 17:35:57.495802 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/2bf0823a-63d9-43c9-9cc7-1e2f364b7855-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "2bf0823a-63d9-43c9-9cc7-1e2f364b7855" (UID: "2bf0823a-63d9-43c9-9cc7-1e2f364b7855"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 11 17:35:57 crc kubenswrapper[4837]: I0111 17:35:57.570708 4837 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/2bf0823a-63d9-43c9-9cc7-1e2f364b7855-utilities\") on node \"crc\" DevicePath \"\"" Jan 11 17:35:57 crc kubenswrapper[4837]: I0111 17:35:57.570750 4837 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-9cmwk\" (UniqueName: \"kubernetes.io/projected/2bf0823a-63d9-43c9-9cc7-1e2f364b7855-kube-api-access-9cmwk\") on node \"crc\" DevicePath \"\"" Jan 11 17:35:57 crc kubenswrapper[4837]: I0111 17:35:57.570763 4837 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/2bf0823a-63d9-43c9-9cc7-1e2f364b7855-catalog-content\") on node \"crc\" DevicePath \"\"" Jan 11 17:35:57 crc kubenswrapper[4837]: I0111 17:35:57.678552 4837 generic.go:334] "Generic (PLEG): container finished" podID="2bf0823a-63d9-43c9-9cc7-1e2f364b7855" containerID="ed49fc1156a8b36ddc04d26696827f282b22c8de5b9d2f8004050cc6a4e0ef58" exitCode=0 Jan 11 17:35:57 crc kubenswrapper[4837]: I0111 17:35:57.678633 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-dht47" event={"ID":"2bf0823a-63d9-43c9-9cc7-1e2f364b7855","Type":"ContainerDied","Data":"ed49fc1156a8b36ddc04d26696827f282b22c8de5b9d2f8004050cc6a4e0ef58"} Jan 11 17:35:57 crc kubenswrapper[4837]: I0111 17:35:57.678660 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-dht47" event={"ID":"2bf0823a-63d9-43c9-9cc7-1e2f364b7855","Type":"ContainerDied","Data":"8b2ffa20981fa479ca0703e74ac8b709cb484f2c0f08727794301b33699c2c82"} Jan 11 17:35:57 crc kubenswrapper[4837]: I0111 17:35:57.678656 4837 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-dht47" Jan 11 17:35:57 crc kubenswrapper[4837]: I0111 17:35:57.678702 4837 scope.go:117] "RemoveContainer" containerID="ed49fc1156a8b36ddc04d26696827f282b22c8de5b9d2f8004050cc6a4e0ef58" Jan 11 17:35:57 crc kubenswrapper[4837]: I0111 17:35:57.694326 4837 scope.go:117] "RemoveContainer" containerID="97dbb9dfdb354c6380f7d51f299799dc6abc105cd069a414692210966b79ab6c" Jan 11 17:35:57 crc kubenswrapper[4837]: I0111 17:35:57.710346 4837 scope.go:117] "RemoveContainer" containerID="8ec512d8d499f7cff74278971351339d0e1358e1d54da8bda6dc623aaf984d7b" Jan 11 17:35:57 crc kubenswrapper[4837]: I0111 17:35:57.748812 4837 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-dht47"] Jan 11 17:35:57 crc kubenswrapper[4837]: I0111 17:35:57.751531 4837 scope.go:117] "RemoveContainer" containerID="ed49fc1156a8b36ddc04d26696827f282b22c8de5b9d2f8004050cc6a4e0ef58" Jan 11 17:35:57 crc kubenswrapper[4837]: E0111 17:35:57.752132 4837 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ed49fc1156a8b36ddc04d26696827f282b22c8de5b9d2f8004050cc6a4e0ef58\": container with ID starting with ed49fc1156a8b36ddc04d26696827f282b22c8de5b9d2f8004050cc6a4e0ef58 not found: ID does not exist" containerID="ed49fc1156a8b36ddc04d26696827f282b22c8de5b9d2f8004050cc6a4e0ef58" Jan 11 17:35:57 crc kubenswrapper[4837]: I0111 17:35:57.752180 4837 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ed49fc1156a8b36ddc04d26696827f282b22c8de5b9d2f8004050cc6a4e0ef58"} err="failed to get container status \"ed49fc1156a8b36ddc04d26696827f282b22c8de5b9d2f8004050cc6a4e0ef58\": rpc error: code = NotFound desc = could not find container \"ed49fc1156a8b36ddc04d26696827f282b22c8de5b9d2f8004050cc6a4e0ef58\": container with ID starting with ed49fc1156a8b36ddc04d26696827f282b22c8de5b9d2f8004050cc6a4e0ef58 not found: ID does not exist" Jan 11 17:35:57 crc kubenswrapper[4837]: I0111 17:35:57.752206 4837 scope.go:117] "RemoveContainer" containerID="97dbb9dfdb354c6380f7d51f299799dc6abc105cd069a414692210966b79ab6c" Jan 11 17:35:57 crc kubenswrapper[4837]: E0111 17:35:57.752558 4837 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"97dbb9dfdb354c6380f7d51f299799dc6abc105cd069a414692210966b79ab6c\": container with ID starting with 97dbb9dfdb354c6380f7d51f299799dc6abc105cd069a414692210966b79ab6c not found: ID does not exist" containerID="97dbb9dfdb354c6380f7d51f299799dc6abc105cd069a414692210966b79ab6c" Jan 11 17:35:57 crc kubenswrapper[4837]: I0111 17:35:57.752578 4837 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"97dbb9dfdb354c6380f7d51f299799dc6abc105cd069a414692210966b79ab6c"} err="failed to get container status \"97dbb9dfdb354c6380f7d51f299799dc6abc105cd069a414692210966b79ab6c\": rpc error: code = NotFound desc = could not find container \"97dbb9dfdb354c6380f7d51f299799dc6abc105cd069a414692210966b79ab6c\": container with ID starting with 97dbb9dfdb354c6380f7d51f299799dc6abc105cd069a414692210966b79ab6c not found: ID does not exist" Jan 11 17:35:57 crc kubenswrapper[4837]: I0111 17:35:57.752591 4837 scope.go:117] "RemoveContainer" containerID="8ec512d8d499f7cff74278971351339d0e1358e1d54da8bda6dc623aaf984d7b" Jan 11 17:35:57 crc kubenswrapper[4837]: E0111 17:35:57.753249 4837 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"8ec512d8d499f7cff74278971351339d0e1358e1d54da8bda6dc623aaf984d7b\": container with ID starting with 8ec512d8d499f7cff74278971351339d0e1358e1d54da8bda6dc623aaf984d7b not found: ID does not exist" containerID="8ec512d8d499f7cff74278971351339d0e1358e1d54da8bda6dc623aaf984d7b" Jan 11 17:35:57 crc kubenswrapper[4837]: I0111 17:35:57.753273 4837 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8ec512d8d499f7cff74278971351339d0e1358e1d54da8bda6dc623aaf984d7b"} err="failed to get container status \"8ec512d8d499f7cff74278971351339d0e1358e1d54da8bda6dc623aaf984d7b\": rpc error: code = NotFound desc = could not find container \"8ec512d8d499f7cff74278971351339d0e1358e1d54da8bda6dc623aaf984d7b\": container with ID starting with 8ec512d8d499f7cff74278971351339d0e1358e1d54da8bda6dc623aaf984d7b not found: ID does not exist" Jan 11 17:35:57 crc kubenswrapper[4837]: I0111 17:35:57.755950 4837 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-dht47"] Jan 11 17:35:57 crc kubenswrapper[4837]: I0111 17:35:57.901581 4837 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-pvd2s"] Jan 11 17:35:57 crc kubenswrapper[4837]: I0111 17:35:57.901812 4837 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-pvd2s" podUID="0f2f437a-d901-4e9b-95c1-9099d8ffaf9c" containerName="registry-server" containerID="cri-o://b8deaa61f47f6740e8ce536e0d2ee2622e91727f7b1f178ca10613b63e74ebd4" gracePeriod=2 Jan 11 17:35:58 crc kubenswrapper[4837]: I0111 17:35:58.294400 4837 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-pvd2s" Jan 11 17:35:58 crc kubenswrapper[4837]: I0111 17:35:58.369906 4837 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="2bf0823a-63d9-43c9-9cc7-1e2f364b7855" path="/var/lib/kubelet/pods/2bf0823a-63d9-43c9-9cc7-1e2f364b7855/volumes" Jan 11 17:35:58 crc kubenswrapper[4837]: I0111 17:35:58.370828 4837 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="df3f0f97-5906-44b5-99d5-6003e1b23be1" path="/var/lib/kubelet/pods/df3f0f97-5906-44b5-99d5-6003e1b23be1/volumes" Jan 11 17:35:58 crc kubenswrapper[4837]: I0111 17:35:58.380075 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/0f2f437a-d901-4e9b-95c1-9099d8ffaf9c-utilities\") pod \"0f2f437a-d901-4e9b-95c1-9099d8ffaf9c\" (UID: \"0f2f437a-d901-4e9b-95c1-9099d8ffaf9c\") " Jan 11 17:35:58 crc kubenswrapper[4837]: I0111 17:35:58.380129 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-hjfqz\" (UniqueName: \"kubernetes.io/projected/0f2f437a-d901-4e9b-95c1-9099d8ffaf9c-kube-api-access-hjfqz\") pod \"0f2f437a-d901-4e9b-95c1-9099d8ffaf9c\" (UID: \"0f2f437a-d901-4e9b-95c1-9099d8ffaf9c\") " Jan 11 17:35:58 crc kubenswrapper[4837]: I0111 17:35:58.380167 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/0f2f437a-d901-4e9b-95c1-9099d8ffaf9c-catalog-content\") pod \"0f2f437a-d901-4e9b-95c1-9099d8ffaf9c\" (UID: \"0f2f437a-d901-4e9b-95c1-9099d8ffaf9c\") " Jan 11 17:35:58 crc kubenswrapper[4837]: I0111 17:35:58.381243 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/0f2f437a-d901-4e9b-95c1-9099d8ffaf9c-utilities" (OuterVolumeSpecName: "utilities") pod "0f2f437a-d901-4e9b-95c1-9099d8ffaf9c" (UID: "0f2f437a-d901-4e9b-95c1-9099d8ffaf9c"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 11 17:35:58 crc kubenswrapper[4837]: I0111 17:35:58.385091 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0f2f437a-d901-4e9b-95c1-9099d8ffaf9c-kube-api-access-hjfqz" (OuterVolumeSpecName: "kube-api-access-hjfqz") pod "0f2f437a-d901-4e9b-95c1-9099d8ffaf9c" (UID: "0f2f437a-d901-4e9b-95c1-9099d8ffaf9c"). InnerVolumeSpecName "kube-api-access-hjfqz". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 11 17:35:58 crc kubenswrapper[4837]: I0111 17:35:58.482134 4837 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/0f2f437a-d901-4e9b-95c1-9099d8ffaf9c-utilities\") on node \"crc\" DevicePath \"\"" Jan 11 17:35:58 crc kubenswrapper[4837]: I0111 17:35:58.482203 4837 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-hjfqz\" (UniqueName: \"kubernetes.io/projected/0f2f437a-d901-4e9b-95c1-9099d8ffaf9c-kube-api-access-hjfqz\") on node \"crc\" DevicePath \"\"" Jan 11 17:35:58 crc kubenswrapper[4837]: I0111 17:35:58.495379 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/0f2f437a-d901-4e9b-95c1-9099d8ffaf9c-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "0f2f437a-d901-4e9b-95c1-9099d8ffaf9c" (UID: "0f2f437a-d901-4e9b-95c1-9099d8ffaf9c"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 11 17:35:58 crc kubenswrapper[4837]: I0111 17:35:58.583819 4837 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/0f2f437a-d901-4e9b-95c1-9099d8ffaf9c-catalog-content\") on node \"crc\" DevicePath \"\"" Jan 11 17:35:58 crc kubenswrapper[4837]: I0111 17:35:58.690163 4837 generic.go:334] "Generic (PLEG): container finished" podID="0f2f437a-d901-4e9b-95c1-9099d8ffaf9c" containerID="b8deaa61f47f6740e8ce536e0d2ee2622e91727f7b1f178ca10613b63e74ebd4" exitCode=0 Jan 11 17:35:58 crc kubenswrapper[4837]: I0111 17:35:58.690904 4837 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-pvd2s" Jan 11 17:35:58 crc kubenswrapper[4837]: I0111 17:35:58.692705 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-pvd2s" event={"ID":"0f2f437a-d901-4e9b-95c1-9099d8ffaf9c","Type":"ContainerDied","Data":"b8deaa61f47f6740e8ce536e0d2ee2622e91727f7b1f178ca10613b63e74ebd4"} Jan 11 17:35:58 crc kubenswrapper[4837]: I0111 17:35:58.692855 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-pvd2s" event={"ID":"0f2f437a-d901-4e9b-95c1-9099d8ffaf9c","Type":"ContainerDied","Data":"d65af25391a3909726d5c7948186895e49ad733f8515b03dda341949131b0b79"} Jan 11 17:35:58 crc kubenswrapper[4837]: I0111 17:35:58.692903 4837 scope.go:117] "RemoveContainer" containerID="b8deaa61f47f6740e8ce536e0d2ee2622e91727f7b1f178ca10613b63e74ebd4" Jan 11 17:35:58 crc kubenswrapper[4837]: I0111 17:35:58.707045 4837 scope.go:117] "RemoveContainer" containerID="2c9b2d61359d4e71a2879e5d1978c4c4518ed5190b05203736f532079767d5b9" Jan 11 17:35:58 crc kubenswrapper[4837]: I0111 17:35:58.725279 4837 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-pvd2s"] Jan 11 17:35:58 crc kubenswrapper[4837]: I0111 17:35:58.727907 4837 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-pvd2s"] Jan 11 17:35:58 crc kubenswrapper[4837]: I0111 17:35:58.743169 4837 scope.go:117] "RemoveContainer" containerID="e1109df7548fbbe03c9812204667c1a90ae3db7ecebd3640143da0f6bb033fb3" Jan 11 17:35:58 crc kubenswrapper[4837]: I0111 17:35:58.762736 4837 scope.go:117] "RemoveContainer" containerID="b8deaa61f47f6740e8ce536e0d2ee2622e91727f7b1f178ca10613b63e74ebd4" Jan 11 17:35:58 crc kubenswrapper[4837]: E0111 17:35:58.763357 4837 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b8deaa61f47f6740e8ce536e0d2ee2622e91727f7b1f178ca10613b63e74ebd4\": container with ID starting with b8deaa61f47f6740e8ce536e0d2ee2622e91727f7b1f178ca10613b63e74ebd4 not found: ID does not exist" containerID="b8deaa61f47f6740e8ce536e0d2ee2622e91727f7b1f178ca10613b63e74ebd4" Jan 11 17:35:58 crc kubenswrapper[4837]: I0111 17:35:58.763784 4837 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b8deaa61f47f6740e8ce536e0d2ee2622e91727f7b1f178ca10613b63e74ebd4"} err="failed to get container status \"b8deaa61f47f6740e8ce536e0d2ee2622e91727f7b1f178ca10613b63e74ebd4\": rpc error: code = NotFound desc = could not find container \"b8deaa61f47f6740e8ce536e0d2ee2622e91727f7b1f178ca10613b63e74ebd4\": container with ID starting with b8deaa61f47f6740e8ce536e0d2ee2622e91727f7b1f178ca10613b63e74ebd4 not found: ID does not exist" Jan 11 17:35:58 crc kubenswrapper[4837]: I0111 17:35:58.763830 4837 scope.go:117] "RemoveContainer" containerID="2c9b2d61359d4e71a2879e5d1978c4c4518ed5190b05203736f532079767d5b9" Jan 11 17:35:58 crc kubenswrapper[4837]: E0111 17:35:58.764316 4837 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"2c9b2d61359d4e71a2879e5d1978c4c4518ed5190b05203736f532079767d5b9\": container with ID starting with 2c9b2d61359d4e71a2879e5d1978c4c4518ed5190b05203736f532079767d5b9 not found: ID does not exist" containerID="2c9b2d61359d4e71a2879e5d1978c4c4518ed5190b05203736f532079767d5b9" Jan 11 17:35:58 crc kubenswrapper[4837]: I0111 17:35:58.764437 4837 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2c9b2d61359d4e71a2879e5d1978c4c4518ed5190b05203736f532079767d5b9"} err="failed to get container status \"2c9b2d61359d4e71a2879e5d1978c4c4518ed5190b05203736f532079767d5b9\": rpc error: code = NotFound desc = could not find container \"2c9b2d61359d4e71a2879e5d1978c4c4518ed5190b05203736f532079767d5b9\": container with ID starting with 2c9b2d61359d4e71a2879e5d1978c4c4518ed5190b05203736f532079767d5b9 not found: ID does not exist" Jan 11 17:35:58 crc kubenswrapper[4837]: I0111 17:35:58.764494 4837 scope.go:117] "RemoveContainer" containerID="e1109df7548fbbe03c9812204667c1a90ae3db7ecebd3640143da0f6bb033fb3" Jan 11 17:35:58 crc kubenswrapper[4837]: E0111 17:35:58.765097 4837 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e1109df7548fbbe03c9812204667c1a90ae3db7ecebd3640143da0f6bb033fb3\": container with ID starting with e1109df7548fbbe03c9812204667c1a90ae3db7ecebd3640143da0f6bb033fb3 not found: ID does not exist" containerID="e1109df7548fbbe03c9812204667c1a90ae3db7ecebd3640143da0f6bb033fb3" Jan 11 17:35:58 crc kubenswrapper[4837]: I0111 17:35:58.765128 4837 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e1109df7548fbbe03c9812204667c1a90ae3db7ecebd3640143da0f6bb033fb3"} err="failed to get container status \"e1109df7548fbbe03c9812204667c1a90ae3db7ecebd3640143da0f6bb033fb3\": rpc error: code = NotFound desc = could not find container \"e1109df7548fbbe03c9812204667c1a90ae3db7ecebd3640143da0f6bb033fb3\": container with ID starting with e1109df7548fbbe03c9812204667c1a90ae3db7ecebd3640143da0f6bb033fb3 not found: ID does not exist" Jan 11 17:36:00 crc kubenswrapper[4837]: I0111 17:36:00.384054 4837 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0f2f437a-d901-4e9b-95c1-9099d8ffaf9c" path="/var/lib/kubelet/pods/0f2f437a-d901-4e9b-95c1-9099d8ffaf9c/volumes" Jan 11 17:36:09 crc kubenswrapper[4837]: I0111 17:36:09.080028 4837 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-authentication/oauth-openshift-558db77b4-ldzgv"] Jan 11 17:36:13 crc kubenswrapper[4837]: I0111 17:36:13.149342 4837 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-controller-manager/controller-manager-78cdf68d5d-rp67p"] Jan 11 17:36:13 crc kubenswrapper[4837]: I0111 17:36:13.150509 4837 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-controller-manager/controller-manager-78cdf68d5d-rp67p" podUID="d940561a-5e7f-455d-bb6e-30fa73b1c4b7" containerName="controller-manager" containerID="cri-o://2668822f06013c245a6f6b7d1f3b753d5e9d71e07f839edc3a48ba75009b6c82" gracePeriod=30 Jan 11 17:36:13 crc kubenswrapper[4837]: I0111 17:36:13.165054 4837 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-697c9bc45b-xc67k"] Jan 11 17:36:13 crc kubenswrapper[4837]: I0111 17:36:13.165397 4837 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-route-controller-manager/route-controller-manager-697c9bc45b-xc67k" podUID="6cb608ee-09ed-44ec-a8b7-d22aa14fe80d" containerName="route-controller-manager" containerID="cri-o://1d13a91e4e0878bdf0fc145511e4abb890c2c9134ff494c0a8020641853098fa" gracePeriod=30 Jan 11 17:36:13 crc kubenswrapper[4837]: I0111 17:36:13.746985 4837 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-697c9bc45b-xc67k" Jan 11 17:36:13 crc kubenswrapper[4837]: I0111 17:36:13.751380 4837 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-78cdf68d5d-rp67p" Jan 11 17:36:13 crc kubenswrapper[4837]: I0111 17:36:13.809158 4837 generic.go:334] "Generic (PLEG): container finished" podID="6cb608ee-09ed-44ec-a8b7-d22aa14fe80d" containerID="1d13a91e4e0878bdf0fc145511e4abb890c2c9134ff494c0a8020641853098fa" exitCode=0 Jan 11 17:36:13 crc kubenswrapper[4837]: I0111 17:36:13.809213 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-697c9bc45b-xc67k" event={"ID":"6cb608ee-09ed-44ec-a8b7-d22aa14fe80d","Type":"ContainerDied","Data":"1d13a91e4e0878bdf0fc145511e4abb890c2c9134ff494c0a8020641853098fa"} Jan 11 17:36:13 crc kubenswrapper[4837]: I0111 17:36:13.809238 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-697c9bc45b-xc67k" event={"ID":"6cb608ee-09ed-44ec-a8b7-d22aa14fe80d","Type":"ContainerDied","Data":"84fcd41d92dd170aa4af66180a73e09731dc8a9347087489ad4209b04c98fc4e"} Jan 11 17:36:13 crc kubenswrapper[4837]: I0111 17:36:13.809256 4837 scope.go:117] "RemoveContainer" containerID="1d13a91e4e0878bdf0fc145511e4abb890c2c9134ff494c0a8020641853098fa" Jan 11 17:36:13 crc kubenswrapper[4837]: I0111 17:36:13.809340 4837 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-697c9bc45b-xc67k" Jan 11 17:36:13 crc kubenswrapper[4837]: I0111 17:36:13.811814 4837 generic.go:334] "Generic (PLEG): container finished" podID="d940561a-5e7f-455d-bb6e-30fa73b1c4b7" containerID="2668822f06013c245a6f6b7d1f3b753d5e9d71e07f839edc3a48ba75009b6c82" exitCode=0 Jan 11 17:36:13 crc kubenswrapper[4837]: I0111 17:36:13.811848 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-78cdf68d5d-rp67p" event={"ID":"d940561a-5e7f-455d-bb6e-30fa73b1c4b7","Type":"ContainerDied","Data":"2668822f06013c245a6f6b7d1f3b753d5e9d71e07f839edc3a48ba75009b6c82"} Jan 11 17:36:13 crc kubenswrapper[4837]: I0111 17:36:13.811868 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-78cdf68d5d-rp67p" event={"ID":"d940561a-5e7f-455d-bb6e-30fa73b1c4b7","Type":"ContainerDied","Data":"d6aa26495a594d2887e26d4f1ad3257348ec62f509f38d1d3b877c7480db4056"} Jan 11 17:36:13 crc kubenswrapper[4837]: I0111 17:36:13.811912 4837 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-78cdf68d5d-rp67p" Jan 11 17:36:13 crc kubenswrapper[4837]: I0111 17:36:13.813228 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/d940561a-5e7f-455d-bb6e-30fa73b1c4b7-serving-cert\") pod \"d940561a-5e7f-455d-bb6e-30fa73b1c4b7\" (UID: \"d940561a-5e7f-455d-bb6e-30fa73b1c4b7\") " Jan 11 17:36:13 crc kubenswrapper[4837]: I0111 17:36:13.813286 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6cb608ee-09ed-44ec-a8b7-d22aa14fe80d-config\") pod \"6cb608ee-09ed-44ec-a8b7-d22aa14fe80d\" (UID: \"6cb608ee-09ed-44ec-a8b7-d22aa14fe80d\") " Jan 11 17:36:13 crc kubenswrapper[4837]: I0111 17:36:13.813311 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-h285z\" (UniqueName: \"kubernetes.io/projected/6cb608ee-09ed-44ec-a8b7-d22aa14fe80d-kube-api-access-h285z\") pod \"6cb608ee-09ed-44ec-a8b7-d22aa14fe80d\" (UID: \"6cb608ee-09ed-44ec-a8b7-d22aa14fe80d\") " Jan 11 17:36:13 crc kubenswrapper[4837]: I0111 17:36:13.813336 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/d940561a-5e7f-455d-bb6e-30fa73b1c4b7-proxy-ca-bundles\") pod \"d940561a-5e7f-455d-bb6e-30fa73b1c4b7\" (UID: \"d940561a-5e7f-455d-bb6e-30fa73b1c4b7\") " Jan 11 17:36:13 crc kubenswrapper[4837]: I0111 17:36:13.813377 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/6cb608ee-09ed-44ec-a8b7-d22aa14fe80d-serving-cert\") pod \"6cb608ee-09ed-44ec-a8b7-d22aa14fe80d\" (UID: \"6cb608ee-09ed-44ec-a8b7-d22aa14fe80d\") " Jan 11 17:36:13 crc kubenswrapper[4837]: I0111 17:36:13.813419 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-l69hg\" (UniqueName: \"kubernetes.io/projected/d940561a-5e7f-455d-bb6e-30fa73b1c4b7-kube-api-access-l69hg\") pod \"d940561a-5e7f-455d-bb6e-30fa73b1c4b7\" (UID: \"d940561a-5e7f-455d-bb6e-30fa73b1c4b7\") " Jan 11 17:36:13 crc kubenswrapper[4837]: I0111 17:36:13.813457 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/d940561a-5e7f-455d-bb6e-30fa73b1c4b7-client-ca\") pod \"d940561a-5e7f-455d-bb6e-30fa73b1c4b7\" (UID: \"d940561a-5e7f-455d-bb6e-30fa73b1c4b7\") " Jan 11 17:36:13 crc kubenswrapper[4837]: I0111 17:36:13.813479 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/6cb608ee-09ed-44ec-a8b7-d22aa14fe80d-client-ca\") pod \"6cb608ee-09ed-44ec-a8b7-d22aa14fe80d\" (UID: \"6cb608ee-09ed-44ec-a8b7-d22aa14fe80d\") " Jan 11 17:36:13 crc kubenswrapper[4837]: I0111 17:36:13.813500 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d940561a-5e7f-455d-bb6e-30fa73b1c4b7-config\") pod \"d940561a-5e7f-455d-bb6e-30fa73b1c4b7\" (UID: \"d940561a-5e7f-455d-bb6e-30fa73b1c4b7\") " Jan 11 17:36:13 crc kubenswrapper[4837]: I0111 17:36:13.814300 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d940561a-5e7f-455d-bb6e-30fa73b1c4b7-proxy-ca-bundles" (OuterVolumeSpecName: "proxy-ca-bundles") pod "d940561a-5e7f-455d-bb6e-30fa73b1c4b7" (UID: "d940561a-5e7f-455d-bb6e-30fa73b1c4b7"). InnerVolumeSpecName "proxy-ca-bundles". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 11 17:36:13 crc kubenswrapper[4837]: I0111 17:36:13.814560 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d940561a-5e7f-455d-bb6e-30fa73b1c4b7-client-ca" (OuterVolumeSpecName: "client-ca") pod "d940561a-5e7f-455d-bb6e-30fa73b1c4b7" (UID: "d940561a-5e7f-455d-bb6e-30fa73b1c4b7"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 11 17:36:13 crc kubenswrapper[4837]: I0111 17:36:13.814365 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d940561a-5e7f-455d-bb6e-30fa73b1c4b7-config" (OuterVolumeSpecName: "config") pod "d940561a-5e7f-455d-bb6e-30fa73b1c4b7" (UID: "d940561a-5e7f-455d-bb6e-30fa73b1c4b7"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 11 17:36:13 crc kubenswrapper[4837]: I0111 17:36:13.814981 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6cb608ee-09ed-44ec-a8b7-d22aa14fe80d-client-ca" (OuterVolumeSpecName: "client-ca") pod "6cb608ee-09ed-44ec-a8b7-d22aa14fe80d" (UID: "6cb608ee-09ed-44ec-a8b7-d22aa14fe80d"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 11 17:36:13 crc kubenswrapper[4837]: I0111 17:36:13.815328 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6cb608ee-09ed-44ec-a8b7-d22aa14fe80d-config" (OuterVolumeSpecName: "config") pod "6cb608ee-09ed-44ec-a8b7-d22aa14fe80d" (UID: "6cb608ee-09ed-44ec-a8b7-d22aa14fe80d"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 11 17:36:13 crc kubenswrapper[4837]: I0111 17:36:13.819035 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d940561a-5e7f-455d-bb6e-30fa73b1c4b7-kube-api-access-l69hg" (OuterVolumeSpecName: "kube-api-access-l69hg") pod "d940561a-5e7f-455d-bb6e-30fa73b1c4b7" (UID: "d940561a-5e7f-455d-bb6e-30fa73b1c4b7"). InnerVolumeSpecName "kube-api-access-l69hg". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 11 17:36:13 crc kubenswrapper[4837]: I0111 17:36:13.819258 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6cb608ee-09ed-44ec-a8b7-d22aa14fe80d-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "6cb608ee-09ed-44ec-a8b7-d22aa14fe80d" (UID: "6cb608ee-09ed-44ec-a8b7-d22aa14fe80d"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 11 17:36:13 crc kubenswrapper[4837]: I0111 17:36:13.831296 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d940561a-5e7f-455d-bb6e-30fa73b1c4b7-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "d940561a-5e7f-455d-bb6e-30fa73b1c4b7" (UID: "d940561a-5e7f-455d-bb6e-30fa73b1c4b7"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 11 17:36:13 crc kubenswrapper[4837]: I0111 17:36:13.831307 4837 scope.go:117] "RemoveContainer" containerID="1d13a91e4e0878bdf0fc145511e4abb890c2c9134ff494c0a8020641853098fa" Jan 11 17:36:13 crc kubenswrapper[4837]: I0111 17:36:13.831648 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6cb608ee-09ed-44ec-a8b7-d22aa14fe80d-kube-api-access-h285z" (OuterVolumeSpecName: "kube-api-access-h285z") pod "6cb608ee-09ed-44ec-a8b7-d22aa14fe80d" (UID: "6cb608ee-09ed-44ec-a8b7-d22aa14fe80d"). InnerVolumeSpecName "kube-api-access-h285z". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 11 17:36:13 crc kubenswrapper[4837]: E0111 17:36:13.831912 4837 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"1d13a91e4e0878bdf0fc145511e4abb890c2c9134ff494c0a8020641853098fa\": container with ID starting with 1d13a91e4e0878bdf0fc145511e4abb890c2c9134ff494c0a8020641853098fa not found: ID does not exist" containerID="1d13a91e4e0878bdf0fc145511e4abb890c2c9134ff494c0a8020641853098fa" Jan 11 17:36:13 crc kubenswrapper[4837]: I0111 17:36:13.831952 4837 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1d13a91e4e0878bdf0fc145511e4abb890c2c9134ff494c0a8020641853098fa"} err="failed to get container status \"1d13a91e4e0878bdf0fc145511e4abb890c2c9134ff494c0a8020641853098fa\": rpc error: code = NotFound desc = could not find container \"1d13a91e4e0878bdf0fc145511e4abb890c2c9134ff494c0a8020641853098fa\": container with ID starting with 1d13a91e4e0878bdf0fc145511e4abb890c2c9134ff494c0a8020641853098fa not found: ID does not exist" Jan 11 17:36:13 crc kubenswrapper[4837]: I0111 17:36:13.831980 4837 scope.go:117] "RemoveContainer" containerID="2668822f06013c245a6f6b7d1f3b753d5e9d71e07f839edc3a48ba75009b6c82" Jan 11 17:36:13 crc kubenswrapper[4837]: I0111 17:36:13.851637 4837 scope.go:117] "RemoveContainer" containerID="2668822f06013c245a6f6b7d1f3b753d5e9d71e07f839edc3a48ba75009b6c82" Jan 11 17:36:13 crc kubenswrapper[4837]: E0111 17:36:13.852100 4837 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"2668822f06013c245a6f6b7d1f3b753d5e9d71e07f839edc3a48ba75009b6c82\": container with ID starting with 2668822f06013c245a6f6b7d1f3b753d5e9d71e07f839edc3a48ba75009b6c82 not found: ID does not exist" containerID="2668822f06013c245a6f6b7d1f3b753d5e9d71e07f839edc3a48ba75009b6c82" Jan 11 17:36:13 crc kubenswrapper[4837]: I0111 17:36:13.852146 4837 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2668822f06013c245a6f6b7d1f3b753d5e9d71e07f839edc3a48ba75009b6c82"} err="failed to get container status \"2668822f06013c245a6f6b7d1f3b753d5e9d71e07f839edc3a48ba75009b6c82\": rpc error: code = NotFound desc = could not find container \"2668822f06013c245a6f6b7d1f3b753d5e9d71e07f839edc3a48ba75009b6c82\": container with ID starting with 2668822f06013c245a6f6b7d1f3b753d5e9d71e07f839edc3a48ba75009b6c82 not found: ID does not exist" Jan 11 17:36:13 crc kubenswrapper[4837]: I0111 17:36:13.914409 4837 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/6cb608ee-09ed-44ec-a8b7-d22aa14fe80d-serving-cert\") on node \"crc\" DevicePath \"\"" Jan 11 17:36:13 crc kubenswrapper[4837]: I0111 17:36:13.914466 4837 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-l69hg\" (UniqueName: \"kubernetes.io/projected/d940561a-5e7f-455d-bb6e-30fa73b1c4b7-kube-api-access-l69hg\") on node \"crc\" DevicePath \"\"" Jan 11 17:36:13 crc kubenswrapper[4837]: I0111 17:36:13.914480 4837 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/d940561a-5e7f-455d-bb6e-30fa73b1c4b7-client-ca\") on node \"crc\" DevicePath \"\"" Jan 11 17:36:13 crc kubenswrapper[4837]: I0111 17:36:13.914490 4837 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/6cb608ee-09ed-44ec-a8b7-d22aa14fe80d-client-ca\") on node \"crc\" DevicePath \"\"" Jan 11 17:36:13 crc kubenswrapper[4837]: I0111 17:36:13.914501 4837 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d940561a-5e7f-455d-bb6e-30fa73b1c4b7-config\") on node \"crc\" DevicePath \"\"" Jan 11 17:36:13 crc kubenswrapper[4837]: I0111 17:36:13.914510 4837 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/d940561a-5e7f-455d-bb6e-30fa73b1c4b7-serving-cert\") on node \"crc\" DevicePath \"\"" Jan 11 17:36:13 crc kubenswrapper[4837]: I0111 17:36:13.914520 4837 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6cb608ee-09ed-44ec-a8b7-d22aa14fe80d-config\") on node \"crc\" DevicePath \"\"" Jan 11 17:36:13 crc kubenswrapper[4837]: I0111 17:36:13.914529 4837 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-h285z\" (UniqueName: \"kubernetes.io/projected/6cb608ee-09ed-44ec-a8b7-d22aa14fe80d-kube-api-access-h285z\") on node \"crc\" DevicePath \"\"" Jan 11 17:36:13 crc kubenswrapper[4837]: I0111 17:36:13.914538 4837 reconciler_common.go:293] "Volume detached for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/d940561a-5e7f-455d-bb6e-30fa73b1c4b7-proxy-ca-bundles\") on node \"crc\" DevicePath \"\"" Jan 11 17:36:14 crc kubenswrapper[4837]: I0111 17:36:14.147244 4837 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-697c9bc45b-xc67k"] Jan 11 17:36:14 crc kubenswrapper[4837]: I0111 17:36:14.156731 4837 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-697c9bc45b-xc67k"] Jan 11 17:36:14 crc kubenswrapper[4837]: I0111 17:36:14.161342 4837 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-controller-manager/controller-manager-78cdf68d5d-rp67p"] Jan 11 17:36:14 crc kubenswrapper[4837]: I0111 17:36:14.165270 4837 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-controller-manager/controller-manager-78cdf68d5d-rp67p"] Jan 11 17:36:14 crc kubenswrapper[4837]: I0111 17:36:14.375535 4837 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6cb608ee-09ed-44ec-a8b7-d22aa14fe80d" path="/var/lib/kubelet/pods/6cb608ee-09ed-44ec-a8b7-d22aa14fe80d/volumes" Jan 11 17:36:14 crc kubenswrapper[4837]: I0111 17:36:14.377044 4837 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d940561a-5e7f-455d-bb6e-30fa73b1c4b7" path="/var/lib/kubelet/pods/d940561a-5e7f-455d-bb6e-30fa73b1c4b7/volumes" Jan 11 17:36:14 crc kubenswrapper[4837]: I0111 17:36:14.841592 4837 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-route-controller-manager/route-controller-manager-56669bb699-p9rjg"] Jan 11 17:36:14 crc kubenswrapper[4837]: E0111 17:36:14.841906 4837 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1aecbb0e-6cc3-4308-a741-c1799ec8b541" containerName="extract-content" Jan 11 17:36:14 crc kubenswrapper[4837]: I0111 17:36:14.841931 4837 state_mem.go:107] "Deleted CPUSet assignment" podUID="1aecbb0e-6cc3-4308-a741-c1799ec8b541" containerName="extract-content" Jan 11 17:36:14 crc kubenswrapper[4837]: E0111 17:36:14.841946 4837 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6cb608ee-09ed-44ec-a8b7-d22aa14fe80d" containerName="route-controller-manager" Jan 11 17:36:14 crc kubenswrapper[4837]: I0111 17:36:14.841956 4837 state_mem.go:107] "Deleted CPUSet assignment" podUID="6cb608ee-09ed-44ec-a8b7-d22aa14fe80d" containerName="route-controller-manager" Jan 11 17:36:14 crc kubenswrapper[4837]: E0111 17:36:14.841966 4837 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0f2f437a-d901-4e9b-95c1-9099d8ffaf9c" containerName="registry-server" Jan 11 17:36:14 crc kubenswrapper[4837]: I0111 17:36:14.841974 4837 state_mem.go:107] "Deleted CPUSet assignment" podUID="0f2f437a-d901-4e9b-95c1-9099d8ffaf9c" containerName="registry-server" Jan 11 17:36:14 crc kubenswrapper[4837]: E0111 17:36:14.841986 4837 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1aecbb0e-6cc3-4308-a741-c1799ec8b541" containerName="registry-server" Jan 11 17:36:14 crc kubenswrapper[4837]: I0111 17:36:14.841995 4837 state_mem.go:107] "Deleted CPUSet assignment" podUID="1aecbb0e-6cc3-4308-a741-c1799ec8b541" containerName="registry-server" Jan 11 17:36:14 crc kubenswrapper[4837]: E0111 17:36:14.842007 4837 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="df3f0f97-5906-44b5-99d5-6003e1b23be1" containerName="registry-server" Jan 11 17:36:14 crc kubenswrapper[4837]: I0111 17:36:14.842015 4837 state_mem.go:107] "Deleted CPUSet assignment" podUID="df3f0f97-5906-44b5-99d5-6003e1b23be1" containerName="registry-server" Jan 11 17:36:14 crc kubenswrapper[4837]: E0111 17:36:14.842026 4837 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0f2f437a-d901-4e9b-95c1-9099d8ffaf9c" containerName="extract-utilities" Jan 11 17:36:14 crc kubenswrapper[4837]: I0111 17:36:14.842033 4837 state_mem.go:107] "Deleted CPUSet assignment" podUID="0f2f437a-d901-4e9b-95c1-9099d8ffaf9c" containerName="extract-utilities" Jan 11 17:36:14 crc kubenswrapper[4837]: E0111 17:36:14.842047 4837 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2bf0823a-63d9-43c9-9cc7-1e2f364b7855" containerName="registry-server" Jan 11 17:36:14 crc kubenswrapper[4837]: I0111 17:36:14.842057 4837 state_mem.go:107] "Deleted CPUSet assignment" podUID="2bf0823a-63d9-43c9-9cc7-1e2f364b7855" containerName="registry-server" Jan 11 17:36:14 crc kubenswrapper[4837]: E0111 17:36:14.842067 4837 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1aecbb0e-6cc3-4308-a741-c1799ec8b541" containerName="extract-utilities" Jan 11 17:36:14 crc kubenswrapper[4837]: I0111 17:36:14.842075 4837 state_mem.go:107] "Deleted CPUSet assignment" podUID="1aecbb0e-6cc3-4308-a741-c1799ec8b541" containerName="extract-utilities" Jan 11 17:36:14 crc kubenswrapper[4837]: E0111 17:36:14.842088 4837 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="df3f0f97-5906-44b5-99d5-6003e1b23be1" containerName="extract-utilities" Jan 11 17:36:14 crc kubenswrapper[4837]: I0111 17:36:14.842097 4837 state_mem.go:107] "Deleted CPUSet assignment" podUID="df3f0f97-5906-44b5-99d5-6003e1b23be1" containerName="extract-utilities" Jan 11 17:36:14 crc kubenswrapper[4837]: E0111 17:36:14.842108 4837 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="df3f0f97-5906-44b5-99d5-6003e1b23be1" containerName="extract-content" Jan 11 17:36:14 crc kubenswrapper[4837]: I0111 17:36:14.842116 4837 state_mem.go:107] "Deleted CPUSet assignment" podUID="df3f0f97-5906-44b5-99d5-6003e1b23be1" containerName="extract-content" Jan 11 17:36:14 crc kubenswrapper[4837]: E0111 17:36:14.842125 4837 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d940561a-5e7f-455d-bb6e-30fa73b1c4b7" containerName="controller-manager" Jan 11 17:36:14 crc kubenswrapper[4837]: I0111 17:36:14.842133 4837 state_mem.go:107] "Deleted CPUSet assignment" podUID="d940561a-5e7f-455d-bb6e-30fa73b1c4b7" containerName="controller-manager" Jan 11 17:36:14 crc kubenswrapper[4837]: E0111 17:36:14.842143 4837 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0f2f437a-d901-4e9b-95c1-9099d8ffaf9c" containerName="extract-content" Jan 11 17:36:14 crc kubenswrapper[4837]: I0111 17:36:14.842151 4837 state_mem.go:107] "Deleted CPUSet assignment" podUID="0f2f437a-d901-4e9b-95c1-9099d8ffaf9c" containerName="extract-content" Jan 11 17:36:14 crc kubenswrapper[4837]: E0111 17:36:14.842167 4837 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2bf0823a-63d9-43c9-9cc7-1e2f364b7855" containerName="extract-content" Jan 11 17:36:14 crc kubenswrapper[4837]: I0111 17:36:14.842175 4837 state_mem.go:107] "Deleted CPUSet assignment" podUID="2bf0823a-63d9-43c9-9cc7-1e2f364b7855" containerName="extract-content" Jan 11 17:36:14 crc kubenswrapper[4837]: E0111 17:36:14.842189 4837 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2bf0823a-63d9-43c9-9cc7-1e2f364b7855" containerName="extract-utilities" Jan 11 17:36:14 crc kubenswrapper[4837]: I0111 17:36:14.842197 4837 state_mem.go:107] "Deleted CPUSet assignment" podUID="2bf0823a-63d9-43c9-9cc7-1e2f364b7855" containerName="extract-utilities" Jan 11 17:36:14 crc kubenswrapper[4837]: I0111 17:36:14.842306 4837 memory_manager.go:354] "RemoveStaleState removing state" podUID="2bf0823a-63d9-43c9-9cc7-1e2f364b7855" containerName="registry-server" Jan 11 17:36:14 crc kubenswrapper[4837]: I0111 17:36:14.842323 4837 memory_manager.go:354] "RemoveStaleState removing state" podUID="df3f0f97-5906-44b5-99d5-6003e1b23be1" containerName="registry-server" Jan 11 17:36:14 crc kubenswrapper[4837]: I0111 17:36:14.842342 4837 memory_manager.go:354] "RemoveStaleState removing state" podUID="d940561a-5e7f-455d-bb6e-30fa73b1c4b7" containerName="controller-manager" Jan 11 17:36:14 crc kubenswrapper[4837]: I0111 17:36:14.842353 4837 memory_manager.go:354] "RemoveStaleState removing state" podUID="6cb608ee-09ed-44ec-a8b7-d22aa14fe80d" containerName="route-controller-manager" Jan 11 17:36:14 crc kubenswrapper[4837]: I0111 17:36:14.842362 4837 memory_manager.go:354] "RemoveStaleState removing state" podUID="1aecbb0e-6cc3-4308-a741-c1799ec8b541" containerName="registry-server" Jan 11 17:36:14 crc kubenswrapper[4837]: I0111 17:36:14.842374 4837 memory_manager.go:354] "RemoveStaleState removing state" podUID="0f2f437a-d901-4e9b-95c1-9099d8ffaf9c" containerName="registry-server" Jan 11 17:36:14 crc kubenswrapper[4837]: I0111 17:36:14.842853 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-56669bb699-p9rjg" Jan 11 17:36:14 crc kubenswrapper[4837]: I0111 17:36:14.844852 4837 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"serving-cert" Jan 11 17:36:14 crc kubenswrapper[4837]: I0111 17:36:14.844904 4837 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"openshift-service-ca.crt" Jan 11 17:36:14 crc kubenswrapper[4837]: I0111 17:36:14.845513 4837 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"kube-root-ca.crt" Jan 11 17:36:14 crc kubenswrapper[4837]: I0111 17:36:14.845522 4837 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-controller-manager/controller-manager-8565796d8b-lmzjz"] Jan 11 17:36:14 crc kubenswrapper[4837]: I0111 17:36:14.845553 4837 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"config" Jan 11 17:36:14 crc kubenswrapper[4837]: I0111 17:36:14.846305 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-8565796d8b-lmzjz" Jan 11 17:36:14 crc kubenswrapper[4837]: I0111 17:36:14.848569 4837 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"route-controller-manager-sa-dockercfg-h2zr2" Jan 11 17:36:14 crc kubenswrapper[4837]: I0111 17:36:14.848792 4837 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"openshift-controller-manager-sa-dockercfg-msq4c" Jan 11 17:36:14 crc kubenswrapper[4837]: I0111 17:36:14.848862 4837 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-service-ca.crt" Jan 11 17:36:14 crc kubenswrapper[4837]: I0111 17:36:14.849009 4837 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"client-ca" Jan 11 17:36:14 crc kubenswrapper[4837]: I0111 17:36:14.849107 4837 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"config" Jan 11 17:36:14 crc kubenswrapper[4837]: I0111 17:36:14.850208 4837 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"serving-cert" Jan 11 17:36:14 crc kubenswrapper[4837]: I0111 17:36:14.850950 4837 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"kube-root-ca.crt" Jan 11 17:36:14 crc kubenswrapper[4837]: I0111 17:36:14.853284 4837 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"client-ca" Jan 11 17:36:14 crc kubenswrapper[4837]: I0111 17:36:14.858942 4837 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-global-ca" Jan 11 17:36:14 crc kubenswrapper[4837]: I0111 17:36:14.860683 4837 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-8565796d8b-lmzjz"] Jan 11 17:36:14 crc kubenswrapper[4837]: I0111 17:36:14.864332 4837 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-56669bb699-p9rjg"] Jan 11 17:36:15 crc kubenswrapper[4837]: I0111 17:36:15.029007 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/188c58de-fa6a-400f-a626-bc8ea9b63976-config\") pod \"controller-manager-8565796d8b-lmzjz\" (UID: \"188c58de-fa6a-400f-a626-bc8ea9b63976\") " pod="openshift-controller-manager/controller-manager-8565796d8b-lmzjz" Jan 11 17:36:15 crc kubenswrapper[4837]: I0111 17:36:15.029055 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-x2cgd\" (UniqueName: \"kubernetes.io/projected/188c58de-fa6a-400f-a626-bc8ea9b63976-kube-api-access-x2cgd\") pod \"controller-manager-8565796d8b-lmzjz\" (UID: \"188c58de-fa6a-400f-a626-bc8ea9b63976\") " pod="openshift-controller-manager/controller-manager-8565796d8b-lmzjz" Jan 11 17:36:15 crc kubenswrapper[4837]: I0111 17:36:15.029090 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/4daba4a5-0eb1-43c4-8194-f0f09ccb1c8d-serving-cert\") pod \"route-controller-manager-56669bb699-p9rjg\" (UID: \"4daba4a5-0eb1-43c4-8194-f0f09ccb1c8d\") " pod="openshift-route-controller-manager/route-controller-manager-56669bb699-p9rjg" Jan 11 17:36:15 crc kubenswrapper[4837]: I0111 17:36:15.029159 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/188c58de-fa6a-400f-a626-bc8ea9b63976-proxy-ca-bundles\") pod \"controller-manager-8565796d8b-lmzjz\" (UID: \"188c58de-fa6a-400f-a626-bc8ea9b63976\") " pod="openshift-controller-manager/controller-manager-8565796d8b-lmzjz" Jan 11 17:36:15 crc kubenswrapper[4837]: I0111 17:36:15.029192 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/188c58de-fa6a-400f-a626-bc8ea9b63976-client-ca\") pod \"controller-manager-8565796d8b-lmzjz\" (UID: \"188c58de-fa6a-400f-a626-bc8ea9b63976\") " pod="openshift-controller-manager/controller-manager-8565796d8b-lmzjz" Jan 11 17:36:15 crc kubenswrapper[4837]: I0111 17:36:15.029245 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/4daba4a5-0eb1-43c4-8194-f0f09ccb1c8d-client-ca\") pod \"route-controller-manager-56669bb699-p9rjg\" (UID: \"4daba4a5-0eb1-43c4-8194-f0f09ccb1c8d\") " pod="openshift-route-controller-manager/route-controller-manager-56669bb699-p9rjg" Jan 11 17:36:15 crc kubenswrapper[4837]: I0111 17:36:15.029365 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/188c58de-fa6a-400f-a626-bc8ea9b63976-serving-cert\") pod \"controller-manager-8565796d8b-lmzjz\" (UID: \"188c58de-fa6a-400f-a626-bc8ea9b63976\") " pod="openshift-controller-manager/controller-manager-8565796d8b-lmzjz" Jan 11 17:36:15 crc kubenswrapper[4837]: I0111 17:36:15.029433 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/4daba4a5-0eb1-43c4-8194-f0f09ccb1c8d-config\") pod \"route-controller-manager-56669bb699-p9rjg\" (UID: \"4daba4a5-0eb1-43c4-8194-f0f09ccb1c8d\") " pod="openshift-route-controller-manager/route-controller-manager-56669bb699-p9rjg" Jan 11 17:36:15 crc kubenswrapper[4837]: I0111 17:36:15.029492 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-r842p\" (UniqueName: \"kubernetes.io/projected/4daba4a5-0eb1-43c4-8194-f0f09ccb1c8d-kube-api-access-r842p\") pod \"route-controller-manager-56669bb699-p9rjg\" (UID: \"4daba4a5-0eb1-43c4-8194-f0f09ccb1c8d\") " pod="openshift-route-controller-manager/route-controller-manager-56669bb699-p9rjg" Jan 11 17:36:15 crc kubenswrapper[4837]: I0111 17:36:15.130757 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/188c58de-fa6a-400f-a626-bc8ea9b63976-proxy-ca-bundles\") pod \"controller-manager-8565796d8b-lmzjz\" (UID: \"188c58de-fa6a-400f-a626-bc8ea9b63976\") " pod="openshift-controller-manager/controller-manager-8565796d8b-lmzjz" Jan 11 17:36:15 crc kubenswrapper[4837]: I0111 17:36:15.130848 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/188c58de-fa6a-400f-a626-bc8ea9b63976-client-ca\") pod \"controller-manager-8565796d8b-lmzjz\" (UID: \"188c58de-fa6a-400f-a626-bc8ea9b63976\") " pod="openshift-controller-manager/controller-manager-8565796d8b-lmzjz" Jan 11 17:36:15 crc kubenswrapper[4837]: I0111 17:36:15.130904 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/4daba4a5-0eb1-43c4-8194-f0f09ccb1c8d-client-ca\") pod \"route-controller-manager-56669bb699-p9rjg\" (UID: \"4daba4a5-0eb1-43c4-8194-f0f09ccb1c8d\") " pod="openshift-route-controller-manager/route-controller-manager-56669bb699-p9rjg" Jan 11 17:36:15 crc kubenswrapper[4837]: I0111 17:36:15.130929 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/188c58de-fa6a-400f-a626-bc8ea9b63976-serving-cert\") pod \"controller-manager-8565796d8b-lmzjz\" (UID: \"188c58de-fa6a-400f-a626-bc8ea9b63976\") " pod="openshift-controller-manager/controller-manager-8565796d8b-lmzjz" Jan 11 17:36:15 crc kubenswrapper[4837]: I0111 17:36:15.130945 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/4daba4a5-0eb1-43c4-8194-f0f09ccb1c8d-config\") pod \"route-controller-manager-56669bb699-p9rjg\" (UID: \"4daba4a5-0eb1-43c4-8194-f0f09ccb1c8d\") " pod="openshift-route-controller-manager/route-controller-manager-56669bb699-p9rjg" Jan 11 17:36:15 crc kubenswrapper[4837]: I0111 17:36:15.130982 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-r842p\" (UniqueName: \"kubernetes.io/projected/4daba4a5-0eb1-43c4-8194-f0f09ccb1c8d-kube-api-access-r842p\") pod \"route-controller-manager-56669bb699-p9rjg\" (UID: \"4daba4a5-0eb1-43c4-8194-f0f09ccb1c8d\") " pod="openshift-route-controller-manager/route-controller-manager-56669bb699-p9rjg" Jan 11 17:36:15 crc kubenswrapper[4837]: I0111 17:36:15.130999 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/188c58de-fa6a-400f-a626-bc8ea9b63976-config\") pod \"controller-manager-8565796d8b-lmzjz\" (UID: \"188c58de-fa6a-400f-a626-bc8ea9b63976\") " pod="openshift-controller-manager/controller-manager-8565796d8b-lmzjz" Jan 11 17:36:15 crc kubenswrapper[4837]: I0111 17:36:15.131024 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-x2cgd\" (UniqueName: \"kubernetes.io/projected/188c58de-fa6a-400f-a626-bc8ea9b63976-kube-api-access-x2cgd\") pod \"controller-manager-8565796d8b-lmzjz\" (UID: \"188c58de-fa6a-400f-a626-bc8ea9b63976\") " pod="openshift-controller-manager/controller-manager-8565796d8b-lmzjz" Jan 11 17:36:15 crc kubenswrapper[4837]: I0111 17:36:15.131064 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/4daba4a5-0eb1-43c4-8194-f0f09ccb1c8d-serving-cert\") pod \"route-controller-manager-56669bb699-p9rjg\" (UID: \"4daba4a5-0eb1-43c4-8194-f0f09ccb1c8d\") " pod="openshift-route-controller-manager/route-controller-manager-56669bb699-p9rjg" Jan 11 17:36:15 crc kubenswrapper[4837]: I0111 17:36:15.132057 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/188c58de-fa6a-400f-a626-bc8ea9b63976-proxy-ca-bundles\") pod \"controller-manager-8565796d8b-lmzjz\" (UID: \"188c58de-fa6a-400f-a626-bc8ea9b63976\") " pod="openshift-controller-manager/controller-manager-8565796d8b-lmzjz" Jan 11 17:36:15 crc kubenswrapper[4837]: I0111 17:36:15.132076 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/4daba4a5-0eb1-43c4-8194-f0f09ccb1c8d-client-ca\") pod \"route-controller-manager-56669bb699-p9rjg\" (UID: \"4daba4a5-0eb1-43c4-8194-f0f09ccb1c8d\") " pod="openshift-route-controller-manager/route-controller-manager-56669bb699-p9rjg" Jan 11 17:36:15 crc kubenswrapper[4837]: I0111 17:36:15.133218 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/4daba4a5-0eb1-43c4-8194-f0f09ccb1c8d-config\") pod \"route-controller-manager-56669bb699-p9rjg\" (UID: \"4daba4a5-0eb1-43c4-8194-f0f09ccb1c8d\") " pod="openshift-route-controller-manager/route-controller-manager-56669bb699-p9rjg" Jan 11 17:36:15 crc kubenswrapper[4837]: I0111 17:36:15.133381 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/188c58de-fa6a-400f-a626-bc8ea9b63976-config\") pod \"controller-manager-8565796d8b-lmzjz\" (UID: \"188c58de-fa6a-400f-a626-bc8ea9b63976\") " pod="openshift-controller-manager/controller-manager-8565796d8b-lmzjz" Jan 11 17:36:15 crc kubenswrapper[4837]: I0111 17:36:15.134188 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/188c58de-fa6a-400f-a626-bc8ea9b63976-client-ca\") pod \"controller-manager-8565796d8b-lmzjz\" (UID: \"188c58de-fa6a-400f-a626-bc8ea9b63976\") " pod="openshift-controller-manager/controller-manager-8565796d8b-lmzjz" Jan 11 17:36:15 crc kubenswrapper[4837]: I0111 17:36:15.136626 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/188c58de-fa6a-400f-a626-bc8ea9b63976-serving-cert\") pod \"controller-manager-8565796d8b-lmzjz\" (UID: \"188c58de-fa6a-400f-a626-bc8ea9b63976\") " pod="openshift-controller-manager/controller-manager-8565796d8b-lmzjz" Jan 11 17:36:15 crc kubenswrapper[4837]: I0111 17:36:15.147381 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/4daba4a5-0eb1-43c4-8194-f0f09ccb1c8d-serving-cert\") pod \"route-controller-manager-56669bb699-p9rjg\" (UID: \"4daba4a5-0eb1-43c4-8194-f0f09ccb1c8d\") " pod="openshift-route-controller-manager/route-controller-manager-56669bb699-p9rjg" Jan 11 17:36:15 crc kubenswrapper[4837]: I0111 17:36:15.152124 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-x2cgd\" (UniqueName: \"kubernetes.io/projected/188c58de-fa6a-400f-a626-bc8ea9b63976-kube-api-access-x2cgd\") pod \"controller-manager-8565796d8b-lmzjz\" (UID: \"188c58de-fa6a-400f-a626-bc8ea9b63976\") " pod="openshift-controller-manager/controller-manager-8565796d8b-lmzjz" Jan 11 17:36:15 crc kubenswrapper[4837]: I0111 17:36:15.153475 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-r842p\" (UniqueName: \"kubernetes.io/projected/4daba4a5-0eb1-43c4-8194-f0f09ccb1c8d-kube-api-access-r842p\") pod \"route-controller-manager-56669bb699-p9rjg\" (UID: \"4daba4a5-0eb1-43c4-8194-f0f09ccb1c8d\") " pod="openshift-route-controller-manager/route-controller-manager-56669bb699-p9rjg" Jan 11 17:36:15 crc kubenswrapper[4837]: I0111 17:36:15.174185 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-56669bb699-p9rjg" Jan 11 17:36:15 crc kubenswrapper[4837]: I0111 17:36:15.199478 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-8565796d8b-lmzjz" Jan 11 17:36:15 crc kubenswrapper[4837]: I0111 17:36:15.634112 4837 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-56669bb699-p9rjg"] Jan 11 17:36:15 crc kubenswrapper[4837]: I0111 17:36:15.693048 4837 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-8565796d8b-lmzjz"] Jan 11 17:36:15 crc kubenswrapper[4837]: W0111 17:36:15.699403 4837 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod188c58de_fa6a_400f_a626_bc8ea9b63976.slice/crio-80601551357d49cd1bcf2a2481be75004436e8a67e92674caf336280020a2caa WatchSource:0}: Error finding container 80601551357d49cd1bcf2a2481be75004436e8a67e92674caf336280020a2caa: Status 404 returned error can't find the container with id 80601551357d49cd1bcf2a2481be75004436e8a67e92674caf336280020a2caa Jan 11 17:36:15 crc kubenswrapper[4837]: I0111 17:36:15.825641 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-56669bb699-p9rjg" event={"ID":"4daba4a5-0eb1-43c4-8194-f0f09ccb1c8d","Type":"ContainerStarted","Data":"62aa55c8ca5f3ddf05bfa5b6121d51aeb7868ef336d4277a9b9b149a9d7024e4"} Jan 11 17:36:15 crc kubenswrapper[4837]: I0111 17:36:15.826846 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-8565796d8b-lmzjz" event={"ID":"188c58de-fa6a-400f-a626-bc8ea9b63976","Type":"ContainerStarted","Data":"80601551357d49cd1bcf2a2481be75004436e8a67e92674caf336280020a2caa"} Jan 11 17:36:16 crc kubenswrapper[4837]: I0111 17:36:16.832874 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-8565796d8b-lmzjz" event={"ID":"188c58de-fa6a-400f-a626-bc8ea9b63976","Type":"ContainerStarted","Data":"217be6929e96988578f50f4b77602096c983070eaf813bbc08578f1e8b990f1d"} Jan 11 17:36:16 crc kubenswrapper[4837]: I0111 17:36:16.833385 4837 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-controller-manager/controller-manager-8565796d8b-lmzjz" Jan 11 17:36:16 crc kubenswrapper[4837]: I0111 17:36:16.834195 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-56669bb699-p9rjg" event={"ID":"4daba4a5-0eb1-43c4-8194-f0f09ccb1c8d","Type":"ContainerStarted","Data":"525a145583fadf5bd47677e7b31a508d33b51cabe2077da1c9ee9f33deca3eed"} Jan 11 17:36:16 crc kubenswrapper[4837]: I0111 17:36:16.834625 4837 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-route-controller-manager/route-controller-manager-56669bb699-p9rjg" Jan 11 17:36:16 crc kubenswrapper[4837]: I0111 17:36:16.838205 4837 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-controller-manager/controller-manager-8565796d8b-lmzjz" Jan 11 17:36:16 crc kubenswrapper[4837]: I0111 17:36:16.839997 4837 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-route-controller-manager/route-controller-manager-56669bb699-p9rjg" Jan 11 17:36:16 crc kubenswrapper[4837]: I0111 17:36:16.849085 4837 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-controller-manager/controller-manager-8565796d8b-lmzjz" podStartSLOduration=3.849065006 podStartE2EDuration="3.849065006s" podCreationTimestamp="2026-01-11 17:36:13 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-11 17:36:16.846275539 +0000 UTC m=+351.024468265" watchObservedRunningTime="2026-01-11 17:36:16.849065006 +0000 UTC m=+351.027257702" Jan 11 17:36:16 crc kubenswrapper[4837]: I0111 17:36:16.891930 4837 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-route-controller-manager/route-controller-manager-56669bb699-p9rjg" podStartSLOduration=3.89191205 podStartE2EDuration="3.89191205s" podCreationTimestamp="2026-01-11 17:36:13 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-11 17:36:16.889186964 +0000 UTC m=+351.067379670" watchObservedRunningTime="2026-01-11 17:36:16.89191205 +0000 UTC m=+351.070104766" Jan 11 17:36:33 crc kubenswrapper[4837]: I0111 17:36:33.054314 4837 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-controller-manager/controller-manager-8565796d8b-lmzjz"] Jan 11 17:36:33 crc kubenswrapper[4837]: I0111 17:36:33.055531 4837 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-controller-manager/controller-manager-8565796d8b-lmzjz" podUID="188c58de-fa6a-400f-a626-bc8ea9b63976" containerName="controller-manager" containerID="cri-o://217be6929e96988578f50f4b77602096c983070eaf813bbc08578f1e8b990f1d" gracePeriod=30 Jan 11 17:36:33 crc kubenswrapper[4837]: I0111 17:36:33.189521 4837 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-56669bb699-p9rjg"] Jan 11 17:36:33 crc kubenswrapper[4837]: I0111 17:36:33.189728 4837 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-route-controller-manager/route-controller-manager-56669bb699-p9rjg" podUID="4daba4a5-0eb1-43c4-8194-f0f09ccb1c8d" containerName="route-controller-manager" containerID="cri-o://525a145583fadf5bd47677e7b31a508d33b51cabe2077da1c9ee9f33deca3eed" gracePeriod=30 Jan 11 17:36:34 crc kubenswrapper[4837]: I0111 17:36:34.129100 4837 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-authentication/oauth-openshift-558db77b4-ldzgv" podUID="1b5ce6bf-72e2-494a-aa22-830e992fbec5" containerName="oauth-openshift" containerID="cri-o://e85f3b8dd2e3e1c3b105111e316094cea33f35099a17bdeeaec768ca613f28ce" gracePeriod=15 Jan 11 17:36:34 crc kubenswrapper[4837]: I0111 17:36:34.940096 4837 generic.go:334] "Generic (PLEG): container finished" podID="4daba4a5-0eb1-43c4-8194-f0f09ccb1c8d" containerID="525a145583fadf5bd47677e7b31a508d33b51cabe2077da1c9ee9f33deca3eed" exitCode=0 Jan 11 17:36:34 crc kubenswrapper[4837]: I0111 17:36:34.940282 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-56669bb699-p9rjg" event={"ID":"4daba4a5-0eb1-43c4-8194-f0f09ccb1c8d","Type":"ContainerDied","Data":"525a145583fadf5bd47677e7b31a508d33b51cabe2077da1c9ee9f33deca3eed"} Jan 11 17:36:34 crc kubenswrapper[4837]: I0111 17:36:34.943605 4837 generic.go:334] "Generic (PLEG): container finished" podID="188c58de-fa6a-400f-a626-bc8ea9b63976" containerID="217be6929e96988578f50f4b77602096c983070eaf813bbc08578f1e8b990f1d" exitCode=0 Jan 11 17:36:34 crc kubenswrapper[4837]: I0111 17:36:34.943664 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-8565796d8b-lmzjz" event={"ID":"188c58de-fa6a-400f-a626-bc8ea9b63976","Type":"ContainerDied","Data":"217be6929e96988578f50f4b77602096c983070eaf813bbc08578f1e8b990f1d"} Jan 11 17:36:34 crc kubenswrapper[4837]: I0111 17:36:34.946195 4837 generic.go:334] "Generic (PLEG): container finished" podID="1b5ce6bf-72e2-494a-aa22-830e992fbec5" containerID="e85f3b8dd2e3e1c3b105111e316094cea33f35099a17bdeeaec768ca613f28ce" exitCode=0 Jan 11 17:36:34 crc kubenswrapper[4837]: I0111 17:36:34.946262 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication/oauth-openshift-558db77b4-ldzgv" event={"ID":"1b5ce6bf-72e2-494a-aa22-830e992fbec5","Type":"ContainerDied","Data":"e85f3b8dd2e3e1c3b105111e316094cea33f35099a17bdeeaec768ca613f28ce"} Jan 11 17:36:35 crc kubenswrapper[4837]: I0111 17:36:35.200440 4837 patch_prober.go:28] interesting pod/controller-manager-8565796d8b-lmzjz container/controller-manager namespace/openshift-controller-manager: Readiness probe status=failure output="Get \"https://10.217.0.66:8443/healthz\": dial tcp 10.217.0.66:8443: connect: connection refused" start-of-body= Jan 11 17:36:35 crc kubenswrapper[4837]: I0111 17:36:35.200502 4837 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-controller-manager/controller-manager-8565796d8b-lmzjz" podUID="188c58de-fa6a-400f-a626-bc8ea9b63976" containerName="controller-manager" probeResult="failure" output="Get \"https://10.217.0.66:8443/healthz\": dial tcp 10.217.0.66:8443: connect: connection refused" Jan 11 17:36:35 crc kubenswrapper[4837]: I0111 17:36:35.471907 4837 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-56669bb699-p9rjg" Jan 11 17:36:35 crc kubenswrapper[4837]: I0111 17:36:35.506351 4837 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-route-controller-manager/route-controller-manager-54b89cbf46-54qg5"] Jan 11 17:36:35 crc kubenswrapper[4837]: E0111 17:36:35.506649 4837 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4daba4a5-0eb1-43c4-8194-f0f09ccb1c8d" containerName="route-controller-manager" Jan 11 17:36:35 crc kubenswrapper[4837]: I0111 17:36:35.506668 4837 state_mem.go:107] "Deleted CPUSet assignment" podUID="4daba4a5-0eb1-43c4-8194-f0f09ccb1c8d" containerName="route-controller-manager" Jan 11 17:36:35 crc kubenswrapper[4837]: I0111 17:36:35.506815 4837 memory_manager.go:354] "RemoveStaleState removing state" podUID="4daba4a5-0eb1-43c4-8194-f0f09ccb1c8d" containerName="route-controller-manager" Jan 11 17:36:35 crc kubenswrapper[4837]: I0111 17:36:35.507244 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-54b89cbf46-54qg5" Jan 11 17:36:35 crc kubenswrapper[4837]: I0111 17:36:35.517821 4837 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-54b89cbf46-54qg5"] Jan 11 17:36:35 crc kubenswrapper[4837]: I0111 17:36:35.540548 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/4daba4a5-0eb1-43c4-8194-f0f09ccb1c8d-config\") pod \"4daba4a5-0eb1-43c4-8194-f0f09ccb1c8d\" (UID: \"4daba4a5-0eb1-43c4-8194-f0f09ccb1c8d\") " Jan 11 17:36:35 crc kubenswrapper[4837]: I0111 17:36:35.540605 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/4daba4a5-0eb1-43c4-8194-f0f09ccb1c8d-serving-cert\") pod \"4daba4a5-0eb1-43c4-8194-f0f09ccb1c8d\" (UID: \"4daba4a5-0eb1-43c4-8194-f0f09ccb1c8d\") " Jan 11 17:36:35 crc kubenswrapper[4837]: I0111 17:36:35.540633 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/4daba4a5-0eb1-43c4-8194-f0f09ccb1c8d-client-ca\") pod \"4daba4a5-0eb1-43c4-8194-f0f09ccb1c8d\" (UID: \"4daba4a5-0eb1-43c4-8194-f0f09ccb1c8d\") " Jan 11 17:36:35 crc kubenswrapper[4837]: I0111 17:36:35.540787 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-r842p\" (UniqueName: \"kubernetes.io/projected/4daba4a5-0eb1-43c4-8194-f0f09ccb1c8d-kube-api-access-r842p\") pod \"4daba4a5-0eb1-43c4-8194-f0f09ccb1c8d\" (UID: \"4daba4a5-0eb1-43c4-8194-f0f09ccb1c8d\") " Jan 11 17:36:35 crc kubenswrapper[4837]: I0111 17:36:35.540952 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ktdb7\" (UniqueName: \"kubernetes.io/projected/b70f8a05-9d11-4cf1-ad6f-9ba72b22466a-kube-api-access-ktdb7\") pod \"route-controller-manager-54b89cbf46-54qg5\" (UID: \"b70f8a05-9d11-4cf1-ad6f-9ba72b22466a\") " pod="openshift-route-controller-manager/route-controller-manager-54b89cbf46-54qg5" Jan 11 17:36:35 crc kubenswrapper[4837]: I0111 17:36:35.540981 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b70f8a05-9d11-4cf1-ad6f-9ba72b22466a-config\") pod \"route-controller-manager-54b89cbf46-54qg5\" (UID: \"b70f8a05-9d11-4cf1-ad6f-9ba72b22466a\") " pod="openshift-route-controller-manager/route-controller-manager-54b89cbf46-54qg5" Jan 11 17:36:35 crc kubenswrapper[4837]: I0111 17:36:35.541036 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/b70f8a05-9d11-4cf1-ad6f-9ba72b22466a-serving-cert\") pod \"route-controller-manager-54b89cbf46-54qg5\" (UID: \"b70f8a05-9d11-4cf1-ad6f-9ba72b22466a\") " pod="openshift-route-controller-manager/route-controller-manager-54b89cbf46-54qg5" Jan 11 17:36:35 crc kubenswrapper[4837]: I0111 17:36:35.541052 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/b70f8a05-9d11-4cf1-ad6f-9ba72b22466a-client-ca\") pod \"route-controller-manager-54b89cbf46-54qg5\" (UID: \"b70f8a05-9d11-4cf1-ad6f-9ba72b22466a\") " pod="openshift-route-controller-manager/route-controller-manager-54b89cbf46-54qg5" Jan 11 17:36:35 crc kubenswrapper[4837]: I0111 17:36:35.541455 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4daba4a5-0eb1-43c4-8194-f0f09ccb1c8d-config" (OuterVolumeSpecName: "config") pod "4daba4a5-0eb1-43c4-8194-f0f09ccb1c8d" (UID: "4daba4a5-0eb1-43c4-8194-f0f09ccb1c8d"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 11 17:36:35 crc kubenswrapper[4837]: I0111 17:36:35.541498 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4daba4a5-0eb1-43c4-8194-f0f09ccb1c8d-client-ca" (OuterVolumeSpecName: "client-ca") pod "4daba4a5-0eb1-43c4-8194-f0f09ccb1c8d" (UID: "4daba4a5-0eb1-43c4-8194-f0f09ccb1c8d"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 11 17:36:35 crc kubenswrapper[4837]: I0111 17:36:35.549053 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4daba4a5-0eb1-43c4-8194-f0f09ccb1c8d-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "4daba4a5-0eb1-43c4-8194-f0f09ccb1c8d" (UID: "4daba4a5-0eb1-43c4-8194-f0f09ccb1c8d"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 11 17:36:35 crc kubenswrapper[4837]: I0111 17:36:35.549823 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4daba4a5-0eb1-43c4-8194-f0f09ccb1c8d-kube-api-access-r842p" (OuterVolumeSpecName: "kube-api-access-r842p") pod "4daba4a5-0eb1-43c4-8194-f0f09ccb1c8d" (UID: "4daba4a5-0eb1-43c4-8194-f0f09ccb1c8d"). InnerVolumeSpecName "kube-api-access-r842p". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 11 17:36:35 crc kubenswrapper[4837]: I0111 17:36:35.588464 4837 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-authentication/oauth-openshift-558db77b4-ldzgv" Jan 11 17:36:35 crc kubenswrapper[4837]: I0111 17:36:35.641639 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/1b5ce6bf-72e2-494a-aa22-830e992fbec5-v4-0-config-system-session\") pod \"1b5ce6bf-72e2-494a-aa22-830e992fbec5\" (UID: \"1b5ce6bf-72e2-494a-aa22-830e992fbec5\") " Jan 11 17:36:35 crc kubenswrapper[4837]: I0111 17:36:35.641724 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/1b5ce6bf-72e2-494a-aa22-830e992fbec5-v4-0-config-system-serving-cert\") pod \"1b5ce6bf-72e2-494a-aa22-830e992fbec5\" (UID: \"1b5ce6bf-72e2-494a-aa22-830e992fbec5\") " Jan 11 17:36:35 crc kubenswrapper[4837]: I0111 17:36:35.641774 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/1b5ce6bf-72e2-494a-aa22-830e992fbec5-v4-0-config-system-trusted-ca-bundle\") pod \"1b5ce6bf-72e2-494a-aa22-830e992fbec5\" (UID: \"1b5ce6bf-72e2-494a-aa22-830e992fbec5\") " Jan 11 17:36:35 crc kubenswrapper[4837]: I0111 17:36:35.641813 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/1b5ce6bf-72e2-494a-aa22-830e992fbec5-v4-0-config-system-ocp-branding-template\") pod \"1b5ce6bf-72e2-494a-aa22-830e992fbec5\" (UID: \"1b5ce6bf-72e2-494a-aa22-830e992fbec5\") " Jan 11 17:36:35 crc kubenswrapper[4837]: I0111 17:36:35.641841 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/1b5ce6bf-72e2-494a-aa22-830e992fbec5-v4-0-config-user-template-error\") pod \"1b5ce6bf-72e2-494a-aa22-830e992fbec5\" (UID: \"1b5ce6bf-72e2-494a-aa22-830e992fbec5\") " Jan 11 17:36:35 crc kubenswrapper[4837]: I0111 17:36:35.641856 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/1b5ce6bf-72e2-494a-aa22-830e992fbec5-audit-dir\") pod \"1b5ce6bf-72e2-494a-aa22-830e992fbec5\" (UID: \"1b5ce6bf-72e2-494a-aa22-830e992fbec5\") " Jan 11 17:36:35 crc kubenswrapper[4837]: I0111 17:36:35.641878 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/1b5ce6bf-72e2-494a-aa22-830e992fbec5-v4-0-config-user-idp-0-file-data\") pod \"1b5ce6bf-72e2-494a-aa22-830e992fbec5\" (UID: \"1b5ce6bf-72e2-494a-aa22-830e992fbec5\") " Jan 11 17:36:35 crc kubenswrapper[4837]: I0111 17:36:35.641922 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/1b5ce6bf-72e2-494a-aa22-830e992fbec5-v4-0-config-system-router-certs\") pod \"1b5ce6bf-72e2-494a-aa22-830e992fbec5\" (UID: \"1b5ce6bf-72e2-494a-aa22-830e992fbec5\") " Jan 11 17:36:35 crc kubenswrapper[4837]: I0111 17:36:35.641941 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/1b5ce6bf-72e2-494a-aa22-830e992fbec5-v4-0-config-user-template-provider-selection\") pod \"1b5ce6bf-72e2-494a-aa22-830e992fbec5\" (UID: \"1b5ce6bf-72e2-494a-aa22-830e992fbec5\") " Jan 11 17:36:35 crc kubenswrapper[4837]: I0111 17:36:35.641961 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/1b5ce6bf-72e2-494a-aa22-830e992fbec5-v4-0-config-system-service-ca\") pod \"1b5ce6bf-72e2-494a-aa22-830e992fbec5\" (UID: \"1b5ce6bf-72e2-494a-aa22-830e992fbec5\") " Jan 11 17:36:35 crc kubenswrapper[4837]: I0111 17:36:35.641984 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/1b5ce6bf-72e2-494a-aa22-830e992fbec5-v4-0-config-system-cliconfig\") pod \"1b5ce6bf-72e2-494a-aa22-830e992fbec5\" (UID: \"1b5ce6bf-72e2-494a-aa22-830e992fbec5\") " Jan 11 17:36:35 crc kubenswrapper[4837]: I0111 17:36:35.642017 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/1b5ce6bf-72e2-494a-aa22-830e992fbec5-audit-policies\") pod \"1b5ce6bf-72e2-494a-aa22-830e992fbec5\" (UID: \"1b5ce6bf-72e2-494a-aa22-830e992fbec5\") " Jan 11 17:36:35 crc kubenswrapper[4837]: I0111 17:36:35.642043 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-x8mhv\" (UniqueName: \"kubernetes.io/projected/1b5ce6bf-72e2-494a-aa22-830e992fbec5-kube-api-access-x8mhv\") pod \"1b5ce6bf-72e2-494a-aa22-830e992fbec5\" (UID: \"1b5ce6bf-72e2-494a-aa22-830e992fbec5\") " Jan 11 17:36:35 crc kubenswrapper[4837]: I0111 17:36:35.642083 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/1b5ce6bf-72e2-494a-aa22-830e992fbec5-v4-0-config-user-template-login\") pod \"1b5ce6bf-72e2-494a-aa22-830e992fbec5\" (UID: \"1b5ce6bf-72e2-494a-aa22-830e992fbec5\") " Jan 11 17:36:35 crc kubenswrapper[4837]: I0111 17:36:35.642257 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ktdb7\" (UniqueName: \"kubernetes.io/projected/b70f8a05-9d11-4cf1-ad6f-9ba72b22466a-kube-api-access-ktdb7\") pod \"route-controller-manager-54b89cbf46-54qg5\" (UID: \"b70f8a05-9d11-4cf1-ad6f-9ba72b22466a\") " pod="openshift-route-controller-manager/route-controller-manager-54b89cbf46-54qg5" Jan 11 17:36:35 crc kubenswrapper[4837]: I0111 17:36:35.642285 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b70f8a05-9d11-4cf1-ad6f-9ba72b22466a-config\") pod \"route-controller-manager-54b89cbf46-54qg5\" (UID: \"b70f8a05-9d11-4cf1-ad6f-9ba72b22466a\") " pod="openshift-route-controller-manager/route-controller-manager-54b89cbf46-54qg5" Jan 11 17:36:35 crc kubenswrapper[4837]: I0111 17:36:35.642326 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/b70f8a05-9d11-4cf1-ad6f-9ba72b22466a-serving-cert\") pod \"route-controller-manager-54b89cbf46-54qg5\" (UID: \"b70f8a05-9d11-4cf1-ad6f-9ba72b22466a\") " pod="openshift-route-controller-manager/route-controller-manager-54b89cbf46-54qg5" Jan 11 17:36:35 crc kubenswrapper[4837]: I0111 17:36:35.642344 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/b70f8a05-9d11-4cf1-ad6f-9ba72b22466a-client-ca\") pod \"route-controller-manager-54b89cbf46-54qg5\" (UID: \"b70f8a05-9d11-4cf1-ad6f-9ba72b22466a\") " pod="openshift-route-controller-manager/route-controller-manager-54b89cbf46-54qg5" Jan 11 17:36:35 crc kubenswrapper[4837]: I0111 17:36:35.642382 4837 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/4daba4a5-0eb1-43c4-8194-f0f09ccb1c8d-config\") on node \"crc\" DevicePath \"\"" Jan 11 17:36:35 crc kubenswrapper[4837]: I0111 17:36:35.642393 4837 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/4daba4a5-0eb1-43c4-8194-f0f09ccb1c8d-serving-cert\") on node \"crc\" DevicePath \"\"" Jan 11 17:36:35 crc kubenswrapper[4837]: I0111 17:36:35.642402 4837 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/4daba4a5-0eb1-43c4-8194-f0f09ccb1c8d-client-ca\") on node \"crc\" DevicePath \"\"" Jan 11 17:36:35 crc kubenswrapper[4837]: I0111 17:36:35.642419 4837 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-r842p\" (UniqueName: \"kubernetes.io/projected/4daba4a5-0eb1-43c4-8194-f0f09ccb1c8d-kube-api-access-r842p\") on node \"crc\" DevicePath \"\"" Jan 11 17:36:35 crc kubenswrapper[4837]: I0111 17:36:35.642605 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1b5ce6bf-72e2-494a-aa22-830e992fbec5-v4-0-config-system-trusted-ca-bundle" (OuterVolumeSpecName: "v4-0-config-system-trusted-ca-bundle") pod "1b5ce6bf-72e2-494a-aa22-830e992fbec5" (UID: "1b5ce6bf-72e2-494a-aa22-830e992fbec5"). InnerVolumeSpecName "v4-0-config-system-trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 11 17:36:35 crc kubenswrapper[4837]: I0111 17:36:35.642699 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/1b5ce6bf-72e2-494a-aa22-830e992fbec5-audit-dir" (OuterVolumeSpecName: "audit-dir") pod "1b5ce6bf-72e2-494a-aa22-830e992fbec5" (UID: "1b5ce6bf-72e2-494a-aa22-830e992fbec5"). InnerVolumeSpecName "audit-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Jan 11 17:36:35 crc kubenswrapper[4837]: I0111 17:36:35.643383 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/b70f8a05-9d11-4cf1-ad6f-9ba72b22466a-client-ca\") pod \"route-controller-manager-54b89cbf46-54qg5\" (UID: \"b70f8a05-9d11-4cf1-ad6f-9ba72b22466a\") " pod="openshift-route-controller-manager/route-controller-manager-54b89cbf46-54qg5" Jan 11 17:36:35 crc kubenswrapper[4837]: I0111 17:36:35.645883 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1b5ce6bf-72e2-494a-aa22-830e992fbec5-v4-0-config-user-template-error" (OuterVolumeSpecName: "v4-0-config-user-template-error") pod "1b5ce6bf-72e2-494a-aa22-830e992fbec5" (UID: "1b5ce6bf-72e2-494a-aa22-830e992fbec5"). InnerVolumeSpecName "v4-0-config-user-template-error". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 11 17:36:35 crc kubenswrapper[4837]: I0111 17:36:35.646579 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b70f8a05-9d11-4cf1-ad6f-9ba72b22466a-config\") pod \"route-controller-manager-54b89cbf46-54qg5\" (UID: \"b70f8a05-9d11-4cf1-ad6f-9ba72b22466a\") " pod="openshift-route-controller-manager/route-controller-manager-54b89cbf46-54qg5" Jan 11 17:36:35 crc kubenswrapper[4837]: I0111 17:36:35.647117 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1b5ce6bf-72e2-494a-aa22-830e992fbec5-v4-0-config-system-router-certs" (OuterVolumeSpecName: "v4-0-config-system-router-certs") pod "1b5ce6bf-72e2-494a-aa22-830e992fbec5" (UID: "1b5ce6bf-72e2-494a-aa22-830e992fbec5"). InnerVolumeSpecName "v4-0-config-system-router-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 11 17:36:35 crc kubenswrapper[4837]: I0111 17:36:35.647187 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1b5ce6bf-72e2-494a-aa22-830e992fbec5-audit-policies" (OuterVolumeSpecName: "audit-policies") pod "1b5ce6bf-72e2-494a-aa22-830e992fbec5" (UID: "1b5ce6bf-72e2-494a-aa22-830e992fbec5"). InnerVolumeSpecName "audit-policies". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 11 17:36:35 crc kubenswrapper[4837]: I0111 17:36:35.647510 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1b5ce6bf-72e2-494a-aa22-830e992fbec5-v4-0-config-system-session" (OuterVolumeSpecName: "v4-0-config-system-session") pod "1b5ce6bf-72e2-494a-aa22-830e992fbec5" (UID: "1b5ce6bf-72e2-494a-aa22-830e992fbec5"). InnerVolumeSpecName "v4-0-config-system-session". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 11 17:36:35 crc kubenswrapper[4837]: I0111 17:36:35.647905 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1b5ce6bf-72e2-494a-aa22-830e992fbec5-v4-0-config-system-service-ca" (OuterVolumeSpecName: "v4-0-config-system-service-ca") pod "1b5ce6bf-72e2-494a-aa22-830e992fbec5" (UID: "1b5ce6bf-72e2-494a-aa22-830e992fbec5"). InnerVolumeSpecName "v4-0-config-system-service-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 11 17:36:35 crc kubenswrapper[4837]: I0111 17:36:35.648174 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1b5ce6bf-72e2-494a-aa22-830e992fbec5-kube-api-access-x8mhv" (OuterVolumeSpecName: "kube-api-access-x8mhv") pod "1b5ce6bf-72e2-494a-aa22-830e992fbec5" (UID: "1b5ce6bf-72e2-494a-aa22-830e992fbec5"). InnerVolumeSpecName "kube-api-access-x8mhv". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 11 17:36:35 crc kubenswrapper[4837]: I0111 17:36:35.648600 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1b5ce6bf-72e2-494a-aa22-830e992fbec5-v4-0-config-user-idp-0-file-data" (OuterVolumeSpecName: "v4-0-config-user-idp-0-file-data") pod "1b5ce6bf-72e2-494a-aa22-830e992fbec5" (UID: "1b5ce6bf-72e2-494a-aa22-830e992fbec5"). InnerVolumeSpecName "v4-0-config-user-idp-0-file-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 11 17:36:35 crc kubenswrapper[4837]: I0111 17:36:35.648829 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1b5ce6bf-72e2-494a-aa22-830e992fbec5-v4-0-config-system-cliconfig" (OuterVolumeSpecName: "v4-0-config-system-cliconfig") pod "1b5ce6bf-72e2-494a-aa22-830e992fbec5" (UID: "1b5ce6bf-72e2-494a-aa22-830e992fbec5"). InnerVolumeSpecName "v4-0-config-system-cliconfig". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 11 17:36:35 crc kubenswrapper[4837]: I0111 17:36:35.648907 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1b5ce6bf-72e2-494a-aa22-830e992fbec5-v4-0-config-system-serving-cert" (OuterVolumeSpecName: "v4-0-config-system-serving-cert") pod "1b5ce6bf-72e2-494a-aa22-830e992fbec5" (UID: "1b5ce6bf-72e2-494a-aa22-830e992fbec5"). InnerVolumeSpecName "v4-0-config-system-serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 11 17:36:35 crc kubenswrapper[4837]: I0111 17:36:35.649990 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/b70f8a05-9d11-4cf1-ad6f-9ba72b22466a-serving-cert\") pod \"route-controller-manager-54b89cbf46-54qg5\" (UID: \"b70f8a05-9d11-4cf1-ad6f-9ba72b22466a\") " pod="openshift-route-controller-manager/route-controller-manager-54b89cbf46-54qg5" Jan 11 17:36:35 crc kubenswrapper[4837]: I0111 17:36:35.658735 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1b5ce6bf-72e2-494a-aa22-830e992fbec5-v4-0-config-user-template-login" (OuterVolumeSpecName: "v4-0-config-user-template-login") pod "1b5ce6bf-72e2-494a-aa22-830e992fbec5" (UID: "1b5ce6bf-72e2-494a-aa22-830e992fbec5"). InnerVolumeSpecName "v4-0-config-user-template-login". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 11 17:36:35 crc kubenswrapper[4837]: I0111 17:36:35.659326 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1b5ce6bf-72e2-494a-aa22-830e992fbec5-v4-0-config-user-template-provider-selection" (OuterVolumeSpecName: "v4-0-config-user-template-provider-selection") pod "1b5ce6bf-72e2-494a-aa22-830e992fbec5" (UID: "1b5ce6bf-72e2-494a-aa22-830e992fbec5"). InnerVolumeSpecName "v4-0-config-user-template-provider-selection". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 11 17:36:35 crc kubenswrapper[4837]: I0111 17:36:35.661290 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1b5ce6bf-72e2-494a-aa22-830e992fbec5-v4-0-config-system-ocp-branding-template" (OuterVolumeSpecName: "v4-0-config-system-ocp-branding-template") pod "1b5ce6bf-72e2-494a-aa22-830e992fbec5" (UID: "1b5ce6bf-72e2-494a-aa22-830e992fbec5"). InnerVolumeSpecName "v4-0-config-system-ocp-branding-template". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 11 17:36:35 crc kubenswrapper[4837]: I0111 17:36:35.662742 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ktdb7\" (UniqueName: \"kubernetes.io/projected/b70f8a05-9d11-4cf1-ad6f-9ba72b22466a-kube-api-access-ktdb7\") pod \"route-controller-manager-54b89cbf46-54qg5\" (UID: \"b70f8a05-9d11-4cf1-ad6f-9ba72b22466a\") " pod="openshift-route-controller-manager/route-controller-manager-54b89cbf46-54qg5" Jan 11 17:36:35 crc kubenswrapper[4837]: I0111 17:36:35.668330 4837 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-8565796d8b-lmzjz" Jan 11 17:36:35 crc kubenswrapper[4837]: I0111 17:36:35.742805 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-x2cgd\" (UniqueName: \"kubernetes.io/projected/188c58de-fa6a-400f-a626-bc8ea9b63976-kube-api-access-x2cgd\") pod \"188c58de-fa6a-400f-a626-bc8ea9b63976\" (UID: \"188c58de-fa6a-400f-a626-bc8ea9b63976\") " Jan 11 17:36:35 crc kubenswrapper[4837]: I0111 17:36:35.742842 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/188c58de-fa6a-400f-a626-bc8ea9b63976-serving-cert\") pod \"188c58de-fa6a-400f-a626-bc8ea9b63976\" (UID: \"188c58de-fa6a-400f-a626-bc8ea9b63976\") " Jan 11 17:36:35 crc kubenswrapper[4837]: I0111 17:36:35.742886 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/188c58de-fa6a-400f-a626-bc8ea9b63976-proxy-ca-bundles\") pod \"188c58de-fa6a-400f-a626-bc8ea9b63976\" (UID: \"188c58de-fa6a-400f-a626-bc8ea9b63976\") " Jan 11 17:36:35 crc kubenswrapper[4837]: I0111 17:36:35.742911 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/188c58de-fa6a-400f-a626-bc8ea9b63976-config\") pod \"188c58de-fa6a-400f-a626-bc8ea9b63976\" (UID: \"188c58de-fa6a-400f-a626-bc8ea9b63976\") " Jan 11 17:36:35 crc kubenswrapper[4837]: I0111 17:36:35.742932 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/188c58de-fa6a-400f-a626-bc8ea9b63976-client-ca\") pod \"188c58de-fa6a-400f-a626-bc8ea9b63976\" (UID: \"188c58de-fa6a-400f-a626-bc8ea9b63976\") " Jan 11 17:36:35 crc kubenswrapper[4837]: I0111 17:36:35.743054 4837 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/1b5ce6bf-72e2-494a-aa22-830e992fbec5-v4-0-config-system-cliconfig\") on node \"crc\" DevicePath \"\"" Jan 11 17:36:35 crc kubenswrapper[4837]: I0111 17:36:35.743065 4837 reconciler_common.go:293] "Volume detached for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/1b5ce6bf-72e2-494a-aa22-830e992fbec5-audit-policies\") on node \"crc\" DevicePath \"\"" Jan 11 17:36:35 crc kubenswrapper[4837]: I0111 17:36:35.743075 4837 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-x8mhv\" (UniqueName: \"kubernetes.io/projected/1b5ce6bf-72e2-494a-aa22-830e992fbec5-kube-api-access-x8mhv\") on node \"crc\" DevicePath \"\"" Jan 11 17:36:35 crc kubenswrapper[4837]: I0111 17:36:35.743084 4837 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/1b5ce6bf-72e2-494a-aa22-830e992fbec5-v4-0-config-user-template-login\") on node \"crc\" DevicePath \"\"" Jan 11 17:36:35 crc kubenswrapper[4837]: I0111 17:36:35.743093 4837 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/1b5ce6bf-72e2-494a-aa22-830e992fbec5-v4-0-config-system-session\") on node \"crc\" DevicePath \"\"" Jan 11 17:36:35 crc kubenswrapper[4837]: I0111 17:36:35.743101 4837 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/1b5ce6bf-72e2-494a-aa22-830e992fbec5-v4-0-config-system-serving-cert\") on node \"crc\" DevicePath \"\"" Jan 11 17:36:35 crc kubenswrapper[4837]: I0111 17:36:35.743110 4837 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/1b5ce6bf-72e2-494a-aa22-830e992fbec5-v4-0-config-system-trusted-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 11 17:36:35 crc kubenswrapper[4837]: I0111 17:36:35.743121 4837 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/1b5ce6bf-72e2-494a-aa22-830e992fbec5-v4-0-config-system-ocp-branding-template\") on node \"crc\" DevicePath \"\"" Jan 11 17:36:35 crc kubenswrapper[4837]: I0111 17:36:35.743130 4837 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/1b5ce6bf-72e2-494a-aa22-830e992fbec5-v4-0-config-user-template-error\") on node \"crc\" DevicePath \"\"" Jan 11 17:36:35 crc kubenswrapper[4837]: I0111 17:36:35.743138 4837 reconciler_common.go:293] "Volume detached for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/1b5ce6bf-72e2-494a-aa22-830e992fbec5-audit-dir\") on node \"crc\" DevicePath \"\"" Jan 11 17:36:35 crc kubenswrapper[4837]: I0111 17:36:35.743148 4837 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/1b5ce6bf-72e2-494a-aa22-830e992fbec5-v4-0-config-user-idp-0-file-data\") on node \"crc\" DevicePath \"\"" Jan 11 17:36:35 crc kubenswrapper[4837]: I0111 17:36:35.743157 4837 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/1b5ce6bf-72e2-494a-aa22-830e992fbec5-v4-0-config-system-router-certs\") on node \"crc\" DevicePath \"\"" Jan 11 17:36:35 crc kubenswrapper[4837]: I0111 17:36:35.743166 4837 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/1b5ce6bf-72e2-494a-aa22-830e992fbec5-v4-0-config-user-template-provider-selection\") on node \"crc\" DevicePath \"\"" Jan 11 17:36:35 crc kubenswrapper[4837]: I0111 17:36:35.743175 4837 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/1b5ce6bf-72e2-494a-aa22-830e992fbec5-v4-0-config-system-service-ca\") on node \"crc\" DevicePath \"\"" Jan 11 17:36:35 crc kubenswrapper[4837]: I0111 17:36:35.743769 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/188c58de-fa6a-400f-a626-bc8ea9b63976-proxy-ca-bundles" (OuterVolumeSpecName: "proxy-ca-bundles") pod "188c58de-fa6a-400f-a626-bc8ea9b63976" (UID: "188c58de-fa6a-400f-a626-bc8ea9b63976"). InnerVolumeSpecName "proxy-ca-bundles". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 11 17:36:35 crc kubenswrapper[4837]: I0111 17:36:35.744083 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/188c58de-fa6a-400f-a626-bc8ea9b63976-client-ca" (OuterVolumeSpecName: "client-ca") pod "188c58de-fa6a-400f-a626-bc8ea9b63976" (UID: "188c58de-fa6a-400f-a626-bc8ea9b63976"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 11 17:36:35 crc kubenswrapper[4837]: I0111 17:36:35.744305 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/188c58de-fa6a-400f-a626-bc8ea9b63976-config" (OuterVolumeSpecName: "config") pod "188c58de-fa6a-400f-a626-bc8ea9b63976" (UID: "188c58de-fa6a-400f-a626-bc8ea9b63976"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 11 17:36:35 crc kubenswrapper[4837]: I0111 17:36:35.746582 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/188c58de-fa6a-400f-a626-bc8ea9b63976-kube-api-access-x2cgd" (OuterVolumeSpecName: "kube-api-access-x2cgd") pod "188c58de-fa6a-400f-a626-bc8ea9b63976" (UID: "188c58de-fa6a-400f-a626-bc8ea9b63976"). InnerVolumeSpecName "kube-api-access-x2cgd". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 11 17:36:35 crc kubenswrapper[4837]: I0111 17:36:35.747365 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/188c58de-fa6a-400f-a626-bc8ea9b63976-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "188c58de-fa6a-400f-a626-bc8ea9b63976" (UID: "188c58de-fa6a-400f-a626-bc8ea9b63976"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 11 17:36:35 crc kubenswrapper[4837]: I0111 17:36:35.826240 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-54b89cbf46-54qg5" Jan 11 17:36:35 crc kubenswrapper[4837]: I0111 17:36:35.844583 4837 reconciler_common.go:293] "Volume detached for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/188c58de-fa6a-400f-a626-bc8ea9b63976-proxy-ca-bundles\") on node \"crc\" DevicePath \"\"" Jan 11 17:36:35 crc kubenswrapper[4837]: I0111 17:36:35.844616 4837 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/188c58de-fa6a-400f-a626-bc8ea9b63976-config\") on node \"crc\" DevicePath \"\"" Jan 11 17:36:35 crc kubenswrapper[4837]: I0111 17:36:35.844630 4837 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/188c58de-fa6a-400f-a626-bc8ea9b63976-client-ca\") on node \"crc\" DevicePath \"\"" Jan 11 17:36:35 crc kubenswrapper[4837]: I0111 17:36:35.844639 4837 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-x2cgd\" (UniqueName: \"kubernetes.io/projected/188c58de-fa6a-400f-a626-bc8ea9b63976-kube-api-access-x2cgd\") on node \"crc\" DevicePath \"\"" Jan 11 17:36:35 crc kubenswrapper[4837]: I0111 17:36:35.844648 4837 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/188c58de-fa6a-400f-a626-bc8ea9b63976-serving-cert\") on node \"crc\" DevicePath \"\"" Jan 11 17:36:35 crc kubenswrapper[4837]: I0111 17:36:35.953908 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-8565796d8b-lmzjz" event={"ID":"188c58de-fa6a-400f-a626-bc8ea9b63976","Type":"ContainerDied","Data":"80601551357d49cd1bcf2a2481be75004436e8a67e92674caf336280020a2caa"} Jan 11 17:36:35 crc kubenswrapper[4837]: I0111 17:36:35.953949 4837 scope.go:117] "RemoveContainer" containerID="217be6929e96988578f50f4b77602096c983070eaf813bbc08578f1e8b990f1d" Jan 11 17:36:35 crc kubenswrapper[4837]: I0111 17:36:35.953921 4837 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-8565796d8b-lmzjz" Jan 11 17:36:35 crc kubenswrapper[4837]: I0111 17:36:35.957077 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-56669bb699-p9rjg" event={"ID":"4daba4a5-0eb1-43c4-8194-f0f09ccb1c8d","Type":"ContainerDied","Data":"62aa55c8ca5f3ddf05bfa5b6121d51aeb7868ef336d4277a9b9b149a9d7024e4"} Jan 11 17:36:35 crc kubenswrapper[4837]: I0111 17:36:35.957251 4837 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-56669bb699-p9rjg" Jan 11 17:36:35 crc kubenswrapper[4837]: I0111 17:36:35.960594 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication/oauth-openshift-558db77b4-ldzgv" event={"ID":"1b5ce6bf-72e2-494a-aa22-830e992fbec5","Type":"ContainerDied","Data":"a47addc44e48347ea1e4f8889bae172c523bd73bbeeef86fe01a6cf74f67a66b"} Jan 11 17:36:35 crc kubenswrapper[4837]: I0111 17:36:35.960664 4837 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-authentication/oauth-openshift-558db77b4-ldzgv" Jan 11 17:36:35 crc kubenswrapper[4837]: I0111 17:36:35.982054 4837 scope.go:117] "RemoveContainer" containerID="525a145583fadf5bd47677e7b31a508d33b51cabe2077da1c9ee9f33deca3eed" Jan 11 17:36:36 crc kubenswrapper[4837]: I0111 17:36:36.000391 4837 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-controller-manager/controller-manager-8565796d8b-lmzjz"] Jan 11 17:36:36 crc kubenswrapper[4837]: I0111 17:36:36.004501 4837 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-controller-manager/controller-manager-8565796d8b-lmzjz"] Jan 11 17:36:36 crc kubenswrapper[4837]: I0111 17:36:36.010656 4837 scope.go:117] "RemoveContainer" containerID="e85f3b8dd2e3e1c3b105111e316094cea33f35099a17bdeeaec768ca613f28ce" Jan 11 17:36:36 crc kubenswrapper[4837]: I0111 17:36:36.020386 4837 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-56669bb699-p9rjg"] Jan 11 17:36:36 crc kubenswrapper[4837]: I0111 17:36:36.026341 4837 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-56669bb699-p9rjg"] Jan 11 17:36:36 crc kubenswrapper[4837]: I0111 17:36:36.032852 4837 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-authentication/oauth-openshift-558db77b4-ldzgv"] Jan 11 17:36:36 crc kubenswrapper[4837]: I0111 17:36:36.038346 4837 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-authentication/oauth-openshift-558db77b4-ldzgv"] Jan 11 17:36:36 crc kubenswrapper[4837]: I0111 17:36:36.175300 4837 patch_prober.go:28] interesting pod/route-controller-manager-56669bb699-p9rjg container/route-controller-manager namespace/openshift-route-controller-manager: Readiness probe status=failure output="Get \"https://10.217.0.65:8443/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" start-of-body= Jan 11 17:36:36 crc kubenswrapper[4837]: I0111 17:36:36.175373 4837 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-route-controller-manager/route-controller-manager-56669bb699-p9rjg" podUID="4daba4a5-0eb1-43c4-8194-f0f09ccb1c8d" containerName="route-controller-manager" probeResult="failure" output="Get \"https://10.217.0.65:8443/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Jan 11 17:36:36 crc kubenswrapper[4837]: I0111 17:36:36.228345 4837 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-54b89cbf46-54qg5"] Jan 11 17:36:36 crc kubenswrapper[4837]: I0111 17:36:36.386406 4837 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="188c58de-fa6a-400f-a626-bc8ea9b63976" path="/var/lib/kubelet/pods/188c58de-fa6a-400f-a626-bc8ea9b63976/volumes" Jan 11 17:36:36 crc kubenswrapper[4837]: I0111 17:36:36.388768 4837 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1b5ce6bf-72e2-494a-aa22-830e992fbec5" path="/var/lib/kubelet/pods/1b5ce6bf-72e2-494a-aa22-830e992fbec5/volumes" Jan 11 17:36:36 crc kubenswrapper[4837]: I0111 17:36:36.390446 4837 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="4daba4a5-0eb1-43c4-8194-f0f09ccb1c8d" path="/var/lib/kubelet/pods/4daba4a5-0eb1-43c4-8194-f0f09ccb1c8d/volumes" Jan 11 17:36:36 crc kubenswrapper[4837]: W0111 17:36:36.509162 4837 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podb70f8a05_9d11_4cf1_ad6f_9ba72b22466a.slice/crio-af777e7aa39e1197d073445a1b34d9fe663b07b6fd8abedd8bc49fad9513e117 WatchSource:0}: Error finding container af777e7aa39e1197d073445a1b34d9fe663b07b6fd8abedd8bc49fad9513e117: Status 404 returned error can't find the container with id af777e7aa39e1197d073445a1b34d9fe663b07b6fd8abedd8bc49fad9513e117 Jan 11 17:36:36 crc kubenswrapper[4837]: I0111 17:36:36.971362 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-54b89cbf46-54qg5" event={"ID":"b70f8a05-9d11-4cf1-ad6f-9ba72b22466a","Type":"ContainerStarted","Data":"998f7a36f336ff040330fb8ed6a3bc393dbbbec3a7b9c1b8037705398f839f58"} Jan 11 17:36:36 crc kubenswrapper[4837]: I0111 17:36:36.972040 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-54b89cbf46-54qg5" event={"ID":"b70f8a05-9d11-4cf1-ad6f-9ba72b22466a","Type":"ContainerStarted","Data":"af777e7aa39e1197d073445a1b34d9fe663b07b6fd8abedd8bc49fad9513e117"} Jan 11 17:36:37 crc kubenswrapper[4837]: I0111 17:36:37.874529 4837 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-controller-manager/controller-manager-5d4d698ddf-8fvc7"] Jan 11 17:36:37 crc kubenswrapper[4837]: E0111 17:36:37.875268 4837 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="188c58de-fa6a-400f-a626-bc8ea9b63976" containerName="controller-manager" Jan 11 17:36:37 crc kubenswrapper[4837]: I0111 17:36:37.875301 4837 state_mem.go:107] "Deleted CPUSet assignment" podUID="188c58de-fa6a-400f-a626-bc8ea9b63976" containerName="controller-manager" Jan 11 17:36:37 crc kubenswrapper[4837]: E0111 17:36:37.875430 4837 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1b5ce6bf-72e2-494a-aa22-830e992fbec5" containerName="oauth-openshift" Jan 11 17:36:37 crc kubenswrapper[4837]: I0111 17:36:37.875454 4837 state_mem.go:107] "Deleted CPUSet assignment" podUID="1b5ce6bf-72e2-494a-aa22-830e992fbec5" containerName="oauth-openshift" Jan 11 17:36:37 crc kubenswrapper[4837]: I0111 17:36:37.875707 4837 memory_manager.go:354] "RemoveStaleState removing state" podUID="1b5ce6bf-72e2-494a-aa22-830e992fbec5" containerName="oauth-openshift" Jan 11 17:36:37 crc kubenswrapper[4837]: I0111 17:36:37.875748 4837 memory_manager.go:354] "RemoveStaleState removing state" podUID="188c58de-fa6a-400f-a626-bc8ea9b63976" containerName="controller-manager" Jan 11 17:36:37 crc kubenswrapper[4837]: I0111 17:36:37.876842 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-5d4d698ddf-8fvc7" Jan 11 17:36:37 crc kubenswrapper[4837]: I0111 17:36:37.883279 4837 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"serving-cert" Jan 11 17:36:37 crc kubenswrapper[4837]: I0111 17:36:37.883353 4837 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-service-ca.crt" Jan 11 17:36:37 crc kubenswrapper[4837]: I0111 17:36:37.883870 4837 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"config" Jan 11 17:36:37 crc kubenswrapper[4837]: I0111 17:36:37.884605 4837 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"client-ca" Jan 11 17:36:37 crc kubenswrapper[4837]: I0111 17:36:37.885028 4837 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"kube-root-ca.crt" Jan 11 17:36:37 crc kubenswrapper[4837]: I0111 17:36:37.885471 4837 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"openshift-controller-manager-sa-dockercfg-msq4c" Jan 11 17:36:37 crc kubenswrapper[4837]: I0111 17:36:37.889521 4837 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-5d4d698ddf-8fvc7"] Jan 11 17:36:37 crc kubenswrapper[4837]: I0111 17:36:37.897721 4837 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-global-ca" Jan 11 17:36:37 crc kubenswrapper[4837]: I0111 17:36:37.970890 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/7566b5f7-27f7-4223-8aab-14d589b8222a-serving-cert\") pod \"controller-manager-5d4d698ddf-8fvc7\" (UID: \"7566b5f7-27f7-4223-8aab-14d589b8222a\") " pod="openshift-controller-manager/controller-manager-5d4d698ddf-8fvc7" Jan 11 17:36:37 crc kubenswrapper[4837]: I0111 17:36:37.971189 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7566b5f7-27f7-4223-8aab-14d589b8222a-config\") pod \"controller-manager-5d4d698ddf-8fvc7\" (UID: \"7566b5f7-27f7-4223-8aab-14d589b8222a\") " pod="openshift-controller-manager/controller-manager-5d4d698ddf-8fvc7" Jan 11 17:36:37 crc kubenswrapper[4837]: I0111 17:36:37.971316 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/7566b5f7-27f7-4223-8aab-14d589b8222a-proxy-ca-bundles\") pod \"controller-manager-5d4d698ddf-8fvc7\" (UID: \"7566b5f7-27f7-4223-8aab-14d589b8222a\") " pod="openshift-controller-manager/controller-manager-5d4d698ddf-8fvc7" Jan 11 17:36:37 crc kubenswrapper[4837]: I0111 17:36:37.971484 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/7566b5f7-27f7-4223-8aab-14d589b8222a-client-ca\") pod \"controller-manager-5d4d698ddf-8fvc7\" (UID: \"7566b5f7-27f7-4223-8aab-14d589b8222a\") " pod="openshift-controller-manager/controller-manager-5d4d698ddf-8fvc7" Jan 11 17:36:37 crc kubenswrapper[4837]: I0111 17:36:37.971577 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tpc28\" (UniqueName: \"kubernetes.io/projected/7566b5f7-27f7-4223-8aab-14d589b8222a-kube-api-access-tpc28\") pod \"controller-manager-5d4d698ddf-8fvc7\" (UID: \"7566b5f7-27f7-4223-8aab-14d589b8222a\") " pod="openshift-controller-manager/controller-manager-5d4d698ddf-8fvc7" Jan 11 17:36:37 crc kubenswrapper[4837]: I0111 17:36:37.979729 4837 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-route-controller-manager/route-controller-manager-54b89cbf46-54qg5" Jan 11 17:36:37 crc kubenswrapper[4837]: I0111 17:36:37.989908 4837 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-route-controller-manager/route-controller-manager-54b89cbf46-54qg5" Jan 11 17:36:37 crc kubenswrapper[4837]: I0111 17:36:37.997038 4837 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-route-controller-manager/route-controller-manager-54b89cbf46-54qg5" podStartSLOduration=4.997020203 podStartE2EDuration="4.997020203s" podCreationTimestamp="2026-01-11 17:36:33 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-11 17:36:37.995105679 +0000 UTC m=+372.173298405" watchObservedRunningTime="2026-01-11 17:36:37.997020203 +0000 UTC m=+372.175212919" Jan 11 17:36:38 crc kubenswrapper[4837]: I0111 17:36:38.072364 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/7566b5f7-27f7-4223-8aab-14d589b8222a-serving-cert\") pod \"controller-manager-5d4d698ddf-8fvc7\" (UID: \"7566b5f7-27f7-4223-8aab-14d589b8222a\") " pod="openshift-controller-manager/controller-manager-5d4d698ddf-8fvc7" Jan 11 17:36:38 crc kubenswrapper[4837]: I0111 17:36:38.072423 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7566b5f7-27f7-4223-8aab-14d589b8222a-config\") pod \"controller-manager-5d4d698ddf-8fvc7\" (UID: \"7566b5f7-27f7-4223-8aab-14d589b8222a\") " pod="openshift-controller-manager/controller-manager-5d4d698ddf-8fvc7" Jan 11 17:36:38 crc kubenswrapper[4837]: I0111 17:36:38.072448 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/7566b5f7-27f7-4223-8aab-14d589b8222a-proxy-ca-bundles\") pod \"controller-manager-5d4d698ddf-8fvc7\" (UID: \"7566b5f7-27f7-4223-8aab-14d589b8222a\") " pod="openshift-controller-manager/controller-manager-5d4d698ddf-8fvc7" Jan 11 17:36:38 crc kubenswrapper[4837]: I0111 17:36:38.072511 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/7566b5f7-27f7-4223-8aab-14d589b8222a-client-ca\") pod \"controller-manager-5d4d698ddf-8fvc7\" (UID: \"7566b5f7-27f7-4223-8aab-14d589b8222a\") " pod="openshift-controller-manager/controller-manager-5d4d698ddf-8fvc7" Jan 11 17:36:38 crc kubenswrapper[4837]: I0111 17:36:38.072537 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-tpc28\" (UniqueName: \"kubernetes.io/projected/7566b5f7-27f7-4223-8aab-14d589b8222a-kube-api-access-tpc28\") pod \"controller-manager-5d4d698ddf-8fvc7\" (UID: \"7566b5f7-27f7-4223-8aab-14d589b8222a\") " pod="openshift-controller-manager/controller-manager-5d4d698ddf-8fvc7" Jan 11 17:36:38 crc kubenswrapper[4837]: I0111 17:36:38.074155 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/7566b5f7-27f7-4223-8aab-14d589b8222a-proxy-ca-bundles\") pod \"controller-manager-5d4d698ddf-8fvc7\" (UID: \"7566b5f7-27f7-4223-8aab-14d589b8222a\") " pod="openshift-controller-manager/controller-manager-5d4d698ddf-8fvc7" Jan 11 17:36:38 crc kubenswrapper[4837]: I0111 17:36:38.074155 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/7566b5f7-27f7-4223-8aab-14d589b8222a-client-ca\") pod \"controller-manager-5d4d698ddf-8fvc7\" (UID: \"7566b5f7-27f7-4223-8aab-14d589b8222a\") " pod="openshift-controller-manager/controller-manager-5d4d698ddf-8fvc7" Jan 11 17:36:38 crc kubenswrapper[4837]: I0111 17:36:38.074891 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7566b5f7-27f7-4223-8aab-14d589b8222a-config\") pod \"controller-manager-5d4d698ddf-8fvc7\" (UID: \"7566b5f7-27f7-4223-8aab-14d589b8222a\") " pod="openshift-controller-manager/controller-manager-5d4d698ddf-8fvc7" Jan 11 17:36:38 crc kubenswrapper[4837]: I0111 17:36:38.082442 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/7566b5f7-27f7-4223-8aab-14d589b8222a-serving-cert\") pod \"controller-manager-5d4d698ddf-8fvc7\" (UID: \"7566b5f7-27f7-4223-8aab-14d589b8222a\") " pod="openshift-controller-manager/controller-manager-5d4d698ddf-8fvc7" Jan 11 17:36:38 crc kubenswrapper[4837]: I0111 17:36:38.088850 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-tpc28\" (UniqueName: \"kubernetes.io/projected/7566b5f7-27f7-4223-8aab-14d589b8222a-kube-api-access-tpc28\") pod \"controller-manager-5d4d698ddf-8fvc7\" (UID: \"7566b5f7-27f7-4223-8aab-14d589b8222a\") " pod="openshift-controller-manager/controller-manager-5d4d698ddf-8fvc7" Jan 11 17:36:38 crc kubenswrapper[4837]: I0111 17:36:38.204533 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-5d4d698ddf-8fvc7" Jan 11 17:36:38 crc kubenswrapper[4837]: I0111 17:36:38.620096 4837 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-5d4d698ddf-8fvc7"] Jan 11 17:36:38 crc kubenswrapper[4837]: W0111 17:36:38.624201 4837 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod7566b5f7_27f7_4223_8aab_14d589b8222a.slice/crio-003f63c997379305e8371d095c9b9b8d44c4343a7cc65709b34a7bf3cebc24fb WatchSource:0}: Error finding container 003f63c997379305e8371d095c9b9b8d44c4343a7cc65709b34a7bf3cebc24fb: Status 404 returned error can't find the container with id 003f63c997379305e8371d095c9b9b8d44c4343a7cc65709b34a7bf3cebc24fb Jan 11 17:36:38 crc kubenswrapper[4837]: I0111 17:36:38.988523 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-5d4d698ddf-8fvc7" event={"ID":"7566b5f7-27f7-4223-8aab-14d589b8222a","Type":"ContainerStarted","Data":"003f63c997379305e8371d095c9b9b8d44c4343a7cc65709b34a7bf3cebc24fb"} Jan 11 17:36:39 crc kubenswrapper[4837]: I0111 17:36:39.996658 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-5d4d698ddf-8fvc7" event={"ID":"7566b5f7-27f7-4223-8aab-14d589b8222a","Type":"ContainerStarted","Data":"797acde52c738dd5922cdf47f307e2ef18bb6e4d095f55ff42a22eac0d732c84"} Jan 11 17:36:40 crc kubenswrapper[4837]: I0111 17:36:40.019720 4837 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-controller-manager/controller-manager-5d4d698ddf-8fvc7" podStartSLOduration=7.019704139 podStartE2EDuration="7.019704139s" podCreationTimestamp="2026-01-11 17:36:33 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-11 17:36:40.017521368 +0000 UTC m=+374.195714104" watchObservedRunningTime="2026-01-11 17:36:40.019704139 +0000 UTC m=+374.197896835" Jan 11 17:36:41 crc kubenswrapper[4837]: I0111 17:36:41.005108 4837 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-controller-manager/controller-manager-5d4d698ddf-8fvc7" Jan 11 17:36:41 crc kubenswrapper[4837]: I0111 17:36:41.015652 4837 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-controller-manager/controller-manager-5d4d698ddf-8fvc7" Jan 11 17:36:43 crc kubenswrapper[4837]: I0111 17:36:43.876218 4837 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-authentication/oauth-openshift-599f75c894-7s8wd"] Jan 11 17:36:43 crc kubenswrapper[4837]: I0111 17:36:43.879115 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-authentication/oauth-openshift-599f75c894-7s8wd" Jan 11 17:36:43 crc kubenswrapper[4837]: I0111 17:36:43.884034 4837 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"kube-root-ca.crt" Jan 11 17:36:43 crc kubenswrapper[4837]: I0111 17:36:43.885902 4837 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"audit" Jan 11 17:36:43 crc kubenswrapper[4837]: I0111 17:36:43.885965 4837 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"v4-0-config-system-cliconfig" Jan 11 17:36:43 crc kubenswrapper[4837]: I0111 17:36:43.886196 4837 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-router-certs" Jan 11 17:36:43 crc kubenswrapper[4837]: I0111 17:36:43.886274 4837 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"v4-0-config-system-service-ca" Jan 11 17:36:43 crc kubenswrapper[4837]: I0111 17:36:43.886325 4837 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-template-error" Jan 11 17:36:43 crc kubenswrapper[4837]: I0111 17:36:43.886562 4837 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"openshift-service-ca.crt" Jan 11 17:36:43 crc kubenswrapper[4837]: I0111 17:36:43.890138 4837 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"oauth-openshift-dockercfg-znhcc" Jan 11 17:36:43 crc kubenswrapper[4837]: I0111 17:36:43.896293 4837 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-serving-cert" Jan 11 17:36:43 crc kubenswrapper[4837]: I0111 17:36:43.896462 4837 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-idp-0-file-data" Jan 11 17:36:43 crc kubenswrapper[4837]: I0111 17:36:43.899268 4837 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-template-provider-selection" Jan 11 17:36:43 crc kubenswrapper[4837]: I0111 17:36:43.899899 4837 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-session" Jan 11 17:36:43 crc kubenswrapper[4837]: I0111 17:36:43.934068 4837 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-authentication/oauth-openshift-599f75c894-7s8wd"] Jan 11 17:36:43 crc kubenswrapper[4837]: I0111 17:36:43.936415 4837 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-template-login" Jan 11 17:36:43 crc kubenswrapper[4837]: I0111 17:36:43.939253 4837 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"v4-0-config-system-trusted-ca-bundle" Jan 11 17:36:43 crc kubenswrapper[4837]: I0111 17:36:43.950818 4837 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-ocp-branding-template" Jan 11 17:36:43 crc kubenswrapper[4837]: I0111 17:36:43.953825 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/fca4c497-c46f-4513-b978-d8abd595578d-v4-0-config-user-template-error\") pod \"oauth-openshift-599f75c894-7s8wd\" (UID: \"fca4c497-c46f-4513-b978-d8abd595578d\") " pod="openshift-authentication/oauth-openshift-599f75c894-7s8wd" Jan 11 17:36:43 crc kubenswrapper[4837]: I0111 17:36:43.953878 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/fca4c497-c46f-4513-b978-d8abd595578d-v4-0-config-system-service-ca\") pod \"oauth-openshift-599f75c894-7s8wd\" (UID: \"fca4c497-c46f-4513-b978-d8abd595578d\") " pod="openshift-authentication/oauth-openshift-599f75c894-7s8wd" Jan 11 17:36:43 crc kubenswrapper[4837]: I0111 17:36:43.953949 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/fca4c497-c46f-4513-b978-d8abd595578d-v4-0-config-system-session\") pod \"oauth-openshift-599f75c894-7s8wd\" (UID: \"fca4c497-c46f-4513-b978-d8abd595578d\") " pod="openshift-authentication/oauth-openshift-599f75c894-7s8wd" Jan 11 17:36:43 crc kubenswrapper[4837]: I0111 17:36:43.953979 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/fca4c497-c46f-4513-b978-d8abd595578d-v4-0-config-user-template-login\") pod \"oauth-openshift-599f75c894-7s8wd\" (UID: \"fca4c497-c46f-4513-b978-d8abd595578d\") " pod="openshift-authentication/oauth-openshift-599f75c894-7s8wd" Jan 11 17:36:43 crc kubenswrapper[4837]: I0111 17:36:43.954058 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/fca4c497-c46f-4513-b978-d8abd595578d-v4-0-config-system-serving-cert\") pod \"oauth-openshift-599f75c894-7s8wd\" (UID: \"fca4c497-c46f-4513-b978-d8abd595578d\") " pod="openshift-authentication/oauth-openshift-599f75c894-7s8wd" Jan 11 17:36:43 crc kubenswrapper[4837]: I0111 17:36:43.954087 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zx6j7\" (UniqueName: \"kubernetes.io/projected/fca4c497-c46f-4513-b978-d8abd595578d-kube-api-access-zx6j7\") pod \"oauth-openshift-599f75c894-7s8wd\" (UID: \"fca4c497-c46f-4513-b978-d8abd595578d\") " pod="openshift-authentication/oauth-openshift-599f75c894-7s8wd" Jan 11 17:36:43 crc kubenswrapper[4837]: I0111 17:36:43.954126 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/fca4c497-c46f-4513-b978-d8abd595578d-v4-0-config-user-template-provider-selection\") pod \"oauth-openshift-599f75c894-7s8wd\" (UID: \"fca4c497-c46f-4513-b978-d8abd595578d\") " pod="openshift-authentication/oauth-openshift-599f75c894-7s8wd" Jan 11 17:36:43 crc kubenswrapper[4837]: I0111 17:36:43.954150 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/fca4c497-c46f-4513-b978-d8abd595578d-v4-0-config-system-router-certs\") pod \"oauth-openshift-599f75c894-7s8wd\" (UID: \"fca4c497-c46f-4513-b978-d8abd595578d\") " pod="openshift-authentication/oauth-openshift-599f75c894-7s8wd" Jan 11 17:36:43 crc kubenswrapper[4837]: I0111 17:36:43.954168 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/fca4c497-c46f-4513-b978-d8abd595578d-v4-0-config-system-ocp-branding-template\") pod \"oauth-openshift-599f75c894-7s8wd\" (UID: \"fca4c497-c46f-4513-b978-d8abd595578d\") " pod="openshift-authentication/oauth-openshift-599f75c894-7s8wd" Jan 11 17:36:43 crc kubenswrapper[4837]: I0111 17:36:43.954215 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/fca4c497-c46f-4513-b978-d8abd595578d-v4-0-config-user-idp-0-file-data\") pod \"oauth-openshift-599f75c894-7s8wd\" (UID: \"fca4c497-c46f-4513-b978-d8abd595578d\") " pod="openshift-authentication/oauth-openshift-599f75c894-7s8wd" Jan 11 17:36:43 crc kubenswrapper[4837]: I0111 17:36:43.954232 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/fca4c497-c46f-4513-b978-d8abd595578d-v4-0-config-system-trusted-ca-bundle\") pod \"oauth-openshift-599f75c894-7s8wd\" (UID: \"fca4c497-c46f-4513-b978-d8abd595578d\") " pod="openshift-authentication/oauth-openshift-599f75c894-7s8wd" Jan 11 17:36:43 crc kubenswrapper[4837]: I0111 17:36:43.954265 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/fca4c497-c46f-4513-b978-d8abd595578d-audit-policies\") pod \"oauth-openshift-599f75c894-7s8wd\" (UID: \"fca4c497-c46f-4513-b978-d8abd595578d\") " pod="openshift-authentication/oauth-openshift-599f75c894-7s8wd" Jan 11 17:36:43 crc kubenswrapper[4837]: I0111 17:36:43.954313 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/fca4c497-c46f-4513-b978-d8abd595578d-v4-0-config-system-cliconfig\") pod \"oauth-openshift-599f75c894-7s8wd\" (UID: \"fca4c497-c46f-4513-b978-d8abd595578d\") " pod="openshift-authentication/oauth-openshift-599f75c894-7s8wd" Jan 11 17:36:43 crc kubenswrapper[4837]: I0111 17:36:43.954342 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/fca4c497-c46f-4513-b978-d8abd595578d-audit-dir\") pod \"oauth-openshift-599f75c894-7s8wd\" (UID: \"fca4c497-c46f-4513-b978-d8abd595578d\") " pod="openshift-authentication/oauth-openshift-599f75c894-7s8wd" Jan 11 17:36:44 crc kubenswrapper[4837]: I0111 17:36:44.056110 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/fca4c497-c46f-4513-b978-d8abd595578d-audit-policies\") pod \"oauth-openshift-599f75c894-7s8wd\" (UID: \"fca4c497-c46f-4513-b978-d8abd595578d\") " pod="openshift-authentication/oauth-openshift-599f75c894-7s8wd" Jan 11 17:36:44 crc kubenswrapper[4837]: I0111 17:36:44.056182 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/fca4c497-c46f-4513-b978-d8abd595578d-v4-0-config-system-cliconfig\") pod \"oauth-openshift-599f75c894-7s8wd\" (UID: \"fca4c497-c46f-4513-b978-d8abd595578d\") " pod="openshift-authentication/oauth-openshift-599f75c894-7s8wd" Jan 11 17:36:44 crc kubenswrapper[4837]: I0111 17:36:44.056215 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/fca4c497-c46f-4513-b978-d8abd595578d-audit-dir\") pod \"oauth-openshift-599f75c894-7s8wd\" (UID: \"fca4c497-c46f-4513-b978-d8abd595578d\") " pod="openshift-authentication/oauth-openshift-599f75c894-7s8wd" Jan 11 17:36:44 crc kubenswrapper[4837]: I0111 17:36:44.056246 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/fca4c497-c46f-4513-b978-d8abd595578d-v4-0-config-user-template-error\") pod \"oauth-openshift-599f75c894-7s8wd\" (UID: \"fca4c497-c46f-4513-b978-d8abd595578d\") " pod="openshift-authentication/oauth-openshift-599f75c894-7s8wd" Jan 11 17:36:44 crc kubenswrapper[4837]: I0111 17:36:44.056268 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/fca4c497-c46f-4513-b978-d8abd595578d-v4-0-config-system-service-ca\") pod \"oauth-openshift-599f75c894-7s8wd\" (UID: \"fca4c497-c46f-4513-b978-d8abd595578d\") " pod="openshift-authentication/oauth-openshift-599f75c894-7s8wd" Jan 11 17:36:44 crc kubenswrapper[4837]: I0111 17:36:44.056313 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/fca4c497-c46f-4513-b978-d8abd595578d-v4-0-config-system-session\") pod \"oauth-openshift-599f75c894-7s8wd\" (UID: \"fca4c497-c46f-4513-b978-d8abd595578d\") " pod="openshift-authentication/oauth-openshift-599f75c894-7s8wd" Jan 11 17:36:44 crc kubenswrapper[4837]: I0111 17:36:44.056334 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/fca4c497-c46f-4513-b978-d8abd595578d-v4-0-config-user-template-login\") pod \"oauth-openshift-599f75c894-7s8wd\" (UID: \"fca4c497-c46f-4513-b978-d8abd595578d\") " pod="openshift-authentication/oauth-openshift-599f75c894-7s8wd" Jan 11 17:36:44 crc kubenswrapper[4837]: I0111 17:36:44.056349 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/fca4c497-c46f-4513-b978-d8abd595578d-audit-dir\") pod \"oauth-openshift-599f75c894-7s8wd\" (UID: \"fca4c497-c46f-4513-b978-d8abd595578d\") " pod="openshift-authentication/oauth-openshift-599f75c894-7s8wd" Jan 11 17:36:44 crc kubenswrapper[4837]: I0111 17:36:44.056368 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/fca4c497-c46f-4513-b978-d8abd595578d-v4-0-config-system-serving-cert\") pod \"oauth-openshift-599f75c894-7s8wd\" (UID: \"fca4c497-c46f-4513-b978-d8abd595578d\") " pod="openshift-authentication/oauth-openshift-599f75c894-7s8wd" Jan 11 17:36:44 crc kubenswrapper[4837]: I0111 17:36:44.056445 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zx6j7\" (UniqueName: \"kubernetes.io/projected/fca4c497-c46f-4513-b978-d8abd595578d-kube-api-access-zx6j7\") pod \"oauth-openshift-599f75c894-7s8wd\" (UID: \"fca4c497-c46f-4513-b978-d8abd595578d\") " pod="openshift-authentication/oauth-openshift-599f75c894-7s8wd" Jan 11 17:36:44 crc kubenswrapper[4837]: I0111 17:36:44.056486 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/fca4c497-c46f-4513-b978-d8abd595578d-v4-0-config-user-template-provider-selection\") pod \"oauth-openshift-599f75c894-7s8wd\" (UID: \"fca4c497-c46f-4513-b978-d8abd595578d\") " pod="openshift-authentication/oauth-openshift-599f75c894-7s8wd" Jan 11 17:36:44 crc kubenswrapper[4837]: I0111 17:36:44.056514 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/fca4c497-c46f-4513-b978-d8abd595578d-v4-0-config-system-router-certs\") pod \"oauth-openshift-599f75c894-7s8wd\" (UID: \"fca4c497-c46f-4513-b978-d8abd595578d\") " pod="openshift-authentication/oauth-openshift-599f75c894-7s8wd" Jan 11 17:36:44 crc kubenswrapper[4837]: I0111 17:36:44.056531 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/fca4c497-c46f-4513-b978-d8abd595578d-v4-0-config-system-ocp-branding-template\") pod \"oauth-openshift-599f75c894-7s8wd\" (UID: \"fca4c497-c46f-4513-b978-d8abd595578d\") " pod="openshift-authentication/oauth-openshift-599f75c894-7s8wd" Jan 11 17:36:44 crc kubenswrapper[4837]: I0111 17:36:44.056578 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/fca4c497-c46f-4513-b978-d8abd595578d-v4-0-config-user-idp-0-file-data\") pod \"oauth-openshift-599f75c894-7s8wd\" (UID: \"fca4c497-c46f-4513-b978-d8abd595578d\") " pod="openshift-authentication/oauth-openshift-599f75c894-7s8wd" Jan 11 17:36:44 crc kubenswrapper[4837]: I0111 17:36:44.056597 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/fca4c497-c46f-4513-b978-d8abd595578d-v4-0-config-system-trusted-ca-bundle\") pod \"oauth-openshift-599f75c894-7s8wd\" (UID: \"fca4c497-c46f-4513-b978-d8abd595578d\") " pod="openshift-authentication/oauth-openshift-599f75c894-7s8wd" Jan 11 17:36:44 crc kubenswrapper[4837]: I0111 17:36:44.057153 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/fca4c497-c46f-4513-b978-d8abd595578d-audit-policies\") pod \"oauth-openshift-599f75c894-7s8wd\" (UID: \"fca4c497-c46f-4513-b978-d8abd595578d\") " pod="openshift-authentication/oauth-openshift-599f75c894-7s8wd" Jan 11 17:36:44 crc kubenswrapper[4837]: I0111 17:36:44.057213 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/fca4c497-c46f-4513-b978-d8abd595578d-v4-0-config-system-cliconfig\") pod \"oauth-openshift-599f75c894-7s8wd\" (UID: \"fca4c497-c46f-4513-b978-d8abd595578d\") " pod="openshift-authentication/oauth-openshift-599f75c894-7s8wd" Jan 11 17:36:44 crc kubenswrapper[4837]: I0111 17:36:44.057442 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/fca4c497-c46f-4513-b978-d8abd595578d-v4-0-config-system-trusted-ca-bundle\") pod \"oauth-openshift-599f75c894-7s8wd\" (UID: \"fca4c497-c46f-4513-b978-d8abd595578d\") " pod="openshift-authentication/oauth-openshift-599f75c894-7s8wd" Jan 11 17:36:44 crc kubenswrapper[4837]: I0111 17:36:44.057808 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/fca4c497-c46f-4513-b978-d8abd595578d-v4-0-config-system-service-ca\") pod \"oauth-openshift-599f75c894-7s8wd\" (UID: \"fca4c497-c46f-4513-b978-d8abd595578d\") " pod="openshift-authentication/oauth-openshift-599f75c894-7s8wd" Jan 11 17:36:44 crc kubenswrapper[4837]: I0111 17:36:44.063193 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/fca4c497-c46f-4513-b978-d8abd595578d-v4-0-config-user-template-error\") pod \"oauth-openshift-599f75c894-7s8wd\" (UID: \"fca4c497-c46f-4513-b978-d8abd595578d\") " pod="openshift-authentication/oauth-openshift-599f75c894-7s8wd" Jan 11 17:36:44 crc kubenswrapper[4837]: I0111 17:36:44.063256 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/fca4c497-c46f-4513-b978-d8abd595578d-v4-0-config-system-session\") pod \"oauth-openshift-599f75c894-7s8wd\" (UID: \"fca4c497-c46f-4513-b978-d8abd595578d\") " pod="openshift-authentication/oauth-openshift-599f75c894-7s8wd" Jan 11 17:36:44 crc kubenswrapper[4837]: I0111 17:36:44.063440 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/fca4c497-c46f-4513-b978-d8abd595578d-v4-0-config-system-router-certs\") pod \"oauth-openshift-599f75c894-7s8wd\" (UID: \"fca4c497-c46f-4513-b978-d8abd595578d\") " pod="openshift-authentication/oauth-openshift-599f75c894-7s8wd" Jan 11 17:36:44 crc kubenswrapper[4837]: I0111 17:36:44.065052 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/fca4c497-c46f-4513-b978-d8abd595578d-v4-0-config-user-template-login\") pod \"oauth-openshift-599f75c894-7s8wd\" (UID: \"fca4c497-c46f-4513-b978-d8abd595578d\") " pod="openshift-authentication/oauth-openshift-599f75c894-7s8wd" Jan 11 17:36:44 crc kubenswrapper[4837]: I0111 17:36:44.065560 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/fca4c497-c46f-4513-b978-d8abd595578d-v4-0-config-system-ocp-branding-template\") pod \"oauth-openshift-599f75c894-7s8wd\" (UID: \"fca4c497-c46f-4513-b978-d8abd595578d\") " pod="openshift-authentication/oauth-openshift-599f75c894-7s8wd" Jan 11 17:36:44 crc kubenswrapper[4837]: I0111 17:36:44.067996 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/fca4c497-c46f-4513-b978-d8abd595578d-v4-0-config-user-template-provider-selection\") pod \"oauth-openshift-599f75c894-7s8wd\" (UID: \"fca4c497-c46f-4513-b978-d8abd595578d\") " pod="openshift-authentication/oauth-openshift-599f75c894-7s8wd" Jan 11 17:36:44 crc kubenswrapper[4837]: I0111 17:36:44.079719 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/fca4c497-c46f-4513-b978-d8abd595578d-v4-0-config-user-idp-0-file-data\") pod \"oauth-openshift-599f75c894-7s8wd\" (UID: \"fca4c497-c46f-4513-b978-d8abd595578d\") " pod="openshift-authentication/oauth-openshift-599f75c894-7s8wd" Jan 11 17:36:44 crc kubenswrapper[4837]: I0111 17:36:44.080187 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/fca4c497-c46f-4513-b978-d8abd595578d-v4-0-config-system-serving-cert\") pod \"oauth-openshift-599f75c894-7s8wd\" (UID: \"fca4c497-c46f-4513-b978-d8abd595578d\") " pod="openshift-authentication/oauth-openshift-599f75c894-7s8wd" Jan 11 17:36:44 crc kubenswrapper[4837]: I0111 17:36:44.086865 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zx6j7\" (UniqueName: \"kubernetes.io/projected/fca4c497-c46f-4513-b978-d8abd595578d-kube-api-access-zx6j7\") pod \"oauth-openshift-599f75c894-7s8wd\" (UID: \"fca4c497-c46f-4513-b978-d8abd595578d\") " pod="openshift-authentication/oauth-openshift-599f75c894-7s8wd" Jan 11 17:36:44 crc kubenswrapper[4837]: I0111 17:36:44.224980 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-authentication/oauth-openshift-599f75c894-7s8wd" Jan 11 17:36:44 crc kubenswrapper[4837]: I0111 17:36:44.719952 4837 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-authentication/oauth-openshift-599f75c894-7s8wd"] Jan 11 17:36:45 crc kubenswrapper[4837]: I0111 17:36:45.032518 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication/oauth-openshift-599f75c894-7s8wd" event={"ID":"fca4c497-c46f-4513-b978-d8abd595578d","Type":"ContainerStarted","Data":"a5f06ed124b7d9102d18a4d1ba02af361e3474e3e6cea1c479a8a5df09c5fe10"} Jan 11 17:36:46 crc kubenswrapper[4837]: I0111 17:36:46.040173 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication/oauth-openshift-599f75c894-7s8wd" event={"ID":"fca4c497-c46f-4513-b978-d8abd595578d","Type":"ContainerStarted","Data":"ac4e4a978ab2fa5bbf09d3a2b85b8ab85bbc93f0e97cc870c72d56cfd893e306"} Jan 11 17:36:46 crc kubenswrapper[4837]: I0111 17:36:46.040804 4837 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-authentication/oauth-openshift-599f75c894-7s8wd" Jan 11 17:36:46 crc kubenswrapper[4837]: I0111 17:36:46.045796 4837 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-authentication/oauth-openshift-599f75c894-7s8wd" Jan 11 17:36:46 crc kubenswrapper[4837]: I0111 17:36:46.073946 4837 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-authentication/oauth-openshift-599f75c894-7s8wd" podStartSLOduration=37.073925551 podStartE2EDuration="37.073925551s" podCreationTimestamp="2026-01-11 17:36:09 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-11 17:36:46.069389105 +0000 UTC m=+380.247581821" watchObservedRunningTime="2026-01-11 17:36:46.073925551 +0000 UTC m=+380.252118257" Jan 11 17:37:09 crc kubenswrapper[4837]: I0111 17:37:09.444230 4837 patch_prober.go:28] interesting pod/machine-config-daemon-pqnst container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Jan 11 17:37:09 crc kubenswrapper[4837]: I0111 17:37:09.444913 4837 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-pqnst" podUID="1cf6fa66-290a-4e29-bafc-e60185a22fe2" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Jan 11 17:37:23 crc kubenswrapper[4837]: I0111 17:37:23.259320 4837 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-29ts4"] Jan 11 17:37:23 crc kubenswrapper[4837]: I0111 17:37:23.260361 4837 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-29ts4" podUID="30409294-8779-48ad-a6e8-36b662f09c0f" containerName="registry-server" containerID="cri-o://8fc494cd2c7422ba58b5dfac309e99f679c0b0579429b7f587db8be68c5c72ea" gracePeriod=30 Jan 11 17:37:23 crc kubenswrapper[4837]: I0111 17:37:23.267165 4837 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-mxndb"] Jan 11 17:37:23 crc kubenswrapper[4837]: I0111 17:37:23.279698 4837 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-gkkhd"] Jan 11 17:37:23 crc kubenswrapper[4837]: I0111 17:37:23.279948 4837 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/marketplace-operator-79b997595-gkkhd" podUID="f9e34a1e-5456-4b26-b347-aa569c5987d5" containerName="marketplace-operator" containerID="cri-o://bd0e8b7a016ff310114e5185abb0d0ca9be702174d96027d70c5053ce5f91a72" gracePeriod=30 Jan 11 17:37:23 crc kubenswrapper[4837]: I0111 17:37:23.285958 4837 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-mxndb" podUID="1d39ff8b-c79a-46ea-af70-0902ce0ee504" containerName="registry-server" containerID="cri-o://fd605aaf7ff12f50189ebe4c1f7e471af62160afef188bd4fb35bee5021e7b90" gracePeriod=30 Jan 11 17:37:23 crc kubenswrapper[4837]: I0111 17:37:23.297904 4837 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-9blrl"] Jan 11 17:37:23 crc kubenswrapper[4837]: I0111 17:37:23.298163 4837 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-9blrl" podUID="5a6d8e6a-5e4a-4ff4-92ec-b2bfee9ef821" containerName="registry-server" containerID="cri-o://4fba69108dfb028b8c2325a937358a28a67b68d650ad355ffcc40011912243f8" gracePeriod=30 Jan 11 17:37:23 crc kubenswrapper[4837]: I0111 17:37:23.304204 4837 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-xb262"] Jan 11 17:37:23 crc kubenswrapper[4837]: I0111 17:37:23.304865 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/marketplace-operator-79b997595-xb262" Jan 11 17:37:23 crc kubenswrapper[4837]: I0111 17:37:23.315466 4837 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-xb262"] Jan 11 17:37:23 crc kubenswrapper[4837]: I0111 17:37:23.318609 4837 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-slftv"] Jan 11 17:37:23 crc kubenswrapper[4837]: I0111 17:37:23.318843 4837 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-slftv" podUID="59797bd6-cb69-412d-952b-1673312648e2" containerName="registry-server" containerID="cri-o://93720f4f395c01bc8939c97f1dcc75df87220bf32e6383b482042d185c96d3a8" gracePeriod=30 Jan 11 17:37:23 crc kubenswrapper[4837]: I0111 17:37:23.404190 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-54nld\" (UniqueName: \"kubernetes.io/projected/a46aac0a-4b71-4559-9481-499e240587e4-kube-api-access-54nld\") pod \"marketplace-operator-79b997595-xb262\" (UID: \"a46aac0a-4b71-4559-9481-499e240587e4\") " pod="openshift-marketplace/marketplace-operator-79b997595-xb262" Jan 11 17:37:23 crc kubenswrapper[4837]: I0111 17:37:23.404259 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/a46aac0a-4b71-4559-9481-499e240587e4-marketplace-operator-metrics\") pod \"marketplace-operator-79b997595-xb262\" (UID: \"a46aac0a-4b71-4559-9481-499e240587e4\") " pod="openshift-marketplace/marketplace-operator-79b997595-xb262" Jan 11 17:37:23 crc kubenswrapper[4837]: I0111 17:37:23.404354 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/a46aac0a-4b71-4559-9481-499e240587e4-marketplace-trusted-ca\") pod \"marketplace-operator-79b997595-xb262\" (UID: \"a46aac0a-4b71-4559-9481-499e240587e4\") " pod="openshift-marketplace/marketplace-operator-79b997595-xb262" Jan 11 17:37:23 crc kubenswrapper[4837]: I0111 17:37:23.505582 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/a46aac0a-4b71-4559-9481-499e240587e4-marketplace-trusted-ca\") pod \"marketplace-operator-79b997595-xb262\" (UID: \"a46aac0a-4b71-4559-9481-499e240587e4\") " pod="openshift-marketplace/marketplace-operator-79b997595-xb262" Jan 11 17:37:23 crc kubenswrapper[4837]: I0111 17:37:23.505664 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-54nld\" (UniqueName: \"kubernetes.io/projected/a46aac0a-4b71-4559-9481-499e240587e4-kube-api-access-54nld\") pod \"marketplace-operator-79b997595-xb262\" (UID: \"a46aac0a-4b71-4559-9481-499e240587e4\") " pod="openshift-marketplace/marketplace-operator-79b997595-xb262" Jan 11 17:37:23 crc kubenswrapper[4837]: I0111 17:37:23.505705 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/a46aac0a-4b71-4559-9481-499e240587e4-marketplace-operator-metrics\") pod \"marketplace-operator-79b997595-xb262\" (UID: \"a46aac0a-4b71-4559-9481-499e240587e4\") " pod="openshift-marketplace/marketplace-operator-79b997595-xb262" Jan 11 17:37:23 crc kubenswrapper[4837]: I0111 17:37:23.507248 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/a46aac0a-4b71-4559-9481-499e240587e4-marketplace-trusted-ca\") pod \"marketplace-operator-79b997595-xb262\" (UID: \"a46aac0a-4b71-4559-9481-499e240587e4\") " pod="openshift-marketplace/marketplace-operator-79b997595-xb262" Jan 11 17:37:23 crc kubenswrapper[4837]: I0111 17:37:23.511251 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/a46aac0a-4b71-4559-9481-499e240587e4-marketplace-operator-metrics\") pod \"marketplace-operator-79b997595-xb262\" (UID: \"a46aac0a-4b71-4559-9481-499e240587e4\") " pod="openshift-marketplace/marketplace-operator-79b997595-xb262" Jan 11 17:37:23 crc kubenswrapper[4837]: I0111 17:37:23.523043 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-54nld\" (UniqueName: \"kubernetes.io/projected/a46aac0a-4b71-4559-9481-499e240587e4-kube-api-access-54nld\") pod \"marketplace-operator-79b997595-xb262\" (UID: \"a46aac0a-4b71-4559-9481-499e240587e4\") " pod="openshift-marketplace/marketplace-operator-79b997595-xb262" Jan 11 17:37:23 crc kubenswrapper[4837]: I0111 17:37:23.733905 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/marketplace-operator-79b997595-xb262" Jan 11 17:37:23 crc kubenswrapper[4837]: I0111 17:37:23.752784 4837 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-29ts4" Jan 11 17:37:23 crc kubenswrapper[4837]: I0111 17:37:23.886153 4837 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-slftv" Jan 11 17:37:23 crc kubenswrapper[4837]: I0111 17:37:23.892756 4837 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-9blrl" Jan 11 17:37:23 crc kubenswrapper[4837]: I0111 17:37:23.897445 4837 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/marketplace-operator-79b997595-gkkhd" Jan 11 17:37:23 crc kubenswrapper[4837]: I0111 17:37:23.901727 4837 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-mxndb" Jan 11 17:37:23 crc kubenswrapper[4837]: I0111 17:37:23.918308 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/59797bd6-cb69-412d-952b-1673312648e2-utilities\") pod \"59797bd6-cb69-412d-952b-1673312648e2\" (UID: \"59797bd6-cb69-412d-952b-1673312648e2\") " Jan 11 17:37:23 crc kubenswrapper[4837]: I0111 17:37:23.918351 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5a6d8e6a-5e4a-4ff4-92ec-b2bfee9ef821-utilities\") pod \"5a6d8e6a-5e4a-4ff4-92ec-b2bfee9ef821\" (UID: \"5a6d8e6a-5e4a-4ff4-92ec-b2bfee9ef821\") " Jan 11 17:37:23 crc kubenswrapper[4837]: I0111 17:37:23.918372 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1d39ff8b-c79a-46ea-af70-0902ce0ee504-catalog-content\") pod \"1d39ff8b-c79a-46ea-af70-0902ce0ee504\" (UID: \"1d39ff8b-c79a-46ea-af70-0902ce0ee504\") " Jan 11 17:37:23 crc kubenswrapper[4837]: I0111 17:37:23.918400 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1d39ff8b-c79a-46ea-af70-0902ce0ee504-utilities\") pod \"1d39ff8b-c79a-46ea-af70-0902ce0ee504\" (UID: \"1d39ff8b-c79a-46ea-af70-0902ce0ee504\") " Jan 11 17:37:23 crc kubenswrapper[4837]: I0111 17:37:23.918414 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5a6d8e6a-5e4a-4ff4-92ec-b2bfee9ef821-catalog-content\") pod \"5a6d8e6a-5e4a-4ff4-92ec-b2bfee9ef821\" (UID: \"5a6d8e6a-5e4a-4ff4-92ec-b2bfee9ef821\") " Jan 11 17:37:23 crc kubenswrapper[4837]: I0111 17:37:23.918436 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/f9e34a1e-5456-4b26-b347-aa569c5987d5-marketplace-operator-metrics\") pod \"f9e34a1e-5456-4b26-b347-aa569c5987d5\" (UID: \"f9e34a1e-5456-4b26-b347-aa569c5987d5\") " Jan 11 17:37:23 crc kubenswrapper[4837]: I0111 17:37:23.918455 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-gtjfz\" (UniqueName: \"kubernetes.io/projected/f9e34a1e-5456-4b26-b347-aa569c5987d5-kube-api-access-gtjfz\") pod \"f9e34a1e-5456-4b26-b347-aa569c5987d5\" (UID: \"f9e34a1e-5456-4b26-b347-aa569c5987d5\") " Jan 11 17:37:23 crc kubenswrapper[4837]: I0111 17:37:23.918472 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7qkcm\" (UniqueName: \"kubernetes.io/projected/5a6d8e6a-5e4a-4ff4-92ec-b2bfee9ef821-kube-api-access-7qkcm\") pod \"5a6d8e6a-5e4a-4ff4-92ec-b2bfee9ef821\" (UID: \"5a6d8e6a-5e4a-4ff4-92ec-b2bfee9ef821\") " Jan 11 17:37:23 crc kubenswrapper[4837]: I0111 17:37:23.918492 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-69k8h\" (UniqueName: \"kubernetes.io/projected/30409294-8779-48ad-a6e8-36b662f09c0f-kube-api-access-69k8h\") pod \"30409294-8779-48ad-a6e8-36b662f09c0f\" (UID: \"30409294-8779-48ad-a6e8-36b662f09c0f\") " Jan 11 17:37:23 crc kubenswrapper[4837]: I0111 17:37:23.918507 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/59797bd6-cb69-412d-952b-1673312648e2-catalog-content\") pod \"59797bd6-cb69-412d-952b-1673312648e2\" (UID: \"59797bd6-cb69-412d-952b-1673312648e2\") " Jan 11 17:37:23 crc kubenswrapper[4837]: I0111 17:37:23.918521 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/30409294-8779-48ad-a6e8-36b662f09c0f-catalog-content\") pod \"30409294-8779-48ad-a6e8-36b662f09c0f\" (UID: \"30409294-8779-48ad-a6e8-36b662f09c0f\") " Jan 11 17:37:23 crc kubenswrapper[4837]: I0111 17:37:23.918546 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/30409294-8779-48ad-a6e8-36b662f09c0f-utilities\") pod \"30409294-8779-48ad-a6e8-36b662f09c0f\" (UID: \"30409294-8779-48ad-a6e8-36b662f09c0f\") " Jan 11 17:37:23 crc kubenswrapper[4837]: I0111 17:37:23.918565 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-md5ls\" (UniqueName: \"kubernetes.io/projected/59797bd6-cb69-412d-952b-1673312648e2-kube-api-access-md5ls\") pod \"59797bd6-cb69-412d-952b-1673312648e2\" (UID: \"59797bd6-cb69-412d-952b-1673312648e2\") " Jan 11 17:37:23 crc kubenswrapper[4837]: I0111 17:37:23.918589 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/f9e34a1e-5456-4b26-b347-aa569c5987d5-marketplace-trusted-ca\") pod \"f9e34a1e-5456-4b26-b347-aa569c5987d5\" (UID: \"f9e34a1e-5456-4b26-b347-aa569c5987d5\") " Jan 11 17:37:23 crc kubenswrapper[4837]: I0111 17:37:23.918613 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-k8dhm\" (UniqueName: \"kubernetes.io/projected/1d39ff8b-c79a-46ea-af70-0902ce0ee504-kube-api-access-k8dhm\") pod \"1d39ff8b-c79a-46ea-af70-0902ce0ee504\" (UID: \"1d39ff8b-c79a-46ea-af70-0902ce0ee504\") " Jan 11 17:37:23 crc kubenswrapper[4837]: I0111 17:37:23.924515 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/1d39ff8b-c79a-46ea-af70-0902ce0ee504-utilities" (OuterVolumeSpecName: "utilities") pod "1d39ff8b-c79a-46ea-af70-0902ce0ee504" (UID: "1d39ff8b-c79a-46ea-af70-0902ce0ee504"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 11 17:37:23 crc kubenswrapper[4837]: I0111 17:37:23.924657 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/30409294-8779-48ad-a6e8-36b662f09c0f-utilities" (OuterVolumeSpecName: "utilities") pod "30409294-8779-48ad-a6e8-36b662f09c0f" (UID: "30409294-8779-48ad-a6e8-36b662f09c0f"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 11 17:37:23 crc kubenswrapper[4837]: I0111 17:37:23.924886 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f9e34a1e-5456-4b26-b347-aa569c5987d5-kube-api-access-gtjfz" (OuterVolumeSpecName: "kube-api-access-gtjfz") pod "f9e34a1e-5456-4b26-b347-aa569c5987d5" (UID: "f9e34a1e-5456-4b26-b347-aa569c5987d5"). InnerVolumeSpecName "kube-api-access-gtjfz". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 11 17:37:23 crc kubenswrapper[4837]: I0111 17:37:23.925457 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/f9e34a1e-5456-4b26-b347-aa569c5987d5-marketplace-trusted-ca" (OuterVolumeSpecName: "marketplace-trusted-ca") pod "f9e34a1e-5456-4b26-b347-aa569c5987d5" (UID: "f9e34a1e-5456-4b26-b347-aa569c5987d5"). InnerVolumeSpecName "marketplace-trusted-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 11 17:37:23 crc kubenswrapper[4837]: I0111 17:37:23.925567 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1d39ff8b-c79a-46ea-af70-0902ce0ee504-kube-api-access-k8dhm" (OuterVolumeSpecName: "kube-api-access-k8dhm") pod "1d39ff8b-c79a-46ea-af70-0902ce0ee504" (UID: "1d39ff8b-c79a-46ea-af70-0902ce0ee504"). InnerVolumeSpecName "kube-api-access-k8dhm". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 11 17:37:23 crc kubenswrapper[4837]: I0111 17:37:23.925862 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/5a6d8e6a-5e4a-4ff4-92ec-b2bfee9ef821-utilities" (OuterVolumeSpecName: "utilities") pod "5a6d8e6a-5e4a-4ff4-92ec-b2bfee9ef821" (UID: "5a6d8e6a-5e4a-4ff4-92ec-b2bfee9ef821"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 11 17:37:23 crc kubenswrapper[4837]: I0111 17:37:23.926169 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/59797bd6-cb69-412d-952b-1673312648e2-utilities" (OuterVolumeSpecName: "utilities") pod "59797bd6-cb69-412d-952b-1673312648e2" (UID: "59797bd6-cb69-412d-952b-1673312648e2"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 11 17:37:23 crc kubenswrapper[4837]: I0111 17:37:23.926381 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/59797bd6-cb69-412d-952b-1673312648e2-kube-api-access-md5ls" (OuterVolumeSpecName: "kube-api-access-md5ls") pod "59797bd6-cb69-412d-952b-1673312648e2" (UID: "59797bd6-cb69-412d-952b-1673312648e2"). InnerVolumeSpecName "kube-api-access-md5ls". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 11 17:37:23 crc kubenswrapper[4837]: I0111 17:37:23.927295 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f9e34a1e-5456-4b26-b347-aa569c5987d5-marketplace-operator-metrics" (OuterVolumeSpecName: "marketplace-operator-metrics") pod "f9e34a1e-5456-4b26-b347-aa569c5987d5" (UID: "f9e34a1e-5456-4b26-b347-aa569c5987d5"). InnerVolumeSpecName "marketplace-operator-metrics". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 11 17:37:23 crc kubenswrapper[4837]: I0111 17:37:23.927832 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/30409294-8779-48ad-a6e8-36b662f09c0f-kube-api-access-69k8h" (OuterVolumeSpecName: "kube-api-access-69k8h") pod "30409294-8779-48ad-a6e8-36b662f09c0f" (UID: "30409294-8779-48ad-a6e8-36b662f09c0f"). InnerVolumeSpecName "kube-api-access-69k8h". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 11 17:37:23 crc kubenswrapper[4837]: I0111 17:37:23.928197 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5a6d8e6a-5e4a-4ff4-92ec-b2bfee9ef821-kube-api-access-7qkcm" (OuterVolumeSpecName: "kube-api-access-7qkcm") pod "5a6d8e6a-5e4a-4ff4-92ec-b2bfee9ef821" (UID: "5a6d8e6a-5e4a-4ff4-92ec-b2bfee9ef821"). InnerVolumeSpecName "kube-api-access-7qkcm". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 11 17:37:23 crc kubenswrapper[4837]: I0111 17:37:23.949551 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/5a6d8e6a-5e4a-4ff4-92ec-b2bfee9ef821-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "5a6d8e6a-5e4a-4ff4-92ec-b2bfee9ef821" (UID: "5a6d8e6a-5e4a-4ff4-92ec-b2bfee9ef821"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 11 17:37:23 crc kubenswrapper[4837]: I0111 17:37:23.979872 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/30409294-8779-48ad-a6e8-36b662f09c0f-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "30409294-8779-48ad-a6e8-36b662f09c0f" (UID: "30409294-8779-48ad-a6e8-36b662f09c0f"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 11 17:37:24 crc kubenswrapper[4837]: I0111 17:37:24.000577 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/1d39ff8b-c79a-46ea-af70-0902ce0ee504-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "1d39ff8b-c79a-46ea-af70-0902ce0ee504" (UID: "1d39ff8b-c79a-46ea-af70-0902ce0ee504"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 11 17:37:24 crc kubenswrapper[4837]: I0111 17:37:24.019143 4837 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-k8dhm\" (UniqueName: \"kubernetes.io/projected/1d39ff8b-c79a-46ea-af70-0902ce0ee504-kube-api-access-k8dhm\") on node \"crc\" DevicePath \"\"" Jan 11 17:37:24 crc kubenswrapper[4837]: I0111 17:37:24.019428 4837 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/59797bd6-cb69-412d-952b-1673312648e2-utilities\") on node \"crc\" DevicePath \"\"" Jan 11 17:37:24 crc kubenswrapper[4837]: I0111 17:37:24.019439 4837 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5a6d8e6a-5e4a-4ff4-92ec-b2bfee9ef821-utilities\") on node \"crc\" DevicePath \"\"" Jan 11 17:37:24 crc kubenswrapper[4837]: I0111 17:37:24.019450 4837 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1d39ff8b-c79a-46ea-af70-0902ce0ee504-catalog-content\") on node \"crc\" DevicePath \"\"" Jan 11 17:37:24 crc kubenswrapper[4837]: I0111 17:37:24.019462 4837 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1d39ff8b-c79a-46ea-af70-0902ce0ee504-utilities\") on node \"crc\" DevicePath \"\"" Jan 11 17:37:24 crc kubenswrapper[4837]: I0111 17:37:24.019475 4837 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5a6d8e6a-5e4a-4ff4-92ec-b2bfee9ef821-catalog-content\") on node \"crc\" DevicePath \"\"" Jan 11 17:37:24 crc kubenswrapper[4837]: I0111 17:37:24.019486 4837 reconciler_common.go:293] "Volume detached for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/f9e34a1e-5456-4b26-b347-aa569c5987d5-marketplace-operator-metrics\") on node \"crc\" DevicePath \"\"" Jan 11 17:37:24 crc kubenswrapper[4837]: I0111 17:37:24.019497 4837 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-gtjfz\" (UniqueName: \"kubernetes.io/projected/f9e34a1e-5456-4b26-b347-aa569c5987d5-kube-api-access-gtjfz\") on node \"crc\" DevicePath \"\"" Jan 11 17:37:24 crc kubenswrapper[4837]: I0111 17:37:24.019510 4837 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-7qkcm\" (UniqueName: \"kubernetes.io/projected/5a6d8e6a-5e4a-4ff4-92ec-b2bfee9ef821-kube-api-access-7qkcm\") on node \"crc\" DevicePath \"\"" Jan 11 17:37:24 crc kubenswrapper[4837]: I0111 17:37:24.019520 4837 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-69k8h\" (UniqueName: \"kubernetes.io/projected/30409294-8779-48ad-a6e8-36b662f09c0f-kube-api-access-69k8h\") on node \"crc\" DevicePath \"\"" Jan 11 17:37:24 crc kubenswrapper[4837]: I0111 17:37:24.019528 4837 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/30409294-8779-48ad-a6e8-36b662f09c0f-catalog-content\") on node \"crc\" DevicePath \"\"" Jan 11 17:37:24 crc kubenswrapper[4837]: I0111 17:37:24.019536 4837 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/30409294-8779-48ad-a6e8-36b662f09c0f-utilities\") on node \"crc\" DevicePath \"\"" Jan 11 17:37:24 crc kubenswrapper[4837]: I0111 17:37:24.019544 4837 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-md5ls\" (UniqueName: \"kubernetes.io/projected/59797bd6-cb69-412d-952b-1673312648e2-kube-api-access-md5ls\") on node \"crc\" DevicePath \"\"" Jan 11 17:37:24 crc kubenswrapper[4837]: I0111 17:37:24.019552 4837 reconciler_common.go:293] "Volume detached for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/f9e34a1e-5456-4b26-b347-aa569c5987d5-marketplace-trusted-ca\") on node \"crc\" DevicePath \"\"" Jan 11 17:37:24 crc kubenswrapper[4837]: I0111 17:37:24.067327 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/59797bd6-cb69-412d-952b-1673312648e2-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "59797bd6-cb69-412d-952b-1673312648e2" (UID: "59797bd6-cb69-412d-952b-1673312648e2"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 11 17:37:24 crc kubenswrapper[4837]: I0111 17:37:24.120034 4837 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/59797bd6-cb69-412d-952b-1673312648e2-catalog-content\") on node \"crc\" DevicePath \"\"" Jan 11 17:37:24 crc kubenswrapper[4837]: I0111 17:37:24.269766 4837 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-xb262"] Jan 11 17:37:24 crc kubenswrapper[4837]: W0111 17:37:24.281920 4837 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poda46aac0a_4b71_4559_9481_499e240587e4.slice/crio-3af1f580b7aa72e5b0314bec901bc59e80e128ea83bcb7231041732481182b24 WatchSource:0}: Error finding container 3af1f580b7aa72e5b0314bec901bc59e80e128ea83bcb7231041732481182b24: Status 404 returned error can't find the container with id 3af1f580b7aa72e5b0314bec901bc59e80e128ea83bcb7231041732481182b24 Jan 11 17:37:24 crc kubenswrapper[4837]: I0111 17:37:24.315241 4837 generic.go:334] "Generic (PLEG): container finished" podID="1d39ff8b-c79a-46ea-af70-0902ce0ee504" containerID="fd605aaf7ff12f50189ebe4c1f7e471af62160afef188bd4fb35bee5021e7b90" exitCode=0 Jan 11 17:37:24 crc kubenswrapper[4837]: I0111 17:37:24.315280 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-mxndb" event={"ID":"1d39ff8b-c79a-46ea-af70-0902ce0ee504","Type":"ContainerDied","Data":"fd605aaf7ff12f50189ebe4c1f7e471af62160afef188bd4fb35bee5021e7b90"} Jan 11 17:37:24 crc kubenswrapper[4837]: I0111 17:37:24.315400 4837 scope.go:117] "RemoveContainer" containerID="fd605aaf7ff12f50189ebe4c1f7e471af62160afef188bd4fb35bee5021e7b90" Jan 11 17:37:24 crc kubenswrapper[4837]: I0111 17:37:24.315400 4837 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-mxndb" Jan 11 17:37:24 crc kubenswrapper[4837]: I0111 17:37:24.315792 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-mxndb" event={"ID":"1d39ff8b-c79a-46ea-af70-0902ce0ee504","Type":"ContainerDied","Data":"91a0daa167fdd699229a9451fc55940853efe015d3de3667f9508a5aaddca406"} Jan 11 17:37:24 crc kubenswrapper[4837]: I0111 17:37:24.317010 4837 generic.go:334] "Generic (PLEG): container finished" podID="30409294-8779-48ad-a6e8-36b662f09c0f" containerID="8fc494cd2c7422ba58b5dfac309e99f679c0b0579429b7f587db8be68c5c72ea" exitCode=0 Jan 11 17:37:24 crc kubenswrapper[4837]: I0111 17:37:24.317058 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-29ts4" event={"ID":"30409294-8779-48ad-a6e8-36b662f09c0f","Type":"ContainerDied","Data":"8fc494cd2c7422ba58b5dfac309e99f679c0b0579429b7f587db8be68c5c72ea"} Jan 11 17:37:24 crc kubenswrapper[4837]: I0111 17:37:24.317075 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-29ts4" event={"ID":"30409294-8779-48ad-a6e8-36b662f09c0f","Type":"ContainerDied","Data":"5b85927c5fcc0dec12428ea7bf9832e3dd16d17005fa34a24b37d1ca1c528396"} Jan 11 17:37:24 crc kubenswrapper[4837]: I0111 17:37:24.317179 4837 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-29ts4" Jan 11 17:37:24 crc kubenswrapper[4837]: I0111 17:37:24.321025 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/marketplace-operator-79b997595-xb262" event={"ID":"a46aac0a-4b71-4559-9481-499e240587e4","Type":"ContainerStarted","Data":"3af1f580b7aa72e5b0314bec901bc59e80e128ea83bcb7231041732481182b24"} Jan 11 17:37:24 crc kubenswrapper[4837]: I0111 17:37:24.324427 4837 generic.go:334] "Generic (PLEG): container finished" podID="f9e34a1e-5456-4b26-b347-aa569c5987d5" containerID="bd0e8b7a016ff310114e5185abb0d0ca9be702174d96027d70c5053ce5f91a72" exitCode=0 Jan 11 17:37:24 crc kubenswrapper[4837]: I0111 17:37:24.324792 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/marketplace-operator-79b997595-gkkhd" event={"ID":"f9e34a1e-5456-4b26-b347-aa569c5987d5","Type":"ContainerDied","Data":"bd0e8b7a016ff310114e5185abb0d0ca9be702174d96027d70c5053ce5f91a72"} Jan 11 17:37:24 crc kubenswrapper[4837]: I0111 17:37:24.325392 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/marketplace-operator-79b997595-gkkhd" event={"ID":"f9e34a1e-5456-4b26-b347-aa569c5987d5","Type":"ContainerDied","Data":"06566f0c1c3ccdfdd35cacfa70f38cd1bcb1f7c65955d9575205fbbfabc11c09"} Jan 11 17:37:24 crc kubenswrapper[4837]: I0111 17:37:24.324889 4837 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/marketplace-operator-79b997595-gkkhd" Jan 11 17:37:24 crc kubenswrapper[4837]: I0111 17:37:24.328147 4837 generic.go:334] "Generic (PLEG): container finished" podID="59797bd6-cb69-412d-952b-1673312648e2" containerID="93720f4f395c01bc8939c97f1dcc75df87220bf32e6383b482042d185c96d3a8" exitCode=0 Jan 11 17:37:24 crc kubenswrapper[4837]: I0111 17:37:24.328184 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-slftv" event={"ID":"59797bd6-cb69-412d-952b-1673312648e2","Type":"ContainerDied","Data":"93720f4f395c01bc8939c97f1dcc75df87220bf32e6383b482042d185c96d3a8"} Jan 11 17:37:24 crc kubenswrapper[4837]: I0111 17:37:24.328236 4837 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-slftv" Jan 11 17:37:24 crc kubenswrapper[4837]: I0111 17:37:24.328217 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-slftv" event={"ID":"59797bd6-cb69-412d-952b-1673312648e2","Type":"ContainerDied","Data":"436761847f8f2a984da0011d27620401bb5241e31549e4dcb6885900fecaf7ec"} Jan 11 17:37:24 crc kubenswrapper[4837]: I0111 17:37:24.331599 4837 generic.go:334] "Generic (PLEG): container finished" podID="5a6d8e6a-5e4a-4ff4-92ec-b2bfee9ef821" containerID="4fba69108dfb028b8c2325a937358a28a67b68d650ad355ffcc40011912243f8" exitCode=0 Jan 11 17:37:24 crc kubenswrapper[4837]: I0111 17:37:24.331625 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-9blrl" event={"ID":"5a6d8e6a-5e4a-4ff4-92ec-b2bfee9ef821","Type":"ContainerDied","Data":"4fba69108dfb028b8c2325a937358a28a67b68d650ad355ffcc40011912243f8"} Jan 11 17:37:24 crc kubenswrapper[4837]: I0111 17:37:24.331644 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-9blrl" event={"ID":"5a6d8e6a-5e4a-4ff4-92ec-b2bfee9ef821","Type":"ContainerDied","Data":"4e78beaaa18406072bf90fd6ff26fa4da2689d59e9678b66ef1563919ebf2ad2"} Jan 11 17:37:24 crc kubenswrapper[4837]: I0111 17:37:24.331740 4837 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-9blrl" Jan 11 17:37:24 crc kubenswrapper[4837]: I0111 17:37:24.352892 4837 scope.go:117] "RemoveContainer" containerID="f91edf638c7e4927c1c4f40abc693622925ffbc5d1e9a0cc5110a01a70e32751" Jan 11 17:37:24 crc kubenswrapper[4837]: I0111 17:37:24.358262 4837 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-29ts4"] Jan 11 17:37:24 crc kubenswrapper[4837]: I0111 17:37:24.362274 4837 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-29ts4"] Jan 11 17:37:24 crc kubenswrapper[4837]: I0111 17:37:24.370241 4837 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="30409294-8779-48ad-a6e8-36b662f09c0f" path="/var/lib/kubelet/pods/30409294-8779-48ad-a6e8-36b662f09c0f/volumes" Jan 11 17:37:24 crc kubenswrapper[4837]: I0111 17:37:24.401576 4837 scope.go:117] "RemoveContainer" containerID="d21ebfa716d770b8c9e8a729545de23ae68451f6837985eda39090e41ba166d6" Jan 11 17:37:24 crc kubenswrapper[4837]: I0111 17:37:24.410505 4837 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-mxndb"] Jan 11 17:37:24 crc kubenswrapper[4837]: I0111 17:37:24.414608 4837 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-mxndb"] Jan 11 17:37:24 crc kubenswrapper[4837]: I0111 17:37:24.421263 4837 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-slftv"] Jan 11 17:37:24 crc kubenswrapper[4837]: I0111 17:37:24.421868 4837 scope.go:117] "RemoveContainer" containerID="fd605aaf7ff12f50189ebe4c1f7e471af62160afef188bd4fb35bee5021e7b90" Jan 11 17:37:24 crc kubenswrapper[4837]: E0111 17:37:24.423434 4837 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"fd605aaf7ff12f50189ebe4c1f7e471af62160afef188bd4fb35bee5021e7b90\": container with ID starting with fd605aaf7ff12f50189ebe4c1f7e471af62160afef188bd4fb35bee5021e7b90 not found: ID does not exist" containerID="fd605aaf7ff12f50189ebe4c1f7e471af62160afef188bd4fb35bee5021e7b90" Jan 11 17:37:24 crc kubenswrapper[4837]: I0111 17:37:24.423470 4837 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"fd605aaf7ff12f50189ebe4c1f7e471af62160afef188bd4fb35bee5021e7b90"} err="failed to get container status \"fd605aaf7ff12f50189ebe4c1f7e471af62160afef188bd4fb35bee5021e7b90\": rpc error: code = NotFound desc = could not find container \"fd605aaf7ff12f50189ebe4c1f7e471af62160afef188bd4fb35bee5021e7b90\": container with ID starting with fd605aaf7ff12f50189ebe4c1f7e471af62160afef188bd4fb35bee5021e7b90 not found: ID does not exist" Jan 11 17:37:24 crc kubenswrapper[4837]: I0111 17:37:24.423496 4837 scope.go:117] "RemoveContainer" containerID="f91edf638c7e4927c1c4f40abc693622925ffbc5d1e9a0cc5110a01a70e32751" Jan 11 17:37:24 crc kubenswrapper[4837]: E0111 17:37:24.425198 4837 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f91edf638c7e4927c1c4f40abc693622925ffbc5d1e9a0cc5110a01a70e32751\": container with ID starting with f91edf638c7e4927c1c4f40abc693622925ffbc5d1e9a0cc5110a01a70e32751 not found: ID does not exist" containerID="f91edf638c7e4927c1c4f40abc693622925ffbc5d1e9a0cc5110a01a70e32751" Jan 11 17:37:24 crc kubenswrapper[4837]: I0111 17:37:24.425232 4837 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f91edf638c7e4927c1c4f40abc693622925ffbc5d1e9a0cc5110a01a70e32751"} err="failed to get container status \"f91edf638c7e4927c1c4f40abc693622925ffbc5d1e9a0cc5110a01a70e32751\": rpc error: code = NotFound desc = could not find container \"f91edf638c7e4927c1c4f40abc693622925ffbc5d1e9a0cc5110a01a70e32751\": container with ID starting with f91edf638c7e4927c1c4f40abc693622925ffbc5d1e9a0cc5110a01a70e32751 not found: ID does not exist" Jan 11 17:37:24 crc kubenswrapper[4837]: I0111 17:37:24.425277 4837 scope.go:117] "RemoveContainer" containerID="d21ebfa716d770b8c9e8a729545de23ae68451f6837985eda39090e41ba166d6" Jan 11 17:37:24 crc kubenswrapper[4837]: E0111 17:37:24.425804 4837 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d21ebfa716d770b8c9e8a729545de23ae68451f6837985eda39090e41ba166d6\": container with ID starting with d21ebfa716d770b8c9e8a729545de23ae68451f6837985eda39090e41ba166d6 not found: ID does not exist" containerID="d21ebfa716d770b8c9e8a729545de23ae68451f6837985eda39090e41ba166d6" Jan 11 17:37:24 crc kubenswrapper[4837]: I0111 17:37:24.425834 4837 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d21ebfa716d770b8c9e8a729545de23ae68451f6837985eda39090e41ba166d6"} err="failed to get container status \"d21ebfa716d770b8c9e8a729545de23ae68451f6837985eda39090e41ba166d6\": rpc error: code = NotFound desc = could not find container \"d21ebfa716d770b8c9e8a729545de23ae68451f6837985eda39090e41ba166d6\": container with ID starting with d21ebfa716d770b8c9e8a729545de23ae68451f6837985eda39090e41ba166d6 not found: ID does not exist" Jan 11 17:37:24 crc kubenswrapper[4837]: I0111 17:37:24.425852 4837 scope.go:117] "RemoveContainer" containerID="8fc494cd2c7422ba58b5dfac309e99f679c0b0579429b7f587db8be68c5c72ea" Jan 11 17:37:24 crc kubenswrapper[4837]: I0111 17:37:24.425854 4837 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-slftv"] Jan 11 17:37:24 crc kubenswrapper[4837]: I0111 17:37:24.429544 4837 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-gkkhd"] Jan 11 17:37:24 crc kubenswrapper[4837]: I0111 17:37:24.437147 4837 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-gkkhd"] Jan 11 17:37:24 crc kubenswrapper[4837]: I0111 17:37:24.455093 4837 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-9blrl"] Jan 11 17:37:24 crc kubenswrapper[4837]: I0111 17:37:24.459287 4837 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-9blrl"] Jan 11 17:37:24 crc kubenswrapper[4837]: I0111 17:37:24.473073 4837 scope.go:117] "RemoveContainer" containerID="68f6ca2c2560051f1c829a084474dabb893c400d035ba1b6d9936195de41fb0d" Jan 11 17:37:24 crc kubenswrapper[4837]: I0111 17:37:24.488062 4837 scope.go:117] "RemoveContainer" containerID="79a4e50dd9e139d48f1cb30436c0881bc8dd21770764ed8c1d18ce1389738be9" Jan 11 17:37:24 crc kubenswrapper[4837]: I0111 17:37:24.505302 4837 scope.go:117] "RemoveContainer" containerID="8fc494cd2c7422ba58b5dfac309e99f679c0b0579429b7f587db8be68c5c72ea" Jan 11 17:37:24 crc kubenswrapper[4837]: E0111 17:37:24.505898 4837 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"8fc494cd2c7422ba58b5dfac309e99f679c0b0579429b7f587db8be68c5c72ea\": container with ID starting with 8fc494cd2c7422ba58b5dfac309e99f679c0b0579429b7f587db8be68c5c72ea not found: ID does not exist" containerID="8fc494cd2c7422ba58b5dfac309e99f679c0b0579429b7f587db8be68c5c72ea" Jan 11 17:37:24 crc kubenswrapper[4837]: I0111 17:37:24.505943 4837 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8fc494cd2c7422ba58b5dfac309e99f679c0b0579429b7f587db8be68c5c72ea"} err="failed to get container status \"8fc494cd2c7422ba58b5dfac309e99f679c0b0579429b7f587db8be68c5c72ea\": rpc error: code = NotFound desc = could not find container \"8fc494cd2c7422ba58b5dfac309e99f679c0b0579429b7f587db8be68c5c72ea\": container with ID starting with 8fc494cd2c7422ba58b5dfac309e99f679c0b0579429b7f587db8be68c5c72ea not found: ID does not exist" Jan 11 17:37:24 crc kubenswrapper[4837]: I0111 17:37:24.505976 4837 scope.go:117] "RemoveContainer" containerID="68f6ca2c2560051f1c829a084474dabb893c400d035ba1b6d9936195de41fb0d" Jan 11 17:37:24 crc kubenswrapper[4837]: E0111 17:37:24.506306 4837 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"68f6ca2c2560051f1c829a084474dabb893c400d035ba1b6d9936195de41fb0d\": container with ID starting with 68f6ca2c2560051f1c829a084474dabb893c400d035ba1b6d9936195de41fb0d not found: ID does not exist" containerID="68f6ca2c2560051f1c829a084474dabb893c400d035ba1b6d9936195de41fb0d" Jan 11 17:37:24 crc kubenswrapper[4837]: I0111 17:37:24.506351 4837 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"68f6ca2c2560051f1c829a084474dabb893c400d035ba1b6d9936195de41fb0d"} err="failed to get container status \"68f6ca2c2560051f1c829a084474dabb893c400d035ba1b6d9936195de41fb0d\": rpc error: code = NotFound desc = could not find container \"68f6ca2c2560051f1c829a084474dabb893c400d035ba1b6d9936195de41fb0d\": container with ID starting with 68f6ca2c2560051f1c829a084474dabb893c400d035ba1b6d9936195de41fb0d not found: ID does not exist" Jan 11 17:37:24 crc kubenswrapper[4837]: I0111 17:37:24.506377 4837 scope.go:117] "RemoveContainer" containerID="79a4e50dd9e139d48f1cb30436c0881bc8dd21770764ed8c1d18ce1389738be9" Jan 11 17:37:24 crc kubenswrapper[4837]: E0111 17:37:24.506603 4837 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"79a4e50dd9e139d48f1cb30436c0881bc8dd21770764ed8c1d18ce1389738be9\": container with ID starting with 79a4e50dd9e139d48f1cb30436c0881bc8dd21770764ed8c1d18ce1389738be9 not found: ID does not exist" containerID="79a4e50dd9e139d48f1cb30436c0881bc8dd21770764ed8c1d18ce1389738be9" Jan 11 17:37:24 crc kubenswrapper[4837]: I0111 17:37:24.506624 4837 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"79a4e50dd9e139d48f1cb30436c0881bc8dd21770764ed8c1d18ce1389738be9"} err="failed to get container status \"79a4e50dd9e139d48f1cb30436c0881bc8dd21770764ed8c1d18ce1389738be9\": rpc error: code = NotFound desc = could not find container \"79a4e50dd9e139d48f1cb30436c0881bc8dd21770764ed8c1d18ce1389738be9\": container with ID starting with 79a4e50dd9e139d48f1cb30436c0881bc8dd21770764ed8c1d18ce1389738be9 not found: ID does not exist" Jan 11 17:37:24 crc kubenswrapper[4837]: I0111 17:37:24.506637 4837 scope.go:117] "RemoveContainer" containerID="bd0e8b7a016ff310114e5185abb0d0ca9be702174d96027d70c5053ce5f91a72" Jan 11 17:37:24 crc kubenswrapper[4837]: I0111 17:37:24.519344 4837 scope.go:117] "RemoveContainer" containerID="22fe794e84e21e3d7722ec9d66f7b89843f7045cce8aa3bf0dc58a0edc5fa3b3" Jan 11 17:37:24 crc kubenswrapper[4837]: I0111 17:37:24.533134 4837 scope.go:117] "RemoveContainer" containerID="bd0e8b7a016ff310114e5185abb0d0ca9be702174d96027d70c5053ce5f91a72" Jan 11 17:37:24 crc kubenswrapper[4837]: E0111 17:37:24.535336 4837 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"bd0e8b7a016ff310114e5185abb0d0ca9be702174d96027d70c5053ce5f91a72\": container with ID starting with bd0e8b7a016ff310114e5185abb0d0ca9be702174d96027d70c5053ce5f91a72 not found: ID does not exist" containerID="bd0e8b7a016ff310114e5185abb0d0ca9be702174d96027d70c5053ce5f91a72" Jan 11 17:37:24 crc kubenswrapper[4837]: I0111 17:37:24.535382 4837 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"bd0e8b7a016ff310114e5185abb0d0ca9be702174d96027d70c5053ce5f91a72"} err="failed to get container status \"bd0e8b7a016ff310114e5185abb0d0ca9be702174d96027d70c5053ce5f91a72\": rpc error: code = NotFound desc = could not find container \"bd0e8b7a016ff310114e5185abb0d0ca9be702174d96027d70c5053ce5f91a72\": container with ID starting with bd0e8b7a016ff310114e5185abb0d0ca9be702174d96027d70c5053ce5f91a72 not found: ID does not exist" Jan 11 17:37:24 crc kubenswrapper[4837]: I0111 17:37:24.535414 4837 scope.go:117] "RemoveContainer" containerID="22fe794e84e21e3d7722ec9d66f7b89843f7045cce8aa3bf0dc58a0edc5fa3b3" Jan 11 17:37:24 crc kubenswrapper[4837]: E0111 17:37:24.535766 4837 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"22fe794e84e21e3d7722ec9d66f7b89843f7045cce8aa3bf0dc58a0edc5fa3b3\": container with ID starting with 22fe794e84e21e3d7722ec9d66f7b89843f7045cce8aa3bf0dc58a0edc5fa3b3 not found: ID does not exist" containerID="22fe794e84e21e3d7722ec9d66f7b89843f7045cce8aa3bf0dc58a0edc5fa3b3" Jan 11 17:37:24 crc kubenswrapper[4837]: I0111 17:37:24.535799 4837 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"22fe794e84e21e3d7722ec9d66f7b89843f7045cce8aa3bf0dc58a0edc5fa3b3"} err="failed to get container status \"22fe794e84e21e3d7722ec9d66f7b89843f7045cce8aa3bf0dc58a0edc5fa3b3\": rpc error: code = NotFound desc = could not find container \"22fe794e84e21e3d7722ec9d66f7b89843f7045cce8aa3bf0dc58a0edc5fa3b3\": container with ID starting with 22fe794e84e21e3d7722ec9d66f7b89843f7045cce8aa3bf0dc58a0edc5fa3b3 not found: ID does not exist" Jan 11 17:37:24 crc kubenswrapper[4837]: I0111 17:37:24.535821 4837 scope.go:117] "RemoveContainer" containerID="93720f4f395c01bc8939c97f1dcc75df87220bf32e6383b482042d185c96d3a8" Jan 11 17:37:24 crc kubenswrapper[4837]: I0111 17:37:24.550738 4837 scope.go:117] "RemoveContainer" containerID="ff0befc9320321b51b2bd3d93706ae69b790b5ca019b8bc59af4eecf6b57da76" Jan 11 17:37:24 crc kubenswrapper[4837]: I0111 17:37:24.564510 4837 scope.go:117] "RemoveContainer" containerID="3683e44906242287c17aef068398977dc417d8d42f4462c46ba87cf6ec098309" Jan 11 17:37:24 crc kubenswrapper[4837]: I0111 17:37:24.577402 4837 scope.go:117] "RemoveContainer" containerID="93720f4f395c01bc8939c97f1dcc75df87220bf32e6383b482042d185c96d3a8" Jan 11 17:37:24 crc kubenswrapper[4837]: E0111 17:37:24.577789 4837 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"93720f4f395c01bc8939c97f1dcc75df87220bf32e6383b482042d185c96d3a8\": container with ID starting with 93720f4f395c01bc8939c97f1dcc75df87220bf32e6383b482042d185c96d3a8 not found: ID does not exist" containerID="93720f4f395c01bc8939c97f1dcc75df87220bf32e6383b482042d185c96d3a8" Jan 11 17:37:24 crc kubenswrapper[4837]: I0111 17:37:24.577817 4837 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"93720f4f395c01bc8939c97f1dcc75df87220bf32e6383b482042d185c96d3a8"} err="failed to get container status \"93720f4f395c01bc8939c97f1dcc75df87220bf32e6383b482042d185c96d3a8\": rpc error: code = NotFound desc = could not find container \"93720f4f395c01bc8939c97f1dcc75df87220bf32e6383b482042d185c96d3a8\": container with ID starting with 93720f4f395c01bc8939c97f1dcc75df87220bf32e6383b482042d185c96d3a8 not found: ID does not exist" Jan 11 17:37:24 crc kubenswrapper[4837]: I0111 17:37:24.577838 4837 scope.go:117] "RemoveContainer" containerID="ff0befc9320321b51b2bd3d93706ae69b790b5ca019b8bc59af4eecf6b57da76" Jan 11 17:37:24 crc kubenswrapper[4837]: E0111 17:37:24.578230 4837 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ff0befc9320321b51b2bd3d93706ae69b790b5ca019b8bc59af4eecf6b57da76\": container with ID starting with ff0befc9320321b51b2bd3d93706ae69b790b5ca019b8bc59af4eecf6b57da76 not found: ID does not exist" containerID="ff0befc9320321b51b2bd3d93706ae69b790b5ca019b8bc59af4eecf6b57da76" Jan 11 17:37:24 crc kubenswrapper[4837]: I0111 17:37:24.578276 4837 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ff0befc9320321b51b2bd3d93706ae69b790b5ca019b8bc59af4eecf6b57da76"} err="failed to get container status \"ff0befc9320321b51b2bd3d93706ae69b790b5ca019b8bc59af4eecf6b57da76\": rpc error: code = NotFound desc = could not find container \"ff0befc9320321b51b2bd3d93706ae69b790b5ca019b8bc59af4eecf6b57da76\": container with ID starting with ff0befc9320321b51b2bd3d93706ae69b790b5ca019b8bc59af4eecf6b57da76 not found: ID does not exist" Jan 11 17:37:24 crc kubenswrapper[4837]: I0111 17:37:24.578306 4837 scope.go:117] "RemoveContainer" containerID="3683e44906242287c17aef068398977dc417d8d42f4462c46ba87cf6ec098309" Jan 11 17:37:24 crc kubenswrapper[4837]: E0111 17:37:24.578681 4837 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"3683e44906242287c17aef068398977dc417d8d42f4462c46ba87cf6ec098309\": container with ID starting with 3683e44906242287c17aef068398977dc417d8d42f4462c46ba87cf6ec098309 not found: ID does not exist" containerID="3683e44906242287c17aef068398977dc417d8d42f4462c46ba87cf6ec098309" Jan 11 17:37:24 crc kubenswrapper[4837]: I0111 17:37:24.578705 4837 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3683e44906242287c17aef068398977dc417d8d42f4462c46ba87cf6ec098309"} err="failed to get container status \"3683e44906242287c17aef068398977dc417d8d42f4462c46ba87cf6ec098309\": rpc error: code = NotFound desc = could not find container \"3683e44906242287c17aef068398977dc417d8d42f4462c46ba87cf6ec098309\": container with ID starting with 3683e44906242287c17aef068398977dc417d8d42f4462c46ba87cf6ec098309 not found: ID does not exist" Jan 11 17:37:24 crc kubenswrapper[4837]: I0111 17:37:24.578719 4837 scope.go:117] "RemoveContainer" containerID="4fba69108dfb028b8c2325a937358a28a67b68d650ad355ffcc40011912243f8" Jan 11 17:37:24 crc kubenswrapper[4837]: I0111 17:37:24.591275 4837 scope.go:117] "RemoveContainer" containerID="421eb4044940a46e9146cd116d17d0e4f6ac4e87355b9596d42b49d3eef92753" Jan 11 17:37:24 crc kubenswrapper[4837]: I0111 17:37:24.605880 4837 scope.go:117] "RemoveContainer" containerID="4bf1ec8c973fcebc64c9f7fd102a91408eee4713be76ef7f51560569c85fc080" Jan 11 17:37:24 crc kubenswrapper[4837]: I0111 17:37:24.617874 4837 scope.go:117] "RemoveContainer" containerID="4fba69108dfb028b8c2325a937358a28a67b68d650ad355ffcc40011912243f8" Jan 11 17:37:24 crc kubenswrapper[4837]: E0111 17:37:24.618224 4837 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"4fba69108dfb028b8c2325a937358a28a67b68d650ad355ffcc40011912243f8\": container with ID starting with 4fba69108dfb028b8c2325a937358a28a67b68d650ad355ffcc40011912243f8 not found: ID does not exist" containerID="4fba69108dfb028b8c2325a937358a28a67b68d650ad355ffcc40011912243f8" Jan 11 17:37:24 crc kubenswrapper[4837]: I0111 17:37:24.618251 4837 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4fba69108dfb028b8c2325a937358a28a67b68d650ad355ffcc40011912243f8"} err="failed to get container status \"4fba69108dfb028b8c2325a937358a28a67b68d650ad355ffcc40011912243f8\": rpc error: code = NotFound desc = could not find container \"4fba69108dfb028b8c2325a937358a28a67b68d650ad355ffcc40011912243f8\": container with ID starting with 4fba69108dfb028b8c2325a937358a28a67b68d650ad355ffcc40011912243f8 not found: ID does not exist" Jan 11 17:37:24 crc kubenswrapper[4837]: I0111 17:37:24.618270 4837 scope.go:117] "RemoveContainer" containerID="421eb4044940a46e9146cd116d17d0e4f6ac4e87355b9596d42b49d3eef92753" Jan 11 17:37:24 crc kubenswrapper[4837]: E0111 17:37:24.618873 4837 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"421eb4044940a46e9146cd116d17d0e4f6ac4e87355b9596d42b49d3eef92753\": container with ID starting with 421eb4044940a46e9146cd116d17d0e4f6ac4e87355b9596d42b49d3eef92753 not found: ID does not exist" containerID="421eb4044940a46e9146cd116d17d0e4f6ac4e87355b9596d42b49d3eef92753" Jan 11 17:37:24 crc kubenswrapper[4837]: I0111 17:37:24.618894 4837 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"421eb4044940a46e9146cd116d17d0e4f6ac4e87355b9596d42b49d3eef92753"} err="failed to get container status \"421eb4044940a46e9146cd116d17d0e4f6ac4e87355b9596d42b49d3eef92753\": rpc error: code = NotFound desc = could not find container \"421eb4044940a46e9146cd116d17d0e4f6ac4e87355b9596d42b49d3eef92753\": container with ID starting with 421eb4044940a46e9146cd116d17d0e4f6ac4e87355b9596d42b49d3eef92753 not found: ID does not exist" Jan 11 17:37:24 crc kubenswrapper[4837]: I0111 17:37:24.618907 4837 scope.go:117] "RemoveContainer" containerID="4bf1ec8c973fcebc64c9f7fd102a91408eee4713be76ef7f51560569c85fc080" Jan 11 17:37:24 crc kubenswrapper[4837]: E0111 17:37:24.619107 4837 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"4bf1ec8c973fcebc64c9f7fd102a91408eee4713be76ef7f51560569c85fc080\": container with ID starting with 4bf1ec8c973fcebc64c9f7fd102a91408eee4713be76ef7f51560569c85fc080 not found: ID does not exist" containerID="4bf1ec8c973fcebc64c9f7fd102a91408eee4713be76ef7f51560569c85fc080" Jan 11 17:37:24 crc kubenswrapper[4837]: I0111 17:37:24.619125 4837 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4bf1ec8c973fcebc64c9f7fd102a91408eee4713be76ef7f51560569c85fc080"} err="failed to get container status \"4bf1ec8c973fcebc64c9f7fd102a91408eee4713be76ef7f51560569c85fc080\": rpc error: code = NotFound desc = could not find container \"4bf1ec8c973fcebc64c9f7fd102a91408eee4713be76ef7f51560569c85fc080\": container with ID starting with 4bf1ec8c973fcebc64c9f7fd102a91408eee4713be76ef7f51560569c85fc080 not found: ID does not exist" Jan 11 17:37:25 crc kubenswrapper[4837]: I0111 17:37:25.346828 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/marketplace-operator-79b997595-xb262" event={"ID":"a46aac0a-4b71-4559-9481-499e240587e4","Type":"ContainerStarted","Data":"01cfe4f3d1e52e147ef26a2728c1a85a96bd7197e7ebc3a8d252a7e9b45269c2"} Jan 11 17:37:25 crc kubenswrapper[4837]: I0111 17:37:25.350282 4837 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/marketplace-operator-79b997595-xb262" Jan 11 17:37:25 crc kubenswrapper[4837]: I0111 17:37:25.352462 4837 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/marketplace-operator-79b997595-xb262" Jan 11 17:37:25 crc kubenswrapper[4837]: I0111 17:37:25.368245 4837 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/marketplace-operator-79b997595-xb262" podStartSLOduration=2.368221478 podStartE2EDuration="2.368221478s" podCreationTimestamp="2026-01-11 17:37:23 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-11 17:37:25.364501232 +0000 UTC m=+419.542693938" watchObservedRunningTime="2026-01-11 17:37:25.368221478 +0000 UTC m=+419.546414204" Jan 11 17:37:25 crc kubenswrapper[4837]: I0111 17:37:25.484692 4837 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-cg2qj"] Jan 11 17:37:25 crc kubenswrapper[4837]: E0111 17:37:25.485084 4837 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5a6d8e6a-5e4a-4ff4-92ec-b2bfee9ef821" containerName="registry-server" Jan 11 17:37:25 crc kubenswrapper[4837]: I0111 17:37:25.485125 4837 state_mem.go:107] "Deleted CPUSet assignment" podUID="5a6d8e6a-5e4a-4ff4-92ec-b2bfee9ef821" containerName="registry-server" Jan 11 17:37:25 crc kubenswrapper[4837]: E0111 17:37:25.485155 4837 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="59797bd6-cb69-412d-952b-1673312648e2" containerName="extract-utilities" Jan 11 17:37:25 crc kubenswrapper[4837]: I0111 17:37:25.485171 4837 state_mem.go:107] "Deleted CPUSet assignment" podUID="59797bd6-cb69-412d-952b-1673312648e2" containerName="extract-utilities" Jan 11 17:37:25 crc kubenswrapper[4837]: E0111 17:37:25.485194 4837 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="30409294-8779-48ad-a6e8-36b662f09c0f" containerName="extract-content" Jan 11 17:37:25 crc kubenswrapper[4837]: I0111 17:37:25.485209 4837 state_mem.go:107] "Deleted CPUSet assignment" podUID="30409294-8779-48ad-a6e8-36b662f09c0f" containerName="extract-content" Jan 11 17:37:25 crc kubenswrapper[4837]: E0111 17:37:25.485226 4837 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f9e34a1e-5456-4b26-b347-aa569c5987d5" containerName="marketplace-operator" Jan 11 17:37:25 crc kubenswrapper[4837]: I0111 17:37:25.485239 4837 state_mem.go:107] "Deleted CPUSet assignment" podUID="f9e34a1e-5456-4b26-b347-aa569c5987d5" containerName="marketplace-operator" Jan 11 17:37:25 crc kubenswrapper[4837]: E0111 17:37:25.485255 4837 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1d39ff8b-c79a-46ea-af70-0902ce0ee504" containerName="extract-utilities" Jan 11 17:37:25 crc kubenswrapper[4837]: I0111 17:37:25.485266 4837 state_mem.go:107] "Deleted CPUSet assignment" podUID="1d39ff8b-c79a-46ea-af70-0902ce0ee504" containerName="extract-utilities" Jan 11 17:37:25 crc kubenswrapper[4837]: E0111 17:37:25.485307 4837 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="30409294-8779-48ad-a6e8-36b662f09c0f" containerName="extract-utilities" Jan 11 17:37:25 crc kubenswrapper[4837]: I0111 17:37:25.485323 4837 state_mem.go:107] "Deleted CPUSet assignment" podUID="30409294-8779-48ad-a6e8-36b662f09c0f" containerName="extract-utilities" Jan 11 17:37:25 crc kubenswrapper[4837]: E0111 17:37:25.485361 4837 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1d39ff8b-c79a-46ea-af70-0902ce0ee504" containerName="extract-content" Jan 11 17:37:25 crc kubenswrapper[4837]: I0111 17:37:25.485377 4837 state_mem.go:107] "Deleted CPUSet assignment" podUID="1d39ff8b-c79a-46ea-af70-0902ce0ee504" containerName="extract-content" Jan 11 17:37:25 crc kubenswrapper[4837]: E0111 17:37:25.485399 4837 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="59797bd6-cb69-412d-952b-1673312648e2" containerName="registry-server" Jan 11 17:37:25 crc kubenswrapper[4837]: I0111 17:37:25.485417 4837 state_mem.go:107] "Deleted CPUSet assignment" podUID="59797bd6-cb69-412d-952b-1673312648e2" containerName="registry-server" Jan 11 17:37:25 crc kubenswrapper[4837]: E0111 17:37:25.485439 4837 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f9e34a1e-5456-4b26-b347-aa569c5987d5" containerName="marketplace-operator" Jan 11 17:37:25 crc kubenswrapper[4837]: I0111 17:37:25.485455 4837 state_mem.go:107] "Deleted CPUSet assignment" podUID="f9e34a1e-5456-4b26-b347-aa569c5987d5" containerName="marketplace-operator" Jan 11 17:37:25 crc kubenswrapper[4837]: E0111 17:37:25.485479 4837 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="59797bd6-cb69-412d-952b-1673312648e2" containerName="extract-content" Jan 11 17:37:25 crc kubenswrapper[4837]: I0111 17:37:25.485495 4837 state_mem.go:107] "Deleted CPUSet assignment" podUID="59797bd6-cb69-412d-952b-1673312648e2" containerName="extract-content" Jan 11 17:37:25 crc kubenswrapper[4837]: E0111 17:37:25.485521 4837 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5a6d8e6a-5e4a-4ff4-92ec-b2bfee9ef821" containerName="extract-content" Jan 11 17:37:25 crc kubenswrapper[4837]: I0111 17:37:25.485540 4837 state_mem.go:107] "Deleted CPUSet assignment" podUID="5a6d8e6a-5e4a-4ff4-92ec-b2bfee9ef821" containerName="extract-content" Jan 11 17:37:25 crc kubenswrapper[4837]: E0111 17:37:25.485561 4837 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1d39ff8b-c79a-46ea-af70-0902ce0ee504" containerName="registry-server" Jan 11 17:37:25 crc kubenswrapper[4837]: I0111 17:37:25.485576 4837 state_mem.go:107] "Deleted CPUSet assignment" podUID="1d39ff8b-c79a-46ea-af70-0902ce0ee504" containerName="registry-server" Jan 11 17:37:25 crc kubenswrapper[4837]: E0111 17:37:25.485595 4837 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5a6d8e6a-5e4a-4ff4-92ec-b2bfee9ef821" containerName="extract-utilities" Jan 11 17:37:25 crc kubenswrapper[4837]: I0111 17:37:25.485610 4837 state_mem.go:107] "Deleted CPUSet assignment" podUID="5a6d8e6a-5e4a-4ff4-92ec-b2bfee9ef821" containerName="extract-utilities" Jan 11 17:37:25 crc kubenswrapper[4837]: E0111 17:37:25.485636 4837 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="30409294-8779-48ad-a6e8-36b662f09c0f" containerName="registry-server" Jan 11 17:37:25 crc kubenswrapper[4837]: I0111 17:37:25.485652 4837 state_mem.go:107] "Deleted CPUSet assignment" podUID="30409294-8779-48ad-a6e8-36b662f09c0f" containerName="registry-server" Jan 11 17:37:25 crc kubenswrapper[4837]: I0111 17:37:25.485859 4837 memory_manager.go:354] "RemoveStaleState removing state" podUID="f9e34a1e-5456-4b26-b347-aa569c5987d5" containerName="marketplace-operator" Jan 11 17:37:25 crc kubenswrapper[4837]: I0111 17:37:25.485881 4837 memory_manager.go:354] "RemoveStaleState removing state" podUID="f9e34a1e-5456-4b26-b347-aa569c5987d5" containerName="marketplace-operator" Jan 11 17:37:25 crc kubenswrapper[4837]: I0111 17:37:25.485897 4837 memory_manager.go:354] "RemoveStaleState removing state" podUID="5a6d8e6a-5e4a-4ff4-92ec-b2bfee9ef821" containerName="registry-server" Jan 11 17:37:25 crc kubenswrapper[4837]: I0111 17:37:25.485915 4837 memory_manager.go:354] "RemoveStaleState removing state" podUID="1d39ff8b-c79a-46ea-af70-0902ce0ee504" containerName="registry-server" Jan 11 17:37:25 crc kubenswrapper[4837]: I0111 17:37:25.485934 4837 memory_manager.go:354] "RemoveStaleState removing state" podUID="30409294-8779-48ad-a6e8-36b662f09c0f" containerName="registry-server" Jan 11 17:37:25 crc kubenswrapper[4837]: I0111 17:37:25.485955 4837 memory_manager.go:354] "RemoveStaleState removing state" podUID="59797bd6-cb69-412d-952b-1673312648e2" containerName="registry-server" Jan 11 17:37:25 crc kubenswrapper[4837]: I0111 17:37:25.487220 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-cg2qj" Jan 11 17:37:25 crc kubenswrapper[4837]: I0111 17:37:25.489824 4837 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-cg2qj"] Jan 11 17:37:25 crc kubenswrapper[4837]: I0111 17:37:25.493200 4837 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"redhat-marketplace-dockercfg-x2ctb" Jan 11 17:37:25 crc kubenswrapper[4837]: I0111 17:37:25.640135 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/179b571c-cc46-454a-bf70-652f09e1c934-catalog-content\") pod \"redhat-marketplace-cg2qj\" (UID: \"179b571c-cc46-454a-bf70-652f09e1c934\") " pod="openshift-marketplace/redhat-marketplace-cg2qj" Jan 11 17:37:25 crc kubenswrapper[4837]: I0111 17:37:25.640204 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-t8t7w\" (UniqueName: \"kubernetes.io/projected/179b571c-cc46-454a-bf70-652f09e1c934-kube-api-access-t8t7w\") pod \"redhat-marketplace-cg2qj\" (UID: \"179b571c-cc46-454a-bf70-652f09e1c934\") " pod="openshift-marketplace/redhat-marketplace-cg2qj" Jan 11 17:37:25 crc kubenswrapper[4837]: I0111 17:37:25.640240 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/179b571c-cc46-454a-bf70-652f09e1c934-utilities\") pod \"redhat-marketplace-cg2qj\" (UID: \"179b571c-cc46-454a-bf70-652f09e1c934\") " pod="openshift-marketplace/redhat-marketplace-cg2qj" Jan 11 17:37:25 crc kubenswrapper[4837]: I0111 17:37:25.680226 4837 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-bh2tn"] Jan 11 17:37:25 crc kubenswrapper[4837]: I0111 17:37:25.681765 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-bh2tn" Jan 11 17:37:25 crc kubenswrapper[4837]: I0111 17:37:25.689651 4837 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-bh2tn"] Jan 11 17:37:25 crc kubenswrapper[4837]: I0111 17:37:25.689945 4837 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"certified-operators-dockercfg-4rs5g" Jan 11 17:37:25 crc kubenswrapper[4837]: I0111 17:37:25.741078 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-t8t7w\" (UniqueName: \"kubernetes.io/projected/179b571c-cc46-454a-bf70-652f09e1c934-kube-api-access-t8t7w\") pod \"redhat-marketplace-cg2qj\" (UID: \"179b571c-cc46-454a-bf70-652f09e1c934\") " pod="openshift-marketplace/redhat-marketplace-cg2qj" Jan 11 17:37:25 crc kubenswrapper[4837]: I0111 17:37:25.741129 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/179b571c-cc46-454a-bf70-652f09e1c934-utilities\") pod \"redhat-marketplace-cg2qj\" (UID: \"179b571c-cc46-454a-bf70-652f09e1c934\") " pod="openshift-marketplace/redhat-marketplace-cg2qj" Jan 11 17:37:25 crc kubenswrapper[4837]: I0111 17:37:25.741314 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/179b571c-cc46-454a-bf70-652f09e1c934-catalog-content\") pod \"redhat-marketplace-cg2qj\" (UID: \"179b571c-cc46-454a-bf70-652f09e1c934\") " pod="openshift-marketplace/redhat-marketplace-cg2qj" Jan 11 17:37:25 crc kubenswrapper[4837]: I0111 17:37:25.741577 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/179b571c-cc46-454a-bf70-652f09e1c934-utilities\") pod \"redhat-marketplace-cg2qj\" (UID: \"179b571c-cc46-454a-bf70-652f09e1c934\") " pod="openshift-marketplace/redhat-marketplace-cg2qj" Jan 11 17:37:25 crc kubenswrapper[4837]: I0111 17:37:25.742918 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/179b571c-cc46-454a-bf70-652f09e1c934-catalog-content\") pod \"redhat-marketplace-cg2qj\" (UID: \"179b571c-cc46-454a-bf70-652f09e1c934\") " pod="openshift-marketplace/redhat-marketplace-cg2qj" Jan 11 17:37:25 crc kubenswrapper[4837]: I0111 17:37:25.760510 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-t8t7w\" (UniqueName: \"kubernetes.io/projected/179b571c-cc46-454a-bf70-652f09e1c934-kube-api-access-t8t7w\") pod \"redhat-marketplace-cg2qj\" (UID: \"179b571c-cc46-454a-bf70-652f09e1c934\") " pod="openshift-marketplace/redhat-marketplace-cg2qj" Jan 11 17:37:25 crc kubenswrapper[4837]: I0111 17:37:25.812910 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-cg2qj" Jan 11 17:37:25 crc kubenswrapper[4837]: I0111 17:37:25.842615 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/2efe9489-3447-4e79-b762-935c88a0c3fe-catalog-content\") pod \"certified-operators-bh2tn\" (UID: \"2efe9489-3447-4e79-b762-935c88a0c3fe\") " pod="openshift-marketplace/certified-operators-bh2tn" Jan 11 17:37:25 crc kubenswrapper[4837]: I0111 17:37:25.842817 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/2efe9489-3447-4e79-b762-935c88a0c3fe-utilities\") pod \"certified-operators-bh2tn\" (UID: \"2efe9489-3447-4e79-b762-935c88a0c3fe\") " pod="openshift-marketplace/certified-operators-bh2tn" Jan 11 17:37:25 crc kubenswrapper[4837]: I0111 17:37:25.842954 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-btrn6\" (UniqueName: \"kubernetes.io/projected/2efe9489-3447-4e79-b762-935c88a0c3fe-kube-api-access-btrn6\") pod \"certified-operators-bh2tn\" (UID: \"2efe9489-3447-4e79-b762-935c88a0c3fe\") " pod="openshift-marketplace/certified-operators-bh2tn" Jan 11 17:37:25 crc kubenswrapper[4837]: I0111 17:37:25.944134 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/2efe9489-3447-4e79-b762-935c88a0c3fe-utilities\") pod \"certified-operators-bh2tn\" (UID: \"2efe9489-3447-4e79-b762-935c88a0c3fe\") " pod="openshift-marketplace/certified-operators-bh2tn" Jan 11 17:37:25 crc kubenswrapper[4837]: I0111 17:37:25.944666 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-btrn6\" (UniqueName: \"kubernetes.io/projected/2efe9489-3447-4e79-b762-935c88a0c3fe-kube-api-access-btrn6\") pod \"certified-operators-bh2tn\" (UID: \"2efe9489-3447-4e79-b762-935c88a0c3fe\") " pod="openshift-marketplace/certified-operators-bh2tn" Jan 11 17:37:25 crc kubenswrapper[4837]: I0111 17:37:25.944745 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/2efe9489-3447-4e79-b762-935c88a0c3fe-catalog-content\") pod \"certified-operators-bh2tn\" (UID: \"2efe9489-3447-4e79-b762-935c88a0c3fe\") " pod="openshift-marketplace/certified-operators-bh2tn" Jan 11 17:37:25 crc kubenswrapper[4837]: I0111 17:37:25.945460 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/2efe9489-3447-4e79-b762-935c88a0c3fe-utilities\") pod \"certified-operators-bh2tn\" (UID: \"2efe9489-3447-4e79-b762-935c88a0c3fe\") " pod="openshift-marketplace/certified-operators-bh2tn" Jan 11 17:37:25 crc kubenswrapper[4837]: I0111 17:37:25.945728 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/2efe9489-3447-4e79-b762-935c88a0c3fe-catalog-content\") pod \"certified-operators-bh2tn\" (UID: \"2efe9489-3447-4e79-b762-935c88a0c3fe\") " pod="openshift-marketplace/certified-operators-bh2tn" Jan 11 17:37:25 crc kubenswrapper[4837]: I0111 17:37:25.964392 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-btrn6\" (UniqueName: \"kubernetes.io/projected/2efe9489-3447-4e79-b762-935c88a0c3fe-kube-api-access-btrn6\") pod \"certified-operators-bh2tn\" (UID: \"2efe9489-3447-4e79-b762-935c88a0c3fe\") " pod="openshift-marketplace/certified-operators-bh2tn" Jan 11 17:37:26 crc kubenswrapper[4837]: I0111 17:37:26.009464 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-bh2tn" Jan 11 17:37:26 crc kubenswrapper[4837]: I0111 17:37:26.206448 4837 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-cg2qj"] Jan 11 17:37:26 crc kubenswrapper[4837]: W0111 17:37:26.209513 4837 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod179b571c_cc46_454a_bf70_652f09e1c934.slice/crio-5e11d0f00157bf3470fe2d28b3465665a8d12dee7f686577276eb5159a056996 WatchSource:0}: Error finding container 5e11d0f00157bf3470fe2d28b3465665a8d12dee7f686577276eb5159a056996: Status 404 returned error can't find the container with id 5e11d0f00157bf3470fe2d28b3465665a8d12dee7f686577276eb5159a056996 Jan 11 17:37:26 crc kubenswrapper[4837]: I0111 17:37:26.359374 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-cg2qj" event={"ID":"179b571c-cc46-454a-bf70-652f09e1c934","Type":"ContainerStarted","Data":"5e11d0f00157bf3470fe2d28b3465665a8d12dee7f686577276eb5159a056996"} Jan 11 17:37:26 crc kubenswrapper[4837]: I0111 17:37:26.369309 4837 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1d39ff8b-c79a-46ea-af70-0902ce0ee504" path="/var/lib/kubelet/pods/1d39ff8b-c79a-46ea-af70-0902ce0ee504/volumes" Jan 11 17:37:26 crc kubenswrapper[4837]: I0111 17:37:26.370259 4837 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="59797bd6-cb69-412d-952b-1673312648e2" path="/var/lib/kubelet/pods/59797bd6-cb69-412d-952b-1673312648e2/volumes" Jan 11 17:37:26 crc kubenswrapper[4837]: I0111 17:37:26.371237 4837 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5a6d8e6a-5e4a-4ff4-92ec-b2bfee9ef821" path="/var/lib/kubelet/pods/5a6d8e6a-5e4a-4ff4-92ec-b2bfee9ef821/volumes" Jan 11 17:37:26 crc kubenswrapper[4837]: I0111 17:37:26.372960 4837 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f9e34a1e-5456-4b26-b347-aa569c5987d5" path="/var/lib/kubelet/pods/f9e34a1e-5456-4b26-b347-aa569c5987d5/volumes" Jan 11 17:37:26 crc kubenswrapper[4837]: I0111 17:37:26.392444 4837 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-bh2tn"] Jan 11 17:37:26 crc kubenswrapper[4837]: W0111 17:37:26.399803 4837 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod2efe9489_3447_4e79_b762_935c88a0c3fe.slice/crio-166a704d455b80efd9c21275b299776b3c96163cd0a28b0143e46e93acf6d65a WatchSource:0}: Error finding container 166a704d455b80efd9c21275b299776b3c96163cd0a28b0143e46e93acf6d65a: Status 404 returned error can't find the container with id 166a704d455b80efd9c21275b299776b3c96163cd0a28b0143e46e93acf6d65a Jan 11 17:37:27 crc kubenswrapper[4837]: I0111 17:37:27.365375 4837 generic.go:334] "Generic (PLEG): container finished" podID="179b571c-cc46-454a-bf70-652f09e1c934" containerID="f4699969ba05ea43e92b4ca6c24f6d3aed77d7e0d563d3e9582336d6f24e1697" exitCode=0 Jan 11 17:37:27 crc kubenswrapper[4837]: I0111 17:37:27.365434 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-cg2qj" event={"ID":"179b571c-cc46-454a-bf70-652f09e1c934","Type":"ContainerDied","Data":"f4699969ba05ea43e92b4ca6c24f6d3aed77d7e0d563d3e9582336d6f24e1697"} Jan 11 17:37:27 crc kubenswrapper[4837]: I0111 17:37:27.367809 4837 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Jan 11 17:37:27 crc kubenswrapper[4837]: I0111 17:37:27.369457 4837 generic.go:334] "Generic (PLEG): container finished" podID="2efe9489-3447-4e79-b762-935c88a0c3fe" containerID="e36f706fb4c42f2013f4454ff8fb9b117b97b0576779f53a9dfa7b762215adfd" exitCode=0 Jan 11 17:37:27 crc kubenswrapper[4837]: I0111 17:37:27.369573 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-bh2tn" event={"ID":"2efe9489-3447-4e79-b762-935c88a0c3fe","Type":"ContainerDied","Data":"e36f706fb4c42f2013f4454ff8fb9b117b97b0576779f53a9dfa7b762215adfd"} Jan 11 17:37:27 crc kubenswrapper[4837]: I0111 17:37:27.369635 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-bh2tn" event={"ID":"2efe9489-3447-4e79-b762-935c88a0c3fe","Type":"ContainerStarted","Data":"166a704d455b80efd9c21275b299776b3c96163cd0a28b0143e46e93acf6d65a"} Jan 11 17:37:27 crc kubenswrapper[4837]: I0111 17:37:27.877777 4837 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-lsj24"] Jan 11 17:37:27 crc kubenswrapper[4837]: I0111 17:37:27.879140 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-lsj24" Jan 11 17:37:27 crc kubenswrapper[4837]: I0111 17:37:27.880783 4837 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"community-operators-dockercfg-dmngl" Jan 11 17:37:27 crc kubenswrapper[4837]: I0111 17:37:27.899016 4837 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-lsj24"] Jan 11 17:37:27 crc kubenswrapper[4837]: I0111 17:37:27.970848 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/734801c7-6fa0-4055-a0bd-22b2824d4312-catalog-content\") pod \"community-operators-lsj24\" (UID: \"734801c7-6fa0-4055-a0bd-22b2824d4312\") " pod="openshift-marketplace/community-operators-lsj24" Jan 11 17:37:27 crc kubenswrapper[4837]: I0111 17:37:27.970978 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dlclm\" (UniqueName: \"kubernetes.io/projected/734801c7-6fa0-4055-a0bd-22b2824d4312-kube-api-access-dlclm\") pod \"community-operators-lsj24\" (UID: \"734801c7-6fa0-4055-a0bd-22b2824d4312\") " pod="openshift-marketplace/community-operators-lsj24" Jan 11 17:37:27 crc kubenswrapper[4837]: I0111 17:37:27.971070 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/734801c7-6fa0-4055-a0bd-22b2824d4312-utilities\") pod \"community-operators-lsj24\" (UID: \"734801c7-6fa0-4055-a0bd-22b2824d4312\") " pod="openshift-marketplace/community-operators-lsj24" Jan 11 17:37:28 crc kubenswrapper[4837]: I0111 17:37:28.072484 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dlclm\" (UniqueName: \"kubernetes.io/projected/734801c7-6fa0-4055-a0bd-22b2824d4312-kube-api-access-dlclm\") pod \"community-operators-lsj24\" (UID: \"734801c7-6fa0-4055-a0bd-22b2824d4312\") " pod="openshift-marketplace/community-operators-lsj24" Jan 11 17:37:28 crc kubenswrapper[4837]: I0111 17:37:28.072570 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/734801c7-6fa0-4055-a0bd-22b2824d4312-utilities\") pod \"community-operators-lsj24\" (UID: \"734801c7-6fa0-4055-a0bd-22b2824d4312\") " pod="openshift-marketplace/community-operators-lsj24" Jan 11 17:37:28 crc kubenswrapper[4837]: I0111 17:37:28.072607 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/734801c7-6fa0-4055-a0bd-22b2824d4312-catalog-content\") pod \"community-operators-lsj24\" (UID: \"734801c7-6fa0-4055-a0bd-22b2824d4312\") " pod="openshift-marketplace/community-operators-lsj24" Jan 11 17:37:28 crc kubenswrapper[4837]: I0111 17:37:28.072901 4837 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-gs7kc"] Jan 11 17:37:28 crc kubenswrapper[4837]: I0111 17:37:28.073099 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/734801c7-6fa0-4055-a0bd-22b2824d4312-catalog-content\") pod \"community-operators-lsj24\" (UID: \"734801c7-6fa0-4055-a0bd-22b2824d4312\") " pod="openshift-marketplace/community-operators-lsj24" Jan 11 17:37:28 crc kubenswrapper[4837]: I0111 17:37:28.073306 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/734801c7-6fa0-4055-a0bd-22b2824d4312-utilities\") pod \"community-operators-lsj24\" (UID: \"734801c7-6fa0-4055-a0bd-22b2824d4312\") " pod="openshift-marketplace/community-operators-lsj24" Jan 11 17:37:28 crc kubenswrapper[4837]: I0111 17:37:28.073846 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-gs7kc" Jan 11 17:37:28 crc kubenswrapper[4837]: I0111 17:37:28.075821 4837 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"redhat-operators-dockercfg-ct8rh" Jan 11 17:37:28 crc kubenswrapper[4837]: I0111 17:37:28.086330 4837 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-gs7kc"] Jan 11 17:37:28 crc kubenswrapper[4837]: I0111 17:37:28.098379 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dlclm\" (UniqueName: \"kubernetes.io/projected/734801c7-6fa0-4055-a0bd-22b2824d4312-kube-api-access-dlclm\") pod \"community-operators-lsj24\" (UID: \"734801c7-6fa0-4055-a0bd-22b2824d4312\") " pod="openshift-marketplace/community-operators-lsj24" Jan 11 17:37:28 crc kubenswrapper[4837]: I0111 17:37:28.208248 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-lsj24" Jan 11 17:37:28 crc kubenswrapper[4837]: I0111 17:37:28.275639 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qvns6\" (UniqueName: \"kubernetes.io/projected/2467754a-ee89-4272-886e-bd185cc623a3-kube-api-access-qvns6\") pod \"redhat-operators-gs7kc\" (UID: \"2467754a-ee89-4272-886e-bd185cc623a3\") " pod="openshift-marketplace/redhat-operators-gs7kc" Jan 11 17:37:28 crc kubenswrapper[4837]: I0111 17:37:28.275983 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/2467754a-ee89-4272-886e-bd185cc623a3-utilities\") pod \"redhat-operators-gs7kc\" (UID: \"2467754a-ee89-4272-886e-bd185cc623a3\") " pod="openshift-marketplace/redhat-operators-gs7kc" Jan 11 17:37:28 crc kubenswrapper[4837]: I0111 17:37:28.276211 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/2467754a-ee89-4272-886e-bd185cc623a3-catalog-content\") pod \"redhat-operators-gs7kc\" (UID: \"2467754a-ee89-4272-886e-bd185cc623a3\") " pod="openshift-marketplace/redhat-operators-gs7kc" Jan 11 17:37:28 crc kubenswrapper[4837]: I0111 17:37:28.377361 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/2467754a-ee89-4272-886e-bd185cc623a3-utilities\") pod \"redhat-operators-gs7kc\" (UID: \"2467754a-ee89-4272-886e-bd185cc623a3\") " pod="openshift-marketplace/redhat-operators-gs7kc" Jan 11 17:37:28 crc kubenswrapper[4837]: I0111 17:37:28.377855 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/2467754a-ee89-4272-886e-bd185cc623a3-catalog-content\") pod \"redhat-operators-gs7kc\" (UID: \"2467754a-ee89-4272-886e-bd185cc623a3\") " pod="openshift-marketplace/redhat-operators-gs7kc" Jan 11 17:37:28 crc kubenswrapper[4837]: I0111 17:37:28.377980 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qvns6\" (UniqueName: \"kubernetes.io/projected/2467754a-ee89-4272-886e-bd185cc623a3-kube-api-access-qvns6\") pod \"redhat-operators-gs7kc\" (UID: \"2467754a-ee89-4272-886e-bd185cc623a3\") " pod="openshift-marketplace/redhat-operators-gs7kc" Jan 11 17:37:28 crc kubenswrapper[4837]: I0111 17:37:28.378377 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/2467754a-ee89-4272-886e-bd185cc623a3-utilities\") pod \"redhat-operators-gs7kc\" (UID: \"2467754a-ee89-4272-886e-bd185cc623a3\") " pod="openshift-marketplace/redhat-operators-gs7kc" Jan 11 17:37:28 crc kubenswrapper[4837]: I0111 17:37:28.378435 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/2467754a-ee89-4272-886e-bd185cc623a3-catalog-content\") pod \"redhat-operators-gs7kc\" (UID: \"2467754a-ee89-4272-886e-bd185cc623a3\") " pod="openshift-marketplace/redhat-operators-gs7kc" Jan 11 17:37:28 crc kubenswrapper[4837]: I0111 17:37:28.402048 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qvns6\" (UniqueName: \"kubernetes.io/projected/2467754a-ee89-4272-886e-bd185cc623a3-kube-api-access-qvns6\") pod \"redhat-operators-gs7kc\" (UID: \"2467754a-ee89-4272-886e-bd185cc623a3\") " pod="openshift-marketplace/redhat-operators-gs7kc" Jan 11 17:37:28 crc kubenswrapper[4837]: I0111 17:37:28.509968 4837 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-lsj24"] Jan 11 17:37:28 crc kubenswrapper[4837]: I0111 17:37:28.699591 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-gs7kc" Jan 11 17:37:29 crc kubenswrapper[4837]: I0111 17:37:29.080982 4837 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-gs7kc"] Jan 11 17:37:29 crc kubenswrapper[4837]: I0111 17:37:29.385805 4837 generic.go:334] "Generic (PLEG): container finished" podID="179b571c-cc46-454a-bf70-652f09e1c934" containerID="4573939045fa39236fba982707885ce3d668038c9a40341ecb4e30fa474ba25d" exitCode=0 Jan 11 17:37:29 crc kubenswrapper[4837]: I0111 17:37:29.385930 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-cg2qj" event={"ID":"179b571c-cc46-454a-bf70-652f09e1c934","Type":"ContainerDied","Data":"4573939045fa39236fba982707885ce3d668038c9a40341ecb4e30fa474ba25d"} Jan 11 17:37:29 crc kubenswrapper[4837]: I0111 17:37:29.393273 4837 generic.go:334] "Generic (PLEG): container finished" podID="2467754a-ee89-4272-886e-bd185cc623a3" containerID="f849ded0ee439245bef432bcad3d60f4574113b9b5701d7b1a163eeefb4224ca" exitCode=0 Jan 11 17:37:29 crc kubenswrapper[4837]: I0111 17:37:29.393412 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-gs7kc" event={"ID":"2467754a-ee89-4272-886e-bd185cc623a3","Type":"ContainerDied","Data":"f849ded0ee439245bef432bcad3d60f4574113b9b5701d7b1a163eeefb4224ca"} Jan 11 17:37:29 crc kubenswrapper[4837]: I0111 17:37:29.393498 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-gs7kc" event={"ID":"2467754a-ee89-4272-886e-bd185cc623a3","Type":"ContainerStarted","Data":"8ef81221297cf606f0f82583a59d87690577de2ed746e25bab3bd071bb6262e0"} Jan 11 17:37:29 crc kubenswrapper[4837]: I0111 17:37:29.396554 4837 generic.go:334] "Generic (PLEG): container finished" podID="2efe9489-3447-4e79-b762-935c88a0c3fe" containerID="4f0246c5ef954c0ec90347753f8ad4be42853717b71fc782a0f47898a75fda95" exitCode=0 Jan 11 17:37:29 crc kubenswrapper[4837]: I0111 17:37:29.396595 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-bh2tn" event={"ID":"2efe9489-3447-4e79-b762-935c88a0c3fe","Type":"ContainerDied","Data":"4f0246c5ef954c0ec90347753f8ad4be42853717b71fc782a0f47898a75fda95"} Jan 11 17:37:29 crc kubenswrapper[4837]: I0111 17:37:29.408052 4837 generic.go:334] "Generic (PLEG): container finished" podID="734801c7-6fa0-4055-a0bd-22b2824d4312" containerID="81d99cdbe64dbe37616c679e7dedb9fd4ad5262e8555f9bb8ea9bd9f75ccd951" exitCode=0 Jan 11 17:37:29 crc kubenswrapper[4837]: I0111 17:37:29.408092 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-lsj24" event={"ID":"734801c7-6fa0-4055-a0bd-22b2824d4312","Type":"ContainerDied","Data":"81d99cdbe64dbe37616c679e7dedb9fd4ad5262e8555f9bb8ea9bd9f75ccd951"} Jan 11 17:37:29 crc kubenswrapper[4837]: I0111 17:37:29.408118 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-lsj24" event={"ID":"734801c7-6fa0-4055-a0bd-22b2824d4312","Type":"ContainerStarted","Data":"d21d8e9629862fc185facf779e11fa45d955a345810e7babd230ac88bae18636"} Jan 11 17:37:30 crc kubenswrapper[4837]: I0111 17:37:30.414471 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-cg2qj" event={"ID":"179b571c-cc46-454a-bf70-652f09e1c934","Type":"ContainerStarted","Data":"e57b87d4e0d2c9a7d2da9b7360c431804b716d920945df1eb524defa9e1e213f"} Jan 11 17:37:30 crc kubenswrapper[4837]: I0111 17:37:30.416705 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-gs7kc" event={"ID":"2467754a-ee89-4272-886e-bd185cc623a3","Type":"ContainerStarted","Data":"7b107d0bb71aecd2d3deec5db61e0600a44e092ffcf7162cf0c3b528e8a78c1e"} Jan 11 17:37:30 crc kubenswrapper[4837]: I0111 17:37:30.419723 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-bh2tn" event={"ID":"2efe9489-3447-4e79-b762-935c88a0c3fe","Type":"ContainerStarted","Data":"b356fb00174f85afab7498c13129ea02dea390be3a4bdf7609b065c119198eda"} Jan 11 17:37:30 crc kubenswrapper[4837]: I0111 17:37:30.438250 4837 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-cg2qj" podStartSLOduration=2.988777492 podStartE2EDuration="5.438234468s" podCreationTimestamp="2026-01-11 17:37:25 +0000 UTC" firstStartedPulling="2026-01-11 17:37:27.367548797 +0000 UTC m=+421.545741503" lastFinishedPulling="2026-01-11 17:37:29.817005773 +0000 UTC m=+423.995198479" observedRunningTime="2026-01-11 17:37:30.434566224 +0000 UTC m=+424.612758940" watchObservedRunningTime="2026-01-11 17:37:30.438234468 +0000 UTC m=+424.616427184" Jan 11 17:37:30 crc kubenswrapper[4837]: I0111 17:37:30.455720 4837 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-bh2tn" podStartSLOduration=2.983129532 podStartE2EDuration="5.455698653s" podCreationTimestamp="2026-01-11 17:37:25 +0000 UTC" firstStartedPulling="2026-01-11 17:37:27.371815098 +0000 UTC m=+421.550007804" lastFinishedPulling="2026-01-11 17:37:29.844384219 +0000 UTC m=+424.022576925" observedRunningTime="2026-01-11 17:37:30.449061385 +0000 UTC m=+424.627254091" watchObservedRunningTime="2026-01-11 17:37:30.455698653 +0000 UTC m=+424.633891379" Jan 11 17:37:31 crc kubenswrapper[4837]: I0111 17:37:31.426752 4837 generic.go:334] "Generic (PLEG): container finished" podID="734801c7-6fa0-4055-a0bd-22b2824d4312" containerID="496f041d7ba3bb60ac2f4adf5c058d4c0ce52497744ee8c707e35bb9a849576c" exitCode=0 Jan 11 17:37:31 crc kubenswrapper[4837]: I0111 17:37:31.426839 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-lsj24" event={"ID":"734801c7-6fa0-4055-a0bd-22b2824d4312","Type":"ContainerDied","Data":"496f041d7ba3bb60ac2f4adf5c058d4c0ce52497744ee8c707e35bb9a849576c"} Jan 11 17:37:31 crc kubenswrapper[4837]: I0111 17:37:31.429473 4837 generic.go:334] "Generic (PLEG): container finished" podID="2467754a-ee89-4272-886e-bd185cc623a3" containerID="7b107d0bb71aecd2d3deec5db61e0600a44e092ffcf7162cf0c3b528e8a78c1e" exitCode=0 Jan 11 17:37:31 crc kubenswrapper[4837]: I0111 17:37:31.429649 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-gs7kc" event={"ID":"2467754a-ee89-4272-886e-bd185cc623a3","Type":"ContainerDied","Data":"7b107d0bb71aecd2d3deec5db61e0600a44e092ffcf7162cf0c3b528e8a78c1e"} Jan 11 17:37:32 crc kubenswrapper[4837]: I0111 17:37:32.436130 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-lsj24" event={"ID":"734801c7-6fa0-4055-a0bd-22b2824d4312","Type":"ContainerStarted","Data":"56c93077794083bd8d2cc76848f93b72f14326bd7f7d49edb893773340fe10ff"} Jan 11 17:37:32 crc kubenswrapper[4837]: I0111 17:37:32.438453 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-gs7kc" event={"ID":"2467754a-ee89-4272-886e-bd185cc623a3","Type":"ContainerStarted","Data":"848a285ebee50f3e3c7a15c43f8f85ffb897096cfe981018368e9122ba9a8e8d"} Jan 11 17:37:32 crc kubenswrapper[4837]: I0111 17:37:32.453759 4837 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-lsj24" podStartSLOduration=3.059884085 podStartE2EDuration="5.453747776s" podCreationTimestamp="2026-01-11 17:37:27 +0000 UTC" firstStartedPulling="2026-01-11 17:37:29.411580383 +0000 UTC m=+423.589773089" lastFinishedPulling="2026-01-11 17:37:31.805444074 +0000 UTC m=+425.983636780" observedRunningTime="2026-01-11 17:37:32.452968354 +0000 UTC m=+426.631161090" watchObservedRunningTime="2026-01-11 17:37:32.453747776 +0000 UTC m=+426.631940482" Jan 11 17:37:32 crc kubenswrapper[4837]: I0111 17:37:32.473025 4837 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-gs7kc" podStartSLOduration=2.009113237 podStartE2EDuration="4.473003662s" podCreationTimestamp="2026-01-11 17:37:28 +0000 UTC" firstStartedPulling="2026-01-11 17:37:29.394731086 +0000 UTC m=+423.572923792" lastFinishedPulling="2026-01-11 17:37:31.858621511 +0000 UTC m=+426.036814217" observedRunningTime="2026-01-11 17:37:32.469552045 +0000 UTC m=+426.647744781" watchObservedRunningTime="2026-01-11 17:37:32.473003662 +0000 UTC m=+426.651196388" Jan 11 17:37:35 crc kubenswrapper[4837]: I0111 17:37:35.813603 4837 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-cg2qj" Jan 11 17:37:35 crc kubenswrapper[4837]: I0111 17:37:35.814196 4837 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-cg2qj" Jan 11 17:37:35 crc kubenswrapper[4837]: I0111 17:37:35.868591 4837 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-cg2qj" Jan 11 17:37:36 crc kubenswrapper[4837]: I0111 17:37:36.009879 4837 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-bh2tn" Jan 11 17:37:36 crc kubenswrapper[4837]: I0111 17:37:36.009928 4837 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-bh2tn" Jan 11 17:37:36 crc kubenswrapper[4837]: I0111 17:37:36.043237 4837 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-bh2tn" Jan 11 17:37:36 crc kubenswrapper[4837]: I0111 17:37:36.501538 4837 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-bh2tn" Jan 11 17:37:36 crc kubenswrapper[4837]: I0111 17:37:36.503356 4837 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-cg2qj" Jan 11 17:37:38 crc kubenswrapper[4837]: I0111 17:37:38.208377 4837 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-lsj24" Jan 11 17:37:38 crc kubenswrapper[4837]: I0111 17:37:38.209803 4837 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-lsj24" Jan 11 17:37:38 crc kubenswrapper[4837]: I0111 17:37:38.251771 4837 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-lsj24" Jan 11 17:37:38 crc kubenswrapper[4837]: I0111 17:37:38.504191 4837 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-lsj24" Jan 11 17:37:38 crc kubenswrapper[4837]: I0111 17:37:38.699863 4837 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-gs7kc" Jan 11 17:37:38 crc kubenswrapper[4837]: I0111 17:37:38.700965 4837 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-gs7kc" Jan 11 17:37:38 crc kubenswrapper[4837]: I0111 17:37:38.735916 4837 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-gs7kc" Jan 11 17:37:39 crc kubenswrapper[4837]: I0111 17:37:39.444376 4837 patch_prober.go:28] interesting pod/machine-config-daemon-pqnst container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Jan 11 17:37:39 crc kubenswrapper[4837]: I0111 17:37:39.444432 4837 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-pqnst" podUID="1cf6fa66-290a-4e29-bafc-e60185a22fe2" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Jan 11 17:37:39 crc kubenswrapper[4837]: I0111 17:37:39.518335 4837 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-gs7kc" Jan 11 17:38:09 crc kubenswrapper[4837]: I0111 17:38:09.444586 4837 patch_prober.go:28] interesting pod/machine-config-daemon-pqnst container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Jan 11 17:38:09 crc kubenswrapper[4837]: I0111 17:38:09.445312 4837 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-pqnst" podUID="1cf6fa66-290a-4e29-bafc-e60185a22fe2" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Jan 11 17:38:09 crc kubenswrapper[4837]: I0111 17:38:09.445384 4837 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-pqnst" Jan 11 17:38:09 crc kubenswrapper[4837]: I0111 17:38:09.446223 4837 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"51a674f214ce0ccc55b2fa9005d4dce39df2f19cf7b9f9089590388fd9cdba68"} pod="openshift-machine-config-operator/machine-config-daemon-pqnst" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Jan 11 17:38:09 crc kubenswrapper[4837]: I0111 17:38:09.446322 4837 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-pqnst" podUID="1cf6fa66-290a-4e29-bafc-e60185a22fe2" containerName="machine-config-daemon" containerID="cri-o://51a674f214ce0ccc55b2fa9005d4dce39df2f19cf7b9f9089590388fd9cdba68" gracePeriod=600 Jan 11 17:38:10 crc kubenswrapper[4837]: I0111 17:38:10.654411 4837 generic.go:334] "Generic (PLEG): container finished" podID="1cf6fa66-290a-4e29-bafc-e60185a22fe2" containerID="51a674f214ce0ccc55b2fa9005d4dce39df2f19cf7b9f9089590388fd9cdba68" exitCode=0 Jan 11 17:38:10 crc kubenswrapper[4837]: I0111 17:38:10.654478 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-pqnst" event={"ID":"1cf6fa66-290a-4e29-bafc-e60185a22fe2","Type":"ContainerDied","Data":"51a674f214ce0ccc55b2fa9005d4dce39df2f19cf7b9f9089590388fd9cdba68"} Jan 11 17:38:10 crc kubenswrapper[4837]: I0111 17:38:10.654770 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-pqnst" event={"ID":"1cf6fa66-290a-4e29-bafc-e60185a22fe2","Type":"ContainerStarted","Data":"10f6c1fba8d2ded4e3d8d28a0fb8b27acf8c6a02810295dad13d2e54f622ba5d"} Jan 11 17:38:10 crc kubenswrapper[4837]: I0111 17:38:10.654793 4837 scope.go:117] "RemoveContainer" containerID="ef1d1b5ff926a1f2f0f357177d19214a1c92fbf76f445fb1a767d0d17cd1b4cb" Jan 11 17:40:39 crc kubenswrapper[4837]: I0111 17:40:39.444166 4837 patch_prober.go:28] interesting pod/machine-config-daemon-pqnst container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Jan 11 17:40:39 crc kubenswrapper[4837]: I0111 17:40:39.444805 4837 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-pqnst" podUID="1cf6fa66-290a-4e29-bafc-e60185a22fe2" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Jan 11 17:40:42 crc kubenswrapper[4837]: I0111 17:40:42.778331 4837 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-image-registry/image-registry-66df7c8f76-fpqw8"] Jan 11 17:40:42 crc kubenswrapper[4837]: I0111 17:40:42.779742 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-66df7c8f76-fpqw8" Jan 11 17:40:42 crc kubenswrapper[4837]: I0111 17:40:42.812899 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/931aeea7-90ca-4c0c-947c-21de9982f6aa-bound-sa-token\") pod \"image-registry-66df7c8f76-fpqw8\" (UID: \"931aeea7-90ca-4c0c-947c-21de9982f6aa\") " pod="openshift-image-registry/image-registry-66df7c8f76-fpqw8" Jan 11 17:40:42 crc kubenswrapper[4837]: I0111 17:40:42.812954 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/931aeea7-90ca-4c0c-947c-21de9982f6aa-registry-certificates\") pod \"image-registry-66df7c8f76-fpqw8\" (UID: \"931aeea7-90ca-4c0c-947c-21de9982f6aa\") " pod="openshift-image-registry/image-registry-66df7c8f76-fpqw8" Jan 11 17:40:42 crc kubenswrapper[4837]: I0111 17:40:42.813000 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-66df7c8f76-fpqw8\" (UID: \"931aeea7-90ca-4c0c-947c-21de9982f6aa\") " pod="openshift-image-registry/image-registry-66df7c8f76-fpqw8" Jan 11 17:40:42 crc kubenswrapper[4837]: I0111 17:40:42.813057 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/931aeea7-90ca-4c0c-947c-21de9982f6aa-ca-trust-extracted\") pod \"image-registry-66df7c8f76-fpqw8\" (UID: \"931aeea7-90ca-4c0c-947c-21de9982f6aa\") " pod="openshift-image-registry/image-registry-66df7c8f76-fpqw8" Jan 11 17:40:42 crc kubenswrapper[4837]: I0111 17:40:42.813098 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/931aeea7-90ca-4c0c-947c-21de9982f6aa-trusted-ca\") pod \"image-registry-66df7c8f76-fpqw8\" (UID: \"931aeea7-90ca-4c0c-947c-21de9982f6aa\") " pod="openshift-image-registry/image-registry-66df7c8f76-fpqw8" Jan 11 17:40:42 crc kubenswrapper[4837]: I0111 17:40:42.813119 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/931aeea7-90ca-4c0c-947c-21de9982f6aa-registry-tls\") pod \"image-registry-66df7c8f76-fpqw8\" (UID: \"931aeea7-90ca-4c0c-947c-21de9982f6aa\") " pod="openshift-image-registry/image-registry-66df7c8f76-fpqw8" Jan 11 17:40:42 crc kubenswrapper[4837]: I0111 17:40:42.813141 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-h2hzz\" (UniqueName: \"kubernetes.io/projected/931aeea7-90ca-4c0c-947c-21de9982f6aa-kube-api-access-h2hzz\") pod \"image-registry-66df7c8f76-fpqw8\" (UID: \"931aeea7-90ca-4c0c-947c-21de9982f6aa\") " pod="openshift-image-registry/image-registry-66df7c8f76-fpqw8" Jan 11 17:40:42 crc kubenswrapper[4837]: I0111 17:40:42.813162 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/931aeea7-90ca-4c0c-947c-21de9982f6aa-installation-pull-secrets\") pod \"image-registry-66df7c8f76-fpqw8\" (UID: \"931aeea7-90ca-4c0c-947c-21de9982f6aa\") " pod="openshift-image-registry/image-registry-66df7c8f76-fpqw8" Jan 11 17:40:42 crc kubenswrapper[4837]: I0111 17:40:42.821276 4837 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-registry-66df7c8f76-fpqw8"] Jan 11 17:40:42 crc kubenswrapper[4837]: I0111 17:40:42.853005 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-66df7c8f76-fpqw8\" (UID: \"931aeea7-90ca-4c0c-947c-21de9982f6aa\") " pod="openshift-image-registry/image-registry-66df7c8f76-fpqw8" Jan 11 17:40:42 crc kubenswrapper[4837]: I0111 17:40:42.915047 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/931aeea7-90ca-4c0c-947c-21de9982f6aa-ca-trust-extracted\") pod \"image-registry-66df7c8f76-fpqw8\" (UID: \"931aeea7-90ca-4c0c-947c-21de9982f6aa\") " pod="openshift-image-registry/image-registry-66df7c8f76-fpqw8" Jan 11 17:40:42 crc kubenswrapper[4837]: I0111 17:40:42.915289 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/931aeea7-90ca-4c0c-947c-21de9982f6aa-trusted-ca\") pod \"image-registry-66df7c8f76-fpqw8\" (UID: \"931aeea7-90ca-4c0c-947c-21de9982f6aa\") " pod="openshift-image-registry/image-registry-66df7c8f76-fpqw8" Jan 11 17:40:42 crc kubenswrapper[4837]: I0111 17:40:42.915346 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/931aeea7-90ca-4c0c-947c-21de9982f6aa-registry-tls\") pod \"image-registry-66df7c8f76-fpqw8\" (UID: \"931aeea7-90ca-4c0c-947c-21de9982f6aa\") " pod="openshift-image-registry/image-registry-66df7c8f76-fpqw8" Jan 11 17:40:42 crc kubenswrapper[4837]: I0111 17:40:42.915369 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-h2hzz\" (UniqueName: \"kubernetes.io/projected/931aeea7-90ca-4c0c-947c-21de9982f6aa-kube-api-access-h2hzz\") pod \"image-registry-66df7c8f76-fpqw8\" (UID: \"931aeea7-90ca-4c0c-947c-21de9982f6aa\") " pod="openshift-image-registry/image-registry-66df7c8f76-fpqw8" Jan 11 17:40:42 crc kubenswrapper[4837]: I0111 17:40:42.915390 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/931aeea7-90ca-4c0c-947c-21de9982f6aa-installation-pull-secrets\") pod \"image-registry-66df7c8f76-fpqw8\" (UID: \"931aeea7-90ca-4c0c-947c-21de9982f6aa\") " pod="openshift-image-registry/image-registry-66df7c8f76-fpqw8" Jan 11 17:40:42 crc kubenswrapper[4837]: I0111 17:40:42.915593 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/931aeea7-90ca-4c0c-947c-21de9982f6aa-ca-trust-extracted\") pod \"image-registry-66df7c8f76-fpqw8\" (UID: \"931aeea7-90ca-4c0c-947c-21de9982f6aa\") " pod="openshift-image-registry/image-registry-66df7c8f76-fpqw8" Jan 11 17:40:42 crc kubenswrapper[4837]: I0111 17:40:42.915793 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/931aeea7-90ca-4c0c-947c-21de9982f6aa-bound-sa-token\") pod \"image-registry-66df7c8f76-fpqw8\" (UID: \"931aeea7-90ca-4c0c-947c-21de9982f6aa\") " pod="openshift-image-registry/image-registry-66df7c8f76-fpqw8" Jan 11 17:40:42 crc kubenswrapper[4837]: I0111 17:40:42.916419 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/931aeea7-90ca-4c0c-947c-21de9982f6aa-registry-certificates\") pod \"image-registry-66df7c8f76-fpqw8\" (UID: \"931aeea7-90ca-4c0c-947c-21de9982f6aa\") " pod="openshift-image-registry/image-registry-66df7c8f76-fpqw8" Jan 11 17:40:42 crc kubenswrapper[4837]: I0111 17:40:42.916481 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/931aeea7-90ca-4c0c-947c-21de9982f6aa-trusted-ca\") pod \"image-registry-66df7c8f76-fpqw8\" (UID: \"931aeea7-90ca-4c0c-947c-21de9982f6aa\") " pod="openshift-image-registry/image-registry-66df7c8f76-fpqw8" Jan 11 17:40:42 crc kubenswrapper[4837]: I0111 17:40:42.917560 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/931aeea7-90ca-4c0c-947c-21de9982f6aa-registry-certificates\") pod \"image-registry-66df7c8f76-fpqw8\" (UID: \"931aeea7-90ca-4c0c-947c-21de9982f6aa\") " pod="openshift-image-registry/image-registry-66df7c8f76-fpqw8" Jan 11 17:40:42 crc kubenswrapper[4837]: I0111 17:40:42.920946 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/931aeea7-90ca-4c0c-947c-21de9982f6aa-registry-tls\") pod \"image-registry-66df7c8f76-fpqw8\" (UID: \"931aeea7-90ca-4c0c-947c-21de9982f6aa\") " pod="openshift-image-registry/image-registry-66df7c8f76-fpqw8" Jan 11 17:40:42 crc kubenswrapper[4837]: I0111 17:40:42.921191 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/931aeea7-90ca-4c0c-947c-21de9982f6aa-installation-pull-secrets\") pod \"image-registry-66df7c8f76-fpqw8\" (UID: \"931aeea7-90ca-4c0c-947c-21de9982f6aa\") " pod="openshift-image-registry/image-registry-66df7c8f76-fpqw8" Jan 11 17:40:42 crc kubenswrapper[4837]: I0111 17:40:42.929752 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/931aeea7-90ca-4c0c-947c-21de9982f6aa-bound-sa-token\") pod \"image-registry-66df7c8f76-fpqw8\" (UID: \"931aeea7-90ca-4c0c-947c-21de9982f6aa\") " pod="openshift-image-registry/image-registry-66df7c8f76-fpqw8" Jan 11 17:40:42 crc kubenswrapper[4837]: I0111 17:40:42.933901 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-h2hzz\" (UniqueName: \"kubernetes.io/projected/931aeea7-90ca-4c0c-947c-21de9982f6aa-kube-api-access-h2hzz\") pod \"image-registry-66df7c8f76-fpqw8\" (UID: \"931aeea7-90ca-4c0c-947c-21de9982f6aa\") " pod="openshift-image-registry/image-registry-66df7c8f76-fpqw8" Jan 11 17:40:43 crc kubenswrapper[4837]: I0111 17:40:43.099287 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-66df7c8f76-fpqw8" Jan 11 17:40:43 crc kubenswrapper[4837]: I0111 17:40:43.328453 4837 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-registry-66df7c8f76-fpqw8"] Jan 11 17:40:44 crc kubenswrapper[4837]: I0111 17:40:44.181261 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-66df7c8f76-fpqw8" event={"ID":"931aeea7-90ca-4c0c-947c-21de9982f6aa","Type":"ContainerStarted","Data":"af1cd9d889bf2e7116bdfd0c63121d2bce50fe646e41ed2870e575f5afd70a0a"} Jan 11 17:40:44 crc kubenswrapper[4837]: I0111 17:40:44.181578 4837 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-image-registry/image-registry-66df7c8f76-fpqw8" Jan 11 17:40:44 crc kubenswrapper[4837]: I0111 17:40:44.181600 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-66df7c8f76-fpqw8" event={"ID":"931aeea7-90ca-4c0c-947c-21de9982f6aa","Type":"ContainerStarted","Data":"e33baa4004e4c54105aa5e555086588a09f587440fec48dd8fcd15b8772e12e4"} Jan 11 17:40:44 crc kubenswrapper[4837]: I0111 17:40:44.207828 4837 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/image-registry-66df7c8f76-fpqw8" podStartSLOduration=2.207799172 podStartE2EDuration="2.207799172s" podCreationTimestamp="2026-01-11 17:40:42 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-11 17:40:44.201304578 +0000 UTC m=+618.379497324" watchObservedRunningTime="2026-01-11 17:40:44.207799172 +0000 UTC m=+618.385991928" Jan 11 17:41:03 crc kubenswrapper[4837]: I0111 17:41:03.106923 4837 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-image-registry/image-registry-66df7c8f76-fpqw8" Jan 11 17:41:03 crc kubenswrapper[4837]: I0111 17:41:03.187737 4837 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-image-registry/image-registry-697d97f7c8-4nvrr"] Jan 11 17:41:09 crc kubenswrapper[4837]: I0111 17:41:09.444078 4837 patch_prober.go:28] interesting pod/machine-config-daemon-pqnst container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Jan 11 17:41:09 crc kubenswrapper[4837]: I0111 17:41:09.444453 4837 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-pqnst" podUID="1cf6fa66-290a-4e29-bafc-e60185a22fe2" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Jan 11 17:41:28 crc kubenswrapper[4837]: I0111 17:41:28.229130 4837 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-image-registry/image-registry-697d97f7c8-4nvrr" podUID="bb63c9f5-457d-4c61-8cc6-56690e66a952" containerName="registry" containerID="cri-o://6bc2f1ac34164e766e353d67f51950114a79249b4c0b2598ba338ed5bc4949b1" gracePeriod=30 Jan 11 17:41:28 crc kubenswrapper[4837]: I0111 17:41:28.478249 4837 generic.go:334] "Generic (PLEG): container finished" podID="bb63c9f5-457d-4c61-8cc6-56690e66a952" containerID="6bc2f1ac34164e766e353d67f51950114a79249b4c0b2598ba338ed5bc4949b1" exitCode=0 Jan 11 17:41:28 crc kubenswrapper[4837]: I0111 17:41:28.478310 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-697d97f7c8-4nvrr" event={"ID":"bb63c9f5-457d-4c61-8cc6-56690e66a952","Type":"ContainerDied","Data":"6bc2f1ac34164e766e353d67f51950114a79249b4c0b2598ba338ed5bc4949b1"} Jan 11 17:41:28 crc kubenswrapper[4837]: I0111 17:41:28.654002 4837 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-697d97f7c8-4nvrr" Jan 11 17:41:28 crc kubenswrapper[4837]: I0111 17:41:28.791279 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-r45g8\" (UniqueName: \"kubernetes.io/projected/bb63c9f5-457d-4c61-8cc6-56690e66a952-kube-api-access-r45g8\") pod \"bb63c9f5-457d-4c61-8cc6-56690e66a952\" (UID: \"bb63c9f5-457d-4c61-8cc6-56690e66a952\") " Jan 11 17:41:28 crc kubenswrapper[4837]: I0111 17:41:28.791597 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/bb63c9f5-457d-4c61-8cc6-56690e66a952-trusted-ca\") pod \"bb63c9f5-457d-4c61-8cc6-56690e66a952\" (UID: \"bb63c9f5-457d-4c61-8cc6-56690e66a952\") " Jan 11 17:41:28 crc kubenswrapper[4837]: I0111 17:41:28.791634 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/bb63c9f5-457d-4c61-8cc6-56690e66a952-installation-pull-secrets\") pod \"bb63c9f5-457d-4c61-8cc6-56690e66a952\" (UID: \"bb63c9f5-457d-4c61-8cc6-56690e66a952\") " Jan 11 17:41:28 crc kubenswrapper[4837]: I0111 17:41:28.791664 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/bb63c9f5-457d-4c61-8cc6-56690e66a952-bound-sa-token\") pod \"bb63c9f5-457d-4c61-8cc6-56690e66a952\" (UID: \"bb63c9f5-457d-4c61-8cc6-56690e66a952\") " Jan 11 17:41:28 crc kubenswrapper[4837]: I0111 17:41:28.791699 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/bb63c9f5-457d-4c61-8cc6-56690e66a952-registry-certificates\") pod \"bb63c9f5-457d-4c61-8cc6-56690e66a952\" (UID: \"bb63c9f5-457d-4c61-8cc6-56690e66a952\") " Jan 11 17:41:28 crc kubenswrapper[4837]: I0111 17:41:28.791732 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/bb63c9f5-457d-4c61-8cc6-56690e66a952-registry-tls\") pod \"bb63c9f5-457d-4c61-8cc6-56690e66a952\" (UID: \"bb63c9f5-457d-4c61-8cc6-56690e66a952\") " Jan 11 17:41:28 crc kubenswrapper[4837]: I0111 17:41:28.791866 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"registry-storage\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"bb63c9f5-457d-4c61-8cc6-56690e66a952\" (UID: \"bb63c9f5-457d-4c61-8cc6-56690e66a952\") " Jan 11 17:41:28 crc kubenswrapper[4837]: I0111 17:41:28.791894 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/bb63c9f5-457d-4c61-8cc6-56690e66a952-ca-trust-extracted\") pod \"bb63c9f5-457d-4c61-8cc6-56690e66a952\" (UID: \"bb63c9f5-457d-4c61-8cc6-56690e66a952\") " Jan 11 17:41:28 crc kubenswrapper[4837]: I0111 17:41:28.792838 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/bb63c9f5-457d-4c61-8cc6-56690e66a952-registry-certificates" (OuterVolumeSpecName: "registry-certificates") pod "bb63c9f5-457d-4c61-8cc6-56690e66a952" (UID: "bb63c9f5-457d-4c61-8cc6-56690e66a952"). InnerVolumeSpecName "registry-certificates". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 11 17:41:28 crc kubenswrapper[4837]: I0111 17:41:28.793722 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/bb63c9f5-457d-4c61-8cc6-56690e66a952-trusted-ca" (OuterVolumeSpecName: "trusted-ca") pod "bb63c9f5-457d-4c61-8cc6-56690e66a952" (UID: "bb63c9f5-457d-4c61-8cc6-56690e66a952"). InnerVolumeSpecName "trusted-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 11 17:41:28 crc kubenswrapper[4837]: I0111 17:41:28.802255 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bb63c9f5-457d-4c61-8cc6-56690e66a952-installation-pull-secrets" (OuterVolumeSpecName: "installation-pull-secrets") pod "bb63c9f5-457d-4c61-8cc6-56690e66a952" (UID: "bb63c9f5-457d-4c61-8cc6-56690e66a952"). InnerVolumeSpecName "installation-pull-secrets". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 11 17:41:28 crc kubenswrapper[4837]: I0111 17:41:28.803173 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bb63c9f5-457d-4c61-8cc6-56690e66a952-bound-sa-token" (OuterVolumeSpecName: "bound-sa-token") pod "bb63c9f5-457d-4c61-8cc6-56690e66a952" (UID: "bb63c9f5-457d-4c61-8cc6-56690e66a952"). InnerVolumeSpecName "bound-sa-token". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 11 17:41:28 crc kubenswrapper[4837]: I0111 17:41:28.803746 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bb63c9f5-457d-4c61-8cc6-56690e66a952-kube-api-access-r45g8" (OuterVolumeSpecName: "kube-api-access-r45g8") pod "bb63c9f5-457d-4c61-8cc6-56690e66a952" (UID: "bb63c9f5-457d-4c61-8cc6-56690e66a952"). InnerVolumeSpecName "kube-api-access-r45g8". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 11 17:41:28 crc kubenswrapper[4837]: I0111 17:41:28.804392 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bb63c9f5-457d-4c61-8cc6-56690e66a952-registry-tls" (OuterVolumeSpecName: "registry-tls") pod "bb63c9f5-457d-4c61-8cc6-56690e66a952" (UID: "bb63c9f5-457d-4c61-8cc6-56690e66a952"). InnerVolumeSpecName "registry-tls". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 11 17:41:28 crc kubenswrapper[4837]: I0111 17:41:28.804588 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (OuterVolumeSpecName: "registry-storage") pod "bb63c9f5-457d-4c61-8cc6-56690e66a952" (UID: "bb63c9f5-457d-4c61-8cc6-56690e66a952"). InnerVolumeSpecName "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8". PluginName "kubernetes.io/csi", VolumeGidValue "" Jan 11 17:41:28 crc kubenswrapper[4837]: I0111 17:41:28.809635 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/bb63c9f5-457d-4c61-8cc6-56690e66a952-ca-trust-extracted" (OuterVolumeSpecName: "ca-trust-extracted") pod "bb63c9f5-457d-4c61-8cc6-56690e66a952" (UID: "bb63c9f5-457d-4c61-8cc6-56690e66a952"). InnerVolumeSpecName "ca-trust-extracted". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 11 17:41:28 crc kubenswrapper[4837]: I0111 17:41:28.893748 4837 reconciler_common.go:293] "Volume detached for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/bb63c9f5-457d-4c61-8cc6-56690e66a952-ca-trust-extracted\") on node \"crc\" DevicePath \"\"" Jan 11 17:41:28 crc kubenswrapper[4837]: I0111 17:41:28.893784 4837 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-r45g8\" (UniqueName: \"kubernetes.io/projected/bb63c9f5-457d-4c61-8cc6-56690e66a952-kube-api-access-r45g8\") on node \"crc\" DevicePath \"\"" Jan 11 17:41:28 crc kubenswrapper[4837]: I0111 17:41:28.893799 4837 reconciler_common.go:293] "Volume detached for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/bb63c9f5-457d-4c61-8cc6-56690e66a952-trusted-ca\") on node \"crc\" DevicePath \"\"" Jan 11 17:41:28 crc kubenswrapper[4837]: I0111 17:41:28.893808 4837 reconciler_common.go:293] "Volume detached for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/bb63c9f5-457d-4c61-8cc6-56690e66a952-installation-pull-secrets\") on node \"crc\" DevicePath \"\"" Jan 11 17:41:28 crc kubenswrapper[4837]: I0111 17:41:28.893816 4837 reconciler_common.go:293] "Volume detached for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/bb63c9f5-457d-4c61-8cc6-56690e66a952-bound-sa-token\") on node \"crc\" DevicePath \"\"" Jan 11 17:41:28 crc kubenswrapper[4837]: I0111 17:41:28.893824 4837 reconciler_common.go:293] "Volume detached for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/bb63c9f5-457d-4c61-8cc6-56690e66a952-registry-certificates\") on node \"crc\" DevicePath \"\"" Jan 11 17:41:28 crc kubenswrapper[4837]: I0111 17:41:28.893834 4837 reconciler_common.go:293] "Volume detached for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/bb63c9f5-457d-4c61-8cc6-56690e66a952-registry-tls\") on node \"crc\" DevicePath \"\"" Jan 11 17:41:29 crc kubenswrapper[4837]: I0111 17:41:29.489785 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-697d97f7c8-4nvrr" event={"ID":"bb63c9f5-457d-4c61-8cc6-56690e66a952","Type":"ContainerDied","Data":"e5206d4c4817d15db4f3a3b1c39591e603914dd0b75a0186d4159d621201c5aa"} Jan 11 17:41:29 crc kubenswrapper[4837]: I0111 17:41:29.490051 4837 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-697d97f7c8-4nvrr" Jan 11 17:41:29 crc kubenswrapper[4837]: I0111 17:41:29.490058 4837 scope.go:117] "RemoveContainer" containerID="6bc2f1ac34164e766e353d67f51950114a79249b4c0b2598ba338ed5bc4949b1" Jan 11 17:41:29 crc kubenswrapper[4837]: I0111 17:41:29.576861 4837 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-image-registry/image-registry-697d97f7c8-4nvrr"] Jan 11 17:41:29 crc kubenswrapper[4837]: I0111 17:41:29.581930 4837 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-image-registry/image-registry-697d97f7c8-4nvrr"] Jan 11 17:41:30 crc kubenswrapper[4837]: I0111 17:41:30.378383 4837 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="bb63c9f5-457d-4c61-8cc6-56690e66a952" path="/var/lib/kubelet/pods/bb63c9f5-457d-4c61-8cc6-56690e66a952/volumes" Jan 11 17:41:39 crc kubenswrapper[4837]: I0111 17:41:39.444582 4837 patch_prober.go:28] interesting pod/machine-config-daemon-pqnst container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Jan 11 17:41:39 crc kubenswrapper[4837]: I0111 17:41:39.445249 4837 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-pqnst" podUID="1cf6fa66-290a-4e29-bafc-e60185a22fe2" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Jan 11 17:41:39 crc kubenswrapper[4837]: I0111 17:41:39.445313 4837 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-pqnst" Jan 11 17:41:39 crc kubenswrapper[4837]: I0111 17:41:39.446150 4837 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"10f6c1fba8d2ded4e3d8d28a0fb8b27acf8c6a02810295dad13d2e54f622ba5d"} pod="openshift-machine-config-operator/machine-config-daemon-pqnst" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Jan 11 17:41:39 crc kubenswrapper[4837]: I0111 17:41:39.446245 4837 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-pqnst" podUID="1cf6fa66-290a-4e29-bafc-e60185a22fe2" containerName="machine-config-daemon" containerID="cri-o://10f6c1fba8d2ded4e3d8d28a0fb8b27acf8c6a02810295dad13d2e54f622ba5d" gracePeriod=600 Jan 11 17:41:40 crc kubenswrapper[4837]: I0111 17:41:40.555645 4837 generic.go:334] "Generic (PLEG): container finished" podID="1cf6fa66-290a-4e29-bafc-e60185a22fe2" containerID="10f6c1fba8d2ded4e3d8d28a0fb8b27acf8c6a02810295dad13d2e54f622ba5d" exitCode=0 Jan 11 17:41:40 crc kubenswrapper[4837]: I0111 17:41:40.555712 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-pqnst" event={"ID":"1cf6fa66-290a-4e29-bafc-e60185a22fe2","Type":"ContainerDied","Data":"10f6c1fba8d2ded4e3d8d28a0fb8b27acf8c6a02810295dad13d2e54f622ba5d"} Jan 11 17:41:40 crc kubenswrapper[4837]: I0111 17:41:40.556262 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-pqnst" event={"ID":"1cf6fa66-290a-4e29-bafc-e60185a22fe2","Type":"ContainerStarted","Data":"6987b5d3ba894eab85f4ec92caa53aa44c7a42f0a8df2d9713c40a8de9354658"} Jan 11 17:41:40 crc kubenswrapper[4837]: I0111 17:41:40.556284 4837 scope.go:117] "RemoveContainer" containerID="51a674f214ce0ccc55b2fa9005d4dce39df2f19cf7b9f9089590388fd9cdba68" Jan 11 17:42:04 crc kubenswrapper[4837]: I0111 17:42:04.562496 4837 dynamic_cafile_content.go:123] "Loaded a new CA Bundle and Verifier" name="client-ca-bundle::/etc/kubernetes/kubelet-ca.crt" Jan 11 17:42:27 crc kubenswrapper[4837]: I0111 17:42:27.144511 4837 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["cert-manager/cert-manager-webhook-687f57d79b-ssj7v"] Jan 11 17:42:27 crc kubenswrapper[4837]: E0111 17:42:27.145241 4837 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="bb63c9f5-457d-4c61-8cc6-56690e66a952" containerName="registry" Jan 11 17:42:27 crc kubenswrapper[4837]: I0111 17:42:27.145259 4837 state_mem.go:107] "Deleted CPUSet assignment" podUID="bb63c9f5-457d-4c61-8cc6-56690e66a952" containerName="registry" Jan 11 17:42:27 crc kubenswrapper[4837]: I0111 17:42:27.145374 4837 memory_manager.go:354] "RemoveStaleState removing state" podUID="bb63c9f5-457d-4c61-8cc6-56690e66a952" containerName="registry" Jan 11 17:42:27 crc kubenswrapper[4837]: I0111 17:42:27.145839 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-webhook-687f57d79b-ssj7v" Jan 11 17:42:27 crc kubenswrapper[4837]: I0111 17:42:27.150381 4837 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["cert-manager/cert-manager-cainjector-cf98fcc89-j7rbn"] Jan 11 17:42:27 crc kubenswrapper[4837]: I0111 17:42:27.150884 4837 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["cert-manager/cert-manager-858654f9db-2f444"] Jan 11 17:42:27 crc kubenswrapper[4837]: I0111 17:42:27.150972 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-cainjector-cf98fcc89-j7rbn" Jan 11 17:42:27 crc kubenswrapper[4837]: I0111 17:42:27.151852 4837 reflector.go:368] Caches populated for *v1.Secret from object-"cert-manager"/"cert-manager-webhook-dockercfg-6znwq" Jan 11 17:42:27 crc kubenswrapper[4837]: I0111 17:42:27.151882 4837 reflector.go:368] Caches populated for *v1.ConfigMap from object-"cert-manager"/"kube-root-ca.crt" Jan 11 17:42:27 crc kubenswrapper[4837]: I0111 17:42:27.151991 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-858654f9db-2f444" Jan 11 17:42:27 crc kubenswrapper[4837]: I0111 17:42:27.151996 4837 reflector.go:368] Caches populated for *v1.ConfigMap from object-"cert-manager"/"openshift-service-ca.crt" Jan 11 17:42:27 crc kubenswrapper[4837]: I0111 17:42:27.152197 4837 reflector.go:368] Caches populated for *v1.Secret from object-"cert-manager"/"cert-manager-cainjector-dockercfg-5t9nn" Jan 11 17:42:27 crc kubenswrapper[4837]: I0111 17:42:27.164141 4837 reflector.go:368] Caches populated for *v1.Secret from object-"cert-manager"/"cert-manager-dockercfg-rxqrl" Jan 11 17:42:27 crc kubenswrapper[4837]: I0111 17:42:27.171554 4837 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-webhook-687f57d79b-ssj7v"] Jan 11 17:42:27 crc kubenswrapper[4837]: I0111 17:42:27.200107 4837 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-cainjector-cf98fcc89-j7rbn"] Jan 11 17:42:27 crc kubenswrapper[4837]: I0111 17:42:27.203215 4837 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-858654f9db-2f444"] Jan 11 17:42:27 crc kubenswrapper[4837]: I0111 17:42:27.242133 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ch2q8\" (UniqueName: \"kubernetes.io/projected/5bf7e751-4059-4025-b610-732ec84bda0d-kube-api-access-ch2q8\") pod \"cert-manager-webhook-687f57d79b-ssj7v\" (UID: \"5bf7e751-4059-4025-b610-732ec84bda0d\") " pod="cert-manager/cert-manager-webhook-687f57d79b-ssj7v" Jan 11 17:42:27 crc kubenswrapper[4837]: I0111 17:42:27.242461 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bkfcd\" (UniqueName: \"kubernetes.io/projected/77732f18-1dd2-475e-9d27-69cf1f66df7d-kube-api-access-bkfcd\") pod \"cert-manager-cainjector-cf98fcc89-j7rbn\" (UID: \"77732f18-1dd2-475e-9d27-69cf1f66df7d\") " pod="cert-manager/cert-manager-cainjector-cf98fcc89-j7rbn" Jan 11 17:42:27 crc kubenswrapper[4837]: I0111 17:42:27.344380 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bkfcd\" (UniqueName: \"kubernetes.io/projected/77732f18-1dd2-475e-9d27-69cf1f66df7d-kube-api-access-bkfcd\") pod \"cert-manager-cainjector-cf98fcc89-j7rbn\" (UID: \"77732f18-1dd2-475e-9d27-69cf1f66df7d\") " pod="cert-manager/cert-manager-cainjector-cf98fcc89-j7rbn" Jan 11 17:42:27 crc kubenswrapper[4837]: I0111 17:42:27.344434 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ch2q8\" (UniqueName: \"kubernetes.io/projected/5bf7e751-4059-4025-b610-732ec84bda0d-kube-api-access-ch2q8\") pod \"cert-manager-webhook-687f57d79b-ssj7v\" (UID: \"5bf7e751-4059-4025-b610-732ec84bda0d\") " pod="cert-manager/cert-manager-webhook-687f57d79b-ssj7v" Jan 11 17:42:27 crc kubenswrapper[4837]: I0111 17:42:27.344478 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kfmw7\" (UniqueName: \"kubernetes.io/projected/469b992e-fb84-479b-8ec6-5c6490e9daf5-kube-api-access-kfmw7\") pod \"cert-manager-858654f9db-2f444\" (UID: \"469b992e-fb84-479b-8ec6-5c6490e9daf5\") " pod="cert-manager/cert-manager-858654f9db-2f444" Jan 11 17:42:27 crc kubenswrapper[4837]: I0111 17:42:27.365052 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bkfcd\" (UniqueName: \"kubernetes.io/projected/77732f18-1dd2-475e-9d27-69cf1f66df7d-kube-api-access-bkfcd\") pod \"cert-manager-cainjector-cf98fcc89-j7rbn\" (UID: \"77732f18-1dd2-475e-9d27-69cf1f66df7d\") " pod="cert-manager/cert-manager-cainjector-cf98fcc89-j7rbn" Jan 11 17:42:27 crc kubenswrapper[4837]: I0111 17:42:27.370809 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ch2q8\" (UniqueName: \"kubernetes.io/projected/5bf7e751-4059-4025-b610-732ec84bda0d-kube-api-access-ch2q8\") pod \"cert-manager-webhook-687f57d79b-ssj7v\" (UID: \"5bf7e751-4059-4025-b610-732ec84bda0d\") " pod="cert-manager/cert-manager-webhook-687f57d79b-ssj7v" Jan 11 17:42:27 crc kubenswrapper[4837]: I0111 17:42:27.445949 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kfmw7\" (UniqueName: \"kubernetes.io/projected/469b992e-fb84-479b-8ec6-5c6490e9daf5-kube-api-access-kfmw7\") pod \"cert-manager-858654f9db-2f444\" (UID: \"469b992e-fb84-479b-8ec6-5c6490e9daf5\") " pod="cert-manager/cert-manager-858654f9db-2f444" Jan 11 17:42:27 crc kubenswrapper[4837]: I0111 17:42:27.466245 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-kfmw7\" (UniqueName: \"kubernetes.io/projected/469b992e-fb84-479b-8ec6-5c6490e9daf5-kube-api-access-kfmw7\") pod \"cert-manager-858654f9db-2f444\" (UID: \"469b992e-fb84-479b-8ec6-5c6490e9daf5\") " pod="cert-manager/cert-manager-858654f9db-2f444" Jan 11 17:42:27 crc kubenswrapper[4837]: I0111 17:42:27.500501 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-webhook-687f57d79b-ssj7v" Jan 11 17:42:27 crc kubenswrapper[4837]: I0111 17:42:27.508535 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-cainjector-cf98fcc89-j7rbn" Jan 11 17:42:27 crc kubenswrapper[4837]: I0111 17:42:27.515665 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-858654f9db-2f444" Jan 11 17:42:27 crc kubenswrapper[4837]: I0111 17:42:27.700935 4837 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-cainjector-cf98fcc89-j7rbn"] Jan 11 17:42:27 crc kubenswrapper[4837]: I0111 17:42:27.707991 4837 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Jan 11 17:42:27 crc kubenswrapper[4837]: I0111 17:42:27.781571 4837 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-858654f9db-2f444"] Jan 11 17:42:27 crc kubenswrapper[4837]: W0111 17:42:27.786794 4837 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod469b992e_fb84_479b_8ec6_5c6490e9daf5.slice/crio-afb8182f763fe4a2d16a8424641477b436cb38ed5a7918ebf84020fa76357421 WatchSource:0}: Error finding container afb8182f763fe4a2d16a8424641477b436cb38ed5a7918ebf84020fa76357421: Status 404 returned error can't find the container with id afb8182f763fe4a2d16a8424641477b436cb38ed5a7918ebf84020fa76357421 Jan 11 17:42:27 crc kubenswrapper[4837]: I0111 17:42:27.830625 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-cainjector-cf98fcc89-j7rbn" event={"ID":"77732f18-1dd2-475e-9d27-69cf1f66df7d","Type":"ContainerStarted","Data":"c5849356991a3441c64f3c70238fc7da4cc8bfe3fd4add45d6b660ada2de5cec"} Jan 11 17:42:27 crc kubenswrapper[4837]: I0111 17:42:27.832223 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-858654f9db-2f444" event={"ID":"469b992e-fb84-479b-8ec6-5c6490e9daf5","Type":"ContainerStarted","Data":"afb8182f763fe4a2d16a8424641477b436cb38ed5a7918ebf84020fa76357421"} Jan 11 17:42:27 crc kubenswrapper[4837]: I0111 17:42:27.951979 4837 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-webhook-687f57d79b-ssj7v"] Jan 11 17:42:27 crc kubenswrapper[4837]: W0111 17:42:27.956096 4837 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod5bf7e751_4059_4025_b610_732ec84bda0d.slice/crio-983ab738b1fbcb9bd41a807788bb583843b130c38a041f9d0217633c32feeeed WatchSource:0}: Error finding container 983ab738b1fbcb9bd41a807788bb583843b130c38a041f9d0217633c32feeeed: Status 404 returned error can't find the container with id 983ab738b1fbcb9bd41a807788bb583843b130c38a041f9d0217633c32feeeed Jan 11 17:42:28 crc kubenswrapper[4837]: I0111 17:42:28.838809 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-webhook-687f57d79b-ssj7v" event={"ID":"5bf7e751-4059-4025-b610-732ec84bda0d","Type":"ContainerStarted","Data":"983ab738b1fbcb9bd41a807788bb583843b130c38a041f9d0217633c32feeeed"} Jan 11 17:42:37 crc kubenswrapper[4837]: I0111 17:42:37.160566 4837 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-ovn-kubernetes/ovnkube-node-b7lgc"] Jan 11 17:42:37 crc kubenswrapper[4837]: I0111 17:42:37.161552 4837 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-b7lgc" podUID="e1452749-ce38-41f8-89dd-4b567f2a3250" containerName="ovn-controller" containerID="cri-o://7d938cb09523badf72bb434ce00ef1964af73cabdf0c3652e5dc3bab6d25a703" gracePeriod=30 Jan 11 17:42:37 crc kubenswrapper[4837]: I0111 17:42:37.161626 4837 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-b7lgc" podUID="e1452749-ce38-41f8-89dd-4b567f2a3250" containerName="nbdb" containerID="cri-o://878ba91885890209a4ad7ed46aaf8610fa1fef6202d8ffc3adf7f527c34d1d18" gracePeriod=30 Jan 11 17:42:37 crc kubenswrapper[4837]: I0111 17:42:37.161744 4837 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-b7lgc" podUID="e1452749-ce38-41f8-89dd-4b567f2a3250" containerName="kube-rbac-proxy-node" containerID="cri-o://2f7c987cd6bac7b28de3c49c8249b4980de62aab3ea9a3b907df9fbeb0309301" gracePeriod=30 Jan 11 17:42:37 crc kubenswrapper[4837]: I0111 17:42:37.161732 4837 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-b7lgc" podUID="e1452749-ce38-41f8-89dd-4b567f2a3250" containerName="kube-rbac-proxy-ovn-metrics" containerID="cri-o://7a19d365a2ce792e7e7bc6982d59fd8d21a540943aa7768d919ba498bd627c1e" gracePeriod=30 Jan 11 17:42:37 crc kubenswrapper[4837]: I0111 17:42:37.161771 4837 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-b7lgc" podUID="e1452749-ce38-41f8-89dd-4b567f2a3250" containerName="ovn-acl-logging" containerID="cri-o://b0d6c2ffdd65ea69d6d830257cf325f8e6b397c8b50a7a55742fed53fe655541" gracePeriod=30 Jan 11 17:42:37 crc kubenswrapper[4837]: I0111 17:42:37.162236 4837 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-b7lgc" podUID="e1452749-ce38-41f8-89dd-4b567f2a3250" containerName="sbdb" containerID="cri-o://6015de498ef0eae3d4666f7c48ba95f62f9de0c9a9a797e1b23edb1c48e5562d" gracePeriod=30 Jan 11 17:42:37 crc kubenswrapper[4837]: I0111 17:42:37.161701 4837 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-b7lgc" podUID="e1452749-ce38-41f8-89dd-4b567f2a3250" containerName="northd" containerID="cri-o://d6e485df1c1cc7ceb0e56aba342245806d7ec935e617dbb546bbc6717f65fed0" gracePeriod=30 Jan 11 17:42:37 crc kubenswrapper[4837]: I0111 17:42:37.195517 4837 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-b7lgc" podUID="e1452749-ce38-41f8-89dd-4b567f2a3250" containerName="ovnkube-controller" containerID="cri-o://9292b0d8221063fb8d3780811566b26c880419dfe3f21d500094bf550f6249ff" gracePeriod=30 Jan 11 17:42:37 crc kubenswrapper[4837]: I0111 17:42:37.901503 4837 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-b7lgc_e1452749-ce38-41f8-89dd-4b567f2a3250/ovn-acl-logging/0.log" Jan 11 17:42:37 crc kubenswrapper[4837]: I0111 17:42:37.902397 4837 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-b7lgc_e1452749-ce38-41f8-89dd-4b567f2a3250/ovn-controller/0.log" Jan 11 17:42:37 crc kubenswrapper[4837]: I0111 17:42:37.902883 4837 generic.go:334] "Generic (PLEG): container finished" podID="e1452749-ce38-41f8-89dd-4b567f2a3250" containerID="9292b0d8221063fb8d3780811566b26c880419dfe3f21d500094bf550f6249ff" exitCode=0 Jan 11 17:42:37 crc kubenswrapper[4837]: I0111 17:42:37.902906 4837 generic.go:334] "Generic (PLEG): container finished" podID="e1452749-ce38-41f8-89dd-4b567f2a3250" containerID="6015de498ef0eae3d4666f7c48ba95f62f9de0c9a9a797e1b23edb1c48e5562d" exitCode=0 Jan 11 17:42:37 crc kubenswrapper[4837]: I0111 17:42:37.902914 4837 generic.go:334] "Generic (PLEG): container finished" podID="e1452749-ce38-41f8-89dd-4b567f2a3250" containerID="878ba91885890209a4ad7ed46aaf8610fa1fef6202d8ffc3adf7f527c34d1d18" exitCode=0 Jan 11 17:42:37 crc kubenswrapper[4837]: I0111 17:42:37.902923 4837 generic.go:334] "Generic (PLEG): container finished" podID="e1452749-ce38-41f8-89dd-4b567f2a3250" containerID="d6e485df1c1cc7ceb0e56aba342245806d7ec935e617dbb546bbc6717f65fed0" exitCode=0 Jan 11 17:42:37 crc kubenswrapper[4837]: I0111 17:42:37.902932 4837 generic.go:334] "Generic (PLEG): container finished" podID="e1452749-ce38-41f8-89dd-4b567f2a3250" containerID="7a19d365a2ce792e7e7bc6982d59fd8d21a540943aa7768d919ba498bd627c1e" exitCode=0 Jan 11 17:42:37 crc kubenswrapper[4837]: I0111 17:42:37.902940 4837 generic.go:334] "Generic (PLEG): container finished" podID="e1452749-ce38-41f8-89dd-4b567f2a3250" containerID="2f7c987cd6bac7b28de3c49c8249b4980de62aab3ea9a3b907df9fbeb0309301" exitCode=0 Jan 11 17:42:37 crc kubenswrapper[4837]: I0111 17:42:37.902947 4837 generic.go:334] "Generic (PLEG): container finished" podID="e1452749-ce38-41f8-89dd-4b567f2a3250" containerID="b0d6c2ffdd65ea69d6d830257cf325f8e6b397c8b50a7a55742fed53fe655541" exitCode=143 Jan 11 17:42:37 crc kubenswrapper[4837]: I0111 17:42:37.902955 4837 generic.go:334] "Generic (PLEG): container finished" podID="e1452749-ce38-41f8-89dd-4b567f2a3250" containerID="7d938cb09523badf72bb434ce00ef1964af73cabdf0c3652e5dc3bab6d25a703" exitCode=143 Jan 11 17:42:37 crc kubenswrapper[4837]: I0111 17:42:37.902947 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-b7lgc" event={"ID":"e1452749-ce38-41f8-89dd-4b567f2a3250","Type":"ContainerDied","Data":"9292b0d8221063fb8d3780811566b26c880419dfe3f21d500094bf550f6249ff"} Jan 11 17:42:37 crc kubenswrapper[4837]: I0111 17:42:37.903063 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-b7lgc" event={"ID":"e1452749-ce38-41f8-89dd-4b567f2a3250","Type":"ContainerDied","Data":"6015de498ef0eae3d4666f7c48ba95f62f9de0c9a9a797e1b23edb1c48e5562d"} Jan 11 17:42:37 crc kubenswrapper[4837]: I0111 17:42:37.903078 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-b7lgc" event={"ID":"e1452749-ce38-41f8-89dd-4b567f2a3250","Type":"ContainerDied","Data":"878ba91885890209a4ad7ed46aaf8610fa1fef6202d8ffc3adf7f527c34d1d18"} Jan 11 17:42:37 crc kubenswrapper[4837]: I0111 17:42:37.903088 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-b7lgc" event={"ID":"e1452749-ce38-41f8-89dd-4b567f2a3250","Type":"ContainerDied","Data":"d6e485df1c1cc7ceb0e56aba342245806d7ec935e617dbb546bbc6717f65fed0"} Jan 11 17:42:37 crc kubenswrapper[4837]: I0111 17:42:37.903097 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-b7lgc" event={"ID":"e1452749-ce38-41f8-89dd-4b567f2a3250","Type":"ContainerDied","Data":"7a19d365a2ce792e7e7bc6982d59fd8d21a540943aa7768d919ba498bd627c1e"} Jan 11 17:42:37 crc kubenswrapper[4837]: I0111 17:42:37.903107 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-b7lgc" event={"ID":"e1452749-ce38-41f8-89dd-4b567f2a3250","Type":"ContainerDied","Data":"2f7c987cd6bac7b28de3c49c8249b4980de62aab3ea9a3b907df9fbeb0309301"} Jan 11 17:42:37 crc kubenswrapper[4837]: I0111 17:42:37.903115 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-b7lgc" event={"ID":"e1452749-ce38-41f8-89dd-4b567f2a3250","Type":"ContainerDied","Data":"b0d6c2ffdd65ea69d6d830257cf325f8e6b397c8b50a7a55742fed53fe655541"} Jan 11 17:42:37 crc kubenswrapper[4837]: I0111 17:42:37.903123 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-b7lgc" event={"ID":"e1452749-ce38-41f8-89dd-4b567f2a3250","Type":"ContainerDied","Data":"7d938cb09523badf72bb434ce00ef1964af73cabdf0c3652e5dc3bab6d25a703"} Jan 11 17:42:37 crc kubenswrapper[4837]: I0111 17:42:37.905090 4837 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-v5bf5_78cc7c3f-09f5-4200-a647-8fa4e9b2aae5/kube-multus/0.log" Jan 11 17:42:37 crc kubenswrapper[4837]: I0111 17:42:37.905161 4837 generic.go:334] "Generic (PLEG): container finished" podID="78cc7c3f-09f5-4200-a647-8fa4e9b2aae5" containerID="66c1f7c32c3134fb7c0af16ffb4d62add39d4be549cf3bc7a575e94dd9352cc2" exitCode=2 Jan 11 17:42:37 crc kubenswrapper[4837]: I0111 17:42:37.905209 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-v5bf5" event={"ID":"78cc7c3f-09f5-4200-a647-8fa4e9b2aae5","Type":"ContainerDied","Data":"66c1f7c32c3134fb7c0af16ffb4d62add39d4be549cf3bc7a575e94dd9352cc2"} Jan 11 17:42:37 crc kubenswrapper[4837]: I0111 17:42:37.905964 4837 scope.go:117] "RemoveContainer" containerID="66c1f7c32c3134fb7c0af16ffb4d62add39d4be549cf3bc7a575e94dd9352cc2" Jan 11 17:42:38 crc kubenswrapper[4837]: I0111 17:42:38.170730 4837 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-b7lgc_e1452749-ce38-41f8-89dd-4b567f2a3250/ovn-acl-logging/0.log" Jan 11 17:42:38 crc kubenswrapper[4837]: I0111 17:42:38.171828 4837 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-b7lgc_e1452749-ce38-41f8-89dd-4b567f2a3250/ovn-controller/0.log" Jan 11 17:42:38 crc kubenswrapper[4837]: I0111 17:42:38.172439 4837 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-b7lgc" Jan 11 17:42:38 crc kubenswrapper[4837]: I0111 17:42:38.201895 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/e1452749-ce38-41f8-89dd-4b567f2a3250-run-systemd\") pod \"e1452749-ce38-41f8-89dd-4b567f2a3250\" (UID: \"e1452749-ce38-41f8-89dd-4b567f2a3250\") " Jan 11 17:42:38 crc kubenswrapper[4837]: I0111 17:42:38.201964 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6f6rv\" (UniqueName: \"kubernetes.io/projected/e1452749-ce38-41f8-89dd-4b567f2a3250-kube-api-access-6f6rv\") pod \"e1452749-ce38-41f8-89dd-4b567f2a3250\" (UID: \"e1452749-ce38-41f8-89dd-4b567f2a3250\") " Jan 11 17:42:38 crc kubenswrapper[4837]: I0111 17:42:38.201997 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/e1452749-ce38-41f8-89dd-4b567f2a3250-var-lib-openvswitch\") pod \"e1452749-ce38-41f8-89dd-4b567f2a3250\" (UID: \"e1452749-ce38-41f8-89dd-4b567f2a3250\") " Jan 11 17:42:38 crc kubenswrapper[4837]: I0111 17:42:38.202031 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/e1452749-ce38-41f8-89dd-4b567f2a3250-log-socket\") pod \"e1452749-ce38-41f8-89dd-4b567f2a3250\" (UID: \"e1452749-ce38-41f8-89dd-4b567f2a3250\") " Jan 11 17:42:38 crc kubenswrapper[4837]: I0111 17:42:38.202063 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/e1452749-ce38-41f8-89dd-4b567f2a3250-host-run-ovn-kubernetes\") pod \"e1452749-ce38-41f8-89dd-4b567f2a3250\" (UID: \"e1452749-ce38-41f8-89dd-4b567f2a3250\") " Jan 11 17:42:38 crc kubenswrapper[4837]: I0111 17:42:38.202116 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/e1452749-ce38-41f8-89dd-4b567f2a3250-ovnkube-config\") pod \"e1452749-ce38-41f8-89dd-4b567f2a3250\" (UID: \"e1452749-ce38-41f8-89dd-4b567f2a3250\") " Jan 11 17:42:38 crc kubenswrapper[4837]: I0111 17:42:38.202164 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/e1452749-ce38-41f8-89dd-4b567f2a3250-host-cni-netd\") pod \"e1452749-ce38-41f8-89dd-4b567f2a3250\" (UID: \"e1452749-ce38-41f8-89dd-4b567f2a3250\") " Jan 11 17:42:38 crc kubenswrapper[4837]: I0111 17:42:38.202197 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/e1452749-ce38-41f8-89dd-4b567f2a3250-etc-openvswitch\") pod \"e1452749-ce38-41f8-89dd-4b567f2a3250\" (UID: \"e1452749-ce38-41f8-89dd-4b567f2a3250\") " Jan 11 17:42:38 crc kubenswrapper[4837]: I0111 17:42:38.202155 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/e1452749-ce38-41f8-89dd-4b567f2a3250-log-socket" (OuterVolumeSpecName: "log-socket") pod "e1452749-ce38-41f8-89dd-4b567f2a3250" (UID: "e1452749-ce38-41f8-89dd-4b567f2a3250"). InnerVolumeSpecName "log-socket". PluginName "kubernetes.io/host-path", VolumeGidValue "" Jan 11 17:42:38 crc kubenswrapper[4837]: I0111 17:42:38.202175 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/e1452749-ce38-41f8-89dd-4b567f2a3250-host-run-ovn-kubernetes" (OuterVolumeSpecName: "host-run-ovn-kubernetes") pod "e1452749-ce38-41f8-89dd-4b567f2a3250" (UID: "e1452749-ce38-41f8-89dd-4b567f2a3250"). InnerVolumeSpecName "host-run-ovn-kubernetes". PluginName "kubernetes.io/host-path", VolumeGidValue "" Jan 11 17:42:38 crc kubenswrapper[4837]: I0111 17:42:38.202200 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/e1452749-ce38-41f8-89dd-4b567f2a3250-host-cni-netd" (OuterVolumeSpecName: "host-cni-netd") pod "e1452749-ce38-41f8-89dd-4b567f2a3250" (UID: "e1452749-ce38-41f8-89dd-4b567f2a3250"). InnerVolumeSpecName "host-cni-netd". PluginName "kubernetes.io/host-path", VolumeGidValue "" Jan 11 17:42:38 crc kubenswrapper[4837]: I0111 17:42:38.202240 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/e1452749-ce38-41f8-89dd-4b567f2a3250-systemd-units\") pod \"e1452749-ce38-41f8-89dd-4b567f2a3250\" (UID: \"e1452749-ce38-41f8-89dd-4b567f2a3250\") " Jan 11 17:42:38 crc kubenswrapper[4837]: I0111 17:42:38.202300 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/e1452749-ce38-41f8-89dd-4b567f2a3250-systemd-units" (OuterVolumeSpecName: "systemd-units") pod "e1452749-ce38-41f8-89dd-4b567f2a3250" (UID: "e1452749-ce38-41f8-89dd-4b567f2a3250"). InnerVolumeSpecName "systemd-units". PluginName "kubernetes.io/host-path", VolumeGidValue "" Jan 11 17:42:38 crc kubenswrapper[4837]: I0111 17:42:38.202312 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/e1452749-ce38-41f8-89dd-4b567f2a3250-host-run-netns\") pod \"e1452749-ce38-41f8-89dd-4b567f2a3250\" (UID: \"e1452749-ce38-41f8-89dd-4b567f2a3250\") " Jan 11 17:42:38 crc kubenswrapper[4837]: I0111 17:42:38.202285 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/e1452749-ce38-41f8-89dd-4b567f2a3250-etc-openvswitch" (OuterVolumeSpecName: "etc-openvswitch") pod "e1452749-ce38-41f8-89dd-4b567f2a3250" (UID: "e1452749-ce38-41f8-89dd-4b567f2a3250"). InnerVolumeSpecName "etc-openvswitch". PluginName "kubernetes.io/host-path", VolumeGidValue "" Jan 11 17:42:38 crc kubenswrapper[4837]: I0111 17:42:38.202343 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/e1452749-ce38-41f8-89dd-4b567f2a3250-host-cni-bin\") pod \"e1452749-ce38-41f8-89dd-4b567f2a3250\" (UID: \"e1452749-ce38-41f8-89dd-4b567f2a3250\") " Jan 11 17:42:38 crc kubenswrapper[4837]: I0111 17:42:38.202366 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/e1452749-ce38-41f8-89dd-4b567f2a3250-node-log\") pod \"e1452749-ce38-41f8-89dd-4b567f2a3250\" (UID: \"e1452749-ce38-41f8-89dd-4b567f2a3250\") " Jan 11 17:42:38 crc kubenswrapper[4837]: I0111 17:42:38.202364 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/e1452749-ce38-41f8-89dd-4b567f2a3250-host-run-netns" (OuterVolumeSpecName: "host-run-netns") pod "e1452749-ce38-41f8-89dd-4b567f2a3250" (UID: "e1452749-ce38-41f8-89dd-4b567f2a3250"). InnerVolumeSpecName "host-run-netns". PluginName "kubernetes.io/host-path", VolumeGidValue "" Jan 11 17:42:38 crc kubenswrapper[4837]: I0111 17:42:38.202392 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/e1452749-ce38-41f8-89dd-4b567f2a3250-host-cni-bin" (OuterVolumeSpecName: "host-cni-bin") pod "e1452749-ce38-41f8-89dd-4b567f2a3250" (UID: "e1452749-ce38-41f8-89dd-4b567f2a3250"). InnerVolumeSpecName "host-cni-bin". PluginName "kubernetes.io/host-path", VolumeGidValue "" Jan 11 17:42:38 crc kubenswrapper[4837]: I0111 17:42:38.202404 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/e1452749-ce38-41f8-89dd-4b567f2a3250-run-ovn\") pod \"e1452749-ce38-41f8-89dd-4b567f2a3250\" (UID: \"e1452749-ce38-41f8-89dd-4b567f2a3250\") " Jan 11 17:42:38 crc kubenswrapper[4837]: I0111 17:42:38.202416 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/e1452749-ce38-41f8-89dd-4b567f2a3250-node-log" (OuterVolumeSpecName: "node-log") pod "e1452749-ce38-41f8-89dd-4b567f2a3250" (UID: "e1452749-ce38-41f8-89dd-4b567f2a3250"). InnerVolumeSpecName "node-log". PluginName "kubernetes.io/host-path", VolumeGidValue "" Jan 11 17:42:38 crc kubenswrapper[4837]: I0111 17:42:38.202457 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/e1452749-ce38-41f8-89dd-4b567f2a3250-ovn-node-metrics-cert\") pod \"e1452749-ce38-41f8-89dd-4b567f2a3250\" (UID: \"e1452749-ce38-41f8-89dd-4b567f2a3250\") " Jan 11 17:42:38 crc kubenswrapper[4837]: I0111 17:42:38.202430 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/e1452749-ce38-41f8-89dd-4b567f2a3250-run-ovn" (OuterVolumeSpecName: "run-ovn") pod "e1452749-ce38-41f8-89dd-4b567f2a3250" (UID: "e1452749-ce38-41f8-89dd-4b567f2a3250"). InnerVolumeSpecName "run-ovn". PluginName "kubernetes.io/host-path", VolumeGidValue "" Jan 11 17:42:38 crc kubenswrapper[4837]: I0111 17:42:38.202492 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/e1452749-ce38-41f8-89dd-4b567f2a3250-run-openvswitch\") pod \"e1452749-ce38-41f8-89dd-4b567f2a3250\" (UID: \"e1452749-ce38-41f8-89dd-4b567f2a3250\") " Jan 11 17:42:38 crc kubenswrapper[4837]: I0111 17:42:38.202524 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/e1452749-ce38-41f8-89dd-4b567f2a3250-host-slash\") pod \"e1452749-ce38-41f8-89dd-4b567f2a3250\" (UID: \"e1452749-ce38-41f8-89dd-4b567f2a3250\") " Jan 11 17:42:38 crc kubenswrapper[4837]: I0111 17:42:38.202549 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/e1452749-ce38-41f8-89dd-4b567f2a3250-host-kubelet\") pod \"e1452749-ce38-41f8-89dd-4b567f2a3250\" (UID: \"e1452749-ce38-41f8-89dd-4b567f2a3250\") " Jan 11 17:42:38 crc kubenswrapper[4837]: I0111 17:42:38.202589 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/e1452749-ce38-41f8-89dd-4b567f2a3250-env-overrides\") pod \"e1452749-ce38-41f8-89dd-4b567f2a3250\" (UID: \"e1452749-ce38-41f8-89dd-4b567f2a3250\") " Jan 11 17:42:38 crc kubenswrapper[4837]: I0111 17:42:38.202597 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/e1452749-ce38-41f8-89dd-4b567f2a3250-run-openvswitch" (OuterVolumeSpecName: "run-openvswitch") pod "e1452749-ce38-41f8-89dd-4b567f2a3250" (UID: "e1452749-ce38-41f8-89dd-4b567f2a3250"). InnerVolumeSpecName "run-openvswitch". PluginName "kubernetes.io/host-path", VolumeGidValue "" Jan 11 17:42:38 crc kubenswrapper[4837]: I0111 17:42:38.202616 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/e1452749-ce38-41f8-89dd-4b567f2a3250-host-slash" (OuterVolumeSpecName: "host-slash") pod "e1452749-ce38-41f8-89dd-4b567f2a3250" (UID: "e1452749-ce38-41f8-89dd-4b567f2a3250"). InnerVolumeSpecName "host-slash". PluginName "kubernetes.io/host-path", VolumeGidValue "" Jan 11 17:42:38 crc kubenswrapper[4837]: I0111 17:42:38.202642 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/e1452749-ce38-41f8-89dd-4b567f2a3250-host-var-lib-cni-networks-ovn-kubernetes\") pod \"e1452749-ce38-41f8-89dd-4b567f2a3250\" (UID: \"e1452749-ce38-41f8-89dd-4b567f2a3250\") " Jan 11 17:42:38 crc kubenswrapper[4837]: I0111 17:42:38.202699 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/e1452749-ce38-41f8-89dd-4b567f2a3250-ovnkube-script-lib\") pod \"e1452749-ce38-41f8-89dd-4b567f2a3250\" (UID: \"e1452749-ce38-41f8-89dd-4b567f2a3250\") " Jan 11 17:42:38 crc kubenswrapper[4837]: I0111 17:42:38.202734 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/e1452749-ce38-41f8-89dd-4b567f2a3250-host-var-lib-cni-networks-ovn-kubernetes" (OuterVolumeSpecName: "host-var-lib-cni-networks-ovn-kubernetes") pod "e1452749-ce38-41f8-89dd-4b567f2a3250" (UID: "e1452749-ce38-41f8-89dd-4b567f2a3250"). InnerVolumeSpecName "host-var-lib-cni-networks-ovn-kubernetes". PluginName "kubernetes.io/host-path", VolumeGidValue "" Jan 11 17:42:38 crc kubenswrapper[4837]: I0111 17:42:38.202741 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/e1452749-ce38-41f8-89dd-4b567f2a3250-host-kubelet" (OuterVolumeSpecName: "host-kubelet") pod "e1452749-ce38-41f8-89dd-4b567f2a3250" (UID: "e1452749-ce38-41f8-89dd-4b567f2a3250"). InnerVolumeSpecName "host-kubelet". PluginName "kubernetes.io/host-path", VolumeGidValue "" Jan 11 17:42:38 crc kubenswrapper[4837]: I0111 17:42:38.203103 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e1452749-ce38-41f8-89dd-4b567f2a3250-env-overrides" (OuterVolumeSpecName: "env-overrides") pod "e1452749-ce38-41f8-89dd-4b567f2a3250" (UID: "e1452749-ce38-41f8-89dd-4b567f2a3250"). InnerVolumeSpecName "env-overrides". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 11 17:42:38 crc kubenswrapper[4837]: I0111 17:42:38.203132 4837 reconciler_common.go:293] "Volume detached for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/e1452749-ce38-41f8-89dd-4b567f2a3250-host-var-lib-cni-networks-ovn-kubernetes\") on node \"crc\" DevicePath \"\"" Jan 11 17:42:38 crc kubenswrapper[4837]: I0111 17:42:38.203156 4837 reconciler_common.go:293] "Volume detached for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/e1452749-ce38-41f8-89dd-4b567f2a3250-log-socket\") on node \"crc\" DevicePath \"\"" Jan 11 17:42:38 crc kubenswrapper[4837]: I0111 17:42:38.203173 4837 reconciler_common.go:293] "Volume detached for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/e1452749-ce38-41f8-89dd-4b567f2a3250-host-run-ovn-kubernetes\") on node \"crc\" DevicePath \"\"" Jan 11 17:42:38 crc kubenswrapper[4837]: I0111 17:42:38.203189 4837 reconciler_common.go:293] "Volume detached for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/e1452749-ce38-41f8-89dd-4b567f2a3250-host-cni-netd\") on node \"crc\" DevicePath \"\"" Jan 11 17:42:38 crc kubenswrapper[4837]: I0111 17:42:38.203204 4837 reconciler_common.go:293] "Volume detached for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/e1452749-ce38-41f8-89dd-4b567f2a3250-etc-openvswitch\") on node \"crc\" DevicePath \"\"" Jan 11 17:42:38 crc kubenswrapper[4837]: I0111 17:42:38.203186 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e1452749-ce38-41f8-89dd-4b567f2a3250-ovnkube-config" (OuterVolumeSpecName: "ovnkube-config") pod "e1452749-ce38-41f8-89dd-4b567f2a3250" (UID: "e1452749-ce38-41f8-89dd-4b567f2a3250"). InnerVolumeSpecName "ovnkube-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 11 17:42:38 crc kubenswrapper[4837]: I0111 17:42:38.203219 4837 reconciler_common.go:293] "Volume detached for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/e1452749-ce38-41f8-89dd-4b567f2a3250-systemd-units\") on node \"crc\" DevicePath \"\"" Jan 11 17:42:38 crc kubenswrapper[4837]: I0111 17:42:38.203269 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/e1452749-ce38-41f8-89dd-4b567f2a3250-var-lib-openvswitch" (OuterVolumeSpecName: "var-lib-openvswitch") pod "e1452749-ce38-41f8-89dd-4b567f2a3250" (UID: "e1452749-ce38-41f8-89dd-4b567f2a3250"). InnerVolumeSpecName "var-lib-openvswitch". PluginName "kubernetes.io/host-path", VolumeGidValue "" Jan 11 17:42:38 crc kubenswrapper[4837]: I0111 17:42:38.203293 4837 reconciler_common.go:293] "Volume detached for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/e1452749-ce38-41f8-89dd-4b567f2a3250-host-run-netns\") on node \"crc\" DevicePath \"\"" Jan 11 17:42:38 crc kubenswrapper[4837]: I0111 17:42:38.203315 4837 reconciler_common.go:293] "Volume detached for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/e1452749-ce38-41f8-89dd-4b567f2a3250-host-cni-bin\") on node \"crc\" DevicePath \"\"" Jan 11 17:42:38 crc kubenswrapper[4837]: I0111 17:42:38.203334 4837 reconciler_common.go:293] "Volume detached for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/e1452749-ce38-41f8-89dd-4b567f2a3250-node-log\") on node \"crc\" DevicePath \"\"" Jan 11 17:42:38 crc kubenswrapper[4837]: I0111 17:42:38.203353 4837 reconciler_common.go:293] "Volume detached for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/e1452749-ce38-41f8-89dd-4b567f2a3250-run-ovn\") on node \"crc\" DevicePath \"\"" Jan 11 17:42:38 crc kubenswrapper[4837]: I0111 17:42:38.203371 4837 reconciler_common.go:293] "Volume detached for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/e1452749-ce38-41f8-89dd-4b567f2a3250-run-openvswitch\") on node \"crc\" DevicePath \"\"" Jan 11 17:42:38 crc kubenswrapper[4837]: I0111 17:42:38.203388 4837 reconciler_common.go:293] "Volume detached for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/e1452749-ce38-41f8-89dd-4b567f2a3250-host-slash\") on node \"crc\" DevicePath \"\"" Jan 11 17:42:38 crc kubenswrapper[4837]: I0111 17:42:38.203405 4837 reconciler_common.go:293] "Volume detached for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/e1452749-ce38-41f8-89dd-4b567f2a3250-host-kubelet\") on node \"crc\" DevicePath \"\"" Jan 11 17:42:38 crc kubenswrapper[4837]: I0111 17:42:38.203440 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e1452749-ce38-41f8-89dd-4b567f2a3250-ovnkube-script-lib" (OuterVolumeSpecName: "ovnkube-script-lib") pod "e1452749-ce38-41f8-89dd-4b567f2a3250" (UID: "e1452749-ce38-41f8-89dd-4b567f2a3250"). InnerVolumeSpecName "ovnkube-script-lib". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 11 17:42:38 crc kubenswrapper[4837]: I0111 17:42:38.208582 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e1452749-ce38-41f8-89dd-4b567f2a3250-kube-api-access-6f6rv" (OuterVolumeSpecName: "kube-api-access-6f6rv") pod "e1452749-ce38-41f8-89dd-4b567f2a3250" (UID: "e1452749-ce38-41f8-89dd-4b567f2a3250"). InnerVolumeSpecName "kube-api-access-6f6rv". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 11 17:42:38 crc kubenswrapper[4837]: I0111 17:42:38.208835 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e1452749-ce38-41f8-89dd-4b567f2a3250-ovn-node-metrics-cert" (OuterVolumeSpecName: "ovn-node-metrics-cert") pod "e1452749-ce38-41f8-89dd-4b567f2a3250" (UID: "e1452749-ce38-41f8-89dd-4b567f2a3250"). InnerVolumeSpecName "ovn-node-metrics-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 11 17:42:38 crc kubenswrapper[4837]: I0111 17:42:38.223827 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/e1452749-ce38-41f8-89dd-4b567f2a3250-run-systemd" (OuterVolumeSpecName: "run-systemd") pod "e1452749-ce38-41f8-89dd-4b567f2a3250" (UID: "e1452749-ce38-41f8-89dd-4b567f2a3250"). InnerVolumeSpecName "run-systemd". PluginName "kubernetes.io/host-path", VolumeGidValue "" Jan 11 17:42:38 crc kubenswrapper[4837]: I0111 17:42:38.234204 4837 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-ovn-kubernetes/ovnkube-node-x2mlj"] Jan 11 17:42:38 crc kubenswrapper[4837]: E0111 17:42:38.234404 4837 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e1452749-ce38-41f8-89dd-4b567f2a3250" containerName="sbdb" Jan 11 17:42:38 crc kubenswrapper[4837]: I0111 17:42:38.234416 4837 state_mem.go:107] "Deleted CPUSet assignment" podUID="e1452749-ce38-41f8-89dd-4b567f2a3250" containerName="sbdb" Jan 11 17:42:38 crc kubenswrapper[4837]: E0111 17:42:38.234426 4837 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e1452749-ce38-41f8-89dd-4b567f2a3250" containerName="ovnkube-controller" Jan 11 17:42:38 crc kubenswrapper[4837]: I0111 17:42:38.234433 4837 state_mem.go:107] "Deleted CPUSet assignment" podUID="e1452749-ce38-41f8-89dd-4b567f2a3250" containerName="ovnkube-controller" Jan 11 17:42:38 crc kubenswrapper[4837]: E0111 17:42:38.234444 4837 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e1452749-ce38-41f8-89dd-4b567f2a3250" containerName="ovn-controller" Jan 11 17:42:38 crc kubenswrapper[4837]: I0111 17:42:38.234450 4837 state_mem.go:107] "Deleted CPUSet assignment" podUID="e1452749-ce38-41f8-89dd-4b567f2a3250" containerName="ovn-controller" Jan 11 17:42:38 crc kubenswrapper[4837]: E0111 17:42:38.234458 4837 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e1452749-ce38-41f8-89dd-4b567f2a3250" containerName="northd" Jan 11 17:42:38 crc kubenswrapper[4837]: I0111 17:42:38.234463 4837 state_mem.go:107] "Deleted CPUSet assignment" podUID="e1452749-ce38-41f8-89dd-4b567f2a3250" containerName="northd" Jan 11 17:42:38 crc kubenswrapper[4837]: E0111 17:42:38.234472 4837 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e1452749-ce38-41f8-89dd-4b567f2a3250" containerName="ovn-acl-logging" Jan 11 17:42:38 crc kubenswrapper[4837]: I0111 17:42:38.234477 4837 state_mem.go:107] "Deleted CPUSet assignment" podUID="e1452749-ce38-41f8-89dd-4b567f2a3250" containerName="ovn-acl-logging" Jan 11 17:42:38 crc kubenswrapper[4837]: E0111 17:42:38.234485 4837 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e1452749-ce38-41f8-89dd-4b567f2a3250" containerName="kube-rbac-proxy-node" Jan 11 17:42:38 crc kubenswrapper[4837]: I0111 17:42:38.234491 4837 state_mem.go:107] "Deleted CPUSet assignment" podUID="e1452749-ce38-41f8-89dd-4b567f2a3250" containerName="kube-rbac-proxy-node" Jan 11 17:42:38 crc kubenswrapper[4837]: E0111 17:42:38.234502 4837 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e1452749-ce38-41f8-89dd-4b567f2a3250" containerName="nbdb" Jan 11 17:42:38 crc kubenswrapper[4837]: I0111 17:42:38.234507 4837 state_mem.go:107] "Deleted CPUSet assignment" podUID="e1452749-ce38-41f8-89dd-4b567f2a3250" containerName="nbdb" Jan 11 17:42:38 crc kubenswrapper[4837]: E0111 17:42:38.234514 4837 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e1452749-ce38-41f8-89dd-4b567f2a3250" containerName="kubecfg-setup" Jan 11 17:42:38 crc kubenswrapper[4837]: I0111 17:42:38.234520 4837 state_mem.go:107] "Deleted CPUSet assignment" podUID="e1452749-ce38-41f8-89dd-4b567f2a3250" containerName="kubecfg-setup" Jan 11 17:42:38 crc kubenswrapper[4837]: E0111 17:42:38.234528 4837 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e1452749-ce38-41f8-89dd-4b567f2a3250" containerName="kube-rbac-proxy-ovn-metrics" Jan 11 17:42:38 crc kubenswrapper[4837]: I0111 17:42:38.234534 4837 state_mem.go:107] "Deleted CPUSet assignment" podUID="e1452749-ce38-41f8-89dd-4b567f2a3250" containerName="kube-rbac-proxy-ovn-metrics" Jan 11 17:42:38 crc kubenswrapper[4837]: I0111 17:42:38.234620 4837 memory_manager.go:354] "RemoveStaleState removing state" podUID="e1452749-ce38-41f8-89dd-4b567f2a3250" containerName="ovn-acl-logging" Jan 11 17:42:38 crc kubenswrapper[4837]: I0111 17:42:38.234630 4837 memory_manager.go:354] "RemoveStaleState removing state" podUID="e1452749-ce38-41f8-89dd-4b567f2a3250" containerName="nbdb" Jan 11 17:42:38 crc kubenswrapper[4837]: I0111 17:42:38.234638 4837 memory_manager.go:354] "RemoveStaleState removing state" podUID="e1452749-ce38-41f8-89dd-4b567f2a3250" containerName="ovnkube-controller" Jan 11 17:42:38 crc kubenswrapper[4837]: I0111 17:42:38.234645 4837 memory_manager.go:354] "RemoveStaleState removing state" podUID="e1452749-ce38-41f8-89dd-4b567f2a3250" containerName="kube-rbac-proxy-node" Jan 11 17:42:38 crc kubenswrapper[4837]: I0111 17:42:38.234651 4837 memory_manager.go:354] "RemoveStaleState removing state" podUID="e1452749-ce38-41f8-89dd-4b567f2a3250" containerName="ovn-controller" Jan 11 17:42:38 crc kubenswrapper[4837]: I0111 17:42:38.234658 4837 memory_manager.go:354] "RemoveStaleState removing state" podUID="e1452749-ce38-41f8-89dd-4b567f2a3250" containerName="kube-rbac-proxy-ovn-metrics" Jan 11 17:42:38 crc kubenswrapper[4837]: I0111 17:42:38.234666 4837 memory_manager.go:354] "RemoveStaleState removing state" podUID="e1452749-ce38-41f8-89dd-4b567f2a3250" containerName="sbdb" Jan 11 17:42:38 crc kubenswrapper[4837]: I0111 17:42:38.234698 4837 memory_manager.go:354] "RemoveStaleState removing state" podUID="e1452749-ce38-41f8-89dd-4b567f2a3250" containerName="northd" Jan 11 17:42:38 crc kubenswrapper[4837]: I0111 17:42:38.236414 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-x2mlj" Jan 11 17:42:38 crc kubenswrapper[4837]: I0111 17:42:38.304396 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/a8d01daa-e96c-4437-8139-fce0acd42889-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-x2mlj\" (UID: \"a8d01daa-e96c-4437-8139-fce0acd42889\") " pod="openshift-ovn-kubernetes/ovnkube-node-x2mlj" Jan 11 17:42:38 crc kubenswrapper[4837]: I0111 17:42:38.304437 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/a8d01daa-e96c-4437-8139-fce0acd42889-ovnkube-script-lib\") pod \"ovnkube-node-x2mlj\" (UID: \"a8d01daa-e96c-4437-8139-fce0acd42889\") " pod="openshift-ovn-kubernetes/ovnkube-node-x2mlj" Jan 11 17:42:38 crc kubenswrapper[4837]: I0111 17:42:38.304452 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/a8d01daa-e96c-4437-8139-fce0acd42889-etc-openvswitch\") pod \"ovnkube-node-x2mlj\" (UID: \"a8d01daa-e96c-4437-8139-fce0acd42889\") " pod="openshift-ovn-kubernetes/ovnkube-node-x2mlj" Jan 11 17:42:38 crc kubenswrapper[4837]: I0111 17:42:38.304489 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/a8d01daa-e96c-4437-8139-fce0acd42889-host-cni-netd\") pod \"ovnkube-node-x2mlj\" (UID: \"a8d01daa-e96c-4437-8139-fce0acd42889\") " pod="openshift-ovn-kubernetes/ovnkube-node-x2mlj" Jan 11 17:42:38 crc kubenswrapper[4837]: I0111 17:42:38.304511 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/a8d01daa-e96c-4437-8139-fce0acd42889-host-cni-bin\") pod \"ovnkube-node-x2mlj\" (UID: \"a8d01daa-e96c-4437-8139-fce0acd42889\") " pod="openshift-ovn-kubernetes/ovnkube-node-x2mlj" Jan 11 17:42:38 crc kubenswrapper[4837]: I0111 17:42:38.304526 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/a8d01daa-e96c-4437-8139-fce0acd42889-host-slash\") pod \"ovnkube-node-x2mlj\" (UID: \"a8d01daa-e96c-4437-8139-fce0acd42889\") " pod="openshift-ovn-kubernetes/ovnkube-node-x2mlj" Jan 11 17:42:38 crc kubenswrapper[4837]: I0111 17:42:38.304541 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/a8d01daa-e96c-4437-8139-fce0acd42889-var-lib-openvswitch\") pod \"ovnkube-node-x2mlj\" (UID: \"a8d01daa-e96c-4437-8139-fce0acd42889\") " pod="openshift-ovn-kubernetes/ovnkube-node-x2mlj" Jan 11 17:42:38 crc kubenswrapper[4837]: I0111 17:42:38.304556 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/a8d01daa-e96c-4437-8139-fce0acd42889-run-ovn\") pod \"ovnkube-node-x2mlj\" (UID: \"a8d01daa-e96c-4437-8139-fce0acd42889\") " pod="openshift-ovn-kubernetes/ovnkube-node-x2mlj" Jan 11 17:42:38 crc kubenswrapper[4837]: I0111 17:42:38.304575 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/a8d01daa-e96c-4437-8139-fce0acd42889-host-run-netns\") pod \"ovnkube-node-x2mlj\" (UID: \"a8d01daa-e96c-4437-8139-fce0acd42889\") " pod="openshift-ovn-kubernetes/ovnkube-node-x2mlj" Jan 11 17:42:38 crc kubenswrapper[4837]: I0111 17:42:38.304588 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/a8d01daa-e96c-4437-8139-fce0acd42889-node-log\") pod \"ovnkube-node-x2mlj\" (UID: \"a8d01daa-e96c-4437-8139-fce0acd42889\") " pod="openshift-ovn-kubernetes/ovnkube-node-x2mlj" Jan 11 17:42:38 crc kubenswrapper[4837]: I0111 17:42:38.304608 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/a8d01daa-e96c-4437-8139-fce0acd42889-ovnkube-config\") pod \"ovnkube-node-x2mlj\" (UID: \"a8d01daa-e96c-4437-8139-fce0acd42889\") " pod="openshift-ovn-kubernetes/ovnkube-node-x2mlj" Jan 11 17:42:38 crc kubenswrapper[4837]: I0111 17:42:38.304623 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/a8d01daa-e96c-4437-8139-fce0acd42889-env-overrides\") pod \"ovnkube-node-x2mlj\" (UID: \"a8d01daa-e96c-4437-8139-fce0acd42889\") " pod="openshift-ovn-kubernetes/ovnkube-node-x2mlj" Jan 11 17:42:38 crc kubenswrapper[4837]: I0111 17:42:38.304646 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/a8d01daa-e96c-4437-8139-fce0acd42889-ovn-node-metrics-cert\") pod \"ovnkube-node-x2mlj\" (UID: \"a8d01daa-e96c-4437-8139-fce0acd42889\") " pod="openshift-ovn-kubernetes/ovnkube-node-x2mlj" Jan 11 17:42:38 crc kubenswrapper[4837]: I0111 17:42:38.304663 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/a8d01daa-e96c-4437-8139-fce0acd42889-run-openvswitch\") pod \"ovnkube-node-x2mlj\" (UID: \"a8d01daa-e96c-4437-8139-fce0acd42889\") " pod="openshift-ovn-kubernetes/ovnkube-node-x2mlj" Jan 11 17:42:38 crc kubenswrapper[4837]: I0111 17:42:38.304714 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nxkj2\" (UniqueName: \"kubernetes.io/projected/a8d01daa-e96c-4437-8139-fce0acd42889-kube-api-access-nxkj2\") pod \"ovnkube-node-x2mlj\" (UID: \"a8d01daa-e96c-4437-8139-fce0acd42889\") " pod="openshift-ovn-kubernetes/ovnkube-node-x2mlj" Jan 11 17:42:38 crc kubenswrapper[4837]: I0111 17:42:38.304733 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/a8d01daa-e96c-4437-8139-fce0acd42889-host-kubelet\") pod \"ovnkube-node-x2mlj\" (UID: \"a8d01daa-e96c-4437-8139-fce0acd42889\") " pod="openshift-ovn-kubernetes/ovnkube-node-x2mlj" Jan 11 17:42:38 crc kubenswrapper[4837]: I0111 17:42:38.304750 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/a8d01daa-e96c-4437-8139-fce0acd42889-systemd-units\") pod \"ovnkube-node-x2mlj\" (UID: \"a8d01daa-e96c-4437-8139-fce0acd42889\") " pod="openshift-ovn-kubernetes/ovnkube-node-x2mlj" Jan 11 17:42:38 crc kubenswrapper[4837]: I0111 17:42:38.304767 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/a8d01daa-e96c-4437-8139-fce0acd42889-run-systemd\") pod \"ovnkube-node-x2mlj\" (UID: \"a8d01daa-e96c-4437-8139-fce0acd42889\") " pod="openshift-ovn-kubernetes/ovnkube-node-x2mlj" Jan 11 17:42:38 crc kubenswrapper[4837]: I0111 17:42:38.304785 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/a8d01daa-e96c-4437-8139-fce0acd42889-host-run-ovn-kubernetes\") pod \"ovnkube-node-x2mlj\" (UID: \"a8d01daa-e96c-4437-8139-fce0acd42889\") " pod="openshift-ovn-kubernetes/ovnkube-node-x2mlj" Jan 11 17:42:38 crc kubenswrapper[4837]: I0111 17:42:38.304802 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/a8d01daa-e96c-4437-8139-fce0acd42889-log-socket\") pod \"ovnkube-node-x2mlj\" (UID: \"a8d01daa-e96c-4437-8139-fce0acd42889\") " pod="openshift-ovn-kubernetes/ovnkube-node-x2mlj" Jan 11 17:42:38 crc kubenswrapper[4837]: I0111 17:42:38.304830 4837 reconciler_common.go:293] "Volume detached for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/e1452749-ce38-41f8-89dd-4b567f2a3250-ovnkube-config\") on node \"crc\" DevicePath \"\"" Jan 11 17:42:38 crc kubenswrapper[4837]: I0111 17:42:38.304840 4837 reconciler_common.go:293] "Volume detached for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/e1452749-ce38-41f8-89dd-4b567f2a3250-ovn-node-metrics-cert\") on node \"crc\" DevicePath \"\"" Jan 11 17:42:38 crc kubenswrapper[4837]: I0111 17:42:38.304850 4837 reconciler_common.go:293] "Volume detached for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/e1452749-ce38-41f8-89dd-4b567f2a3250-env-overrides\") on node \"crc\" DevicePath \"\"" Jan 11 17:42:38 crc kubenswrapper[4837]: I0111 17:42:38.304858 4837 reconciler_common.go:293] "Volume detached for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/e1452749-ce38-41f8-89dd-4b567f2a3250-ovnkube-script-lib\") on node \"crc\" DevicePath \"\"" Jan 11 17:42:38 crc kubenswrapper[4837]: I0111 17:42:38.304867 4837 reconciler_common.go:293] "Volume detached for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/e1452749-ce38-41f8-89dd-4b567f2a3250-run-systemd\") on node \"crc\" DevicePath \"\"" Jan 11 17:42:38 crc kubenswrapper[4837]: I0111 17:42:38.304875 4837 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6f6rv\" (UniqueName: \"kubernetes.io/projected/e1452749-ce38-41f8-89dd-4b567f2a3250-kube-api-access-6f6rv\") on node \"crc\" DevicePath \"\"" Jan 11 17:42:38 crc kubenswrapper[4837]: I0111 17:42:38.304884 4837 reconciler_common.go:293] "Volume detached for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/e1452749-ce38-41f8-89dd-4b567f2a3250-var-lib-openvswitch\") on node \"crc\" DevicePath \"\"" Jan 11 17:42:38 crc kubenswrapper[4837]: I0111 17:42:38.405853 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/a8d01daa-e96c-4437-8139-fce0acd42889-ovnkube-config\") pod \"ovnkube-node-x2mlj\" (UID: \"a8d01daa-e96c-4437-8139-fce0acd42889\") " pod="openshift-ovn-kubernetes/ovnkube-node-x2mlj" Jan 11 17:42:38 crc kubenswrapper[4837]: I0111 17:42:38.405927 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/a8d01daa-e96c-4437-8139-fce0acd42889-env-overrides\") pod \"ovnkube-node-x2mlj\" (UID: \"a8d01daa-e96c-4437-8139-fce0acd42889\") " pod="openshift-ovn-kubernetes/ovnkube-node-x2mlj" Jan 11 17:42:38 crc kubenswrapper[4837]: I0111 17:42:38.405983 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/a8d01daa-e96c-4437-8139-fce0acd42889-ovn-node-metrics-cert\") pod \"ovnkube-node-x2mlj\" (UID: \"a8d01daa-e96c-4437-8139-fce0acd42889\") " pod="openshift-ovn-kubernetes/ovnkube-node-x2mlj" Jan 11 17:42:38 crc kubenswrapper[4837]: I0111 17:42:38.406019 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/a8d01daa-e96c-4437-8139-fce0acd42889-run-openvswitch\") pod \"ovnkube-node-x2mlj\" (UID: \"a8d01daa-e96c-4437-8139-fce0acd42889\") " pod="openshift-ovn-kubernetes/ovnkube-node-x2mlj" Jan 11 17:42:38 crc kubenswrapper[4837]: I0111 17:42:38.406063 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-nxkj2\" (UniqueName: \"kubernetes.io/projected/a8d01daa-e96c-4437-8139-fce0acd42889-kube-api-access-nxkj2\") pod \"ovnkube-node-x2mlj\" (UID: \"a8d01daa-e96c-4437-8139-fce0acd42889\") " pod="openshift-ovn-kubernetes/ovnkube-node-x2mlj" Jan 11 17:42:38 crc kubenswrapper[4837]: I0111 17:42:38.406104 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/a8d01daa-e96c-4437-8139-fce0acd42889-host-kubelet\") pod \"ovnkube-node-x2mlj\" (UID: \"a8d01daa-e96c-4437-8139-fce0acd42889\") " pod="openshift-ovn-kubernetes/ovnkube-node-x2mlj" Jan 11 17:42:38 crc kubenswrapper[4837]: I0111 17:42:38.406143 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/a8d01daa-e96c-4437-8139-fce0acd42889-systemd-units\") pod \"ovnkube-node-x2mlj\" (UID: \"a8d01daa-e96c-4437-8139-fce0acd42889\") " pod="openshift-ovn-kubernetes/ovnkube-node-x2mlj" Jan 11 17:42:38 crc kubenswrapper[4837]: I0111 17:42:38.406176 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/a8d01daa-e96c-4437-8139-fce0acd42889-run-systemd\") pod \"ovnkube-node-x2mlj\" (UID: \"a8d01daa-e96c-4437-8139-fce0acd42889\") " pod="openshift-ovn-kubernetes/ovnkube-node-x2mlj" Jan 11 17:42:38 crc kubenswrapper[4837]: I0111 17:42:38.406208 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/a8d01daa-e96c-4437-8139-fce0acd42889-host-kubelet\") pod \"ovnkube-node-x2mlj\" (UID: \"a8d01daa-e96c-4437-8139-fce0acd42889\") " pod="openshift-ovn-kubernetes/ovnkube-node-x2mlj" Jan 11 17:42:38 crc kubenswrapper[4837]: I0111 17:42:38.406214 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/a8d01daa-e96c-4437-8139-fce0acd42889-host-run-ovn-kubernetes\") pod \"ovnkube-node-x2mlj\" (UID: \"a8d01daa-e96c-4437-8139-fce0acd42889\") " pod="openshift-ovn-kubernetes/ovnkube-node-x2mlj" Jan 11 17:42:38 crc kubenswrapper[4837]: I0111 17:42:38.406273 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/a8d01daa-e96c-4437-8139-fce0acd42889-host-run-ovn-kubernetes\") pod \"ovnkube-node-x2mlj\" (UID: \"a8d01daa-e96c-4437-8139-fce0acd42889\") " pod="openshift-ovn-kubernetes/ovnkube-node-x2mlj" Jan 11 17:42:38 crc kubenswrapper[4837]: I0111 17:42:38.406282 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/a8d01daa-e96c-4437-8139-fce0acd42889-log-socket\") pod \"ovnkube-node-x2mlj\" (UID: \"a8d01daa-e96c-4437-8139-fce0acd42889\") " pod="openshift-ovn-kubernetes/ovnkube-node-x2mlj" Jan 11 17:42:38 crc kubenswrapper[4837]: I0111 17:42:38.406282 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/a8d01daa-e96c-4437-8139-fce0acd42889-systemd-units\") pod \"ovnkube-node-x2mlj\" (UID: \"a8d01daa-e96c-4437-8139-fce0acd42889\") " pod="openshift-ovn-kubernetes/ovnkube-node-x2mlj" Jan 11 17:42:38 crc kubenswrapper[4837]: I0111 17:42:38.406325 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/a8d01daa-e96c-4437-8139-fce0acd42889-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-x2mlj\" (UID: \"a8d01daa-e96c-4437-8139-fce0acd42889\") " pod="openshift-ovn-kubernetes/ovnkube-node-x2mlj" Jan 11 17:42:38 crc kubenswrapper[4837]: I0111 17:42:38.406287 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/a8d01daa-e96c-4437-8139-fce0acd42889-run-systemd\") pod \"ovnkube-node-x2mlj\" (UID: \"a8d01daa-e96c-4437-8139-fce0acd42889\") " pod="openshift-ovn-kubernetes/ovnkube-node-x2mlj" Jan 11 17:42:38 crc kubenswrapper[4837]: I0111 17:42:38.406360 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/a8d01daa-e96c-4437-8139-fce0acd42889-ovnkube-script-lib\") pod \"ovnkube-node-x2mlj\" (UID: \"a8d01daa-e96c-4437-8139-fce0acd42889\") " pod="openshift-ovn-kubernetes/ovnkube-node-x2mlj" Jan 11 17:42:38 crc kubenswrapper[4837]: I0111 17:42:38.406363 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/a8d01daa-e96c-4437-8139-fce0acd42889-log-socket\") pod \"ovnkube-node-x2mlj\" (UID: \"a8d01daa-e96c-4437-8139-fce0acd42889\") " pod="openshift-ovn-kubernetes/ovnkube-node-x2mlj" Jan 11 17:42:38 crc kubenswrapper[4837]: I0111 17:42:38.406403 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/a8d01daa-e96c-4437-8139-fce0acd42889-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-x2mlj\" (UID: \"a8d01daa-e96c-4437-8139-fce0acd42889\") " pod="openshift-ovn-kubernetes/ovnkube-node-x2mlj" Jan 11 17:42:38 crc kubenswrapper[4837]: I0111 17:42:38.406394 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/a8d01daa-e96c-4437-8139-fce0acd42889-etc-openvswitch\") pod \"ovnkube-node-x2mlj\" (UID: \"a8d01daa-e96c-4437-8139-fce0acd42889\") " pod="openshift-ovn-kubernetes/ovnkube-node-x2mlj" Jan 11 17:42:38 crc kubenswrapper[4837]: I0111 17:42:38.406486 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/a8d01daa-e96c-4437-8139-fce0acd42889-env-overrides\") pod \"ovnkube-node-x2mlj\" (UID: \"a8d01daa-e96c-4437-8139-fce0acd42889\") " pod="openshift-ovn-kubernetes/ovnkube-node-x2mlj" Jan 11 17:42:38 crc kubenswrapper[4837]: I0111 17:42:38.406496 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/a8d01daa-e96c-4437-8139-fce0acd42889-host-cni-netd\") pod \"ovnkube-node-x2mlj\" (UID: \"a8d01daa-e96c-4437-8139-fce0acd42889\") " pod="openshift-ovn-kubernetes/ovnkube-node-x2mlj" Jan 11 17:42:38 crc kubenswrapper[4837]: I0111 17:42:38.406434 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/a8d01daa-e96c-4437-8139-fce0acd42889-etc-openvswitch\") pod \"ovnkube-node-x2mlj\" (UID: \"a8d01daa-e96c-4437-8139-fce0acd42889\") " pod="openshift-ovn-kubernetes/ovnkube-node-x2mlj" Jan 11 17:42:38 crc kubenswrapper[4837]: I0111 17:42:38.406533 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/a8d01daa-e96c-4437-8139-fce0acd42889-host-cni-netd\") pod \"ovnkube-node-x2mlj\" (UID: \"a8d01daa-e96c-4437-8139-fce0acd42889\") " pod="openshift-ovn-kubernetes/ovnkube-node-x2mlj" Jan 11 17:42:38 crc kubenswrapper[4837]: I0111 17:42:38.406572 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/a8d01daa-e96c-4437-8139-fce0acd42889-host-cni-bin\") pod \"ovnkube-node-x2mlj\" (UID: \"a8d01daa-e96c-4437-8139-fce0acd42889\") " pod="openshift-ovn-kubernetes/ovnkube-node-x2mlj" Jan 11 17:42:38 crc kubenswrapper[4837]: I0111 17:42:38.406627 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/a8d01daa-e96c-4437-8139-fce0acd42889-host-slash\") pod \"ovnkube-node-x2mlj\" (UID: \"a8d01daa-e96c-4437-8139-fce0acd42889\") " pod="openshift-ovn-kubernetes/ovnkube-node-x2mlj" Jan 11 17:42:38 crc kubenswrapper[4837]: I0111 17:42:38.406705 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/a8d01daa-e96c-4437-8139-fce0acd42889-host-cni-bin\") pod \"ovnkube-node-x2mlj\" (UID: \"a8d01daa-e96c-4437-8139-fce0acd42889\") " pod="openshift-ovn-kubernetes/ovnkube-node-x2mlj" Jan 11 17:42:38 crc kubenswrapper[4837]: I0111 17:42:38.406733 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/a8d01daa-e96c-4437-8139-fce0acd42889-host-slash\") pod \"ovnkube-node-x2mlj\" (UID: \"a8d01daa-e96c-4437-8139-fce0acd42889\") " pod="openshift-ovn-kubernetes/ovnkube-node-x2mlj" Jan 11 17:42:38 crc kubenswrapper[4837]: I0111 17:42:38.406745 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/a8d01daa-e96c-4437-8139-fce0acd42889-run-openvswitch\") pod \"ovnkube-node-x2mlj\" (UID: \"a8d01daa-e96c-4437-8139-fce0acd42889\") " pod="openshift-ovn-kubernetes/ovnkube-node-x2mlj" Jan 11 17:42:38 crc kubenswrapper[4837]: I0111 17:42:38.406707 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/a8d01daa-e96c-4437-8139-fce0acd42889-var-lib-openvswitch\") pod \"ovnkube-node-x2mlj\" (UID: \"a8d01daa-e96c-4437-8139-fce0acd42889\") " pod="openshift-ovn-kubernetes/ovnkube-node-x2mlj" Jan 11 17:42:38 crc kubenswrapper[4837]: I0111 17:42:38.406761 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/a8d01daa-e96c-4437-8139-fce0acd42889-var-lib-openvswitch\") pod \"ovnkube-node-x2mlj\" (UID: \"a8d01daa-e96c-4437-8139-fce0acd42889\") " pod="openshift-ovn-kubernetes/ovnkube-node-x2mlj" Jan 11 17:42:38 crc kubenswrapper[4837]: I0111 17:42:38.406803 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/a8d01daa-e96c-4437-8139-fce0acd42889-run-ovn\") pod \"ovnkube-node-x2mlj\" (UID: \"a8d01daa-e96c-4437-8139-fce0acd42889\") " pod="openshift-ovn-kubernetes/ovnkube-node-x2mlj" Jan 11 17:42:38 crc kubenswrapper[4837]: I0111 17:42:38.406853 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/a8d01daa-e96c-4437-8139-fce0acd42889-host-run-netns\") pod \"ovnkube-node-x2mlj\" (UID: \"a8d01daa-e96c-4437-8139-fce0acd42889\") " pod="openshift-ovn-kubernetes/ovnkube-node-x2mlj" Jan 11 17:42:38 crc kubenswrapper[4837]: I0111 17:42:38.406902 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/a8d01daa-e96c-4437-8139-fce0acd42889-node-log\") pod \"ovnkube-node-x2mlj\" (UID: \"a8d01daa-e96c-4437-8139-fce0acd42889\") " pod="openshift-ovn-kubernetes/ovnkube-node-x2mlj" Jan 11 17:42:38 crc kubenswrapper[4837]: I0111 17:42:38.406918 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/a8d01daa-e96c-4437-8139-fce0acd42889-run-ovn\") pod \"ovnkube-node-x2mlj\" (UID: \"a8d01daa-e96c-4437-8139-fce0acd42889\") " pod="openshift-ovn-kubernetes/ovnkube-node-x2mlj" Jan 11 17:42:38 crc kubenswrapper[4837]: I0111 17:42:38.406935 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/a8d01daa-e96c-4437-8139-fce0acd42889-host-run-netns\") pod \"ovnkube-node-x2mlj\" (UID: \"a8d01daa-e96c-4437-8139-fce0acd42889\") " pod="openshift-ovn-kubernetes/ovnkube-node-x2mlj" Jan 11 17:42:38 crc kubenswrapper[4837]: I0111 17:42:38.406991 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/a8d01daa-e96c-4437-8139-fce0acd42889-node-log\") pod \"ovnkube-node-x2mlj\" (UID: \"a8d01daa-e96c-4437-8139-fce0acd42889\") " pod="openshift-ovn-kubernetes/ovnkube-node-x2mlj" Jan 11 17:42:38 crc kubenswrapper[4837]: I0111 17:42:38.407229 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/a8d01daa-e96c-4437-8139-fce0acd42889-ovnkube-config\") pod \"ovnkube-node-x2mlj\" (UID: \"a8d01daa-e96c-4437-8139-fce0acd42889\") " pod="openshift-ovn-kubernetes/ovnkube-node-x2mlj" Jan 11 17:42:38 crc kubenswrapper[4837]: I0111 17:42:38.407572 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/a8d01daa-e96c-4437-8139-fce0acd42889-ovnkube-script-lib\") pod \"ovnkube-node-x2mlj\" (UID: \"a8d01daa-e96c-4437-8139-fce0acd42889\") " pod="openshift-ovn-kubernetes/ovnkube-node-x2mlj" Jan 11 17:42:38 crc kubenswrapper[4837]: I0111 17:42:38.411771 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/a8d01daa-e96c-4437-8139-fce0acd42889-ovn-node-metrics-cert\") pod \"ovnkube-node-x2mlj\" (UID: \"a8d01daa-e96c-4437-8139-fce0acd42889\") " pod="openshift-ovn-kubernetes/ovnkube-node-x2mlj" Jan 11 17:42:38 crc kubenswrapper[4837]: I0111 17:42:38.427196 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-nxkj2\" (UniqueName: \"kubernetes.io/projected/a8d01daa-e96c-4437-8139-fce0acd42889-kube-api-access-nxkj2\") pod \"ovnkube-node-x2mlj\" (UID: \"a8d01daa-e96c-4437-8139-fce0acd42889\") " pod="openshift-ovn-kubernetes/ovnkube-node-x2mlj" Jan 11 17:42:38 crc kubenswrapper[4837]: I0111 17:42:38.551784 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-x2mlj" Jan 11 17:42:38 crc kubenswrapper[4837]: I0111 17:42:38.912625 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-x2mlj" event={"ID":"a8d01daa-e96c-4437-8139-fce0acd42889","Type":"ContainerStarted","Data":"ee66c975624c9d306ae7781cfafa40270854c1b014a16deeeeabfce190573a45"} Jan 11 17:42:38 crc kubenswrapper[4837]: I0111 17:42:38.922151 4837 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-b7lgc_e1452749-ce38-41f8-89dd-4b567f2a3250/ovn-acl-logging/0.log" Jan 11 17:42:38 crc kubenswrapper[4837]: I0111 17:42:38.923538 4837 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-b7lgc_e1452749-ce38-41f8-89dd-4b567f2a3250/ovn-controller/0.log" Jan 11 17:42:38 crc kubenswrapper[4837]: I0111 17:42:38.925025 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-b7lgc" event={"ID":"e1452749-ce38-41f8-89dd-4b567f2a3250","Type":"ContainerDied","Data":"6736d3517f7eec45b1843f7580da009352a6d13f6cc3bf67f0047b8b967eac72"} Jan 11 17:42:38 crc kubenswrapper[4837]: I0111 17:42:38.925172 4837 scope.go:117] "RemoveContainer" containerID="9292b0d8221063fb8d3780811566b26c880419dfe3f21d500094bf550f6249ff" Jan 11 17:42:38 crc kubenswrapper[4837]: I0111 17:42:38.925073 4837 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-b7lgc" Jan 11 17:42:38 crc kubenswrapper[4837]: I0111 17:42:38.933697 4837 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-v5bf5_78cc7c3f-09f5-4200-a647-8fa4e9b2aae5/kube-multus/0.log" Jan 11 17:42:38 crc kubenswrapper[4837]: I0111 17:42:38.933762 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-v5bf5" event={"ID":"78cc7c3f-09f5-4200-a647-8fa4e9b2aae5","Type":"ContainerStarted","Data":"f1ba8a9e7e9a82e493d447f8bdcc23738bddb14b0f76e14ab38ac336797b70e1"} Jan 11 17:42:38 crc kubenswrapper[4837]: I0111 17:42:38.953660 4837 scope.go:117] "RemoveContainer" containerID="6015de498ef0eae3d4666f7c48ba95f62f9de0c9a9a797e1b23edb1c48e5562d" Jan 11 17:42:38 crc kubenswrapper[4837]: I0111 17:42:38.973590 4837 scope.go:117] "RemoveContainer" containerID="878ba91885890209a4ad7ed46aaf8610fa1fef6202d8ffc3adf7f527c34d1d18" Jan 11 17:42:38 crc kubenswrapper[4837]: I0111 17:42:38.988257 4837 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-ovn-kubernetes/ovnkube-node-b7lgc"] Jan 11 17:42:38 crc kubenswrapper[4837]: I0111 17:42:38.993075 4837 scope.go:117] "RemoveContainer" containerID="d6e485df1c1cc7ceb0e56aba342245806d7ec935e617dbb546bbc6717f65fed0" Jan 11 17:42:38 crc kubenswrapper[4837]: I0111 17:42:38.994406 4837 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-ovn-kubernetes/ovnkube-node-b7lgc"] Jan 11 17:42:39 crc kubenswrapper[4837]: I0111 17:42:39.006978 4837 scope.go:117] "RemoveContainer" containerID="7a19d365a2ce792e7e7bc6982d59fd8d21a540943aa7768d919ba498bd627c1e" Jan 11 17:42:39 crc kubenswrapper[4837]: I0111 17:42:39.019109 4837 scope.go:117] "RemoveContainer" containerID="2f7c987cd6bac7b28de3c49c8249b4980de62aab3ea9a3b907df9fbeb0309301" Jan 11 17:42:39 crc kubenswrapper[4837]: I0111 17:42:39.034595 4837 scope.go:117] "RemoveContainer" containerID="b0d6c2ffdd65ea69d6d830257cf325f8e6b397c8b50a7a55742fed53fe655541" Jan 11 17:42:39 crc kubenswrapper[4837]: I0111 17:42:39.049284 4837 scope.go:117] "RemoveContainer" containerID="7d938cb09523badf72bb434ce00ef1964af73cabdf0c3652e5dc3bab6d25a703" Jan 11 17:42:39 crc kubenswrapper[4837]: I0111 17:42:39.065758 4837 scope.go:117] "RemoveContainer" containerID="c84b06f66f7c241b37748895091d23bd71f1ca684c1fe1ef6116b709873ea779" Jan 11 17:42:39 crc kubenswrapper[4837]: I0111 17:42:39.940415 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-x2mlj" event={"ID":"a8d01daa-e96c-4437-8139-fce0acd42889","Type":"ContainerStarted","Data":"b8d0bd2c4848292c2c165e83b70ce74ed9d6bd3df9fb466deb8184043b125bfb"} Jan 11 17:42:40 crc kubenswrapper[4837]: I0111 17:42:40.375870 4837 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e1452749-ce38-41f8-89dd-4b567f2a3250" path="/var/lib/kubelet/pods/e1452749-ce38-41f8-89dd-4b567f2a3250/volumes" Jan 11 17:42:40 crc kubenswrapper[4837]: I0111 17:42:40.955137 4837 generic.go:334] "Generic (PLEG): container finished" podID="a8d01daa-e96c-4437-8139-fce0acd42889" containerID="b8d0bd2c4848292c2c165e83b70ce74ed9d6bd3df9fb466deb8184043b125bfb" exitCode=0 Jan 11 17:42:40 crc kubenswrapper[4837]: I0111 17:42:40.955172 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-x2mlj" event={"ID":"a8d01daa-e96c-4437-8139-fce0acd42889","Type":"ContainerDied","Data":"b8d0bd2c4848292c2c165e83b70ce74ed9d6bd3df9fb466deb8184043b125bfb"} Jan 11 17:42:49 crc kubenswrapper[4837]: I0111 17:42:49.012454 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-x2mlj" event={"ID":"a8d01daa-e96c-4437-8139-fce0acd42889","Type":"ContainerStarted","Data":"77bccc08647327ffea67bb01ae781b8c1f93a752eb872bda126ab0bbddccb7b1"} Jan 11 17:42:51 crc kubenswrapper[4837]: I0111 17:42:51.030899 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-x2mlj" event={"ID":"a8d01daa-e96c-4437-8139-fce0acd42889","Type":"ContainerStarted","Data":"5e8483dc1012e08f47a6bd6c56164e859e8948e156a9afb01a6125671a1d52eb"} Jan 11 17:42:51 crc kubenswrapper[4837]: I0111 17:42:51.031600 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-x2mlj" event={"ID":"a8d01daa-e96c-4437-8139-fce0acd42889","Type":"ContainerStarted","Data":"07687c12d0e66c5a891ec2e19f81000945665235fe1d5a15f5487eb3d2f1c124"} Jan 11 17:42:51 crc kubenswrapper[4837]: I0111 17:42:51.031620 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-x2mlj" event={"ID":"a8d01daa-e96c-4437-8139-fce0acd42889","Type":"ContainerStarted","Data":"50a815315bbba41f477cc35b026d9a7188ceb2a0912886a15d230daebfd0fbc3"} Jan 11 17:42:51 crc kubenswrapper[4837]: I0111 17:42:51.031636 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-x2mlj" event={"ID":"a8d01daa-e96c-4437-8139-fce0acd42889","Type":"ContainerStarted","Data":"e7791116f78d95416622f4dc5b3e10f36747f36356a13a66696340a3f890b315"} Jan 11 17:42:51 crc kubenswrapper[4837]: I0111 17:42:51.031651 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-x2mlj" event={"ID":"a8d01daa-e96c-4437-8139-fce0acd42889","Type":"ContainerStarted","Data":"7abbef1bf04cd54ced1d484bd6a3661f6333fdd9db3e21d8ea981bdf0bbe9abc"} Jan 11 17:42:51 crc kubenswrapper[4837]: I0111 17:42:51.032205 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-webhook-687f57d79b-ssj7v" event={"ID":"5bf7e751-4059-4025-b610-732ec84bda0d","Type":"ContainerStarted","Data":"41f7ceeb95528d88da0ff65ea38a2eeb73f69b6c650b47025d0ba076271e2ec6"} Jan 11 17:42:51 crc kubenswrapper[4837]: I0111 17:42:51.033090 4837 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="cert-manager/cert-manager-webhook-687f57d79b-ssj7v" Jan 11 17:42:51 crc kubenswrapper[4837]: I0111 17:42:51.034594 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-858654f9db-2f444" event={"ID":"469b992e-fb84-479b-8ec6-5c6490e9daf5","Type":"ContainerStarted","Data":"e481a3e7aecadb9c3b3261df56abddc56e52076730f5ff288ce364f577032b46"} Jan 11 17:42:51 crc kubenswrapper[4837]: I0111 17:42:51.036802 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-cainjector-cf98fcc89-j7rbn" event={"ID":"77732f18-1dd2-475e-9d27-69cf1f66df7d","Type":"ContainerStarted","Data":"362793f9796330c9594fab1fdda3cd0f077558c98ee957523f42ba9dc7a0c3e2"} Jan 11 17:42:51 crc kubenswrapper[4837]: I0111 17:42:51.060464 4837 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="cert-manager/cert-manager-webhook-687f57d79b-ssj7v" podStartSLOduration=2.098974968 podStartE2EDuration="24.0604407s" podCreationTimestamp="2026-01-11 17:42:27 +0000 UTC" firstStartedPulling="2026-01-11 17:42:27.958282141 +0000 UTC m=+722.136474857" lastFinishedPulling="2026-01-11 17:42:49.919747883 +0000 UTC m=+744.097940589" observedRunningTime="2026-01-11 17:42:51.059786053 +0000 UTC m=+745.237978799" watchObservedRunningTime="2026-01-11 17:42:51.0604407 +0000 UTC m=+745.238633396" Jan 11 17:42:51 crc kubenswrapper[4837]: I0111 17:42:51.076898 4837 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="cert-manager/cert-manager-cainjector-cf98fcc89-j7rbn" podStartSLOduration=1.855957129 podStartE2EDuration="24.076879538s" podCreationTimestamp="2026-01-11 17:42:27 +0000 UTC" firstStartedPulling="2026-01-11 17:42:27.707777542 +0000 UTC m=+721.885970248" lastFinishedPulling="2026-01-11 17:42:49.928699951 +0000 UTC m=+744.106892657" observedRunningTime="2026-01-11 17:42:51.072532742 +0000 UTC m=+745.250725448" watchObservedRunningTime="2026-01-11 17:42:51.076879538 +0000 UTC m=+745.255072244" Jan 11 17:42:51 crc kubenswrapper[4837]: I0111 17:42:51.088468 4837 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="cert-manager/cert-manager-858654f9db-2f444" podStartSLOduration=1.889361659 podStartE2EDuration="24.088443747s" podCreationTimestamp="2026-01-11 17:42:27 +0000 UTC" firstStartedPulling="2026-01-11 17:42:27.790332393 +0000 UTC m=+721.968525099" lastFinishedPulling="2026-01-11 17:42:49.989414471 +0000 UTC m=+744.167607187" observedRunningTime="2026-01-11 17:42:51.087084571 +0000 UTC m=+745.265277277" watchObservedRunningTime="2026-01-11 17:42:51.088443747 +0000 UTC m=+745.266636453" Jan 11 17:42:53 crc kubenswrapper[4837]: I0111 17:42:53.055496 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-x2mlj" event={"ID":"a8d01daa-e96c-4437-8139-fce0acd42889","Type":"ContainerStarted","Data":"92364dba47b1584493ac2ee43ee49385e1b2270327dc70e16f4182dcf8a533be"} Jan 11 17:42:56 crc kubenswrapper[4837]: I0111 17:42:56.077406 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-x2mlj" event={"ID":"a8d01daa-e96c-4437-8139-fce0acd42889","Type":"ContainerStarted","Data":"da8170a8c74dc429abac052f225d80314ff12b0bd1696f95d09d22166f0e5aae"} Jan 11 17:42:56 crc kubenswrapper[4837]: I0111 17:42:56.077779 4837 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-x2mlj" Jan 11 17:42:56 crc kubenswrapper[4837]: I0111 17:42:56.077797 4837 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-x2mlj" Jan 11 17:42:56 crc kubenswrapper[4837]: I0111 17:42:56.104905 4837 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ovn-kubernetes/ovnkube-node-x2mlj" podStartSLOduration=18.104887703 podStartE2EDuration="18.104887703s" podCreationTimestamp="2026-01-11 17:42:38 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-11 17:42:56.098918894 +0000 UTC m=+750.277111620" watchObservedRunningTime="2026-01-11 17:42:56.104887703 +0000 UTC m=+750.283080419" Jan 11 17:42:56 crc kubenswrapper[4837]: I0111 17:42:56.106218 4837 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-x2mlj" Jan 11 17:42:57 crc kubenswrapper[4837]: I0111 17:42:57.083390 4837 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-x2mlj" Jan 11 17:42:57 crc kubenswrapper[4837]: I0111 17:42:57.123884 4837 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-x2mlj" Jan 11 17:42:57 crc kubenswrapper[4837]: I0111 17:42:57.504427 4837 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="cert-manager/cert-manager-webhook-687f57d79b-ssj7v" Jan 11 17:43:08 crc kubenswrapper[4837]: I0111 17:43:08.587934 4837 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-x2mlj" Jan 11 17:43:39 crc kubenswrapper[4837]: I0111 17:43:39.267148 4837 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/98085b0df3808ebec39f9f9529f737144fe2dbcdaa4f334014817c0fa8ds5q5"] Jan 11 17:43:39 crc kubenswrapper[4837]: I0111 17:43:39.269935 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/98085b0df3808ebec39f9f9529f737144fe2dbcdaa4f334014817c0fa8ds5q5" Jan 11 17:43:39 crc kubenswrapper[4837]: I0111 17:43:39.273280 4837 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"default-dockercfg-vmwhc" Jan 11 17:43:39 crc kubenswrapper[4837]: I0111 17:43:39.278278 4837 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/98085b0df3808ebec39f9f9529f737144fe2dbcdaa4f334014817c0fa8ds5q5"] Jan 11 17:43:39 crc kubenswrapper[4837]: I0111 17:43:39.290104 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-m46tt\" (UniqueName: \"kubernetes.io/projected/a50fb0c0-dff3-4722-b0ed-4c014c80faee-kube-api-access-m46tt\") pod \"98085b0df3808ebec39f9f9529f737144fe2dbcdaa4f334014817c0fa8ds5q5\" (UID: \"a50fb0c0-dff3-4722-b0ed-4c014c80faee\") " pod="openshift-marketplace/98085b0df3808ebec39f9f9529f737144fe2dbcdaa4f334014817c0fa8ds5q5" Jan 11 17:43:39 crc kubenswrapper[4837]: I0111 17:43:39.290248 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/a50fb0c0-dff3-4722-b0ed-4c014c80faee-bundle\") pod \"98085b0df3808ebec39f9f9529f737144fe2dbcdaa4f334014817c0fa8ds5q5\" (UID: \"a50fb0c0-dff3-4722-b0ed-4c014c80faee\") " pod="openshift-marketplace/98085b0df3808ebec39f9f9529f737144fe2dbcdaa4f334014817c0fa8ds5q5" Jan 11 17:43:39 crc kubenswrapper[4837]: I0111 17:43:39.290329 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/a50fb0c0-dff3-4722-b0ed-4c014c80faee-util\") pod \"98085b0df3808ebec39f9f9529f737144fe2dbcdaa4f334014817c0fa8ds5q5\" (UID: \"a50fb0c0-dff3-4722-b0ed-4c014c80faee\") " pod="openshift-marketplace/98085b0df3808ebec39f9f9529f737144fe2dbcdaa4f334014817c0fa8ds5q5" Jan 11 17:43:39 crc kubenswrapper[4837]: I0111 17:43:39.391918 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-m46tt\" (UniqueName: \"kubernetes.io/projected/a50fb0c0-dff3-4722-b0ed-4c014c80faee-kube-api-access-m46tt\") pod \"98085b0df3808ebec39f9f9529f737144fe2dbcdaa4f334014817c0fa8ds5q5\" (UID: \"a50fb0c0-dff3-4722-b0ed-4c014c80faee\") " pod="openshift-marketplace/98085b0df3808ebec39f9f9529f737144fe2dbcdaa4f334014817c0fa8ds5q5" Jan 11 17:43:39 crc kubenswrapper[4837]: I0111 17:43:39.392085 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/a50fb0c0-dff3-4722-b0ed-4c014c80faee-bundle\") pod \"98085b0df3808ebec39f9f9529f737144fe2dbcdaa4f334014817c0fa8ds5q5\" (UID: \"a50fb0c0-dff3-4722-b0ed-4c014c80faee\") " pod="openshift-marketplace/98085b0df3808ebec39f9f9529f737144fe2dbcdaa4f334014817c0fa8ds5q5" Jan 11 17:43:39 crc kubenswrapper[4837]: I0111 17:43:39.392149 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/a50fb0c0-dff3-4722-b0ed-4c014c80faee-util\") pod \"98085b0df3808ebec39f9f9529f737144fe2dbcdaa4f334014817c0fa8ds5q5\" (UID: \"a50fb0c0-dff3-4722-b0ed-4c014c80faee\") " pod="openshift-marketplace/98085b0df3808ebec39f9f9529f737144fe2dbcdaa4f334014817c0fa8ds5q5" Jan 11 17:43:39 crc kubenswrapper[4837]: I0111 17:43:39.394153 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/a50fb0c0-dff3-4722-b0ed-4c014c80faee-bundle\") pod \"98085b0df3808ebec39f9f9529f737144fe2dbcdaa4f334014817c0fa8ds5q5\" (UID: \"a50fb0c0-dff3-4722-b0ed-4c014c80faee\") " pod="openshift-marketplace/98085b0df3808ebec39f9f9529f737144fe2dbcdaa4f334014817c0fa8ds5q5" Jan 11 17:43:39 crc kubenswrapper[4837]: I0111 17:43:39.394284 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/a50fb0c0-dff3-4722-b0ed-4c014c80faee-util\") pod \"98085b0df3808ebec39f9f9529f737144fe2dbcdaa4f334014817c0fa8ds5q5\" (UID: \"a50fb0c0-dff3-4722-b0ed-4c014c80faee\") " pod="openshift-marketplace/98085b0df3808ebec39f9f9529f737144fe2dbcdaa4f334014817c0fa8ds5q5" Jan 11 17:43:39 crc kubenswrapper[4837]: I0111 17:43:39.417406 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-m46tt\" (UniqueName: \"kubernetes.io/projected/a50fb0c0-dff3-4722-b0ed-4c014c80faee-kube-api-access-m46tt\") pod \"98085b0df3808ebec39f9f9529f737144fe2dbcdaa4f334014817c0fa8ds5q5\" (UID: \"a50fb0c0-dff3-4722-b0ed-4c014c80faee\") " pod="openshift-marketplace/98085b0df3808ebec39f9f9529f737144fe2dbcdaa4f334014817c0fa8ds5q5" Jan 11 17:43:39 crc kubenswrapper[4837]: I0111 17:43:39.443962 4837 patch_prober.go:28] interesting pod/machine-config-daemon-pqnst container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Jan 11 17:43:39 crc kubenswrapper[4837]: I0111 17:43:39.444025 4837 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-pqnst" podUID="1cf6fa66-290a-4e29-bafc-e60185a22fe2" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Jan 11 17:43:39 crc kubenswrapper[4837]: I0111 17:43:39.596846 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/98085b0df3808ebec39f9f9529f737144fe2dbcdaa4f334014817c0fa8ds5q5" Jan 11 17:43:39 crc kubenswrapper[4837]: I0111 17:43:39.834698 4837 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/98085b0df3808ebec39f9f9529f737144fe2dbcdaa4f334014817c0fa8ds5q5"] Jan 11 17:43:40 crc kubenswrapper[4837]: I0111 17:43:40.351987 4837 generic.go:334] "Generic (PLEG): container finished" podID="a50fb0c0-dff3-4722-b0ed-4c014c80faee" containerID="b5dd33d47c067b3cd480c0b04fed6352997056ef8a36217101c80bb9dad1f717" exitCode=0 Jan 11 17:43:40 crc kubenswrapper[4837]: I0111 17:43:40.352076 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/98085b0df3808ebec39f9f9529f737144fe2dbcdaa4f334014817c0fa8ds5q5" event={"ID":"a50fb0c0-dff3-4722-b0ed-4c014c80faee","Type":"ContainerDied","Data":"b5dd33d47c067b3cd480c0b04fed6352997056ef8a36217101c80bb9dad1f717"} Jan 11 17:43:40 crc kubenswrapper[4837]: I0111 17:43:40.352247 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/98085b0df3808ebec39f9f9529f737144fe2dbcdaa4f334014817c0fa8ds5q5" event={"ID":"a50fb0c0-dff3-4722-b0ed-4c014c80faee","Type":"ContainerStarted","Data":"c87fbe1989744e6a137639a54d9663af6722542d6054b2059cba96c2a29c8748"} Jan 11 17:43:41 crc kubenswrapper[4837]: I0111 17:43:41.581417 4837 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-n88lw"] Jan 11 17:43:41 crc kubenswrapper[4837]: I0111 17:43:41.588189 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-n88lw" Jan 11 17:43:41 crc kubenswrapper[4837]: I0111 17:43:41.598469 4837 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-n88lw"] Jan 11 17:43:41 crc kubenswrapper[4837]: I0111 17:43:41.645418 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-p7npp\" (UniqueName: \"kubernetes.io/projected/a8630d06-af1d-4e1d-95c3-5fb89c8294ff-kube-api-access-p7npp\") pod \"redhat-operators-n88lw\" (UID: \"a8630d06-af1d-4e1d-95c3-5fb89c8294ff\") " pod="openshift-marketplace/redhat-operators-n88lw" Jan 11 17:43:41 crc kubenswrapper[4837]: I0111 17:43:41.645491 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a8630d06-af1d-4e1d-95c3-5fb89c8294ff-utilities\") pod \"redhat-operators-n88lw\" (UID: \"a8630d06-af1d-4e1d-95c3-5fb89c8294ff\") " pod="openshift-marketplace/redhat-operators-n88lw" Jan 11 17:43:41 crc kubenswrapper[4837]: I0111 17:43:41.645512 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a8630d06-af1d-4e1d-95c3-5fb89c8294ff-catalog-content\") pod \"redhat-operators-n88lw\" (UID: \"a8630d06-af1d-4e1d-95c3-5fb89c8294ff\") " pod="openshift-marketplace/redhat-operators-n88lw" Jan 11 17:43:41 crc kubenswrapper[4837]: I0111 17:43:41.746446 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-p7npp\" (UniqueName: \"kubernetes.io/projected/a8630d06-af1d-4e1d-95c3-5fb89c8294ff-kube-api-access-p7npp\") pod \"redhat-operators-n88lw\" (UID: \"a8630d06-af1d-4e1d-95c3-5fb89c8294ff\") " pod="openshift-marketplace/redhat-operators-n88lw" Jan 11 17:43:41 crc kubenswrapper[4837]: I0111 17:43:41.746498 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a8630d06-af1d-4e1d-95c3-5fb89c8294ff-utilities\") pod \"redhat-operators-n88lw\" (UID: \"a8630d06-af1d-4e1d-95c3-5fb89c8294ff\") " pod="openshift-marketplace/redhat-operators-n88lw" Jan 11 17:43:41 crc kubenswrapper[4837]: I0111 17:43:41.746765 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a8630d06-af1d-4e1d-95c3-5fb89c8294ff-catalog-content\") pod \"redhat-operators-n88lw\" (UID: \"a8630d06-af1d-4e1d-95c3-5fb89c8294ff\") " pod="openshift-marketplace/redhat-operators-n88lw" Jan 11 17:43:41 crc kubenswrapper[4837]: I0111 17:43:41.746917 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a8630d06-af1d-4e1d-95c3-5fb89c8294ff-utilities\") pod \"redhat-operators-n88lw\" (UID: \"a8630d06-af1d-4e1d-95c3-5fb89c8294ff\") " pod="openshift-marketplace/redhat-operators-n88lw" Jan 11 17:43:41 crc kubenswrapper[4837]: I0111 17:43:41.747158 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a8630d06-af1d-4e1d-95c3-5fb89c8294ff-catalog-content\") pod \"redhat-operators-n88lw\" (UID: \"a8630d06-af1d-4e1d-95c3-5fb89c8294ff\") " pod="openshift-marketplace/redhat-operators-n88lw" Jan 11 17:43:41 crc kubenswrapper[4837]: I0111 17:43:41.765113 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-p7npp\" (UniqueName: \"kubernetes.io/projected/a8630d06-af1d-4e1d-95c3-5fb89c8294ff-kube-api-access-p7npp\") pod \"redhat-operators-n88lw\" (UID: \"a8630d06-af1d-4e1d-95c3-5fb89c8294ff\") " pod="openshift-marketplace/redhat-operators-n88lw" Jan 11 17:43:41 crc kubenswrapper[4837]: I0111 17:43:41.957113 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-n88lw" Jan 11 17:43:42 crc kubenswrapper[4837]: W0111 17:43:42.149143 4837 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poda8630d06_af1d_4e1d_95c3_5fb89c8294ff.slice/crio-f09a9de4717351a6b0006afaea39492110980f5af9c6dbc611504e246c8dc72a WatchSource:0}: Error finding container f09a9de4717351a6b0006afaea39492110980f5af9c6dbc611504e246c8dc72a: Status 404 returned error can't find the container with id f09a9de4717351a6b0006afaea39492110980f5af9c6dbc611504e246c8dc72a Jan 11 17:43:42 crc kubenswrapper[4837]: I0111 17:43:42.152949 4837 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-n88lw"] Jan 11 17:43:42 crc kubenswrapper[4837]: I0111 17:43:42.363743 4837 generic.go:334] "Generic (PLEG): container finished" podID="a8630d06-af1d-4e1d-95c3-5fb89c8294ff" containerID="179be47d8d352ddd2025837a6608854e2a4c0df6d2721be38559cf440aed63f9" exitCode=0 Jan 11 17:43:42 crc kubenswrapper[4837]: I0111 17:43:42.367305 4837 generic.go:334] "Generic (PLEG): container finished" podID="a50fb0c0-dff3-4722-b0ed-4c014c80faee" containerID="4002218debcbc93414f22773834a86e49dfe71dec33789c04e4d60edf7cc01b5" exitCode=0 Jan 11 17:43:42 crc kubenswrapper[4837]: I0111 17:43:42.374306 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-n88lw" event={"ID":"a8630d06-af1d-4e1d-95c3-5fb89c8294ff","Type":"ContainerDied","Data":"179be47d8d352ddd2025837a6608854e2a4c0df6d2721be38559cf440aed63f9"} Jan 11 17:43:42 crc kubenswrapper[4837]: I0111 17:43:42.374367 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-n88lw" event={"ID":"a8630d06-af1d-4e1d-95c3-5fb89c8294ff","Type":"ContainerStarted","Data":"f09a9de4717351a6b0006afaea39492110980f5af9c6dbc611504e246c8dc72a"} Jan 11 17:43:42 crc kubenswrapper[4837]: I0111 17:43:42.374385 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/98085b0df3808ebec39f9f9529f737144fe2dbcdaa4f334014817c0fa8ds5q5" event={"ID":"a50fb0c0-dff3-4722-b0ed-4c014c80faee","Type":"ContainerDied","Data":"4002218debcbc93414f22773834a86e49dfe71dec33789c04e4d60edf7cc01b5"} Jan 11 17:43:43 crc kubenswrapper[4837]: I0111 17:43:43.374142 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-n88lw" event={"ID":"a8630d06-af1d-4e1d-95c3-5fb89c8294ff","Type":"ContainerStarted","Data":"e55e017c617f3b120ee54e06589f9be7b3e094a54037ea86c4e1724492d6910b"} Jan 11 17:43:43 crc kubenswrapper[4837]: I0111 17:43:43.377234 4837 generic.go:334] "Generic (PLEG): container finished" podID="a50fb0c0-dff3-4722-b0ed-4c014c80faee" containerID="7fbc4db5c998af8dbcb92129155c5c76795bd4a0427196bb02138210c4f8b2be" exitCode=0 Jan 11 17:43:43 crc kubenswrapper[4837]: I0111 17:43:43.377267 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/98085b0df3808ebec39f9f9529f737144fe2dbcdaa4f334014817c0fa8ds5q5" event={"ID":"a50fb0c0-dff3-4722-b0ed-4c014c80faee","Type":"ContainerDied","Data":"7fbc4db5c998af8dbcb92129155c5c76795bd4a0427196bb02138210c4f8b2be"} Jan 11 17:43:44 crc kubenswrapper[4837]: I0111 17:43:44.386568 4837 generic.go:334] "Generic (PLEG): container finished" podID="a8630d06-af1d-4e1d-95c3-5fb89c8294ff" containerID="e55e017c617f3b120ee54e06589f9be7b3e094a54037ea86c4e1724492d6910b" exitCode=0 Jan 11 17:43:44 crc kubenswrapper[4837]: I0111 17:43:44.386723 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-n88lw" event={"ID":"a8630d06-af1d-4e1d-95c3-5fb89c8294ff","Type":"ContainerDied","Data":"e55e017c617f3b120ee54e06589f9be7b3e094a54037ea86c4e1724492d6910b"} Jan 11 17:43:44 crc kubenswrapper[4837]: I0111 17:43:44.668889 4837 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/98085b0df3808ebec39f9f9529f737144fe2dbcdaa4f334014817c0fa8ds5q5" Jan 11 17:43:44 crc kubenswrapper[4837]: I0111 17:43:44.788431 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/a50fb0c0-dff3-4722-b0ed-4c014c80faee-bundle\") pod \"a50fb0c0-dff3-4722-b0ed-4c014c80faee\" (UID: \"a50fb0c0-dff3-4722-b0ed-4c014c80faee\") " Jan 11 17:43:44 crc kubenswrapper[4837]: I0111 17:43:44.788538 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-m46tt\" (UniqueName: \"kubernetes.io/projected/a50fb0c0-dff3-4722-b0ed-4c014c80faee-kube-api-access-m46tt\") pod \"a50fb0c0-dff3-4722-b0ed-4c014c80faee\" (UID: \"a50fb0c0-dff3-4722-b0ed-4c014c80faee\") " Jan 11 17:43:44 crc kubenswrapper[4837]: I0111 17:43:44.788617 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/a50fb0c0-dff3-4722-b0ed-4c014c80faee-util\") pod \"a50fb0c0-dff3-4722-b0ed-4c014c80faee\" (UID: \"a50fb0c0-dff3-4722-b0ed-4c014c80faee\") " Jan 11 17:43:44 crc kubenswrapper[4837]: I0111 17:43:44.789586 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/a50fb0c0-dff3-4722-b0ed-4c014c80faee-bundle" (OuterVolumeSpecName: "bundle") pod "a50fb0c0-dff3-4722-b0ed-4c014c80faee" (UID: "a50fb0c0-dff3-4722-b0ed-4c014c80faee"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 11 17:43:44 crc kubenswrapper[4837]: I0111 17:43:44.797171 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a50fb0c0-dff3-4722-b0ed-4c014c80faee-kube-api-access-m46tt" (OuterVolumeSpecName: "kube-api-access-m46tt") pod "a50fb0c0-dff3-4722-b0ed-4c014c80faee" (UID: "a50fb0c0-dff3-4722-b0ed-4c014c80faee"). InnerVolumeSpecName "kube-api-access-m46tt". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 11 17:43:44 crc kubenswrapper[4837]: I0111 17:43:44.817427 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/a50fb0c0-dff3-4722-b0ed-4c014c80faee-util" (OuterVolumeSpecName: "util") pod "a50fb0c0-dff3-4722-b0ed-4c014c80faee" (UID: "a50fb0c0-dff3-4722-b0ed-4c014c80faee"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 11 17:43:44 crc kubenswrapper[4837]: I0111 17:43:44.890081 4837 reconciler_common.go:293] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/a50fb0c0-dff3-4722-b0ed-4c014c80faee-util\") on node \"crc\" DevicePath \"\"" Jan 11 17:43:44 crc kubenswrapper[4837]: I0111 17:43:44.890146 4837 reconciler_common.go:293] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/a50fb0c0-dff3-4722-b0ed-4c014c80faee-bundle\") on node \"crc\" DevicePath \"\"" Jan 11 17:43:44 crc kubenswrapper[4837]: I0111 17:43:44.890169 4837 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-m46tt\" (UniqueName: \"kubernetes.io/projected/a50fb0c0-dff3-4722-b0ed-4c014c80faee-kube-api-access-m46tt\") on node \"crc\" DevicePath \"\"" Jan 11 17:43:45 crc kubenswrapper[4837]: I0111 17:43:45.396866 4837 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/98085b0df3808ebec39f9f9529f737144fe2dbcdaa4f334014817c0fa8ds5q5" Jan 11 17:43:45 crc kubenswrapper[4837]: I0111 17:43:45.396896 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/98085b0df3808ebec39f9f9529f737144fe2dbcdaa4f334014817c0fa8ds5q5" event={"ID":"a50fb0c0-dff3-4722-b0ed-4c014c80faee","Type":"ContainerDied","Data":"c87fbe1989744e6a137639a54d9663af6722542d6054b2059cba96c2a29c8748"} Jan 11 17:43:45 crc kubenswrapper[4837]: I0111 17:43:45.397497 4837 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="c87fbe1989744e6a137639a54d9663af6722542d6054b2059cba96c2a29c8748" Jan 11 17:43:45 crc kubenswrapper[4837]: I0111 17:43:45.399667 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-n88lw" event={"ID":"a8630d06-af1d-4e1d-95c3-5fb89c8294ff","Type":"ContainerStarted","Data":"6d539a8e7bb34ff598feacfad898da6b42bff1e8294f80e55dcc34c82c9ab415"} Jan 11 17:43:45 crc kubenswrapper[4837]: I0111 17:43:45.714470 4837 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-n88lw" podStartSLOduration=2.23613591 podStartE2EDuration="4.714445956s" podCreationTimestamp="2026-01-11 17:43:41 +0000 UTC" firstStartedPulling="2026-01-11 17:43:42.36575521 +0000 UTC m=+796.543947916" lastFinishedPulling="2026-01-11 17:43:44.844065256 +0000 UTC m=+799.022257962" observedRunningTime="2026-01-11 17:43:45.42437872 +0000 UTC m=+799.602571466" watchObservedRunningTime="2026-01-11 17:43:45.714445956 +0000 UTC m=+799.892638682" Jan 11 17:43:49 crc kubenswrapper[4837]: I0111 17:43:49.305272 4837 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-nmstate/nmstate-operator-6769fb99d-tx7vk"] Jan 11 17:43:49 crc kubenswrapper[4837]: E0111 17:43:49.305860 4837 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a50fb0c0-dff3-4722-b0ed-4c014c80faee" containerName="util" Jan 11 17:43:49 crc kubenswrapper[4837]: I0111 17:43:49.305882 4837 state_mem.go:107] "Deleted CPUSet assignment" podUID="a50fb0c0-dff3-4722-b0ed-4c014c80faee" containerName="util" Jan 11 17:43:49 crc kubenswrapper[4837]: E0111 17:43:49.305904 4837 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a50fb0c0-dff3-4722-b0ed-4c014c80faee" containerName="extract" Jan 11 17:43:49 crc kubenswrapper[4837]: I0111 17:43:49.305913 4837 state_mem.go:107] "Deleted CPUSet assignment" podUID="a50fb0c0-dff3-4722-b0ed-4c014c80faee" containerName="extract" Jan 11 17:43:49 crc kubenswrapper[4837]: E0111 17:43:49.305923 4837 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a50fb0c0-dff3-4722-b0ed-4c014c80faee" containerName="pull" Jan 11 17:43:49 crc kubenswrapper[4837]: I0111 17:43:49.305932 4837 state_mem.go:107] "Deleted CPUSet assignment" podUID="a50fb0c0-dff3-4722-b0ed-4c014c80faee" containerName="pull" Jan 11 17:43:49 crc kubenswrapper[4837]: I0111 17:43:49.306056 4837 memory_manager.go:354] "RemoveStaleState removing state" podUID="a50fb0c0-dff3-4722-b0ed-4c014c80faee" containerName="extract" Jan 11 17:43:49 crc kubenswrapper[4837]: I0111 17:43:49.306508 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-operator-6769fb99d-tx7vk" Jan 11 17:43:49 crc kubenswrapper[4837]: I0111 17:43:49.308713 4837 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-nmstate"/"nmstate-operator-dockercfg-2mzfw" Jan 11 17:43:49 crc kubenswrapper[4837]: I0111 17:43:49.309900 4837 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-nmstate"/"kube-root-ca.crt" Jan 11 17:43:49 crc kubenswrapper[4837]: I0111 17:43:49.315042 4837 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-nmstate"/"openshift-service-ca.crt" Jan 11 17:43:49 crc kubenswrapper[4837]: I0111 17:43:49.318119 4837 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-operator-6769fb99d-tx7vk"] Jan 11 17:43:49 crc kubenswrapper[4837]: I0111 17:43:49.350025 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-f7pfp\" (UniqueName: \"kubernetes.io/projected/7340b2fb-4088-4358-977e-020434c7fa2c-kube-api-access-f7pfp\") pod \"nmstate-operator-6769fb99d-tx7vk\" (UID: \"7340b2fb-4088-4358-977e-020434c7fa2c\") " pod="openshift-nmstate/nmstate-operator-6769fb99d-tx7vk" Jan 11 17:43:49 crc kubenswrapper[4837]: I0111 17:43:49.451483 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-f7pfp\" (UniqueName: \"kubernetes.io/projected/7340b2fb-4088-4358-977e-020434c7fa2c-kube-api-access-f7pfp\") pod \"nmstate-operator-6769fb99d-tx7vk\" (UID: \"7340b2fb-4088-4358-977e-020434c7fa2c\") " pod="openshift-nmstate/nmstate-operator-6769fb99d-tx7vk" Jan 11 17:43:49 crc kubenswrapper[4837]: I0111 17:43:49.473207 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-f7pfp\" (UniqueName: \"kubernetes.io/projected/7340b2fb-4088-4358-977e-020434c7fa2c-kube-api-access-f7pfp\") pod \"nmstate-operator-6769fb99d-tx7vk\" (UID: \"7340b2fb-4088-4358-977e-020434c7fa2c\") " pod="openshift-nmstate/nmstate-operator-6769fb99d-tx7vk" Jan 11 17:43:49 crc kubenswrapper[4837]: I0111 17:43:49.622946 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-operator-6769fb99d-tx7vk" Jan 11 17:43:49 crc kubenswrapper[4837]: I0111 17:43:49.872552 4837 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-operator-6769fb99d-tx7vk"] Jan 11 17:43:49 crc kubenswrapper[4837]: W0111 17:43:49.878548 4837 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod7340b2fb_4088_4358_977e_020434c7fa2c.slice/crio-0c24a257787e0ad9125f09e45e44735d06a747e817ce6f94644a0b0690c6eaed WatchSource:0}: Error finding container 0c24a257787e0ad9125f09e45e44735d06a747e817ce6f94644a0b0690c6eaed: Status 404 returned error can't find the container with id 0c24a257787e0ad9125f09e45e44735d06a747e817ce6f94644a0b0690c6eaed Jan 11 17:43:50 crc kubenswrapper[4837]: I0111 17:43:50.433322 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-operator-6769fb99d-tx7vk" event={"ID":"7340b2fb-4088-4358-977e-020434c7fa2c","Type":"ContainerStarted","Data":"0c24a257787e0ad9125f09e45e44735d06a747e817ce6f94644a0b0690c6eaed"} Jan 11 17:43:51 crc kubenswrapper[4837]: I0111 17:43:51.958082 4837 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-n88lw" Jan 11 17:43:51 crc kubenswrapper[4837]: I0111 17:43:51.958366 4837 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-n88lw" Jan 11 17:43:52 crc kubenswrapper[4837]: I0111 17:43:52.005532 4837 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-n88lw" Jan 11 17:43:52 crc kubenswrapper[4837]: I0111 17:43:52.506395 4837 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-n88lw" Jan 11 17:43:53 crc kubenswrapper[4837]: I0111 17:43:53.452791 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-operator-6769fb99d-tx7vk" event={"ID":"7340b2fb-4088-4358-977e-020434c7fa2c","Type":"ContainerStarted","Data":"88e53c0bb311402aa514d31a6ca6001e18915d5caa89968a2ece06b5384fbbb5"} Jan 11 17:43:53 crc kubenswrapper[4837]: I0111 17:43:53.469879 4837 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-nmstate/nmstate-operator-6769fb99d-tx7vk" podStartSLOduration=1.873177306 podStartE2EDuration="4.469859757s" podCreationTimestamp="2026-01-11 17:43:49 +0000 UTC" firstStartedPulling="2026-01-11 17:43:49.886937197 +0000 UTC m=+804.065129913" lastFinishedPulling="2026-01-11 17:43:52.483619618 +0000 UTC m=+806.661812364" observedRunningTime="2026-01-11 17:43:53.466822616 +0000 UTC m=+807.645015332" watchObservedRunningTime="2026-01-11 17:43:53.469859757 +0000 UTC m=+807.648052483" Jan 11 17:43:54 crc kubenswrapper[4837]: I0111 17:43:54.374586 4837 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-n88lw"] Jan 11 17:43:54 crc kubenswrapper[4837]: I0111 17:43:54.467559 4837 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-n88lw" podUID="a8630d06-af1d-4e1d-95c3-5fb89c8294ff" containerName="registry-server" containerID="cri-o://6d539a8e7bb34ff598feacfad898da6b42bff1e8294f80e55dcc34c82c9ab415" gracePeriod=2 Jan 11 17:43:56 crc kubenswrapper[4837]: I0111 17:43:56.481643 4837 generic.go:334] "Generic (PLEG): container finished" podID="a8630d06-af1d-4e1d-95c3-5fb89c8294ff" containerID="6d539a8e7bb34ff598feacfad898da6b42bff1e8294f80e55dcc34c82c9ab415" exitCode=0 Jan 11 17:43:56 crc kubenswrapper[4837]: I0111 17:43:56.481723 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-n88lw" event={"ID":"a8630d06-af1d-4e1d-95c3-5fb89c8294ff","Type":"ContainerDied","Data":"6d539a8e7bb34ff598feacfad898da6b42bff1e8294f80e55dcc34c82c9ab415"} Jan 11 17:43:57 crc kubenswrapper[4837]: I0111 17:43:57.564710 4837 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-n88lw" Jan 11 17:43:57 crc kubenswrapper[4837]: I0111 17:43:57.579779 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a8630d06-af1d-4e1d-95c3-5fb89c8294ff-catalog-content\") pod \"a8630d06-af1d-4e1d-95c3-5fb89c8294ff\" (UID: \"a8630d06-af1d-4e1d-95c3-5fb89c8294ff\") " Jan 11 17:43:57 crc kubenswrapper[4837]: I0111 17:43:57.579831 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a8630d06-af1d-4e1d-95c3-5fb89c8294ff-utilities\") pod \"a8630d06-af1d-4e1d-95c3-5fb89c8294ff\" (UID: \"a8630d06-af1d-4e1d-95c3-5fb89c8294ff\") " Jan 11 17:43:57 crc kubenswrapper[4837]: I0111 17:43:57.579897 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-p7npp\" (UniqueName: \"kubernetes.io/projected/a8630d06-af1d-4e1d-95c3-5fb89c8294ff-kube-api-access-p7npp\") pod \"a8630d06-af1d-4e1d-95c3-5fb89c8294ff\" (UID: \"a8630d06-af1d-4e1d-95c3-5fb89c8294ff\") " Jan 11 17:43:57 crc kubenswrapper[4837]: I0111 17:43:57.581447 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/a8630d06-af1d-4e1d-95c3-5fb89c8294ff-utilities" (OuterVolumeSpecName: "utilities") pod "a8630d06-af1d-4e1d-95c3-5fb89c8294ff" (UID: "a8630d06-af1d-4e1d-95c3-5fb89c8294ff"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 11 17:43:57 crc kubenswrapper[4837]: I0111 17:43:57.581751 4837 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a8630d06-af1d-4e1d-95c3-5fb89c8294ff-utilities\") on node \"crc\" DevicePath \"\"" Jan 11 17:43:57 crc kubenswrapper[4837]: I0111 17:43:57.589275 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a8630d06-af1d-4e1d-95c3-5fb89c8294ff-kube-api-access-p7npp" (OuterVolumeSpecName: "kube-api-access-p7npp") pod "a8630d06-af1d-4e1d-95c3-5fb89c8294ff" (UID: "a8630d06-af1d-4e1d-95c3-5fb89c8294ff"). InnerVolumeSpecName "kube-api-access-p7npp". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 11 17:43:57 crc kubenswrapper[4837]: I0111 17:43:57.682645 4837 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-p7npp\" (UniqueName: \"kubernetes.io/projected/a8630d06-af1d-4e1d-95c3-5fb89c8294ff-kube-api-access-p7npp\") on node \"crc\" DevicePath \"\"" Jan 11 17:43:57 crc kubenswrapper[4837]: I0111 17:43:57.722773 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/a8630d06-af1d-4e1d-95c3-5fb89c8294ff-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "a8630d06-af1d-4e1d-95c3-5fb89c8294ff" (UID: "a8630d06-af1d-4e1d-95c3-5fb89c8294ff"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 11 17:43:57 crc kubenswrapper[4837]: I0111 17:43:57.785496 4837 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a8630d06-af1d-4e1d-95c3-5fb89c8294ff-catalog-content\") on node \"crc\" DevicePath \"\"" Jan 11 17:43:58 crc kubenswrapper[4837]: I0111 17:43:58.498560 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-n88lw" event={"ID":"a8630d06-af1d-4e1d-95c3-5fb89c8294ff","Type":"ContainerDied","Data":"f09a9de4717351a6b0006afaea39492110980f5af9c6dbc611504e246c8dc72a"} Jan 11 17:43:58 crc kubenswrapper[4837]: I0111 17:43:58.498644 4837 scope.go:117] "RemoveContainer" containerID="6d539a8e7bb34ff598feacfad898da6b42bff1e8294f80e55dcc34c82c9ab415" Jan 11 17:43:58 crc kubenswrapper[4837]: I0111 17:43:58.498708 4837 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-n88lw" Jan 11 17:43:58 crc kubenswrapper[4837]: I0111 17:43:58.526556 4837 scope.go:117] "RemoveContainer" containerID="e55e017c617f3b120ee54e06589f9be7b3e094a54037ea86c4e1724492d6910b" Jan 11 17:43:58 crc kubenswrapper[4837]: I0111 17:43:58.543285 4837 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-n88lw"] Jan 11 17:43:58 crc kubenswrapper[4837]: I0111 17:43:58.551596 4837 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-n88lw"] Jan 11 17:43:58 crc kubenswrapper[4837]: I0111 17:43:58.560515 4837 scope.go:117] "RemoveContainer" containerID="179be47d8d352ddd2025837a6608854e2a4c0df6d2721be38559cf440aed63f9" Jan 11 17:43:59 crc kubenswrapper[4837]: I0111 17:43:59.458775 4837 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-nmstate/nmstate-metrics-7f7f7578db-znz8s"] Jan 11 17:43:59 crc kubenswrapper[4837]: E0111 17:43:59.459082 4837 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a8630d06-af1d-4e1d-95c3-5fb89c8294ff" containerName="extract-utilities" Jan 11 17:43:59 crc kubenswrapper[4837]: I0111 17:43:59.459101 4837 state_mem.go:107] "Deleted CPUSet assignment" podUID="a8630d06-af1d-4e1d-95c3-5fb89c8294ff" containerName="extract-utilities" Jan 11 17:43:59 crc kubenswrapper[4837]: E0111 17:43:59.459127 4837 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a8630d06-af1d-4e1d-95c3-5fb89c8294ff" containerName="extract-content" Jan 11 17:43:59 crc kubenswrapper[4837]: I0111 17:43:59.459139 4837 state_mem.go:107] "Deleted CPUSet assignment" podUID="a8630d06-af1d-4e1d-95c3-5fb89c8294ff" containerName="extract-content" Jan 11 17:43:59 crc kubenswrapper[4837]: E0111 17:43:59.459155 4837 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a8630d06-af1d-4e1d-95c3-5fb89c8294ff" containerName="registry-server" Jan 11 17:43:59 crc kubenswrapper[4837]: I0111 17:43:59.459168 4837 state_mem.go:107] "Deleted CPUSet assignment" podUID="a8630d06-af1d-4e1d-95c3-5fb89c8294ff" containerName="registry-server" Jan 11 17:43:59 crc kubenswrapper[4837]: I0111 17:43:59.459322 4837 memory_manager.go:354] "RemoveStaleState removing state" podUID="a8630d06-af1d-4e1d-95c3-5fb89c8294ff" containerName="registry-server" Jan 11 17:43:59 crc kubenswrapper[4837]: I0111 17:43:59.460240 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-metrics-7f7f7578db-znz8s" Jan 11 17:43:59 crc kubenswrapper[4837]: I0111 17:43:59.462605 4837 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-nmstate"/"nmstate-handler-dockercfg-pkwjg" Jan 11 17:43:59 crc kubenswrapper[4837]: I0111 17:43:59.465187 4837 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-nmstate/nmstate-webhook-f8fb84555-4tph5"] Jan 11 17:43:59 crc kubenswrapper[4837]: I0111 17:43:59.465894 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-webhook-f8fb84555-4tph5" Jan 11 17:43:59 crc kubenswrapper[4837]: I0111 17:43:59.468464 4837 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-nmstate"/"openshift-nmstate-webhook" Jan 11 17:43:59 crc kubenswrapper[4837]: I0111 17:43:59.472210 4837 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-metrics-7f7f7578db-znz8s"] Jan 11 17:43:59 crc kubenswrapper[4837]: I0111 17:43:59.488920 4837 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-webhook-f8fb84555-4tph5"] Jan 11 17:43:59 crc kubenswrapper[4837]: I0111 17:43:59.497657 4837 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-nmstate/nmstate-handler-68ws8"] Jan 11 17:43:59 crc kubenswrapper[4837]: I0111 17:43:59.498577 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-handler-68ws8" Jan 11 17:43:59 crc kubenswrapper[4837]: I0111 17:43:59.512440 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nmstate-lock\" (UniqueName: \"kubernetes.io/host-path/1c2557e3-14a8-4911-92b0-564bb7b60b06-nmstate-lock\") pod \"nmstate-handler-68ws8\" (UID: \"1c2557e3-14a8-4911-92b0-564bb7b60b06\") " pod="openshift-nmstate/nmstate-handler-68ws8" Jan 11 17:43:59 crc kubenswrapper[4837]: I0111 17:43:59.512492 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bdb5m\" (UniqueName: \"kubernetes.io/projected/c440fad0-c0e0-4553-ad26-b843f81c8863-kube-api-access-bdb5m\") pod \"nmstate-metrics-7f7f7578db-znz8s\" (UID: \"c440fad0-c0e0-4553-ad26-b843f81c8863\") " pod="openshift-nmstate/nmstate-metrics-7f7f7578db-znz8s" Jan 11 17:43:59 crc kubenswrapper[4837]: I0111 17:43:59.512554 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tdjvk\" (UniqueName: \"kubernetes.io/projected/1c2557e3-14a8-4911-92b0-564bb7b60b06-kube-api-access-tdjvk\") pod \"nmstate-handler-68ws8\" (UID: \"1c2557e3-14a8-4911-92b0-564bb7b60b06\") " pod="openshift-nmstate/nmstate-handler-68ws8" Jan 11 17:43:59 crc kubenswrapper[4837]: I0111 17:43:59.512582 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mkmpq\" (UniqueName: \"kubernetes.io/projected/5d9870f8-4c71-4490-8e77-17f1a82e725a-kube-api-access-mkmpq\") pod \"nmstate-webhook-f8fb84555-4tph5\" (UID: \"5d9870f8-4c71-4490-8e77-17f1a82e725a\") " pod="openshift-nmstate/nmstate-webhook-f8fb84555-4tph5" Jan 11 17:43:59 crc kubenswrapper[4837]: I0111 17:43:59.512602 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-key-pair\" (UniqueName: \"kubernetes.io/secret/5d9870f8-4c71-4490-8e77-17f1a82e725a-tls-key-pair\") pod \"nmstate-webhook-f8fb84555-4tph5\" (UID: \"5d9870f8-4c71-4490-8e77-17f1a82e725a\") " pod="openshift-nmstate/nmstate-webhook-f8fb84555-4tph5" Jan 11 17:43:59 crc kubenswrapper[4837]: I0111 17:43:59.512618 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dbus-socket\" (UniqueName: \"kubernetes.io/host-path/1c2557e3-14a8-4911-92b0-564bb7b60b06-dbus-socket\") pod \"nmstate-handler-68ws8\" (UID: \"1c2557e3-14a8-4911-92b0-564bb7b60b06\") " pod="openshift-nmstate/nmstate-handler-68ws8" Jan 11 17:43:59 crc kubenswrapper[4837]: I0111 17:43:59.512638 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovs-socket\" (UniqueName: \"kubernetes.io/host-path/1c2557e3-14a8-4911-92b0-564bb7b60b06-ovs-socket\") pod \"nmstate-handler-68ws8\" (UID: \"1c2557e3-14a8-4911-92b0-564bb7b60b06\") " pod="openshift-nmstate/nmstate-handler-68ws8" Jan 11 17:43:59 crc kubenswrapper[4837]: I0111 17:43:59.585033 4837 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-nmstate/nmstate-console-plugin-6ff7998486-kkz9f"] Jan 11 17:43:59 crc kubenswrapper[4837]: I0111 17:43:59.585876 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-console-plugin-6ff7998486-kkz9f" Jan 11 17:43:59 crc kubenswrapper[4837]: I0111 17:43:59.587160 4837 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-nmstate"/"default-dockercfg-hlhx6" Jan 11 17:43:59 crc kubenswrapper[4837]: I0111 17:43:59.588271 4837 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-nmstate"/"nginx-conf" Jan 11 17:43:59 crc kubenswrapper[4837]: I0111 17:43:59.589000 4837 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-nmstate"/"plugin-serving-cert" Jan 11 17:43:59 crc kubenswrapper[4837]: I0111 17:43:59.603169 4837 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-console-plugin-6ff7998486-kkz9f"] Jan 11 17:43:59 crc kubenswrapper[4837]: I0111 17:43:59.613231 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovs-socket\" (UniqueName: \"kubernetes.io/host-path/1c2557e3-14a8-4911-92b0-564bb7b60b06-ovs-socket\") pod \"nmstate-handler-68ws8\" (UID: \"1c2557e3-14a8-4911-92b0-564bb7b60b06\") " pod="openshift-nmstate/nmstate-handler-68ws8" Jan 11 17:43:59 crc kubenswrapper[4837]: I0111 17:43:59.613296 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/3b85571d-dea1-437f-bd5c-27d5d421411e-nginx-conf\") pod \"nmstate-console-plugin-6ff7998486-kkz9f\" (UID: \"3b85571d-dea1-437f-bd5c-27d5d421411e\") " pod="openshift-nmstate/nmstate-console-plugin-6ff7998486-kkz9f" Jan 11 17:43:59 crc kubenswrapper[4837]: I0111 17:43:59.613319 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nmstate-lock\" (UniqueName: \"kubernetes.io/host-path/1c2557e3-14a8-4911-92b0-564bb7b60b06-nmstate-lock\") pod \"nmstate-handler-68ws8\" (UID: \"1c2557e3-14a8-4911-92b0-564bb7b60b06\") " pod="openshift-nmstate/nmstate-handler-68ws8" Jan 11 17:43:59 crc kubenswrapper[4837]: I0111 17:43:59.613342 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bdb5m\" (UniqueName: \"kubernetes.io/projected/c440fad0-c0e0-4553-ad26-b843f81c8863-kube-api-access-bdb5m\") pod \"nmstate-metrics-7f7f7578db-znz8s\" (UID: \"c440fad0-c0e0-4553-ad26-b843f81c8863\") " pod="openshift-nmstate/nmstate-metrics-7f7f7578db-znz8s" Jan 11 17:43:59 crc kubenswrapper[4837]: I0111 17:43:59.613368 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-56vrs\" (UniqueName: \"kubernetes.io/projected/3b85571d-dea1-437f-bd5c-27d5d421411e-kube-api-access-56vrs\") pod \"nmstate-console-plugin-6ff7998486-kkz9f\" (UID: \"3b85571d-dea1-437f-bd5c-27d5d421411e\") " pod="openshift-nmstate/nmstate-console-plugin-6ff7998486-kkz9f" Jan 11 17:43:59 crc kubenswrapper[4837]: I0111 17:43:59.613395 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-tdjvk\" (UniqueName: \"kubernetes.io/projected/1c2557e3-14a8-4911-92b0-564bb7b60b06-kube-api-access-tdjvk\") pod \"nmstate-handler-68ws8\" (UID: \"1c2557e3-14a8-4911-92b0-564bb7b60b06\") " pod="openshift-nmstate/nmstate-handler-68ws8" Jan 11 17:43:59 crc kubenswrapper[4837]: I0111 17:43:59.613421 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"plugin-serving-cert\" (UniqueName: \"kubernetes.io/secret/3b85571d-dea1-437f-bd5c-27d5d421411e-plugin-serving-cert\") pod \"nmstate-console-plugin-6ff7998486-kkz9f\" (UID: \"3b85571d-dea1-437f-bd5c-27d5d421411e\") " pod="openshift-nmstate/nmstate-console-plugin-6ff7998486-kkz9f" Jan 11 17:43:59 crc kubenswrapper[4837]: I0111 17:43:59.613452 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mkmpq\" (UniqueName: \"kubernetes.io/projected/5d9870f8-4c71-4490-8e77-17f1a82e725a-kube-api-access-mkmpq\") pod \"nmstate-webhook-f8fb84555-4tph5\" (UID: \"5d9870f8-4c71-4490-8e77-17f1a82e725a\") " pod="openshift-nmstate/nmstate-webhook-f8fb84555-4tph5" Jan 11 17:43:59 crc kubenswrapper[4837]: I0111 17:43:59.613475 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"tls-key-pair\" (UniqueName: \"kubernetes.io/secret/5d9870f8-4c71-4490-8e77-17f1a82e725a-tls-key-pair\") pod \"nmstate-webhook-f8fb84555-4tph5\" (UID: \"5d9870f8-4c71-4490-8e77-17f1a82e725a\") " pod="openshift-nmstate/nmstate-webhook-f8fb84555-4tph5" Jan 11 17:43:59 crc kubenswrapper[4837]: I0111 17:43:59.613496 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dbus-socket\" (UniqueName: \"kubernetes.io/host-path/1c2557e3-14a8-4911-92b0-564bb7b60b06-dbus-socket\") pod \"nmstate-handler-68ws8\" (UID: \"1c2557e3-14a8-4911-92b0-564bb7b60b06\") " pod="openshift-nmstate/nmstate-handler-68ws8" Jan 11 17:43:59 crc kubenswrapper[4837]: I0111 17:43:59.613771 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovs-socket\" (UniqueName: \"kubernetes.io/host-path/1c2557e3-14a8-4911-92b0-564bb7b60b06-ovs-socket\") pod \"nmstate-handler-68ws8\" (UID: \"1c2557e3-14a8-4911-92b0-564bb7b60b06\") " pod="openshift-nmstate/nmstate-handler-68ws8" Jan 11 17:43:59 crc kubenswrapper[4837]: I0111 17:43:59.613821 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dbus-socket\" (UniqueName: \"kubernetes.io/host-path/1c2557e3-14a8-4911-92b0-564bb7b60b06-dbus-socket\") pod \"nmstate-handler-68ws8\" (UID: \"1c2557e3-14a8-4911-92b0-564bb7b60b06\") " pod="openshift-nmstate/nmstate-handler-68ws8" Jan 11 17:43:59 crc kubenswrapper[4837]: I0111 17:43:59.613880 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nmstate-lock\" (UniqueName: \"kubernetes.io/host-path/1c2557e3-14a8-4911-92b0-564bb7b60b06-nmstate-lock\") pod \"nmstate-handler-68ws8\" (UID: \"1c2557e3-14a8-4911-92b0-564bb7b60b06\") " pod="openshift-nmstate/nmstate-handler-68ws8" Jan 11 17:43:59 crc kubenswrapper[4837]: E0111 17:43:59.613986 4837 secret.go:188] Couldn't get secret openshift-nmstate/openshift-nmstate-webhook: secret "openshift-nmstate-webhook" not found Jan 11 17:43:59 crc kubenswrapper[4837]: E0111 17:43:59.614040 4837 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5d9870f8-4c71-4490-8e77-17f1a82e725a-tls-key-pair podName:5d9870f8-4c71-4490-8e77-17f1a82e725a nodeName:}" failed. No retries permitted until 2026-01-11 17:44:00.114022563 +0000 UTC m=+814.292215270 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "tls-key-pair" (UniqueName: "kubernetes.io/secret/5d9870f8-4c71-4490-8e77-17f1a82e725a-tls-key-pair") pod "nmstate-webhook-f8fb84555-4tph5" (UID: "5d9870f8-4c71-4490-8e77-17f1a82e725a") : secret "openshift-nmstate-webhook" not found Jan 11 17:43:59 crc kubenswrapper[4837]: I0111 17:43:59.632119 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bdb5m\" (UniqueName: \"kubernetes.io/projected/c440fad0-c0e0-4553-ad26-b843f81c8863-kube-api-access-bdb5m\") pod \"nmstate-metrics-7f7f7578db-znz8s\" (UID: \"c440fad0-c0e0-4553-ad26-b843f81c8863\") " pod="openshift-nmstate/nmstate-metrics-7f7f7578db-znz8s" Jan 11 17:43:59 crc kubenswrapper[4837]: I0111 17:43:59.643375 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-tdjvk\" (UniqueName: \"kubernetes.io/projected/1c2557e3-14a8-4911-92b0-564bb7b60b06-kube-api-access-tdjvk\") pod \"nmstate-handler-68ws8\" (UID: \"1c2557e3-14a8-4911-92b0-564bb7b60b06\") " pod="openshift-nmstate/nmstate-handler-68ws8" Jan 11 17:43:59 crc kubenswrapper[4837]: I0111 17:43:59.643451 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mkmpq\" (UniqueName: \"kubernetes.io/projected/5d9870f8-4c71-4490-8e77-17f1a82e725a-kube-api-access-mkmpq\") pod \"nmstate-webhook-f8fb84555-4tph5\" (UID: \"5d9870f8-4c71-4490-8e77-17f1a82e725a\") " pod="openshift-nmstate/nmstate-webhook-f8fb84555-4tph5" Jan 11 17:43:59 crc kubenswrapper[4837]: I0111 17:43:59.714819 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/3b85571d-dea1-437f-bd5c-27d5d421411e-nginx-conf\") pod \"nmstate-console-plugin-6ff7998486-kkz9f\" (UID: \"3b85571d-dea1-437f-bd5c-27d5d421411e\") " pod="openshift-nmstate/nmstate-console-plugin-6ff7998486-kkz9f" Jan 11 17:43:59 crc kubenswrapper[4837]: I0111 17:43:59.714876 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-56vrs\" (UniqueName: \"kubernetes.io/projected/3b85571d-dea1-437f-bd5c-27d5d421411e-kube-api-access-56vrs\") pod \"nmstate-console-plugin-6ff7998486-kkz9f\" (UID: \"3b85571d-dea1-437f-bd5c-27d5d421411e\") " pod="openshift-nmstate/nmstate-console-plugin-6ff7998486-kkz9f" Jan 11 17:43:59 crc kubenswrapper[4837]: I0111 17:43:59.715222 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"plugin-serving-cert\" (UniqueName: \"kubernetes.io/secret/3b85571d-dea1-437f-bd5c-27d5d421411e-plugin-serving-cert\") pod \"nmstate-console-plugin-6ff7998486-kkz9f\" (UID: \"3b85571d-dea1-437f-bd5c-27d5d421411e\") " pod="openshift-nmstate/nmstate-console-plugin-6ff7998486-kkz9f" Jan 11 17:43:59 crc kubenswrapper[4837]: E0111 17:43:59.715330 4837 secret.go:188] Couldn't get secret openshift-nmstate/plugin-serving-cert: secret "plugin-serving-cert" not found Jan 11 17:43:59 crc kubenswrapper[4837]: E0111 17:43:59.715590 4837 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/3b85571d-dea1-437f-bd5c-27d5d421411e-plugin-serving-cert podName:3b85571d-dea1-437f-bd5c-27d5d421411e nodeName:}" failed. No retries permitted until 2026-01-11 17:44:00.215558302 +0000 UTC m=+814.393751008 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "plugin-serving-cert" (UniqueName: "kubernetes.io/secret/3b85571d-dea1-437f-bd5c-27d5d421411e-plugin-serving-cert") pod "nmstate-console-plugin-6ff7998486-kkz9f" (UID: "3b85571d-dea1-437f-bd5c-27d5d421411e") : secret "plugin-serving-cert" not found Jan 11 17:43:59 crc kubenswrapper[4837]: I0111 17:43:59.715698 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/3b85571d-dea1-437f-bd5c-27d5d421411e-nginx-conf\") pod \"nmstate-console-plugin-6ff7998486-kkz9f\" (UID: \"3b85571d-dea1-437f-bd5c-27d5d421411e\") " pod="openshift-nmstate/nmstate-console-plugin-6ff7998486-kkz9f" Jan 11 17:43:59 crc kubenswrapper[4837]: I0111 17:43:59.747482 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-56vrs\" (UniqueName: \"kubernetes.io/projected/3b85571d-dea1-437f-bd5c-27d5d421411e-kube-api-access-56vrs\") pod \"nmstate-console-plugin-6ff7998486-kkz9f\" (UID: \"3b85571d-dea1-437f-bd5c-27d5d421411e\") " pod="openshift-nmstate/nmstate-console-plugin-6ff7998486-kkz9f" Jan 11 17:43:59 crc kubenswrapper[4837]: I0111 17:43:59.775604 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-metrics-7f7f7578db-znz8s" Jan 11 17:43:59 crc kubenswrapper[4837]: I0111 17:43:59.784846 4837 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-console/console-5b67ffbb49-xj5cd"] Jan 11 17:43:59 crc kubenswrapper[4837]: I0111 17:43:59.785654 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-5b67ffbb49-xj5cd" Jan 11 17:43:59 crc kubenswrapper[4837]: I0111 17:43:59.811600 4837 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-5b67ffbb49-xj5cd"] Jan 11 17:43:59 crc kubenswrapper[4837]: I0111 17:43:59.818854 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/703af3d8-b286-4fdb-ac04-ea2084575961-console-serving-cert\") pod \"console-5b67ffbb49-xj5cd\" (UID: \"703af3d8-b286-4fdb-ac04-ea2084575961\") " pod="openshift-console/console-5b67ffbb49-xj5cd" Jan 11 17:43:59 crc kubenswrapper[4837]: I0111 17:43:59.818912 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xc4md\" (UniqueName: \"kubernetes.io/projected/703af3d8-b286-4fdb-ac04-ea2084575961-kube-api-access-xc4md\") pod \"console-5b67ffbb49-xj5cd\" (UID: \"703af3d8-b286-4fdb-ac04-ea2084575961\") " pod="openshift-console/console-5b67ffbb49-xj5cd" Jan 11 17:43:59 crc kubenswrapper[4837]: I0111 17:43:59.818939 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/703af3d8-b286-4fdb-ac04-ea2084575961-trusted-ca-bundle\") pod \"console-5b67ffbb49-xj5cd\" (UID: \"703af3d8-b286-4fdb-ac04-ea2084575961\") " pod="openshift-console/console-5b67ffbb49-xj5cd" Jan 11 17:43:59 crc kubenswrapper[4837]: I0111 17:43:59.818988 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/703af3d8-b286-4fdb-ac04-ea2084575961-service-ca\") pod \"console-5b67ffbb49-xj5cd\" (UID: \"703af3d8-b286-4fdb-ac04-ea2084575961\") " pod="openshift-console/console-5b67ffbb49-xj5cd" Jan 11 17:43:59 crc kubenswrapper[4837]: I0111 17:43:59.819052 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/703af3d8-b286-4fdb-ac04-ea2084575961-console-config\") pod \"console-5b67ffbb49-xj5cd\" (UID: \"703af3d8-b286-4fdb-ac04-ea2084575961\") " pod="openshift-console/console-5b67ffbb49-xj5cd" Jan 11 17:43:59 crc kubenswrapper[4837]: I0111 17:43:59.819073 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/703af3d8-b286-4fdb-ac04-ea2084575961-oauth-serving-cert\") pod \"console-5b67ffbb49-xj5cd\" (UID: \"703af3d8-b286-4fdb-ac04-ea2084575961\") " pod="openshift-console/console-5b67ffbb49-xj5cd" Jan 11 17:43:59 crc kubenswrapper[4837]: I0111 17:43:59.819094 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/703af3d8-b286-4fdb-ac04-ea2084575961-console-oauth-config\") pod \"console-5b67ffbb49-xj5cd\" (UID: \"703af3d8-b286-4fdb-ac04-ea2084575961\") " pod="openshift-console/console-5b67ffbb49-xj5cd" Jan 11 17:43:59 crc kubenswrapper[4837]: I0111 17:43:59.819388 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-handler-68ws8" Jan 11 17:43:59 crc kubenswrapper[4837]: I0111 17:43:59.923583 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/703af3d8-b286-4fdb-ac04-ea2084575961-service-ca\") pod \"console-5b67ffbb49-xj5cd\" (UID: \"703af3d8-b286-4fdb-ac04-ea2084575961\") " pod="openshift-console/console-5b67ffbb49-xj5cd" Jan 11 17:43:59 crc kubenswrapper[4837]: I0111 17:43:59.923661 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/703af3d8-b286-4fdb-ac04-ea2084575961-console-config\") pod \"console-5b67ffbb49-xj5cd\" (UID: \"703af3d8-b286-4fdb-ac04-ea2084575961\") " pod="openshift-console/console-5b67ffbb49-xj5cd" Jan 11 17:43:59 crc kubenswrapper[4837]: I0111 17:43:59.923692 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/703af3d8-b286-4fdb-ac04-ea2084575961-oauth-serving-cert\") pod \"console-5b67ffbb49-xj5cd\" (UID: \"703af3d8-b286-4fdb-ac04-ea2084575961\") " pod="openshift-console/console-5b67ffbb49-xj5cd" Jan 11 17:43:59 crc kubenswrapper[4837]: I0111 17:43:59.923731 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/703af3d8-b286-4fdb-ac04-ea2084575961-console-oauth-config\") pod \"console-5b67ffbb49-xj5cd\" (UID: \"703af3d8-b286-4fdb-ac04-ea2084575961\") " pod="openshift-console/console-5b67ffbb49-xj5cd" Jan 11 17:43:59 crc kubenswrapper[4837]: I0111 17:43:59.923769 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/703af3d8-b286-4fdb-ac04-ea2084575961-console-serving-cert\") pod \"console-5b67ffbb49-xj5cd\" (UID: \"703af3d8-b286-4fdb-ac04-ea2084575961\") " pod="openshift-console/console-5b67ffbb49-xj5cd" Jan 11 17:43:59 crc kubenswrapper[4837]: I0111 17:43:59.923792 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xc4md\" (UniqueName: \"kubernetes.io/projected/703af3d8-b286-4fdb-ac04-ea2084575961-kube-api-access-xc4md\") pod \"console-5b67ffbb49-xj5cd\" (UID: \"703af3d8-b286-4fdb-ac04-ea2084575961\") " pod="openshift-console/console-5b67ffbb49-xj5cd" Jan 11 17:43:59 crc kubenswrapper[4837]: I0111 17:43:59.923813 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/703af3d8-b286-4fdb-ac04-ea2084575961-trusted-ca-bundle\") pod \"console-5b67ffbb49-xj5cd\" (UID: \"703af3d8-b286-4fdb-ac04-ea2084575961\") " pod="openshift-console/console-5b67ffbb49-xj5cd" Jan 11 17:43:59 crc kubenswrapper[4837]: I0111 17:43:59.924929 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/703af3d8-b286-4fdb-ac04-ea2084575961-service-ca\") pod \"console-5b67ffbb49-xj5cd\" (UID: \"703af3d8-b286-4fdb-ac04-ea2084575961\") " pod="openshift-console/console-5b67ffbb49-xj5cd" Jan 11 17:43:59 crc kubenswrapper[4837]: I0111 17:43:59.925144 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/703af3d8-b286-4fdb-ac04-ea2084575961-trusted-ca-bundle\") pod \"console-5b67ffbb49-xj5cd\" (UID: \"703af3d8-b286-4fdb-ac04-ea2084575961\") " pod="openshift-console/console-5b67ffbb49-xj5cd" Jan 11 17:43:59 crc kubenswrapper[4837]: I0111 17:43:59.926307 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/703af3d8-b286-4fdb-ac04-ea2084575961-console-config\") pod \"console-5b67ffbb49-xj5cd\" (UID: \"703af3d8-b286-4fdb-ac04-ea2084575961\") " pod="openshift-console/console-5b67ffbb49-xj5cd" Jan 11 17:43:59 crc kubenswrapper[4837]: I0111 17:43:59.926312 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/703af3d8-b286-4fdb-ac04-ea2084575961-oauth-serving-cert\") pod \"console-5b67ffbb49-xj5cd\" (UID: \"703af3d8-b286-4fdb-ac04-ea2084575961\") " pod="openshift-console/console-5b67ffbb49-xj5cd" Jan 11 17:43:59 crc kubenswrapper[4837]: I0111 17:43:59.929719 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/703af3d8-b286-4fdb-ac04-ea2084575961-console-oauth-config\") pod \"console-5b67ffbb49-xj5cd\" (UID: \"703af3d8-b286-4fdb-ac04-ea2084575961\") " pod="openshift-console/console-5b67ffbb49-xj5cd" Jan 11 17:43:59 crc kubenswrapper[4837]: I0111 17:43:59.934429 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/703af3d8-b286-4fdb-ac04-ea2084575961-console-serving-cert\") pod \"console-5b67ffbb49-xj5cd\" (UID: \"703af3d8-b286-4fdb-ac04-ea2084575961\") " pod="openshift-console/console-5b67ffbb49-xj5cd" Jan 11 17:43:59 crc kubenswrapper[4837]: I0111 17:43:59.950579 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xc4md\" (UniqueName: \"kubernetes.io/projected/703af3d8-b286-4fdb-ac04-ea2084575961-kube-api-access-xc4md\") pod \"console-5b67ffbb49-xj5cd\" (UID: \"703af3d8-b286-4fdb-ac04-ea2084575961\") " pod="openshift-console/console-5b67ffbb49-xj5cd" Jan 11 17:43:59 crc kubenswrapper[4837]: I0111 17:43:59.976279 4837 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-metrics-7f7f7578db-znz8s"] Jan 11 17:43:59 crc kubenswrapper[4837]: W0111 17:43:59.984703 4837 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podc440fad0_c0e0_4553_ad26_b843f81c8863.slice/crio-6d3895f8b1260bbd2e497ba543e16d0f35a41d4da1d336faae01babe113750c9 WatchSource:0}: Error finding container 6d3895f8b1260bbd2e497ba543e16d0f35a41d4da1d336faae01babe113750c9: Status 404 returned error can't find the container with id 6d3895f8b1260bbd2e497ba543e16d0f35a41d4da1d336faae01babe113750c9 Jan 11 17:44:00 crc kubenswrapper[4837]: I0111 17:44:00.126432 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"tls-key-pair\" (UniqueName: \"kubernetes.io/secret/5d9870f8-4c71-4490-8e77-17f1a82e725a-tls-key-pair\") pod \"nmstate-webhook-f8fb84555-4tph5\" (UID: \"5d9870f8-4c71-4490-8e77-17f1a82e725a\") " pod="openshift-nmstate/nmstate-webhook-f8fb84555-4tph5" Jan 11 17:44:00 crc kubenswrapper[4837]: I0111 17:44:00.130657 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"tls-key-pair\" (UniqueName: \"kubernetes.io/secret/5d9870f8-4c71-4490-8e77-17f1a82e725a-tls-key-pair\") pod \"nmstate-webhook-f8fb84555-4tph5\" (UID: \"5d9870f8-4c71-4490-8e77-17f1a82e725a\") " pod="openshift-nmstate/nmstate-webhook-f8fb84555-4tph5" Jan 11 17:44:00 crc kubenswrapper[4837]: I0111 17:44:00.148705 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-5b67ffbb49-xj5cd" Jan 11 17:44:00 crc kubenswrapper[4837]: I0111 17:44:00.228749 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"plugin-serving-cert\" (UniqueName: \"kubernetes.io/secret/3b85571d-dea1-437f-bd5c-27d5d421411e-plugin-serving-cert\") pod \"nmstate-console-plugin-6ff7998486-kkz9f\" (UID: \"3b85571d-dea1-437f-bd5c-27d5d421411e\") " pod="openshift-nmstate/nmstate-console-plugin-6ff7998486-kkz9f" Jan 11 17:44:00 crc kubenswrapper[4837]: I0111 17:44:00.234198 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"plugin-serving-cert\" (UniqueName: \"kubernetes.io/secret/3b85571d-dea1-437f-bd5c-27d5d421411e-plugin-serving-cert\") pod \"nmstate-console-plugin-6ff7998486-kkz9f\" (UID: \"3b85571d-dea1-437f-bd5c-27d5d421411e\") " pod="openshift-nmstate/nmstate-console-plugin-6ff7998486-kkz9f" Jan 11 17:44:00 crc kubenswrapper[4837]: I0111 17:44:00.317981 4837 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-5b67ffbb49-xj5cd"] Jan 11 17:44:00 crc kubenswrapper[4837]: W0111 17:44:00.326085 4837 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod703af3d8_b286_4fdb_ac04_ea2084575961.slice/crio-3aaa802e374f7b64e1bbb3c57b12393578366c40eb1a7153ec16655638c771b2 WatchSource:0}: Error finding container 3aaa802e374f7b64e1bbb3c57b12393578366c40eb1a7153ec16655638c771b2: Status 404 returned error can't find the container with id 3aaa802e374f7b64e1bbb3c57b12393578366c40eb1a7153ec16655638c771b2 Jan 11 17:44:00 crc kubenswrapper[4837]: I0111 17:44:00.376384 4837 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a8630d06-af1d-4e1d-95c3-5fb89c8294ff" path="/var/lib/kubelet/pods/a8630d06-af1d-4e1d-95c3-5fb89c8294ff/volumes" Jan 11 17:44:00 crc kubenswrapper[4837]: I0111 17:44:00.386159 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-webhook-f8fb84555-4tph5" Jan 11 17:44:00 crc kubenswrapper[4837]: I0111 17:44:00.505183 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-console-plugin-6ff7998486-kkz9f" Jan 11 17:44:00 crc kubenswrapper[4837]: I0111 17:44:00.513086 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-metrics-7f7f7578db-znz8s" event={"ID":"c440fad0-c0e0-4553-ad26-b843f81c8863","Type":"ContainerStarted","Data":"6d3895f8b1260bbd2e497ba543e16d0f35a41d4da1d336faae01babe113750c9"} Jan 11 17:44:00 crc kubenswrapper[4837]: I0111 17:44:00.514964 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-handler-68ws8" event={"ID":"1c2557e3-14a8-4911-92b0-564bb7b60b06","Type":"ContainerStarted","Data":"459dac0a0114bdfaa744413cd389172d0a19ace3c3b243dc62d87bb6af5eb0d5"} Jan 11 17:44:00 crc kubenswrapper[4837]: I0111 17:44:00.517458 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-5b67ffbb49-xj5cd" event={"ID":"703af3d8-b286-4fdb-ac04-ea2084575961","Type":"ContainerStarted","Data":"b658b1679117124f342b688c01aa0e55e52e8de7f9e2cc9258b77438467466bc"} Jan 11 17:44:00 crc kubenswrapper[4837]: I0111 17:44:00.517511 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-5b67ffbb49-xj5cd" event={"ID":"703af3d8-b286-4fdb-ac04-ea2084575961","Type":"ContainerStarted","Data":"3aaa802e374f7b64e1bbb3c57b12393578366c40eb1a7153ec16655638c771b2"} Jan 11 17:44:00 crc kubenswrapper[4837]: I0111 17:44:00.655206 4837 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console/console-5b67ffbb49-xj5cd" podStartSLOduration=1.655190017 podStartE2EDuration="1.655190017s" podCreationTimestamp="2026-01-11 17:43:59 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-11 17:44:00.540193581 +0000 UTC m=+814.718386297" watchObservedRunningTime="2026-01-11 17:44:00.655190017 +0000 UTC m=+814.833382723" Jan 11 17:44:00 crc kubenswrapper[4837]: I0111 17:44:00.664912 4837 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-webhook-f8fb84555-4tph5"] Jan 11 17:44:00 crc kubenswrapper[4837]: I0111 17:44:00.991221 4837 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-console-plugin-6ff7998486-kkz9f"] Jan 11 17:44:00 crc kubenswrapper[4837]: W0111 17:44:00.994521 4837 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod3b85571d_dea1_437f_bd5c_27d5d421411e.slice/crio-95b5a2b257891c691cc659c339d06ae3fe19c3b00db8b84a8b05d2b17ae39766 WatchSource:0}: Error finding container 95b5a2b257891c691cc659c339d06ae3fe19c3b00db8b84a8b05d2b17ae39766: Status 404 returned error can't find the container with id 95b5a2b257891c691cc659c339d06ae3fe19c3b00db8b84a8b05d2b17ae39766 Jan 11 17:44:01 crc kubenswrapper[4837]: I0111 17:44:01.524612 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-webhook-f8fb84555-4tph5" event={"ID":"5d9870f8-4c71-4490-8e77-17f1a82e725a","Type":"ContainerStarted","Data":"15a0650a8374f1fe67c7b7239cccb07fb2ef128f7f9d4bb6f1330feca7a2b67c"} Jan 11 17:44:01 crc kubenswrapper[4837]: I0111 17:44:01.526504 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-console-plugin-6ff7998486-kkz9f" event={"ID":"3b85571d-dea1-437f-bd5c-27d5d421411e","Type":"ContainerStarted","Data":"95b5a2b257891c691cc659c339d06ae3fe19c3b00db8b84a8b05d2b17ae39766"} Jan 11 17:44:03 crc kubenswrapper[4837]: I0111 17:44:03.543617 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-webhook-f8fb84555-4tph5" event={"ID":"5d9870f8-4c71-4490-8e77-17f1a82e725a","Type":"ContainerStarted","Data":"1cf380a2481c5e8e63a518b136ef64bae28ffe9b5d94ad0dcd16869ab0a35248"} Jan 11 17:44:03 crc kubenswrapper[4837]: I0111 17:44:03.544272 4837 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-nmstate/nmstate-webhook-f8fb84555-4tph5" Jan 11 17:44:03 crc kubenswrapper[4837]: I0111 17:44:03.550071 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-metrics-7f7f7578db-znz8s" event={"ID":"c440fad0-c0e0-4553-ad26-b843f81c8863","Type":"ContainerStarted","Data":"5a73704f5edb67e3e41edcc350dfeb14f3fc5e0e1e744145c5d7eecda71ec407"} Jan 11 17:44:03 crc kubenswrapper[4837]: I0111 17:44:03.552565 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-handler-68ws8" event={"ID":"1c2557e3-14a8-4911-92b0-564bb7b60b06","Type":"ContainerStarted","Data":"1176ffafc44af3f20539331ea4ef52316d6420af3354835026d28380191ed654"} Jan 11 17:44:03 crc kubenswrapper[4837]: I0111 17:44:03.552761 4837 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-nmstate/nmstate-handler-68ws8" Jan 11 17:44:03 crc kubenswrapper[4837]: I0111 17:44:03.575599 4837 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-nmstate/nmstate-webhook-f8fb84555-4tph5" podStartSLOduration=2.807960396 podStartE2EDuration="4.575519059s" podCreationTimestamp="2026-01-11 17:43:59 +0000 UTC" firstStartedPulling="2026-01-11 17:44:00.680735928 +0000 UTC m=+814.858928634" lastFinishedPulling="2026-01-11 17:44:02.448294591 +0000 UTC m=+816.626487297" observedRunningTime="2026-01-11 17:44:03.562188914 +0000 UTC m=+817.740381610" watchObservedRunningTime="2026-01-11 17:44:03.575519059 +0000 UTC m=+817.753711775" Jan 11 17:44:03 crc kubenswrapper[4837]: I0111 17:44:03.582233 4837 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-nmstate/nmstate-handler-68ws8" podStartSLOduration=2.034424699 podStartE2EDuration="4.582216458s" podCreationTimestamp="2026-01-11 17:43:59 +0000 UTC" firstStartedPulling="2026-01-11 17:43:59.860818334 +0000 UTC m=+814.039011040" lastFinishedPulling="2026-01-11 17:44:02.408610093 +0000 UTC m=+816.586802799" observedRunningTime="2026-01-11 17:44:03.574934473 +0000 UTC m=+817.753127179" watchObservedRunningTime="2026-01-11 17:44:03.582216458 +0000 UTC m=+817.760409194" Jan 11 17:44:04 crc kubenswrapper[4837]: I0111 17:44:04.560235 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-console-plugin-6ff7998486-kkz9f" event={"ID":"3b85571d-dea1-437f-bd5c-27d5d421411e","Type":"ContainerStarted","Data":"67c28cacacea153e355b8874332725eb8fca8ecf7a680af719310f77cd95e6f8"} Jan 11 17:44:04 crc kubenswrapper[4837]: I0111 17:44:04.591792 4837 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-nmstate/nmstate-console-plugin-6ff7998486-kkz9f" podStartSLOduration=3.115784575 podStartE2EDuration="5.591768418s" podCreationTimestamp="2026-01-11 17:43:59 +0000 UTC" firstStartedPulling="2026-01-11 17:44:00.998266085 +0000 UTC m=+815.176458831" lastFinishedPulling="2026-01-11 17:44:03.474249938 +0000 UTC m=+817.652442674" observedRunningTime="2026-01-11 17:44:04.578506284 +0000 UTC m=+818.756699030" watchObservedRunningTime="2026-01-11 17:44:04.591768418 +0000 UTC m=+818.769961154" Jan 11 17:44:05 crc kubenswrapper[4837]: I0111 17:44:05.573737 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-metrics-7f7f7578db-znz8s" event={"ID":"c440fad0-c0e0-4553-ad26-b843f81c8863","Type":"ContainerStarted","Data":"ca12ca02730f578949b6dac4fafec86aa8fac945fd3d59ce7bbf296b1e351378"} Jan 11 17:44:05 crc kubenswrapper[4837]: I0111 17:44:05.613153 4837 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-nmstate/nmstate-metrics-7f7f7578db-znz8s" podStartSLOduration=1.5308193509999999 podStartE2EDuration="6.613103182s" podCreationTimestamp="2026-01-11 17:43:59 +0000 UTC" firstStartedPulling="2026-01-11 17:43:59.986461815 +0000 UTC m=+814.164654521" lastFinishedPulling="2026-01-11 17:44:05.068745646 +0000 UTC m=+819.246938352" observedRunningTime="2026-01-11 17:44:05.607051061 +0000 UTC m=+819.785243797" watchObservedRunningTime="2026-01-11 17:44:05.613103182 +0000 UTC m=+819.791295898" Jan 11 17:44:09 crc kubenswrapper[4837]: I0111 17:44:09.444836 4837 patch_prober.go:28] interesting pod/machine-config-daemon-pqnst container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Jan 11 17:44:09 crc kubenswrapper[4837]: I0111 17:44:09.445585 4837 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-pqnst" podUID="1cf6fa66-290a-4e29-bafc-e60185a22fe2" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Jan 11 17:44:09 crc kubenswrapper[4837]: I0111 17:44:09.850316 4837 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-nmstate/nmstate-handler-68ws8" Jan 11 17:44:10 crc kubenswrapper[4837]: I0111 17:44:10.149471 4837 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-console/console-5b67ffbb49-xj5cd" Jan 11 17:44:10 crc kubenswrapper[4837]: I0111 17:44:10.149955 4837 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-console/console-5b67ffbb49-xj5cd" Jan 11 17:44:10 crc kubenswrapper[4837]: I0111 17:44:10.157841 4837 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-console/console-5b67ffbb49-xj5cd" Jan 11 17:44:10 crc kubenswrapper[4837]: I0111 17:44:10.621264 4837 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console/console-5b67ffbb49-xj5cd" Jan 11 17:44:10 crc kubenswrapper[4837]: I0111 17:44:10.690328 4837 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-console/console-f9d7485db-v8df8"] Jan 11 17:44:20 crc kubenswrapper[4837]: I0111 17:44:20.395392 4837 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-nmstate/nmstate-webhook-f8fb84555-4tph5" Jan 11 17:44:35 crc kubenswrapper[4837]: I0111 17:44:35.731770 4837 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-console/console-f9d7485db-v8df8" podUID="1141f492-afec-40f3-bde7-7072d6a75a68" containerName="console" containerID="cri-o://e6cb39cd7ba9a53ad90183631cd08e2974d886a416a3c254a23cb1459c204d89" gracePeriod=15 Jan 11 17:44:36 crc kubenswrapper[4837]: I0111 17:44:36.094349 4837 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-f9d7485db-v8df8_1141f492-afec-40f3-bde7-7072d6a75a68/console/0.log" Jan 11 17:44:36 crc kubenswrapper[4837]: I0111 17:44:36.094413 4837 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-f9d7485db-v8df8" Jan 11 17:44:36 crc kubenswrapper[4837]: I0111 17:44:36.164991 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/1141f492-afec-40f3-bde7-7072d6a75a68-service-ca\") pod \"1141f492-afec-40f3-bde7-7072d6a75a68\" (UID: \"1141f492-afec-40f3-bde7-7072d6a75a68\") " Jan 11 17:44:36 crc kubenswrapper[4837]: I0111 17:44:36.165741 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/1141f492-afec-40f3-bde7-7072d6a75a68-console-config\") pod \"1141f492-afec-40f3-bde7-7072d6a75a68\" (UID: \"1141f492-afec-40f3-bde7-7072d6a75a68\") " Jan 11 17:44:36 crc kubenswrapper[4837]: I0111 17:44:36.165803 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/1141f492-afec-40f3-bde7-7072d6a75a68-oauth-serving-cert\") pod \"1141f492-afec-40f3-bde7-7072d6a75a68\" (UID: \"1141f492-afec-40f3-bde7-7072d6a75a68\") " Jan 11 17:44:36 crc kubenswrapper[4837]: I0111 17:44:36.165837 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/1141f492-afec-40f3-bde7-7072d6a75a68-trusted-ca-bundle\") pod \"1141f492-afec-40f3-bde7-7072d6a75a68\" (UID: \"1141f492-afec-40f3-bde7-7072d6a75a68\") " Jan 11 17:44:36 crc kubenswrapper[4837]: I0111 17:44:36.165889 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/1141f492-afec-40f3-bde7-7072d6a75a68-console-serving-cert\") pod \"1141f492-afec-40f3-bde7-7072d6a75a68\" (UID: \"1141f492-afec-40f3-bde7-7072d6a75a68\") " Jan 11 17:44:36 crc kubenswrapper[4837]: I0111 17:44:36.165946 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1141f492-afec-40f3-bde7-7072d6a75a68-service-ca" (OuterVolumeSpecName: "service-ca") pod "1141f492-afec-40f3-bde7-7072d6a75a68" (UID: "1141f492-afec-40f3-bde7-7072d6a75a68"). InnerVolumeSpecName "service-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 11 17:44:36 crc kubenswrapper[4837]: I0111 17:44:36.166009 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/1141f492-afec-40f3-bde7-7072d6a75a68-console-oauth-config\") pod \"1141f492-afec-40f3-bde7-7072d6a75a68\" (UID: \"1141f492-afec-40f3-bde7-7072d6a75a68\") " Jan 11 17:44:36 crc kubenswrapper[4837]: I0111 17:44:36.166097 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-l8sxd\" (UniqueName: \"kubernetes.io/projected/1141f492-afec-40f3-bde7-7072d6a75a68-kube-api-access-l8sxd\") pod \"1141f492-afec-40f3-bde7-7072d6a75a68\" (UID: \"1141f492-afec-40f3-bde7-7072d6a75a68\") " Jan 11 17:44:36 crc kubenswrapper[4837]: I0111 17:44:36.166341 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1141f492-afec-40f3-bde7-7072d6a75a68-console-config" (OuterVolumeSpecName: "console-config") pod "1141f492-afec-40f3-bde7-7072d6a75a68" (UID: "1141f492-afec-40f3-bde7-7072d6a75a68"). InnerVolumeSpecName "console-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 11 17:44:36 crc kubenswrapper[4837]: I0111 17:44:36.166429 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1141f492-afec-40f3-bde7-7072d6a75a68-trusted-ca-bundle" (OuterVolumeSpecName: "trusted-ca-bundle") pod "1141f492-afec-40f3-bde7-7072d6a75a68" (UID: "1141f492-afec-40f3-bde7-7072d6a75a68"). InnerVolumeSpecName "trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 11 17:44:36 crc kubenswrapper[4837]: I0111 17:44:36.166502 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1141f492-afec-40f3-bde7-7072d6a75a68-oauth-serving-cert" (OuterVolumeSpecName: "oauth-serving-cert") pod "1141f492-afec-40f3-bde7-7072d6a75a68" (UID: "1141f492-afec-40f3-bde7-7072d6a75a68"). InnerVolumeSpecName "oauth-serving-cert". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 11 17:44:36 crc kubenswrapper[4837]: I0111 17:44:36.167552 4837 reconciler_common.go:293] "Volume detached for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/1141f492-afec-40f3-bde7-7072d6a75a68-service-ca\") on node \"crc\" DevicePath \"\"" Jan 11 17:44:36 crc kubenswrapper[4837]: I0111 17:44:36.167582 4837 reconciler_common.go:293] "Volume detached for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/1141f492-afec-40f3-bde7-7072d6a75a68-console-config\") on node \"crc\" DevicePath \"\"" Jan 11 17:44:36 crc kubenswrapper[4837]: I0111 17:44:36.167597 4837 reconciler_common.go:293] "Volume detached for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/1141f492-afec-40f3-bde7-7072d6a75a68-oauth-serving-cert\") on node \"crc\" DevicePath \"\"" Jan 11 17:44:36 crc kubenswrapper[4837]: I0111 17:44:36.167611 4837 reconciler_common.go:293] "Volume detached for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/1141f492-afec-40f3-bde7-7072d6a75a68-trusted-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 11 17:44:36 crc kubenswrapper[4837]: I0111 17:44:36.183269 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1141f492-afec-40f3-bde7-7072d6a75a68-console-serving-cert" (OuterVolumeSpecName: "console-serving-cert") pod "1141f492-afec-40f3-bde7-7072d6a75a68" (UID: "1141f492-afec-40f3-bde7-7072d6a75a68"). InnerVolumeSpecName "console-serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 11 17:44:36 crc kubenswrapper[4837]: I0111 17:44:36.187537 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1141f492-afec-40f3-bde7-7072d6a75a68-kube-api-access-l8sxd" (OuterVolumeSpecName: "kube-api-access-l8sxd") pod "1141f492-afec-40f3-bde7-7072d6a75a68" (UID: "1141f492-afec-40f3-bde7-7072d6a75a68"). InnerVolumeSpecName "kube-api-access-l8sxd". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 11 17:44:36 crc kubenswrapper[4837]: I0111 17:44:36.193816 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1141f492-afec-40f3-bde7-7072d6a75a68-console-oauth-config" (OuterVolumeSpecName: "console-oauth-config") pod "1141f492-afec-40f3-bde7-7072d6a75a68" (UID: "1141f492-afec-40f3-bde7-7072d6a75a68"). InnerVolumeSpecName "console-oauth-config". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 11 17:44:36 crc kubenswrapper[4837]: I0111 17:44:36.234911 4837 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/5b7fccbebf0e22d2dd769066fa7aaa90fd620c5db34f2af6c91e4319d4f2nrw"] Jan 11 17:44:36 crc kubenswrapper[4837]: E0111 17:44:36.235270 4837 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1141f492-afec-40f3-bde7-7072d6a75a68" containerName="console" Jan 11 17:44:36 crc kubenswrapper[4837]: I0111 17:44:36.235298 4837 state_mem.go:107] "Deleted CPUSet assignment" podUID="1141f492-afec-40f3-bde7-7072d6a75a68" containerName="console" Jan 11 17:44:36 crc kubenswrapper[4837]: I0111 17:44:36.235414 4837 memory_manager.go:354] "RemoveStaleState removing state" podUID="1141f492-afec-40f3-bde7-7072d6a75a68" containerName="console" Jan 11 17:44:36 crc kubenswrapper[4837]: I0111 17:44:36.236391 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/5b7fccbebf0e22d2dd769066fa7aaa90fd620c5db34f2af6c91e4319d4f2nrw" Jan 11 17:44:36 crc kubenswrapper[4837]: I0111 17:44:36.239461 4837 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"default-dockercfg-vmwhc" Jan 11 17:44:36 crc kubenswrapper[4837]: I0111 17:44:36.243548 4837 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/5b7fccbebf0e22d2dd769066fa7aaa90fd620c5db34f2af6c91e4319d4f2nrw"] Jan 11 17:44:36 crc kubenswrapper[4837]: I0111 17:44:36.268923 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/55364c0f-8ad4-40fb-8739-65b09f608b27-bundle\") pod \"5b7fccbebf0e22d2dd769066fa7aaa90fd620c5db34f2af6c91e4319d4f2nrw\" (UID: \"55364c0f-8ad4-40fb-8739-65b09f608b27\") " pod="openshift-marketplace/5b7fccbebf0e22d2dd769066fa7aaa90fd620c5db34f2af6c91e4319d4f2nrw" Jan 11 17:44:36 crc kubenswrapper[4837]: I0111 17:44:36.268985 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xcpcv\" (UniqueName: \"kubernetes.io/projected/55364c0f-8ad4-40fb-8739-65b09f608b27-kube-api-access-xcpcv\") pod \"5b7fccbebf0e22d2dd769066fa7aaa90fd620c5db34f2af6c91e4319d4f2nrw\" (UID: \"55364c0f-8ad4-40fb-8739-65b09f608b27\") " pod="openshift-marketplace/5b7fccbebf0e22d2dd769066fa7aaa90fd620c5db34f2af6c91e4319d4f2nrw" Jan 11 17:44:36 crc kubenswrapper[4837]: I0111 17:44:36.269043 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/55364c0f-8ad4-40fb-8739-65b09f608b27-util\") pod \"5b7fccbebf0e22d2dd769066fa7aaa90fd620c5db34f2af6c91e4319d4f2nrw\" (UID: \"55364c0f-8ad4-40fb-8739-65b09f608b27\") " pod="openshift-marketplace/5b7fccbebf0e22d2dd769066fa7aaa90fd620c5db34f2af6c91e4319d4f2nrw" Jan 11 17:44:36 crc kubenswrapper[4837]: I0111 17:44:36.269082 4837 reconciler_common.go:293] "Volume detached for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/1141f492-afec-40f3-bde7-7072d6a75a68-console-serving-cert\") on node \"crc\" DevicePath \"\"" Jan 11 17:44:36 crc kubenswrapper[4837]: I0111 17:44:36.269093 4837 reconciler_common.go:293] "Volume detached for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/1141f492-afec-40f3-bde7-7072d6a75a68-console-oauth-config\") on node \"crc\" DevicePath \"\"" Jan 11 17:44:36 crc kubenswrapper[4837]: I0111 17:44:36.269102 4837 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-l8sxd\" (UniqueName: \"kubernetes.io/projected/1141f492-afec-40f3-bde7-7072d6a75a68-kube-api-access-l8sxd\") on node \"crc\" DevicePath \"\"" Jan 11 17:44:36 crc kubenswrapper[4837]: I0111 17:44:36.370642 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xcpcv\" (UniqueName: \"kubernetes.io/projected/55364c0f-8ad4-40fb-8739-65b09f608b27-kube-api-access-xcpcv\") pod \"5b7fccbebf0e22d2dd769066fa7aaa90fd620c5db34f2af6c91e4319d4f2nrw\" (UID: \"55364c0f-8ad4-40fb-8739-65b09f608b27\") " pod="openshift-marketplace/5b7fccbebf0e22d2dd769066fa7aaa90fd620c5db34f2af6c91e4319d4f2nrw" Jan 11 17:44:36 crc kubenswrapper[4837]: I0111 17:44:36.370742 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/55364c0f-8ad4-40fb-8739-65b09f608b27-util\") pod \"5b7fccbebf0e22d2dd769066fa7aaa90fd620c5db34f2af6c91e4319d4f2nrw\" (UID: \"55364c0f-8ad4-40fb-8739-65b09f608b27\") " pod="openshift-marketplace/5b7fccbebf0e22d2dd769066fa7aaa90fd620c5db34f2af6c91e4319d4f2nrw" Jan 11 17:44:36 crc kubenswrapper[4837]: I0111 17:44:36.370804 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/55364c0f-8ad4-40fb-8739-65b09f608b27-bundle\") pod \"5b7fccbebf0e22d2dd769066fa7aaa90fd620c5db34f2af6c91e4319d4f2nrw\" (UID: \"55364c0f-8ad4-40fb-8739-65b09f608b27\") " pod="openshift-marketplace/5b7fccbebf0e22d2dd769066fa7aaa90fd620c5db34f2af6c91e4319d4f2nrw" Jan 11 17:44:36 crc kubenswrapper[4837]: I0111 17:44:36.371315 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/55364c0f-8ad4-40fb-8739-65b09f608b27-bundle\") pod \"5b7fccbebf0e22d2dd769066fa7aaa90fd620c5db34f2af6c91e4319d4f2nrw\" (UID: \"55364c0f-8ad4-40fb-8739-65b09f608b27\") " pod="openshift-marketplace/5b7fccbebf0e22d2dd769066fa7aaa90fd620c5db34f2af6c91e4319d4f2nrw" Jan 11 17:44:36 crc kubenswrapper[4837]: I0111 17:44:36.371660 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/55364c0f-8ad4-40fb-8739-65b09f608b27-util\") pod \"5b7fccbebf0e22d2dd769066fa7aaa90fd620c5db34f2af6c91e4319d4f2nrw\" (UID: \"55364c0f-8ad4-40fb-8739-65b09f608b27\") " pod="openshift-marketplace/5b7fccbebf0e22d2dd769066fa7aaa90fd620c5db34f2af6c91e4319d4f2nrw" Jan 11 17:44:36 crc kubenswrapper[4837]: I0111 17:44:36.401185 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xcpcv\" (UniqueName: \"kubernetes.io/projected/55364c0f-8ad4-40fb-8739-65b09f608b27-kube-api-access-xcpcv\") pod \"5b7fccbebf0e22d2dd769066fa7aaa90fd620c5db34f2af6c91e4319d4f2nrw\" (UID: \"55364c0f-8ad4-40fb-8739-65b09f608b27\") " pod="openshift-marketplace/5b7fccbebf0e22d2dd769066fa7aaa90fd620c5db34f2af6c91e4319d4f2nrw" Jan 11 17:44:36 crc kubenswrapper[4837]: I0111 17:44:36.550324 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/5b7fccbebf0e22d2dd769066fa7aaa90fd620c5db34f2af6c91e4319d4f2nrw" Jan 11 17:44:36 crc kubenswrapper[4837]: I0111 17:44:36.765553 4837 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/5b7fccbebf0e22d2dd769066fa7aaa90fd620c5db34f2af6c91e4319d4f2nrw"] Jan 11 17:44:36 crc kubenswrapper[4837]: I0111 17:44:36.792349 4837 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-f9d7485db-v8df8_1141f492-afec-40f3-bde7-7072d6a75a68/console/0.log" Jan 11 17:44:36 crc kubenswrapper[4837]: I0111 17:44:36.792403 4837 generic.go:334] "Generic (PLEG): container finished" podID="1141f492-afec-40f3-bde7-7072d6a75a68" containerID="e6cb39cd7ba9a53ad90183631cd08e2974d886a416a3c254a23cb1459c204d89" exitCode=2 Jan 11 17:44:36 crc kubenswrapper[4837]: I0111 17:44:36.792457 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-f9d7485db-v8df8" event={"ID":"1141f492-afec-40f3-bde7-7072d6a75a68","Type":"ContainerDied","Data":"e6cb39cd7ba9a53ad90183631cd08e2974d886a416a3c254a23cb1459c204d89"} Jan 11 17:44:36 crc kubenswrapper[4837]: I0111 17:44:36.792488 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-f9d7485db-v8df8" event={"ID":"1141f492-afec-40f3-bde7-7072d6a75a68","Type":"ContainerDied","Data":"73a286ac58474705dc48075cafc3a1bc6c04270cfe3d679f971104098fe04639"} Jan 11 17:44:36 crc kubenswrapper[4837]: I0111 17:44:36.792530 4837 scope.go:117] "RemoveContainer" containerID="e6cb39cd7ba9a53ad90183631cd08e2974d886a416a3c254a23cb1459c204d89" Jan 11 17:44:36 crc kubenswrapper[4837]: I0111 17:44:36.792597 4837 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-f9d7485db-v8df8" Jan 11 17:44:36 crc kubenswrapper[4837]: I0111 17:44:36.793329 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/5b7fccbebf0e22d2dd769066fa7aaa90fd620c5db34f2af6c91e4319d4f2nrw" event={"ID":"55364c0f-8ad4-40fb-8739-65b09f608b27","Type":"ContainerStarted","Data":"4d7ea483307317743300db4a7f4c87855a8954317b7b1871c230b8a804f1fdfa"} Jan 11 17:44:36 crc kubenswrapper[4837]: I0111 17:44:36.813741 4837 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-console/console-f9d7485db-v8df8"] Jan 11 17:44:36 crc kubenswrapper[4837]: I0111 17:44:36.818289 4837 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-console/console-f9d7485db-v8df8"] Jan 11 17:44:36 crc kubenswrapper[4837]: I0111 17:44:36.821759 4837 scope.go:117] "RemoveContainer" containerID="e6cb39cd7ba9a53ad90183631cd08e2974d886a416a3c254a23cb1459c204d89" Jan 11 17:44:36 crc kubenswrapper[4837]: E0111 17:44:36.822385 4837 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e6cb39cd7ba9a53ad90183631cd08e2974d886a416a3c254a23cb1459c204d89\": container with ID starting with e6cb39cd7ba9a53ad90183631cd08e2974d886a416a3c254a23cb1459c204d89 not found: ID does not exist" containerID="e6cb39cd7ba9a53ad90183631cd08e2974d886a416a3c254a23cb1459c204d89" Jan 11 17:44:36 crc kubenswrapper[4837]: I0111 17:44:36.822427 4837 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e6cb39cd7ba9a53ad90183631cd08e2974d886a416a3c254a23cb1459c204d89"} err="failed to get container status \"e6cb39cd7ba9a53ad90183631cd08e2974d886a416a3c254a23cb1459c204d89\": rpc error: code = NotFound desc = could not find container \"e6cb39cd7ba9a53ad90183631cd08e2974d886a416a3c254a23cb1459c204d89\": container with ID starting with e6cb39cd7ba9a53ad90183631cd08e2974d886a416a3c254a23cb1459c204d89 not found: ID does not exist" Jan 11 17:44:37 crc kubenswrapper[4837]: I0111 17:44:37.800509 4837 generic.go:334] "Generic (PLEG): container finished" podID="55364c0f-8ad4-40fb-8739-65b09f608b27" containerID="fab8a6e4d69fd4203771b5032fecfb6c8c774629cf7861cb4d66a962cd59948c" exitCode=0 Jan 11 17:44:37 crc kubenswrapper[4837]: I0111 17:44:37.800554 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/5b7fccbebf0e22d2dd769066fa7aaa90fd620c5db34f2af6c91e4319d4f2nrw" event={"ID":"55364c0f-8ad4-40fb-8739-65b09f608b27","Type":"ContainerDied","Data":"fab8a6e4d69fd4203771b5032fecfb6c8c774629cf7861cb4d66a962cd59948c"} Jan 11 17:44:38 crc kubenswrapper[4837]: I0111 17:44:38.376089 4837 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1141f492-afec-40f3-bde7-7072d6a75a68" path="/var/lib/kubelet/pods/1141f492-afec-40f3-bde7-7072d6a75a68/volumes" Jan 11 17:44:39 crc kubenswrapper[4837]: I0111 17:44:39.444410 4837 patch_prober.go:28] interesting pod/machine-config-daemon-pqnst container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Jan 11 17:44:39 crc kubenswrapper[4837]: I0111 17:44:39.444522 4837 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-pqnst" podUID="1cf6fa66-290a-4e29-bafc-e60185a22fe2" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Jan 11 17:44:39 crc kubenswrapper[4837]: I0111 17:44:39.444635 4837 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-pqnst" Jan 11 17:44:39 crc kubenswrapper[4837]: I0111 17:44:39.445563 4837 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"6987b5d3ba894eab85f4ec92caa53aa44c7a42f0a8df2d9713c40a8de9354658"} pod="openshift-machine-config-operator/machine-config-daemon-pqnst" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Jan 11 17:44:39 crc kubenswrapper[4837]: I0111 17:44:39.445670 4837 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-pqnst" podUID="1cf6fa66-290a-4e29-bafc-e60185a22fe2" containerName="machine-config-daemon" containerID="cri-o://6987b5d3ba894eab85f4ec92caa53aa44c7a42f0a8df2d9713c40a8de9354658" gracePeriod=600 Jan 11 17:44:39 crc kubenswrapper[4837]: I0111 17:44:39.817412 4837 generic.go:334] "Generic (PLEG): container finished" podID="1cf6fa66-290a-4e29-bafc-e60185a22fe2" containerID="6987b5d3ba894eab85f4ec92caa53aa44c7a42f0a8df2d9713c40a8de9354658" exitCode=0 Jan 11 17:44:39 crc kubenswrapper[4837]: I0111 17:44:39.817621 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-pqnst" event={"ID":"1cf6fa66-290a-4e29-bafc-e60185a22fe2","Type":"ContainerDied","Data":"6987b5d3ba894eab85f4ec92caa53aa44c7a42f0a8df2d9713c40a8de9354658"} Jan 11 17:44:39 crc kubenswrapper[4837]: I0111 17:44:39.817813 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-pqnst" event={"ID":"1cf6fa66-290a-4e29-bafc-e60185a22fe2","Type":"ContainerStarted","Data":"fefbb2dded3dd3108c6af21a68e03b8d09000012756ad3a506890b3d9bced335"} Jan 11 17:44:39 crc kubenswrapper[4837]: I0111 17:44:39.817839 4837 scope.go:117] "RemoveContainer" containerID="10f6c1fba8d2ded4e3d8d28a0fb8b27acf8c6a02810295dad13d2e54f622ba5d" Jan 11 17:44:40 crc kubenswrapper[4837]: I0111 17:44:40.830031 4837 generic.go:334] "Generic (PLEG): container finished" podID="55364c0f-8ad4-40fb-8739-65b09f608b27" containerID="be6506f2c42d0b6f0d2327e88f2a401f3640b4c91f5e70f6ab390599b1f1b527" exitCode=0 Jan 11 17:44:40 crc kubenswrapper[4837]: I0111 17:44:40.830229 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/5b7fccbebf0e22d2dd769066fa7aaa90fd620c5db34f2af6c91e4319d4f2nrw" event={"ID":"55364c0f-8ad4-40fb-8739-65b09f608b27","Type":"ContainerDied","Data":"be6506f2c42d0b6f0d2327e88f2a401f3640b4c91f5e70f6ab390599b1f1b527"} Jan 11 17:44:41 crc kubenswrapper[4837]: I0111 17:44:41.848533 4837 generic.go:334] "Generic (PLEG): container finished" podID="55364c0f-8ad4-40fb-8739-65b09f608b27" containerID="64b70a30328928a3eb0105bf43dd0647c39d9303159f40fab5d4f35bdfc208d8" exitCode=0 Jan 11 17:44:41 crc kubenswrapper[4837]: I0111 17:44:41.849031 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/5b7fccbebf0e22d2dd769066fa7aaa90fd620c5db34f2af6c91e4319d4f2nrw" event={"ID":"55364c0f-8ad4-40fb-8739-65b09f608b27","Type":"ContainerDied","Data":"64b70a30328928a3eb0105bf43dd0647c39d9303159f40fab5d4f35bdfc208d8"} Jan 11 17:44:43 crc kubenswrapper[4837]: I0111 17:44:43.131338 4837 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/5b7fccbebf0e22d2dd769066fa7aaa90fd620c5db34f2af6c91e4319d4f2nrw" Jan 11 17:44:43 crc kubenswrapper[4837]: I0111 17:44:43.158387 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/55364c0f-8ad4-40fb-8739-65b09f608b27-util\") pod \"55364c0f-8ad4-40fb-8739-65b09f608b27\" (UID: \"55364c0f-8ad4-40fb-8739-65b09f608b27\") " Jan 11 17:44:43 crc kubenswrapper[4837]: I0111 17:44:43.158438 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/55364c0f-8ad4-40fb-8739-65b09f608b27-bundle\") pod \"55364c0f-8ad4-40fb-8739-65b09f608b27\" (UID: \"55364c0f-8ad4-40fb-8739-65b09f608b27\") " Jan 11 17:44:43 crc kubenswrapper[4837]: I0111 17:44:43.158523 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xcpcv\" (UniqueName: \"kubernetes.io/projected/55364c0f-8ad4-40fb-8739-65b09f608b27-kube-api-access-xcpcv\") pod \"55364c0f-8ad4-40fb-8739-65b09f608b27\" (UID: \"55364c0f-8ad4-40fb-8739-65b09f608b27\") " Jan 11 17:44:43 crc kubenswrapper[4837]: I0111 17:44:43.161267 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/55364c0f-8ad4-40fb-8739-65b09f608b27-bundle" (OuterVolumeSpecName: "bundle") pod "55364c0f-8ad4-40fb-8739-65b09f608b27" (UID: "55364c0f-8ad4-40fb-8739-65b09f608b27"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 11 17:44:43 crc kubenswrapper[4837]: I0111 17:44:43.168358 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/55364c0f-8ad4-40fb-8739-65b09f608b27-kube-api-access-xcpcv" (OuterVolumeSpecName: "kube-api-access-xcpcv") pod "55364c0f-8ad4-40fb-8739-65b09f608b27" (UID: "55364c0f-8ad4-40fb-8739-65b09f608b27"). InnerVolumeSpecName "kube-api-access-xcpcv". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 11 17:44:43 crc kubenswrapper[4837]: I0111 17:44:43.171179 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/55364c0f-8ad4-40fb-8739-65b09f608b27-util" (OuterVolumeSpecName: "util") pod "55364c0f-8ad4-40fb-8739-65b09f608b27" (UID: "55364c0f-8ad4-40fb-8739-65b09f608b27"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 11 17:44:43 crc kubenswrapper[4837]: I0111 17:44:43.259829 4837 reconciler_common.go:293] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/55364c0f-8ad4-40fb-8739-65b09f608b27-util\") on node \"crc\" DevicePath \"\"" Jan 11 17:44:43 crc kubenswrapper[4837]: I0111 17:44:43.259870 4837 reconciler_common.go:293] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/55364c0f-8ad4-40fb-8739-65b09f608b27-bundle\") on node \"crc\" DevicePath \"\"" Jan 11 17:44:43 crc kubenswrapper[4837]: I0111 17:44:43.259880 4837 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xcpcv\" (UniqueName: \"kubernetes.io/projected/55364c0f-8ad4-40fb-8739-65b09f608b27-kube-api-access-xcpcv\") on node \"crc\" DevicePath \"\"" Jan 11 17:44:43 crc kubenswrapper[4837]: I0111 17:44:43.864212 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/5b7fccbebf0e22d2dd769066fa7aaa90fd620c5db34f2af6c91e4319d4f2nrw" event={"ID":"55364c0f-8ad4-40fb-8739-65b09f608b27","Type":"ContainerDied","Data":"4d7ea483307317743300db4a7f4c87855a8954317b7b1871c230b8a804f1fdfa"} Jan 11 17:44:43 crc kubenswrapper[4837]: I0111 17:44:43.864259 4837 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="4d7ea483307317743300db4a7f4c87855a8954317b7b1871c230b8a804f1fdfa" Jan 11 17:44:43 crc kubenswrapper[4837]: I0111 17:44:43.864366 4837 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/5b7fccbebf0e22d2dd769066fa7aaa90fd620c5db34f2af6c91e4319d4f2nrw" Jan 11 17:44:54 crc kubenswrapper[4837]: I0111 17:44:54.420215 4837 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["metallb-system/metallb-operator-controller-manager-7b497646f-46nhj"] Jan 11 17:44:54 crc kubenswrapper[4837]: E0111 17:44:54.420848 4837 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="55364c0f-8ad4-40fb-8739-65b09f608b27" containerName="extract" Jan 11 17:44:54 crc kubenswrapper[4837]: I0111 17:44:54.420860 4837 state_mem.go:107] "Deleted CPUSet assignment" podUID="55364c0f-8ad4-40fb-8739-65b09f608b27" containerName="extract" Jan 11 17:44:54 crc kubenswrapper[4837]: E0111 17:44:54.420871 4837 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="55364c0f-8ad4-40fb-8739-65b09f608b27" containerName="util" Jan 11 17:44:54 crc kubenswrapper[4837]: I0111 17:44:54.420876 4837 state_mem.go:107] "Deleted CPUSet assignment" podUID="55364c0f-8ad4-40fb-8739-65b09f608b27" containerName="util" Jan 11 17:44:54 crc kubenswrapper[4837]: E0111 17:44:54.420887 4837 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="55364c0f-8ad4-40fb-8739-65b09f608b27" containerName="pull" Jan 11 17:44:54 crc kubenswrapper[4837]: I0111 17:44:54.420894 4837 state_mem.go:107] "Deleted CPUSet assignment" podUID="55364c0f-8ad4-40fb-8739-65b09f608b27" containerName="pull" Jan 11 17:44:54 crc kubenswrapper[4837]: I0111 17:44:54.420993 4837 memory_manager.go:354] "RemoveStaleState removing state" podUID="55364c0f-8ad4-40fb-8739-65b09f608b27" containerName="extract" Jan 11 17:44:54 crc kubenswrapper[4837]: I0111 17:44:54.421357 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/metallb-operator-controller-manager-7b497646f-46nhj" Jan 11 17:44:54 crc kubenswrapper[4837]: I0111 17:44:54.423981 4837 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"metallb-operator-webhook-server-cert" Jan 11 17:44:54 crc kubenswrapper[4837]: I0111 17:44:54.424213 4837 reflector.go:368] Caches populated for *v1.ConfigMap from object-"metallb-system"/"openshift-service-ca.crt" Jan 11 17:44:54 crc kubenswrapper[4837]: I0111 17:44:54.424282 4837 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"metallb-operator-controller-manager-service-cert" Jan 11 17:44:54 crc kubenswrapper[4837]: I0111 17:44:54.424963 4837 reflector.go:368] Caches populated for *v1.ConfigMap from object-"metallb-system"/"kube-root-ca.crt" Jan 11 17:44:54 crc kubenswrapper[4837]: I0111 17:44:54.426298 4837 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"manager-account-dockercfg-jngq2" Jan 11 17:44:54 crc kubenswrapper[4837]: I0111 17:44:54.447216 4837 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/metallb-operator-controller-manager-7b497646f-46nhj"] Jan 11 17:44:54 crc kubenswrapper[4837]: I0111 17:44:54.512534 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/b3e8b743-e8f1-453a-9f63-44700da2d56a-apiservice-cert\") pod \"metallb-operator-controller-manager-7b497646f-46nhj\" (UID: \"b3e8b743-e8f1-453a-9f63-44700da2d56a\") " pod="metallb-system/metallb-operator-controller-manager-7b497646f-46nhj" Jan 11 17:44:54 crc kubenswrapper[4837]: I0111 17:44:54.512596 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-f4qtb\" (UniqueName: \"kubernetes.io/projected/b3e8b743-e8f1-453a-9f63-44700da2d56a-kube-api-access-f4qtb\") pod \"metallb-operator-controller-manager-7b497646f-46nhj\" (UID: \"b3e8b743-e8f1-453a-9f63-44700da2d56a\") " pod="metallb-system/metallb-operator-controller-manager-7b497646f-46nhj" Jan 11 17:44:54 crc kubenswrapper[4837]: I0111 17:44:54.512960 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/b3e8b743-e8f1-453a-9f63-44700da2d56a-webhook-cert\") pod \"metallb-operator-controller-manager-7b497646f-46nhj\" (UID: \"b3e8b743-e8f1-453a-9f63-44700da2d56a\") " pod="metallb-system/metallb-operator-controller-manager-7b497646f-46nhj" Jan 11 17:44:54 crc kubenswrapper[4837]: I0111 17:44:54.613933 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/b3e8b743-e8f1-453a-9f63-44700da2d56a-webhook-cert\") pod \"metallb-operator-controller-manager-7b497646f-46nhj\" (UID: \"b3e8b743-e8f1-453a-9f63-44700da2d56a\") " pod="metallb-system/metallb-operator-controller-manager-7b497646f-46nhj" Jan 11 17:44:54 crc kubenswrapper[4837]: I0111 17:44:54.613991 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/b3e8b743-e8f1-453a-9f63-44700da2d56a-apiservice-cert\") pod \"metallb-operator-controller-manager-7b497646f-46nhj\" (UID: \"b3e8b743-e8f1-453a-9f63-44700da2d56a\") " pod="metallb-system/metallb-operator-controller-manager-7b497646f-46nhj" Jan 11 17:44:54 crc kubenswrapper[4837]: I0111 17:44:54.614023 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-f4qtb\" (UniqueName: \"kubernetes.io/projected/b3e8b743-e8f1-453a-9f63-44700da2d56a-kube-api-access-f4qtb\") pod \"metallb-operator-controller-manager-7b497646f-46nhj\" (UID: \"b3e8b743-e8f1-453a-9f63-44700da2d56a\") " pod="metallb-system/metallb-operator-controller-manager-7b497646f-46nhj" Jan 11 17:44:54 crc kubenswrapper[4837]: I0111 17:44:54.621488 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/b3e8b743-e8f1-453a-9f63-44700da2d56a-apiservice-cert\") pod \"metallb-operator-controller-manager-7b497646f-46nhj\" (UID: \"b3e8b743-e8f1-453a-9f63-44700da2d56a\") " pod="metallb-system/metallb-operator-controller-manager-7b497646f-46nhj" Jan 11 17:44:54 crc kubenswrapper[4837]: I0111 17:44:54.622292 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/b3e8b743-e8f1-453a-9f63-44700da2d56a-webhook-cert\") pod \"metallb-operator-controller-manager-7b497646f-46nhj\" (UID: \"b3e8b743-e8f1-453a-9f63-44700da2d56a\") " pod="metallb-system/metallb-operator-controller-manager-7b497646f-46nhj" Jan 11 17:44:54 crc kubenswrapper[4837]: I0111 17:44:54.633901 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-f4qtb\" (UniqueName: \"kubernetes.io/projected/b3e8b743-e8f1-453a-9f63-44700da2d56a-kube-api-access-f4qtb\") pod \"metallb-operator-controller-manager-7b497646f-46nhj\" (UID: \"b3e8b743-e8f1-453a-9f63-44700da2d56a\") " pod="metallb-system/metallb-operator-controller-manager-7b497646f-46nhj" Jan 11 17:44:54 crc kubenswrapper[4837]: I0111 17:44:54.736256 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/metallb-operator-controller-manager-7b497646f-46nhj" Jan 11 17:44:54 crc kubenswrapper[4837]: I0111 17:44:54.764130 4837 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["metallb-system/metallb-operator-webhook-server-55549fb586-lvpmn"] Jan 11 17:44:54 crc kubenswrapper[4837]: I0111 17:44:54.765077 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/metallb-operator-webhook-server-55549fb586-lvpmn" Jan 11 17:44:54 crc kubenswrapper[4837]: I0111 17:44:54.770718 4837 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"metallb-webhook-cert" Jan 11 17:44:54 crc kubenswrapper[4837]: I0111 17:44:54.770866 4837 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"controller-dockercfg-g9cvr" Jan 11 17:44:54 crc kubenswrapper[4837]: I0111 17:44:54.771707 4837 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"metallb-operator-webhook-server-service-cert" Jan 11 17:44:54 crc kubenswrapper[4837]: I0111 17:44:54.779528 4837 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/metallb-operator-webhook-server-55549fb586-lvpmn"] Jan 11 17:44:54 crc kubenswrapper[4837]: I0111 17:44:54.816434 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/28a54c4f-092d-4c3e-b528-9d3651c4f3a9-webhook-cert\") pod \"metallb-operator-webhook-server-55549fb586-lvpmn\" (UID: \"28a54c4f-092d-4c3e-b528-9d3651c4f3a9\") " pod="metallb-system/metallb-operator-webhook-server-55549fb586-lvpmn" Jan 11 17:44:54 crc kubenswrapper[4837]: I0111 17:44:54.816493 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/28a54c4f-092d-4c3e-b528-9d3651c4f3a9-apiservice-cert\") pod \"metallb-operator-webhook-server-55549fb586-lvpmn\" (UID: \"28a54c4f-092d-4c3e-b528-9d3651c4f3a9\") " pod="metallb-system/metallb-operator-webhook-server-55549fb586-lvpmn" Jan 11 17:44:54 crc kubenswrapper[4837]: I0111 17:44:54.816568 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9t4lt\" (UniqueName: \"kubernetes.io/projected/28a54c4f-092d-4c3e-b528-9d3651c4f3a9-kube-api-access-9t4lt\") pod \"metallb-operator-webhook-server-55549fb586-lvpmn\" (UID: \"28a54c4f-092d-4c3e-b528-9d3651c4f3a9\") " pod="metallb-system/metallb-operator-webhook-server-55549fb586-lvpmn" Jan 11 17:44:54 crc kubenswrapper[4837]: I0111 17:44:54.918130 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9t4lt\" (UniqueName: \"kubernetes.io/projected/28a54c4f-092d-4c3e-b528-9d3651c4f3a9-kube-api-access-9t4lt\") pod \"metallb-operator-webhook-server-55549fb586-lvpmn\" (UID: \"28a54c4f-092d-4c3e-b528-9d3651c4f3a9\") " pod="metallb-system/metallb-operator-webhook-server-55549fb586-lvpmn" Jan 11 17:44:54 crc kubenswrapper[4837]: I0111 17:44:54.918516 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/28a54c4f-092d-4c3e-b528-9d3651c4f3a9-webhook-cert\") pod \"metallb-operator-webhook-server-55549fb586-lvpmn\" (UID: \"28a54c4f-092d-4c3e-b528-9d3651c4f3a9\") " pod="metallb-system/metallb-operator-webhook-server-55549fb586-lvpmn" Jan 11 17:44:54 crc kubenswrapper[4837]: I0111 17:44:54.918556 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/28a54c4f-092d-4c3e-b528-9d3651c4f3a9-apiservice-cert\") pod \"metallb-operator-webhook-server-55549fb586-lvpmn\" (UID: \"28a54c4f-092d-4c3e-b528-9d3651c4f3a9\") " pod="metallb-system/metallb-operator-webhook-server-55549fb586-lvpmn" Jan 11 17:44:54 crc kubenswrapper[4837]: I0111 17:44:54.943392 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/28a54c4f-092d-4c3e-b528-9d3651c4f3a9-apiservice-cert\") pod \"metallb-operator-webhook-server-55549fb586-lvpmn\" (UID: \"28a54c4f-092d-4c3e-b528-9d3651c4f3a9\") " pod="metallb-system/metallb-operator-webhook-server-55549fb586-lvpmn" Jan 11 17:44:54 crc kubenswrapper[4837]: I0111 17:44:54.943893 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/28a54c4f-092d-4c3e-b528-9d3651c4f3a9-webhook-cert\") pod \"metallb-operator-webhook-server-55549fb586-lvpmn\" (UID: \"28a54c4f-092d-4c3e-b528-9d3651c4f3a9\") " pod="metallb-system/metallb-operator-webhook-server-55549fb586-lvpmn" Jan 11 17:44:54 crc kubenswrapper[4837]: I0111 17:44:54.965873 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9t4lt\" (UniqueName: \"kubernetes.io/projected/28a54c4f-092d-4c3e-b528-9d3651c4f3a9-kube-api-access-9t4lt\") pod \"metallb-operator-webhook-server-55549fb586-lvpmn\" (UID: \"28a54c4f-092d-4c3e-b528-9d3651c4f3a9\") " pod="metallb-system/metallb-operator-webhook-server-55549fb586-lvpmn" Jan 11 17:44:55 crc kubenswrapper[4837]: I0111 17:44:55.019993 4837 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/metallb-operator-controller-manager-7b497646f-46nhj"] Jan 11 17:44:55 crc kubenswrapper[4837]: W0111 17:44:55.031130 4837 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podb3e8b743_e8f1_453a_9f63_44700da2d56a.slice/crio-3046068c90dbbe38f99b1025948ef613a203042baf4a2ab47e2893f3cfbb6c19 WatchSource:0}: Error finding container 3046068c90dbbe38f99b1025948ef613a203042baf4a2ab47e2893f3cfbb6c19: Status 404 returned error can't find the container with id 3046068c90dbbe38f99b1025948ef613a203042baf4a2ab47e2893f3cfbb6c19 Jan 11 17:44:55 crc kubenswrapper[4837]: I0111 17:44:55.124259 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/metallb-operator-webhook-server-55549fb586-lvpmn" Jan 11 17:44:55 crc kubenswrapper[4837]: I0111 17:44:55.493106 4837 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/metallb-operator-webhook-server-55549fb586-lvpmn"] Jan 11 17:44:55 crc kubenswrapper[4837]: W0111 17:44:55.499639 4837 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod28a54c4f_092d_4c3e_b528_9d3651c4f3a9.slice/crio-2327edd58669880843b0a53f79ab6da746299df1c87a76300652619146abdad7 WatchSource:0}: Error finding container 2327edd58669880843b0a53f79ab6da746299df1c87a76300652619146abdad7: Status 404 returned error can't find the container with id 2327edd58669880843b0a53f79ab6da746299df1c87a76300652619146abdad7 Jan 11 17:44:55 crc kubenswrapper[4837]: I0111 17:44:55.956499 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/metallb-operator-controller-manager-7b497646f-46nhj" event={"ID":"b3e8b743-e8f1-453a-9f63-44700da2d56a","Type":"ContainerStarted","Data":"3046068c90dbbe38f99b1025948ef613a203042baf4a2ab47e2893f3cfbb6c19"} Jan 11 17:44:55 crc kubenswrapper[4837]: I0111 17:44:55.958283 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/metallb-operator-webhook-server-55549fb586-lvpmn" event={"ID":"28a54c4f-092d-4c3e-b528-9d3651c4f3a9","Type":"ContainerStarted","Data":"2327edd58669880843b0a53f79ab6da746299df1c87a76300652619146abdad7"} Jan 11 17:44:58 crc kubenswrapper[4837]: I0111 17:44:58.978220 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/metallb-operator-controller-manager-7b497646f-46nhj" event={"ID":"b3e8b743-e8f1-453a-9f63-44700da2d56a","Type":"ContainerStarted","Data":"c89c2cb1dea15756361ac1e90328f740d3b96a5a1540a24015ea6ddd396395f2"} Jan 11 17:44:58 crc kubenswrapper[4837]: I0111 17:44:58.978859 4837 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="metallb-system/metallb-operator-controller-manager-7b497646f-46nhj" Jan 11 17:44:59 crc kubenswrapper[4837]: I0111 17:44:59.003237 4837 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="metallb-system/metallb-operator-controller-manager-7b497646f-46nhj" podStartSLOduration=1.586647268 podStartE2EDuration="5.003213728s" podCreationTimestamp="2026-01-11 17:44:54 +0000 UTC" firstStartedPulling="2026-01-11 17:44:55.034454745 +0000 UTC m=+869.212647441" lastFinishedPulling="2026-01-11 17:44:58.451021195 +0000 UTC m=+872.629213901" observedRunningTime="2026-01-11 17:44:59.001166063 +0000 UTC m=+873.179358779" watchObservedRunningTime="2026-01-11 17:44:59.003213728 +0000 UTC m=+873.181406434" Jan 11 17:45:00 crc kubenswrapper[4837]: I0111 17:45:00.142782 4837 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29469225-tjvdp"] Jan 11 17:45:00 crc kubenswrapper[4837]: I0111 17:45:00.143708 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29469225-tjvdp" Jan 11 17:45:00 crc kubenswrapper[4837]: I0111 17:45:00.145693 4837 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"collect-profiles-config" Jan 11 17:45:00 crc kubenswrapper[4837]: I0111 17:45:00.146477 4837 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"collect-profiles-dockercfg-kzf4t" Jan 11 17:45:00 crc kubenswrapper[4837]: I0111 17:45:00.163174 4837 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29469225-tjvdp"] Jan 11 17:45:00 crc kubenswrapper[4837]: I0111 17:45:00.196619 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zx55v\" (UniqueName: \"kubernetes.io/projected/f2b9f39d-7ce7-4626-9a5c-7ed9b0917745-kube-api-access-zx55v\") pod \"collect-profiles-29469225-tjvdp\" (UID: \"f2b9f39d-7ce7-4626-9a5c-7ed9b0917745\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29469225-tjvdp" Jan 11 17:45:00 crc kubenswrapper[4837]: I0111 17:45:00.196719 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/f2b9f39d-7ce7-4626-9a5c-7ed9b0917745-config-volume\") pod \"collect-profiles-29469225-tjvdp\" (UID: \"f2b9f39d-7ce7-4626-9a5c-7ed9b0917745\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29469225-tjvdp" Jan 11 17:45:00 crc kubenswrapper[4837]: I0111 17:45:00.196736 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/f2b9f39d-7ce7-4626-9a5c-7ed9b0917745-secret-volume\") pod \"collect-profiles-29469225-tjvdp\" (UID: \"f2b9f39d-7ce7-4626-9a5c-7ed9b0917745\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29469225-tjvdp" Jan 11 17:45:00 crc kubenswrapper[4837]: I0111 17:45:00.297629 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zx55v\" (UniqueName: \"kubernetes.io/projected/f2b9f39d-7ce7-4626-9a5c-7ed9b0917745-kube-api-access-zx55v\") pod \"collect-profiles-29469225-tjvdp\" (UID: \"f2b9f39d-7ce7-4626-9a5c-7ed9b0917745\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29469225-tjvdp" Jan 11 17:45:00 crc kubenswrapper[4837]: I0111 17:45:00.297728 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/f2b9f39d-7ce7-4626-9a5c-7ed9b0917745-config-volume\") pod \"collect-profiles-29469225-tjvdp\" (UID: \"f2b9f39d-7ce7-4626-9a5c-7ed9b0917745\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29469225-tjvdp" Jan 11 17:45:00 crc kubenswrapper[4837]: I0111 17:45:00.297756 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/f2b9f39d-7ce7-4626-9a5c-7ed9b0917745-secret-volume\") pod \"collect-profiles-29469225-tjvdp\" (UID: \"f2b9f39d-7ce7-4626-9a5c-7ed9b0917745\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29469225-tjvdp" Jan 11 17:45:00 crc kubenswrapper[4837]: I0111 17:45:00.298749 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/f2b9f39d-7ce7-4626-9a5c-7ed9b0917745-config-volume\") pod \"collect-profiles-29469225-tjvdp\" (UID: \"f2b9f39d-7ce7-4626-9a5c-7ed9b0917745\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29469225-tjvdp" Jan 11 17:45:00 crc kubenswrapper[4837]: I0111 17:45:00.306121 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/f2b9f39d-7ce7-4626-9a5c-7ed9b0917745-secret-volume\") pod \"collect-profiles-29469225-tjvdp\" (UID: \"f2b9f39d-7ce7-4626-9a5c-7ed9b0917745\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29469225-tjvdp" Jan 11 17:45:00 crc kubenswrapper[4837]: I0111 17:45:00.319398 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zx55v\" (UniqueName: \"kubernetes.io/projected/f2b9f39d-7ce7-4626-9a5c-7ed9b0917745-kube-api-access-zx55v\") pod \"collect-profiles-29469225-tjvdp\" (UID: \"f2b9f39d-7ce7-4626-9a5c-7ed9b0917745\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29469225-tjvdp" Jan 11 17:45:00 crc kubenswrapper[4837]: I0111 17:45:00.459280 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29469225-tjvdp" Jan 11 17:45:00 crc kubenswrapper[4837]: I0111 17:45:00.698665 4837 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29469225-tjvdp"] Jan 11 17:45:00 crc kubenswrapper[4837]: I0111 17:45:00.996926 4837 generic.go:334] "Generic (PLEG): container finished" podID="f2b9f39d-7ce7-4626-9a5c-7ed9b0917745" containerID="532cabea9051f842e4d1c7aedc1a13ba9afd22b80ad1fd825909f599e9c6e86a" exitCode=0 Jan 11 17:45:00 crc kubenswrapper[4837]: I0111 17:45:00.997000 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29469225-tjvdp" event={"ID":"f2b9f39d-7ce7-4626-9a5c-7ed9b0917745","Type":"ContainerDied","Data":"532cabea9051f842e4d1c7aedc1a13ba9afd22b80ad1fd825909f599e9c6e86a"} Jan 11 17:45:00 crc kubenswrapper[4837]: I0111 17:45:00.997025 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29469225-tjvdp" event={"ID":"f2b9f39d-7ce7-4626-9a5c-7ed9b0917745","Type":"ContainerStarted","Data":"0c34f7903260e2336ea032cf1c8ca376ddd239dc059d2a9b8d2082868a1692db"} Jan 11 17:45:00 crc kubenswrapper[4837]: I0111 17:45:00.999007 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/metallb-operator-webhook-server-55549fb586-lvpmn" event={"ID":"28a54c4f-092d-4c3e-b528-9d3651c4f3a9","Type":"ContainerStarted","Data":"9c969dd52df491cb34857bf261ffaf176f5b0b70bb5e11484ba7550422c52329"} Jan 11 17:45:00 crc kubenswrapper[4837]: I0111 17:45:00.999136 4837 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="metallb-system/metallb-operator-webhook-server-55549fb586-lvpmn" Jan 11 17:45:01 crc kubenswrapper[4837]: I0111 17:45:01.033783 4837 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="metallb-system/metallb-operator-webhook-server-55549fb586-lvpmn" podStartSLOduration=2.668021845 podStartE2EDuration="7.033761325s" podCreationTimestamp="2026-01-11 17:44:54 +0000 UTC" firstStartedPulling="2026-01-11 17:44:55.502482184 +0000 UTC m=+869.680674890" lastFinishedPulling="2026-01-11 17:44:59.868221664 +0000 UTC m=+874.046414370" observedRunningTime="2026-01-11 17:45:01.029572204 +0000 UTC m=+875.207764930" watchObservedRunningTime="2026-01-11 17:45:01.033761325 +0000 UTC m=+875.211954041" Jan 11 17:45:02 crc kubenswrapper[4837]: I0111 17:45:02.307181 4837 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29469225-tjvdp" Jan 11 17:45:02 crc kubenswrapper[4837]: I0111 17:45:02.427058 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zx55v\" (UniqueName: \"kubernetes.io/projected/f2b9f39d-7ce7-4626-9a5c-7ed9b0917745-kube-api-access-zx55v\") pod \"f2b9f39d-7ce7-4626-9a5c-7ed9b0917745\" (UID: \"f2b9f39d-7ce7-4626-9a5c-7ed9b0917745\") " Jan 11 17:45:02 crc kubenswrapper[4837]: I0111 17:45:02.427130 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/f2b9f39d-7ce7-4626-9a5c-7ed9b0917745-secret-volume\") pod \"f2b9f39d-7ce7-4626-9a5c-7ed9b0917745\" (UID: \"f2b9f39d-7ce7-4626-9a5c-7ed9b0917745\") " Jan 11 17:45:02 crc kubenswrapper[4837]: I0111 17:45:02.427186 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/f2b9f39d-7ce7-4626-9a5c-7ed9b0917745-config-volume\") pod \"f2b9f39d-7ce7-4626-9a5c-7ed9b0917745\" (UID: \"f2b9f39d-7ce7-4626-9a5c-7ed9b0917745\") " Jan 11 17:45:02 crc kubenswrapper[4837]: I0111 17:45:02.428048 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/f2b9f39d-7ce7-4626-9a5c-7ed9b0917745-config-volume" (OuterVolumeSpecName: "config-volume") pod "f2b9f39d-7ce7-4626-9a5c-7ed9b0917745" (UID: "f2b9f39d-7ce7-4626-9a5c-7ed9b0917745"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 11 17:45:02 crc kubenswrapper[4837]: I0111 17:45:02.432563 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f2b9f39d-7ce7-4626-9a5c-7ed9b0917745-kube-api-access-zx55v" (OuterVolumeSpecName: "kube-api-access-zx55v") pod "f2b9f39d-7ce7-4626-9a5c-7ed9b0917745" (UID: "f2b9f39d-7ce7-4626-9a5c-7ed9b0917745"). InnerVolumeSpecName "kube-api-access-zx55v". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 11 17:45:02 crc kubenswrapper[4837]: I0111 17:45:02.441911 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f2b9f39d-7ce7-4626-9a5c-7ed9b0917745-secret-volume" (OuterVolumeSpecName: "secret-volume") pod "f2b9f39d-7ce7-4626-9a5c-7ed9b0917745" (UID: "f2b9f39d-7ce7-4626-9a5c-7ed9b0917745"). InnerVolumeSpecName "secret-volume". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 11 17:45:02 crc kubenswrapper[4837]: I0111 17:45:02.528647 4837 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/f2b9f39d-7ce7-4626-9a5c-7ed9b0917745-config-volume\") on node \"crc\" DevicePath \"\"" Jan 11 17:45:02 crc kubenswrapper[4837]: I0111 17:45:02.528698 4837 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-zx55v\" (UniqueName: \"kubernetes.io/projected/f2b9f39d-7ce7-4626-9a5c-7ed9b0917745-kube-api-access-zx55v\") on node \"crc\" DevicePath \"\"" Jan 11 17:45:02 crc kubenswrapper[4837]: I0111 17:45:02.528713 4837 reconciler_common.go:293] "Volume detached for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/f2b9f39d-7ce7-4626-9a5c-7ed9b0917745-secret-volume\") on node \"crc\" DevicePath \"\"" Jan 11 17:45:03 crc kubenswrapper[4837]: I0111 17:45:03.012402 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29469225-tjvdp" event={"ID":"f2b9f39d-7ce7-4626-9a5c-7ed9b0917745","Type":"ContainerDied","Data":"0c34f7903260e2336ea032cf1c8ca376ddd239dc059d2a9b8d2082868a1692db"} Jan 11 17:45:03 crc kubenswrapper[4837]: I0111 17:45:03.012871 4837 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="0c34f7903260e2336ea032cf1c8ca376ddd239dc059d2a9b8d2082868a1692db" Jan 11 17:45:03 crc kubenswrapper[4837]: I0111 17:45:03.012446 4837 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29469225-tjvdp" Jan 11 17:45:04 crc kubenswrapper[4837]: I0111 17:45:04.856619 4837 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-wx9lf"] Jan 11 17:45:04 crc kubenswrapper[4837]: E0111 17:45:04.857208 4837 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f2b9f39d-7ce7-4626-9a5c-7ed9b0917745" containerName="collect-profiles" Jan 11 17:45:04 crc kubenswrapper[4837]: I0111 17:45:04.857224 4837 state_mem.go:107] "Deleted CPUSet assignment" podUID="f2b9f39d-7ce7-4626-9a5c-7ed9b0917745" containerName="collect-profiles" Jan 11 17:45:04 crc kubenswrapper[4837]: I0111 17:45:04.857429 4837 memory_manager.go:354] "RemoveStaleState removing state" podUID="f2b9f39d-7ce7-4626-9a5c-7ed9b0917745" containerName="collect-profiles" Jan 11 17:45:04 crc kubenswrapper[4837]: I0111 17:45:04.858276 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-wx9lf" Jan 11 17:45:04 crc kubenswrapper[4837]: I0111 17:45:04.873592 4837 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-wx9lf"] Jan 11 17:45:04 crc kubenswrapper[4837]: I0111 17:45:04.957829 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5zmdw\" (UniqueName: \"kubernetes.io/projected/5599acec-41ba-4615-a5fa-eec3d2ffaa41-kube-api-access-5zmdw\") pod \"certified-operators-wx9lf\" (UID: \"5599acec-41ba-4615-a5fa-eec3d2ffaa41\") " pod="openshift-marketplace/certified-operators-wx9lf" Jan 11 17:45:04 crc kubenswrapper[4837]: I0111 17:45:04.958110 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5599acec-41ba-4615-a5fa-eec3d2ffaa41-catalog-content\") pod \"certified-operators-wx9lf\" (UID: \"5599acec-41ba-4615-a5fa-eec3d2ffaa41\") " pod="openshift-marketplace/certified-operators-wx9lf" Jan 11 17:45:04 crc kubenswrapper[4837]: I0111 17:45:04.958227 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5599acec-41ba-4615-a5fa-eec3d2ffaa41-utilities\") pod \"certified-operators-wx9lf\" (UID: \"5599acec-41ba-4615-a5fa-eec3d2ffaa41\") " pod="openshift-marketplace/certified-operators-wx9lf" Jan 11 17:45:05 crc kubenswrapper[4837]: I0111 17:45:05.059408 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5zmdw\" (UniqueName: \"kubernetes.io/projected/5599acec-41ba-4615-a5fa-eec3d2ffaa41-kube-api-access-5zmdw\") pod \"certified-operators-wx9lf\" (UID: \"5599acec-41ba-4615-a5fa-eec3d2ffaa41\") " pod="openshift-marketplace/certified-operators-wx9lf" Jan 11 17:45:05 crc kubenswrapper[4837]: I0111 17:45:05.059545 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5599acec-41ba-4615-a5fa-eec3d2ffaa41-catalog-content\") pod \"certified-operators-wx9lf\" (UID: \"5599acec-41ba-4615-a5fa-eec3d2ffaa41\") " pod="openshift-marketplace/certified-operators-wx9lf" Jan 11 17:45:05 crc kubenswrapper[4837]: I0111 17:45:05.059601 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5599acec-41ba-4615-a5fa-eec3d2ffaa41-utilities\") pod \"certified-operators-wx9lf\" (UID: \"5599acec-41ba-4615-a5fa-eec3d2ffaa41\") " pod="openshift-marketplace/certified-operators-wx9lf" Jan 11 17:45:05 crc kubenswrapper[4837]: I0111 17:45:05.060249 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5599acec-41ba-4615-a5fa-eec3d2ffaa41-utilities\") pod \"certified-operators-wx9lf\" (UID: \"5599acec-41ba-4615-a5fa-eec3d2ffaa41\") " pod="openshift-marketplace/certified-operators-wx9lf" Jan 11 17:45:05 crc kubenswrapper[4837]: I0111 17:45:05.060280 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5599acec-41ba-4615-a5fa-eec3d2ffaa41-catalog-content\") pod \"certified-operators-wx9lf\" (UID: \"5599acec-41ba-4615-a5fa-eec3d2ffaa41\") " pod="openshift-marketplace/certified-operators-wx9lf" Jan 11 17:45:05 crc kubenswrapper[4837]: I0111 17:45:05.090800 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5zmdw\" (UniqueName: \"kubernetes.io/projected/5599acec-41ba-4615-a5fa-eec3d2ffaa41-kube-api-access-5zmdw\") pod \"certified-operators-wx9lf\" (UID: \"5599acec-41ba-4615-a5fa-eec3d2ffaa41\") " pod="openshift-marketplace/certified-operators-wx9lf" Jan 11 17:45:05 crc kubenswrapper[4837]: I0111 17:45:05.177714 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-wx9lf" Jan 11 17:45:05 crc kubenswrapper[4837]: I0111 17:45:05.466155 4837 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-wx9lf"] Jan 11 17:45:06 crc kubenswrapper[4837]: I0111 17:45:06.034168 4837 generic.go:334] "Generic (PLEG): container finished" podID="5599acec-41ba-4615-a5fa-eec3d2ffaa41" containerID="84f6cd0137351dee578a20045a077b0fcb45dcef670d807c9675430cd2971476" exitCode=0 Jan 11 17:45:06 crc kubenswrapper[4837]: I0111 17:45:06.034218 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-wx9lf" event={"ID":"5599acec-41ba-4615-a5fa-eec3d2ffaa41","Type":"ContainerDied","Data":"84f6cd0137351dee578a20045a077b0fcb45dcef670d807c9675430cd2971476"} Jan 11 17:45:06 crc kubenswrapper[4837]: I0111 17:45:06.034254 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-wx9lf" event={"ID":"5599acec-41ba-4615-a5fa-eec3d2ffaa41","Type":"ContainerStarted","Data":"294ed29ddc252304490404a9f53ce99c27d1542b947828a40431b04bcd791407"} Jan 11 17:45:08 crc kubenswrapper[4837]: I0111 17:45:08.048194 4837 generic.go:334] "Generic (PLEG): container finished" podID="5599acec-41ba-4615-a5fa-eec3d2ffaa41" containerID="ee8292639a744e87db7627db7c7a466a7621811282d1f7da911eafd8653bcdf7" exitCode=0 Jan 11 17:45:08 crc kubenswrapper[4837]: I0111 17:45:08.048286 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-wx9lf" event={"ID":"5599acec-41ba-4615-a5fa-eec3d2ffaa41","Type":"ContainerDied","Data":"ee8292639a744e87db7627db7c7a466a7621811282d1f7da911eafd8653bcdf7"} Jan 11 17:45:09 crc kubenswrapper[4837]: I0111 17:45:09.057637 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-wx9lf" event={"ID":"5599acec-41ba-4615-a5fa-eec3d2ffaa41","Type":"ContainerStarted","Data":"3311b4a8082160fc764545dc2423940c73454b165c917f3524ac2a7bc7d6bbe0"} Jan 11 17:45:09 crc kubenswrapper[4837]: I0111 17:45:09.079457 4837 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-wx9lf" podStartSLOduration=2.713136942 podStartE2EDuration="5.079439258s" podCreationTimestamp="2026-01-11 17:45:04 +0000 UTC" firstStartedPulling="2026-01-11 17:45:06.035901333 +0000 UTC m=+880.214094039" lastFinishedPulling="2026-01-11 17:45:08.402203649 +0000 UTC m=+882.580396355" observedRunningTime="2026-01-11 17:45:09.079198232 +0000 UTC m=+883.257390968" watchObservedRunningTime="2026-01-11 17:45:09.079439258 +0000 UTC m=+883.257631964" Jan 11 17:45:15 crc kubenswrapper[4837]: I0111 17:45:15.130505 4837 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="metallb-system/metallb-operator-webhook-server-55549fb586-lvpmn" Jan 11 17:45:15 crc kubenswrapper[4837]: I0111 17:45:15.178827 4837 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-wx9lf" Jan 11 17:45:15 crc kubenswrapper[4837]: I0111 17:45:15.178905 4837 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-wx9lf" Jan 11 17:45:15 crc kubenswrapper[4837]: I0111 17:45:15.230629 4837 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-wx9lf" Jan 11 17:45:16 crc kubenswrapper[4837]: I0111 17:45:16.153590 4837 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-wx9lf" Jan 11 17:45:17 crc kubenswrapper[4837]: I0111 17:45:17.640360 4837 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-wx9lf"] Jan 11 17:45:18 crc kubenswrapper[4837]: I0111 17:45:18.110047 4837 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-wx9lf" podUID="5599acec-41ba-4615-a5fa-eec3d2ffaa41" containerName="registry-server" containerID="cri-o://3311b4a8082160fc764545dc2423940c73454b165c917f3524ac2a7bc7d6bbe0" gracePeriod=2 Jan 11 17:45:18 crc kubenswrapper[4837]: I0111 17:45:18.555856 4837 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-wx9lf" Jan 11 17:45:18 crc kubenswrapper[4837]: I0111 17:45:18.643695 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-5zmdw\" (UniqueName: \"kubernetes.io/projected/5599acec-41ba-4615-a5fa-eec3d2ffaa41-kube-api-access-5zmdw\") pod \"5599acec-41ba-4615-a5fa-eec3d2ffaa41\" (UID: \"5599acec-41ba-4615-a5fa-eec3d2ffaa41\") " Jan 11 17:45:18 crc kubenswrapper[4837]: I0111 17:45:18.643836 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5599acec-41ba-4615-a5fa-eec3d2ffaa41-utilities\") pod \"5599acec-41ba-4615-a5fa-eec3d2ffaa41\" (UID: \"5599acec-41ba-4615-a5fa-eec3d2ffaa41\") " Jan 11 17:45:18 crc kubenswrapper[4837]: I0111 17:45:18.643886 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5599acec-41ba-4615-a5fa-eec3d2ffaa41-catalog-content\") pod \"5599acec-41ba-4615-a5fa-eec3d2ffaa41\" (UID: \"5599acec-41ba-4615-a5fa-eec3d2ffaa41\") " Jan 11 17:45:18 crc kubenswrapper[4837]: I0111 17:45:18.644923 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/5599acec-41ba-4615-a5fa-eec3d2ffaa41-utilities" (OuterVolumeSpecName: "utilities") pod "5599acec-41ba-4615-a5fa-eec3d2ffaa41" (UID: "5599acec-41ba-4615-a5fa-eec3d2ffaa41"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 11 17:45:18 crc kubenswrapper[4837]: I0111 17:45:18.656540 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5599acec-41ba-4615-a5fa-eec3d2ffaa41-kube-api-access-5zmdw" (OuterVolumeSpecName: "kube-api-access-5zmdw") pod "5599acec-41ba-4615-a5fa-eec3d2ffaa41" (UID: "5599acec-41ba-4615-a5fa-eec3d2ffaa41"). InnerVolumeSpecName "kube-api-access-5zmdw". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 11 17:45:18 crc kubenswrapper[4837]: I0111 17:45:18.695283 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/5599acec-41ba-4615-a5fa-eec3d2ffaa41-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "5599acec-41ba-4615-a5fa-eec3d2ffaa41" (UID: "5599acec-41ba-4615-a5fa-eec3d2ffaa41"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 11 17:45:18 crc kubenswrapper[4837]: I0111 17:45:18.746090 4837 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-5zmdw\" (UniqueName: \"kubernetes.io/projected/5599acec-41ba-4615-a5fa-eec3d2ffaa41-kube-api-access-5zmdw\") on node \"crc\" DevicePath \"\"" Jan 11 17:45:18 crc kubenswrapper[4837]: I0111 17:45:18.746126 4837 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5599acec-41ba-4615-a5fa-eec3d2ffaa41-utilities\") on node \"crc\" DevicePath \"\"" Jan 11 17:45:18 crc kubenswrapper[4837]: I0111 17:45:18.746138 4837 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5599acec-41ba-4615-a5fa-eec3d2ffaa41-catalog-content\") on node \"crc\" DevicePath \"\"" Jan 11 17:45:19 crc kubenswrapper[4837]: I0111 17:45:19.121003 4837 generic.go:334] "Generic (PLEG): container finished" podID="5599acec-41ba-4615-a5fa-eec3d2ffaa41" containerID="3311b4a8082160fc764545dc2423940c73454b165c917f3524ac2a7bc7d6bbe0" exitCode=0 Jan 11 17:45:19 crc kubenswrapper[4837]: I0111 17:45:19.121092 4837 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-wx9lf" Jan 11 17:45:19 crc kubenswrapper[4837]: I0111 17:45:19.121071 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-wx9lf" event={"ID":"5599acec-41ba-4615-a5fa-eec3d2ffaa41","Type":"ContainerDied","Data":"3311b4a8082160fc764545dc2423940c73454b165c917f3524ac2a7bc7d6bbe0"} Jan 11 17:45:19 crc kubenswrapper[4837]: I0111 17:45:19.121448 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-wx9lf" event={"ID":"5599acec-41ba-4615-a5fa-eec3d2ffaa41","Type":"ContainerDied","Data":"294ed29ddc252304490404a9f53ce99c27d1542b947828a40431b04bcd791407"} Jan 11 17:45:19 crc kubenswrapper[4837]: I0111 17:45:19.121477 4837 scope.go:117] "RemoveContainer" containerID="3311b4a8082160fc764545dc2423940c73454b165c917f3524ac2a7bc7d6bbe0" Jan 11 17:45:19 crc kubenswrapper[4837]: I0111 17:45:19.142063 4837 scope.go:117] "RemoveContainer" containerID="ee8292639a744e87db7627db7c7a466a7621811282d1f7da911eafd8653bcdf7" Jan 11 17:45:19 crc kubenswrapper[4837]: I0111 17:45:19.162234 4837 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-wx9lf"] Jan 11 17:45:19 crc kubenswrapper[4837]: I0111 17:45:19.166339 4837 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-wx9lf"] Jan 11 17:45:19 crc kubenswrapper[4837]: I0111 17:45:19.170581 4837 scope.go:117] "RemoveContainer" containerID="84f6cd0137351dee578a20045a077b0fcb45dcef670d807c9675430cd2971476" Jan 11 17:45:19 crc kubenswrapper[4837]: I0111 17:45:19.196272 4837 scope.go:117] "RemoveContainer" containerID="3311b4a8082160fc764545dc2423940c73454b165c917f3524ac2a7bc7d6bbe0" Jan 11 17:45:19 crc kubenswrapper[4837]: E0111 17:45:19.196937 4837 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"3311b4a8082160fc764545dc2423940c73454b165c917f3524ac2a7bc7d6bbe0\": container with ID starting with 3311b4a8082160fc764545dc2423940c73454b165c917f3524ac2a7bc7d6bbe0 not found: ID does not exist" containerID="3311b4a8082160fc764545dc2423940c73454b165c917f3524ac2a7bc7d6bbe0" Jan 11 17:45:19 crc kubenswrapper[4837]: I0111 17:45:19.196967 4837 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3311b4a8082160fc764545dc2423940c73454b165c917f3524ac2a7bc7d6bbe0"} err="failed to get container status \"3311b4a8082160fc764545dc2423940c73454b165c917f3524ac2a7bc7d6bbe0\": rpc error: code = NotFound desc = could not find container \"3311b4a8082160fc764545dc2423940c73454b165c917f3524ac2a7bc7d6bbe0\": container with ID starting with 3311b4a8082160fc764545dc2423940c73454b165c917f3524ac2a7bc7d6bbe0 not found: ID does not exist" Jan 11 17:45:19 crc kubenswrapper[4837]: I0111 17:45:19.196989 4837 scope.go:117] "RemoveContainer" containerID="ee8292639a744e87db7627db7c7a466a7621811282d1f7da911eafd8653bcdf7" Jan 11 17:45:19 crc kubenswrapper[4837]: E0111 17:45:19.197407 4837 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ee8292639a744e87db7627db7c7a466a7621811282d1f7da911eafd8653bcdf7\": container with ID starting with ee8292639a744e87db7627db7c7a466a7621811282d1f7da911eafd8653bcdf7 not found: ID does not exist" containerID="ee8292639a744e87db7627db7c7a466a7621811282d1f7da911eafd8653bcdf7" Jan 11 17:45:19 crc kubenswrapper[4837]: I0111 17:45:19.197428 4837 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ee8292639a744e87db7627db7c7a466a7621811282d1f7da911eafd8653bcdf7"} err="failed to get container status \"ee8292639a744e87db7627db7c7a466a7621811282d1f7da911eafd8653bcdf7\": rpc error: code = NotFound desc = could not find container \"ee8292639a744e87db7627db7c7a466a7621811282d1f7da911eafd8653bcdf7\": container with ID starting with ee8292639a744e87db7627db7c7a466a7621811282d1f7da911eafd8653bcdf7 not found: ID does not exist" Jan 11 17:45:19 crc kubenswrapper[4837]: I0111 17:45:19.197440 4837 scope.go:117] "RemoveContainer" containerID="84f6cd0137351dee578a20045a077b0fcb45dcef670d807c9675430cd2971476" Jan 11 17:45:19 crc kubenswrapper[4837]: E0111 17:45:19.197794 4837 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"84f6cd0137351dee578a20045a077b0fcb45dcef670d807c9675430cd2971476\": container with ID starting with 84f6cd0137351dee578a20045a077b0fcb45dcef670d807c9675430cd2971476 not found: ID does not exist" containerID="84f6cd0137351dee578a20045a077b0fcb45dcef670d807c9675430cd2971476" Jan 11 17:45:19 crc kubenswrapper[4837]: I0111 17:45:19.197855 4837 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"84f6cd0137351dee578a20045a077b0fcb45dcef670d807c9675430cd2971476"} err="failed to get container status \"84f6cd0137351dee578a20045a077b0fcb45dcef670d807c9675430cd2971476\": rpc error: code = NotFound desc = could not find container \"84f6cd0137351dee578a20045a077b0fcb45dcef670d807c9675430cd2971476\": container with ID starting with 84f6cd0137351dee578a20045a077b0fcb45dcef670d807c9675430cd2971476 not found: ID does not exist" Jan 11 17:45:20 crc kubenswrapper[4837]: I0111 17:45:20.375469 4837 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5599acec-41ba-4615-a5fa-eec3d2ffaa41" path="/var/lib/kubelet/pods/5599acec-41ba-4615-a5fa-eec3d2ffaa41/volumes" Jan 11 17:45:21 crc kubenswrapper[4837]: I0111 17:45:21.649649 4837 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-fqccs"] Jan 11 17:45:21 crc kubenswrapper[4837]: E0111 17:45:21.650008 4837 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5599acec-41ba-4615-a5fa-eec3d2ffaa41" containerName="registry-server" Jan 11 17:45:21 crc kubenswrapper[4837]: I0111 17:45:21.650026 4837 state_mem.go:107] "Deleted CPUSet assignment" podUID="5599acec-41ba-4615-a5fa-eec3d2ffaa41" containerName="registry-server" Jan 11 17:45:21 crc kubenswrapper[4837]: E0111 17:45:21.650054 4837 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5599acec-41ba-4615-a5fa-eec3d2ffaa41" containerName="extract-utilities" Jan 11 17:45:21 crc kubenswrapper[4837]: I0111 17:45:21.650062 4837 state_mem.go:107] "Deleted CPUSet assignment" podUID="5599acec-41ba-4615-a5fa-eec3d2ffaa41" containerName="extract-utilities" Jan 11 17:45:21 crc kubenswrapper[4837]: E0111 17:45:21.650086 4837 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5599acec-41ba-4615-a5fa-eec3d2ffaa41" containerName="extract-content" Jan 11 17:45:21 crc kubenswrapper[4837]: I0111 17:45:21.650099 4837 state_mem.go:107] "Deleted CPUSet assignment" podUID="5599acec-41ba-4615-a5fa-eec3d2ffaa41" containerName="extract-content" Jan 11 17:45:21 crc kubenswrapper[4837]: I0111 17:45:21.650265 4837 memory_manager.go:354] "RemoveStaleState removing state" podUID="5599acec-41ba-4615-a5fa-eec3d2ffaa41" containerName="registry-server" Jan 11 17:45:21 crc kubenswrapper[4837]: I0111 17:45:21.651469 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-fqccs" Jan 11 17:45:21 crc kubenswrapper[4837]: I0111 17:45:21.664788 4837 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-fqccs"] Jan 11 17:45:21 crc kubenswrapper[4837]: I0111 17:45:21.698714 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-57vtv\" (UniqueName: \"kubernetes.io/projected/c3bf552a-61ce-44a2-9bfe-902e539a1291-kube-api-access-57vtv\") pod \"community-operators-fqccs\" (UID: \"c3bf552a-61ce-44a2-9bfe-902e539a1291\") " pod="openshift-marketplace/community-operators-fqccs" Jan 11 17:45:21 crc kubenswrapper[4837]: I0111 17:45:21.698826 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c3bf552a-61ce-44a2-9bfe-902e539a1291-utilities\") pod \"community-operators-fqccs\" (UID: \"c3bf552a-61ce-44a2-9bfe-902e539a1291\") " pod="openshift-marketplace/community-operators-fqccs" Jan 11 17:45:21 crc kubenswrapper[4837]: I0111 17:45:21.698864 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c3bf552a-61ce-44a2-9bfe-902e539a1291-catalog-content\") pod \"community-operators-fqccs\" (UID: \"c3bf552a-61ce-44a2-9bfe-902e539a1291\") " pod="openshift-marketplace/community-operators-fqccs" Jan 11 17:45:21 crc kubenswrapper[4837]: I0111 17:45:21.800192 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c3bf552a-61ce-44a2-9bfe-902e539a1291-utilities\") pod \"community-operators-fqccs\" (UID: \"c3bf552a-61ce-44a2-9bfe-902e539a1291\") " pod="openshift-marketplace/community-operators-fqccs" Jan 11 17:45:21 crc kubenswrapper[4837]: I0111 17:45:21.800254 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c3bf552a-61ce-44a2-9bfe-902e539a1291-catalog-content\") pod \"community-operators-fqccs\" (UID: \"c3bf552a-61ce-44a2-9bfe-902e539a1291\") " pod="openshift-marketplace/community-operators-fqccs" Jan 11 17:45:21 crc kubenswrapper[4837]: I0111 17:45:21.800298 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-57vtv\" (UniqueName: \"kubernetes.io/projected/c3bf552a-61ce-44a2-9bfe-902e539a1291-kube-api-access-57vtv\") pod \"community-operators-fqccs\" (UID: \"c3bf552a-61ce-44a2-9bfe-902e539a1291\") " pod="openshift-marketplace/community-operators-fqccs" Jan 11 17:45:21 crc kubenswrapper[4837]: I0111 17:45:21.800817 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c3bf552a-61ce-44a2-9bfe-902e539a1291-utilities\") pod \"community-operators-fqccs\" (UID: \"c3bf552a-61ce-44a2-9bfe-902e539a1291\") " pod="openshift-marketplace/community-operators-fqccs" Jan 11 17:45:21 crc kubenswrapper[4837]: I0111 17:45:21.800899 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c3bf552a-61ce-44a2-9bfe-902e539a1291-catalog-content\") pod \"community-operators-fqccs\" (UID: \"c3bf552a-61ce-44a2-9bfe-902e539a1291\") " pod="openshift-marketplace/community-operators-fqccs" Jan 11 17:45:21 crc kubenswrapper[4837]: I0111 17:45:21.827224 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-57vtv\" (UniqueName: \"kubernetes.io/projected/c3bf552a-61ce-44a2-9bfe-902e539a1291-kube-api-access-57vtv\") pod \"community-operators-fqccs\" (UID: \"c3bf552a-61ce-44a2-9bfe-902e539a1291\") " pod="openshift-marketplace/community-operators-fqccs" Jan 11 17:45:21 crc kubenswrapper[4837]: I0111 17:45:21.972337 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-fqccs" Jan 11 17:45:22 crc kubenswrapper[4837]: I0111 17:45:22.452511 4837 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-fqccs"] Jan 11 17:45:23 crc kubenswrapper[4837]: I0111 17:45:23.152303 4837 generic.go:334] "Generic (PLEG): container finished" podID="c3bf552a-61ce-44a2-9bfe-902e539a1291" containerID="18c92cd4a3bd3734317535ece1496fcc4cbb3dc6f700ae726a25757a3c2b42a2" exitCode=0 Jan 11 17:45:23 crc kubenswrapper[4837]: I0111 17:45:23.152374 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-fqccs" event={"ID":"c3bf552a-61ce-44a2-9bfe-902e539a1291","Type":"ContainerDied","Data":"18c92cd4a3bd3734317535ece1496fcc4cbb3dc6f700ae726a25757a3c2b42a2"} Jan 11 17:45:23 crc kubenswrapper[4837]: I0111 17:45:23.152457 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-fqccs" event={"ID":"c3bf552a-61ce-44a2-9bfe-902e539a1291","Type":"ContainerStarted","Data":"2d5999272912d033fe98c1de556e6eed158ca411fa7fb65c69c37c2c4fc48d10"} Jan 11 17:45:27 crc kubenswrapper[4837]: I0111 17:45:27.180088 4837 generic.go:334] "Generic (PLEG): container finished" podID="c3bf552a-61ce-44a2-9bfe-902e539a1291" containerID="e180bfdfe3562fd566d0291a8b460860ef4e8442d1c46673522f4b6a1cd0bdfd" exitCode=0 Jan 11 17:45:27 crc kubenswrapper[4837]: I0111 17:45:27.180205 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-fqccs" event={"ID":"c3bf552a-61ce-44a2-9bfe-902e539a1291","Type":"ContainerDied","Data":"e180bfdfe3562fd566d0291a8b460860ef4e8442d1c46673522f4b6a1cd0bdfd"} Jan 11 17:45:29 crc kubenswrapper[4837]: I0111 17:45:29.202444 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-fqccs" event={"ID":"c3bf552a-61ce-44a2-9bfe-902e539a1291","Type":"ContainerStarted","Data":"3fd42816ca63a537b37c20db955b5de922a9782c207ce031e92e850c4677a97c"} Jan 11 17:45:29 crc kubenswrapper[4837]: I0111 17:45:29.241509 4837 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-fqccs" podStartSLOduration=3.374366003 podStartE2EDuration="8.241429135s" podCreationTimestamp="2026-01-11 17:45:21 +0000 UTC" firstStartedPulling="2026-01-11 17:45:23.154409365 +0000 UTC m=+897.332602071" lastFinishedPulling="2026-01-11 17:45:28.021472457 +0000 UTC m=+902.199665203" observedRunningTime="2026-01-11 17:45:29.228811558 +0000 UTC m=+903.407004264" watchObservedRunningTime="2026-01-11 17:45:29.241429135 +0000 UTC m=+903.419621881" Jan 11 17:45:31 crc kubenswrapper[4837]: I0111 17:45:31.972609 4837 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-fqccs" Jan 11 17:45:31 crc kubenswrapper[4837]: I0111 17:45:31.972705 4837 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-fqccs" Jan 11 17:45:32 crc kubenswrapper[4837]: I0111 17:45:32.014788 4837 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-fqccs" Jan 11 17:45:33 crc kubenswrapper[4837]: I0111 17:45:33.280578 4837 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-fqccs" Jan 11 17:45:33 crc kubenswrapper[4837]: I0111 17:45:33.326329 4837 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-fqccs"] Jan 11 17:45:34 crc kubenswrapper[4837]: I0111 17:45:34.739936 4837 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="metallb-system/metallb-operator-controller-manager-7b497646f-46nhj" Jan 11 17:45:35 crc kubenswrapper[4837]: I0111 17:45:35.248395 4837 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-fqccs" podUID="c3bf552a-61ce-44a2-9bfe-902e539a1291" containerName="registry-server" containerID="cri-o://3fd42816ca63a537b37c20db955b5de922a9782c207ce031e92e850c4677a97c" gracePeriod=2 Jan 11 17:45:35 crc kubenswrapper[4837]: I0111 17:45:35.516367 4837 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["metallb-system/frr-k8s-m6x95"] Jan 11 17:45:35 crc kubenswrapper[4837]: I0111 17:45:35.519361 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/frr-k8s-m6x95" Jan 11 17:45:35 crc kubenswrapper[4837]: I0111 17:45:35.521202 4837 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["metallb-system/frr-k8s-webhook-server-7784b6fcf-568c8"] Jan 11 17:45:35 crc kubenswrapper[4837]: I0111 17:45:35.521835 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/frr-k8s-webhook-server-7784b6fcf-568c8" Jan 11 17:45:35 crc kubenswrapper[4837]: I0111 17:45:35.524874 4837 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"frr-k8s-daemon-dockercfg-zlxxt" Jan 11 17:45:35 crc kubenswrapper[4837]: I0111 17:45:35.525098 4837 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"frr-k8s-webhook-server-cert" Jan 11 17:45:35 crc kubenswrapper[4837]: I0111 17:45:35.525250 4837 reflector.go:368] Caches populated for *v1.ConfigMap from object-"metallb-system"/"frr-startup" Jan 11 17:45:35 crc kubenswrapper[4837]: I0111 17:45:35.527934 4837 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"frr-k8s-certs-secret" Jan 11 17:45:35 crc kubenswrapper[4837]: I0111 17:45:35.536255 4837 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/frr-k8s-webhook-server-7784b6fcf-568c8"] Jan 11 17:45:35 crc kubenswrapper[4837]: I0111 17:45:35.590126 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-h27l7\" (UniqueName: \"kubernetes.io/projected/aa1bc5b4-0f84-413a-a7fd-d2531bbb8265-kube-api-access-h27l7\") pod \"frr-k8s-webhook-server-7784b6fcf-568c8\" (UID: \"aa1bc5b4-0f84-413a-a7fd-d2531bbb8265\") " pod="metallb-system/frr-k8s-webhook-server-7784b6fcf-568c8" Jan 11 17:45:35 crc kubenswrapper[4837]: I0111 17:45:35.590175 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"frr-conf\" (UniqueName: \"kubernetes.io/empty-dir/d4fb0b82-36cb-45ce-b356-1a740d312fcf-frr-conf\") pod \"frr-k8s-m6x95\" (UID: \"d4fb0b82-36cb-45ce-b356-1a740d312fcf\") " pod="metallb-system/frr-k8s-m6x95" Jan 11 17:45:35 crc kubenswrapper[4837]: I0111 17:45:35.590209 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jrzqf\" (UniqueName: \"kubernetes.io/projected/d4fb0b82-36cb-45ce-b356-1a740d312fcf-kube-api-access-jrzqf\") pod \"frr-k8s-m6x95\" (UID: \"d4fb0b82-36cb-45ce-b356-1a740d312fcf\") " pod="metallb-system/frr-k8s-m6x95" Jan 11 17:45:35 crc kubenswrapper[4837]: I0111 17:45:35.590234 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/aa1bc5b4-0f84-413a-a7fd-d2531bbb8265-cert\") pod \"frr-k8s-webhook-server-7784b6fcf-568c8\" (UID: \"aa1bc5b4-0f84-413a-a7fd-d2531bbb8265\") " pod="metallb-system/frr-k8s-webhook-server-7784b6fcf-568c8" Jan 11 17:45:35 crc kubenswrapper[4837]: I0111 17:45:35.590417 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"reloader\" (UniqueName: \"kubernetes.io/empty-dir/d4fb0b82-36cb-45ce-b356-1a740d312fcf-reloader\") pod \"frr-k8s-m6x95\" (UID: \"d4fb0b82-36cb-45ce-b356-1a740d312fcf\") " pod="metallb-system/frr-k8s-m6x95" Jan 11 17:45:35 crc kubenswrapper[4837]: I0111 17:45:35.590465 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"frr-startup\" (UniqueName: \"kubernetes.io/configmap/d4fb0b82-36cb-45ce-b356-1a740d312fcf-frr-startup\") pod \"frr-k8s-m6x95\" (UID: \"d4fb0b82-36cb-45ce-b356-1a740d312fcf\") " pod="metallb-system/frr-k8s-m6x95" Jan 11 17:45:35 crc kubenswrapper[4837]: I0111 17:45:35.590576 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"frr-sockets\" (UniqueName: \"kubernetes.io/empty-dir/d4fb0b82-36cb-45ce-b356-1a740d312fcf-frr-sockets\") pod \"frr-k8s-m6x95\" (UID: \"d4fb0b82-36cb-45ce-b356-1a740d312fcf\") " pod="metallb-system/frr-k8s-m6x95" Jan 11 17:45:35 crc kubenswrapper[4837]: I0111 17:45:35.590691 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/d4fb0b82-36cb-45ce-b356-1a740d312fcf-metrics-certs\") pod \"frr-k8s-m6x95\" (UID: \"d4fb0b82-36cb-45ce-b356-1a740d312fcf\") " pod="metallb-system/frr-k8s-m6x95" Jan 11 17:45:35 crc kubenswrapper[4837]: I0111 17:45:35.590785 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics\" (UniqueName: \"kubernetes.io/empty-dir/d4fb0b82-36cb-45ce-b356-1a740d312fcf-metrics\") pod \"frr-k8s-m6x95\" (UID: \"d4fb0b82-36cb-45ce-b356-1a740d312fcf\") " pod="metallb-system/frr-k8s-m6x95" Jan 11 17:45:35 crc kubenswrapper[4837]: I0111 17:45:35.614868 4837 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["metallb-system/speaker-8bfbc"] Jan 11 17:45:35 crc kubenswrapper[4837]: I0111 17:45:35.615951 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/speaker-8bfbc" Jan 11 17:45:35 crc kubenswrapper[4837]: I0111 17:45:35.618235 4837 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"speaker-dockercfg-xhpjv" Jan 11 17:45:35 crc kubenswrapper[4837]: I0111 17:45:35.618273 4837 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"metallb-memberlist" Jan 11 17:45:35 crc kubenswrapper[4837]: I0111 17:45:35.618602 4837 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"speaker-certs-secret" Jan 11 17:45:35 crc kubenswrapper[4837]: I0111 17:45:35.618859 4837 reflector.go:368] Caches populated for *v1.ConfigMap from object-"metallb-system"/"metallb-excludel2" Jan 11 17:45:35 crc kubenswrapper[4837]: I0111 17:45:35.638256 4837 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["metallb-system/controller-5bddd4b946-5bpfr"] Jan 11 17:45:35 crc kubenswrapper[4837]: I0111 17:45:35.639399 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/controller-5bddd4b946-5bpfr" Jan 11 17:45:35 crc kubenswrapper[4837]: I0111 17:45:35.643463 4837 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"controller-certs-secret" Jan 11 17:45:35 crc kubenswrapper[4837]: I0111 17:45:35.652449 4837 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/controller-5bddd4b946-5bpfr"] Jan 11 17:45:35 crc kubenswrapper[4837]: I0111 17:45:35.691475 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"memberlist\" (UniqueName: \"kubernetes.io/secret/e3e46c8e-1e90-49a8-a3eb-879ccd3c4807-memberlist\") pod \"speaker-8bfbc\" (UID: \"e3e46c8e-1e90-49a8-a3eb-879ccd3c4807\") " pod="metallb-system/speaker-8bfbc" Jan 11 17:45:35 crc kubenswrapper[4837]: I0111 17:45:35.691515 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-964w6\" (UniqueName: \"kubernetes.io/projected/e3e46c8e-1e90-49a8-a3eb-879ccd3c4807-kube-api-access-964w6\") pod \"speaker-8bfbc\" (UID: \"e3e46c8e-1e90-49a8-a3eb-879ccd3c4807\") " pod="metallb-system/speaker-8bfbc" Jan 11 17:45:35 crc kubenswrapper[4837]: I0111 17:45:35.691532 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/da885226-0b14-4626-8d89-7d4505ab29a1-cert\") pod \"controller-5bddd4b946-5bpfr\" (UID: \"da885226-0b14-4626-8d89-7d4505ab29a1\") " pod="metallb-system/controller-5bddd4b946-5bpfr" Jan 11 17:45:35 crc kubenswrapper[4837]: I0111 17:45:35.691560 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics\" (UniqueName: \"kubernetes.io/empty-dir/d4fb0b82-36cb-45ce-b356-1a740d312fcf-metrics\") pod \"frr-k8s-m6x95\" (UID: \"d4fb0b82-36cb-45ce-b356-1a740d312fcf\") " pod="metallb-system/frr-k8s-m6x95" Jan 11 17:45:35 crc kubenswrapper[4837]: I0111 17:45:35.691636 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-h27l7\" (UniqueName: \"kubernetes.io/projected/aa1bc5b4-0f84-413a-a7fd-d2531bbb8265-kube-api-access-h27l7\") pod \"frr-k8s-webhook-server-7784b6fcf-568c8\" (UID: \"aa1bc5b4-0f84-413a-a7fd-d2531bbb8265\") " pod="metallb-system/frr-k8s-webhook-server-7784b6fcf-568c8" Jan 11 17:45:35 crc kubenswrapper[4837]: I0111 17:45:35.691686 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"frr-conf\" (UniqueName: \"kubernetes.io/empty-dir/d4fb0b82-36cb-45ce-b356-1a740d312fcf-frr-conf\") pod \"frr-k8s-m6x95\" (UID: \"d4fb0b82-36cb-45ce-b356-1a740d312fcf\") " pod="metallb-system/frr-k8s-m6x95" Jan 11 17:45:35 crc kubenswrapper[4837]: I0111 17:45:35.691718 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/e3e46c8e-1e90-49a8-a3eb-879ccd3c4807-metrics-certs\") pod \"speaker-8bfbc\" (UID: \"e3e46c8e-1e90-49a8-a3eb-879ccd3c4807\") " pod="metallb-system/speaker-8bfbc" Jan 11 17:45:35 crc kubenswrapper[4837]: I0111 17:45:35.691756 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jrzqf\" (UniqueName: \"kubernetes.io/projected/d4fb0b82-36cb-45ce-b356-1a740d312fcf-kube-api-access-jrzqf\") pod \"frr-k8s-m6x95\" (UID: \"d4fb0b82-36cb-45ce-b356-1a740d312fcf\") " pod="metallb-system/frr-k8s-m6x95" Jan 11 17:45:35 crc kubenswrapper[4837]: I0111 17:45:35.691773 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metallb-excludel2\" (UniqueName: \"kubernetes.io/configmap/e3e46c8e-1e90-49a8-a3eb-879ccd3c4807-metallb-excludel2\") pod \"speaker-8bfbc\" (UID: \"e3e46c8e-1e90-49a8-a3eb-879ccd3c4807\") " pod="metallb-system/speaker-8bfbc" Jan 11 17:45:35 crc kubenswrapper[4837]: I0111 17:45:35.691798 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/aa1bc5b4-0f84-413a-a7fd-d2531bbb8265-cert\") pod \"frr-k8s-webhook-server-7784b6fcf-568c8\" (UID: \"aa1bc5b4-0f84-413a-a7fd-d2531bbb8265\") " pod="metallb-system/frr-k8s-webhook-server-7784b6fcf-568c8" Jan 11 17:45:35 crc kubenswrapper[4837]: I0111 17:45:35.691857 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/da885226-0b14-4626-8d89-7d4505ab29a1-metrics-certs\") pod \"controller-5bddd4b946-5bpfr\" (UID: \"da885226-0b14-4626-8d89-7d4505ab29a1\") " pod="metallb-system/controller-5bddd4b946-5bpfr" Jan 11 17:45:35 crc kubenswrapper[4837]: I0111 17:45:35.691872 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"reloader\" (UniqueName: \"kubernetes.io/empty-dir/d4fb0b82-36cb-45ce-b356-1a740d312fcf-reloader\") pod \"frr-k8s-m6x95\" (UID: \"d4fb0b82-36cb-45ce-b356-1a740d312fcf\") " pod="metallb-system/frr-k8s-m6x95" Jan 11 17:45:35 crc kubenswrapper[4837]: I0111 17:45:35.691892 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"frr-startup\" (UniqueName: \"kubernetes.io/configmap/d4fb0b82-36cb-45ce-b356-1a740d312fcf-frr-startup\") pod \"frr-k8s-m6x95\" (UID: \"d4fb0b82-36cb-45ce-b356-1a740d312fcf\") " pod="metallb-system/frr-k8s-m6x95" Jan 11 17:45:35 crc kubenswrapper[4837]: I0111 17:45:35.691960 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"frr-sockets\" (UniqueName: \"kubernetes.io/empty-dir/d4fb0b82-36cb-45ce-b356-1a740d312fcf-frr-sockets\") pod \"frr-k8s-m6x95\" (UID: \"d4fb0b82-36cb-45ce-b356-1a740d312fcf\") " pod="metallb-system/frr-k8s-m6x95" Jan 11 17:45:35 crc kubenswrapper[4837]: I0111 17:45:35.691977 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/d4fb0b82-36cb-45ce-b356-1a740d312fcf-metrics-certs\") pod \"frr-k8s-m6x95\" (UID: \"d4fb0b82-36cb-45ce-b356-1a740d312fcf\") " pod="metallb-system/frr-k8s-m6x95" Jan 11 17:45:35 crc kubenswrapper[4837]: I0111 17:45:35.691994 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nv4xd\" (UniqueName: \"kubernetes.io/projected/da885226-0b14-4626-8d89-7d4505ab29a1-kube-api-access-nv4xd\") pod \"controller-5bddd4b946-5bpfr\" (UID: \"da885226-0b14-4626-8d89-7d4505ab29a1\") " pod="metallb-system/controller-5bddd4b946-5bpfr" Jan 11 17:45:35 crc kubenswrapper[4837]: I0111 17:45:35.691998 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics\" (UniqueName: \"kubernetes.io/empty-dir/d4fb0b82-36cb-45ce-b356-1a740d312fcf-metrics\") pod \"frr-k8s-m6x95\" (UID: \"d4fb0b82-36cb-45ce-b356-1a740d312fcf\") " pod="metallb-system/frr-k8s-m6x95" Jan 11 17:45:35 crc kubenswrapper[4837]: I0111 17:45:35.692204 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"frr-conf\" (UniqueName: \"kubernetes.io/empty-dir/d4fb0b82-36cb-45ce-b356-1a740d312fcf-frr-conf\") pod \"frr-k8s-m6x95\" (UID: \"d4fb0b82-36cb-45ce-b356-1a740d312fcf\") " pod="metallb-system/frr-k8s-m6x95" Jan 11 17:45:35 crc kubenswrapper[4837]: I0111 17:45:35.692592 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"reloader\" (UniqueName: \"kubernetes.io/empty-dir/d4fb0b82-36cb-45ce-b356-1a740d312fcf-reloader\") pod \"frr-k8s-m6x95\" (UID: \"d4fb0b82-36cb-45ce-b356-1a740d312fcf\") " pod="metallb-system/frr-k8s-m6x95" Jan 11 17:45:35 crc kubenswrapper[4837]: I0111 17:45:35.693262 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"frr-startup\" (UniqueName: \"kubernetes.io/configmap/d4fb0b82-36cb-45ce-b356-1a740d312fcf-frr-startup\") pod \"frr-k8s-m6x95\" (UID: \"d4fb0b82-36cb-45ce-b356-1a740d312fcf\") " pod="metallb-system/frr-k8s-m6x95" Jan 11 17:45:35 crc kubenswrapper[4837]: I0111 17:45:35.693281 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"frr-sockets\" (UniqueName: \"kubernetes.io/empty-dir/d4fb0b82-36cb-45ce-b356-1a740d312fcf-frr-sockets\") pod \"frr-k8s-m6x95\" (UID: \"d4fb0b82-36cb-45ce-b356-1a740d312fcf\") " pod="metallb-system/frr-k8s-m6x95" Jan 11 17:45:35 crc kubenswrapper[4837]: E0111 17:45:35.693298 4837 secret.go:188] Couldn't get secret metallb-system/frr-k8s-certs-secret: secret "frr-k8s-certs-secret" not found Jan 11 17:45:35 crc kubenswrapper[4837]: E0111 17:45:35.693393 4837 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/d4fb0b82-36cb-45ce-b356-1a740d312fcf-metrics-certs podName:d4fb0b82-36cb-45ce-b356-1a740d312fcf nodeName:}" failed. No retries permitted until 2026-01-11 17:45:36.193370423 +0000 UTC m=+910.371563199 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/d4fb0b82-36cb-45ce-b356-1a740d312fcf-metrics-certs") pod "frr-k8s-m6x95" (UID: "d4fb0b82-36cb-45ce-b356-1a740d312fcf") : secret "frr-k8s-certs-secret" not found Jan 11 17:45:35 crc kubenswrapper[4837]: I0111 17:45:35.697879 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/aa1bc5b4-0f84-413a-a7fd-d2531bbb8265-cert\") pod \"frr-k8s-webhook-server-7784b6fcf-568c8\" (UID: \"aa1bc5b4-0f84-413a-a7fd-d2531bbb8265\") " pod="metallb-system/frr-k8s-webhook-server-7784b6fcf-568c8" Jan 11 17:45:35 crc kubenswrapper[4837]: I0111 17:45:35.714961 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jrzqf\" (UniqueName: \"kubernetes.io/projected/d4fb0b82-36cb-45ce-b356-1a740d312fcf-kube-api-access-jrzqf\") pod \"frr-k8s-m6x95\" (UID: \"d4fb0b82-36cb-45ce-b356-1a740d312fcf\") " pod="metallb-system/frr-k8s-m6x95" Jan 11 17:45:35 crc kubenswrapper[4837]: I0111 17:45:35.715276 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-h27l7\" (UniqueName: \"kubernetes.io/projected/aa1bc5b4-0f84-413a-a7fd-d2531bbb8265-kube-api-access-h27l7\") pod \"frr-k8s-webhook-server-7784b6fcf-568c8\" (UID: \"aa1bc5b4-0f84-413a-a7fd-d2531bbb8265\") " pod="metallb-system/frr-k8s-webhook-server-7784b6fcf-568c8" Jan 11 17:45:35 crc kubenswrapper[4837]: I0111 17:45:35.792947 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-nv4xd\" (UniqueName: \"kubernetes.io/projected/da885226-0b14-4626-8d89-7d4505ab29a1-kube-api-access-nv4xd\") pod \"controller-5bddd4b946-5bpfr\" (UID: \"da885226-0b14-4626-8d89-7d4505ab29a1\") " pod="metallb-system/controller-5bddd4b946-5bpfr" Jan 11 17:45:35 crc kubenswrapper[4837]: I0111 17:45:35.793034 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"memberlist\" (UniqueName: \"kubernetes.io/secret/e3e46c8e-1e90-49a8-a3eb-879ccd3c4807-memberlist\") pod \"speaker-8bfbc\" (UID: \"e3e46c8e-1e90-49a8-a3eb-879ccd3c4807\") " pod="metallb-system/speaker-8bfbc" Jan 11 17:45:35 crc kubenswrapper[4837]: I0111 17:45:35.793063 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-964w6\" (UniqueName: \"kubernetes.io/projected/e3e46c8e-1e90-49a8-a3eb-879ccd3c4807-kube-api-access-964w6\") pod \"speaker-8bfbc\" (UID: \"e3e46c8e-1e90-49a8-a3eb-879ccd3c4807\") " pod="metallb-system/speaker-8bfbc" Jan 11 17:45:35 crc kubenswrapper[4837]: I0111 17:45:35.793091 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/da885226-0b14-4626-8d89-7d4505ab29a1-cert\") pod \"controller-5bddd4b946-5bpfr\" (UID: \"da885226-0b14-4626-8d89-7d4505ab29a1\") " pod="metallb-system/controller-5bddd4b946-5bpfr" Jan 11 17:45:35 crc kubenswrapper[4837]: I0111 17:45:35.793130 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/e3e46c8e-1e90-49a8-a3eb-879ccd3c4807-metrics-certs\") pod \"speaker-8bfbc\" (UID: \"e3e46c8e-1e90-49a8-a3eb-879ccd3c4807\") " pod="metallb-system/speaker-8bfbc" Jan 11 17:45:35 crc kubenswrapper[4837]: I0111 17:45:35.793158 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metallb-excludel2\" (UniqueName: \"kubernetes.io/configmap/e3e46c8e-1e90-49a8-a3eb-879ccd3c4807-metallb-excludel2\") pod \"speaker-8bfbc\" (UID: \"e3e46c8e-1e90-49a8-a3eb-879ccd3c4807\") " pod="metallb-system/speaker-8bfbc" Jan 11 17:45:35 crc kubenswrapper[4837]: I0111 17:45:35.793230 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/da885226-0b14-4626-8d89-7d4505ab29a1-metrics-certs\") pod \"controller-5bddd4b946-5bpfr\" (UID: \"da885226-0b14-4626-8d89-7d4505ab29a1\") " pod="metallb-system/controller-5bddd4b946-5bpfr" Jan 11 17:45:35 crc kubenswrapper[4837]: E0111 17:45:35.793356 4837 secret.go:188] Couldn't get secret metallb-system/controller-certs-secret: secret "controller-certs-secret" not found Jan 11 17:45:35 crc kubenswrapper[4837]: E0111 17:45:35.793361 4837 secret.go:188] Couldn't get secret metallb-system/metallb-memberlist: secret "metallb-memberlist" not found Jan 11 17:45:35 crc kubenswrapper[4837]: E0111 17:45:35.793417 4837 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/e3e46c8e-1e90-49a8-a3eb-879ccd3c4807-memberlist podName:e3e46c8e-1e90-49a8-a3eb-879ccd3c4807 nodeName:}" failed. No retries permitted until 2026-01-11 17:45:36.293397131 +0000 UTC m=+910.471589837 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "memberlist" (UniqueName: "kubernetes.io/secret/e3e46c8e-1e90-49a8-a3eb-879ccd3c4807-memberlist") pod "speaker-8bfbc" (UID: "e3e46c8e-1e90-49a8-a3eb-879ccd3c4807") : secret "metallb-memberlist" not found Jan 11 17:45:35 crc kubenswrapper[4837]: E0111 17:45:35.793438 4837 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/da885226-0b14-4626-8d89-7d4505ab29a1-metrics-certs podName:da885226-0b14-4626-8d89-7d4505ab29a1 nodeName:}" failed. No retries permitted until 2026-01-11 17:45:36.293429392 +0000 UTC m=+910.471622098 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/da885226-0b14-4626-8d89-7d4505ab29a1-metrics-certs") pod "controller-5bddd4b946-5bpfr" (UID: "da885226-0b14-4626-8d89-7d4505ab29a1") : secret "controller-certs-secret" not found Jan 11 17:45:35 crc kubenswrapper[4837]: I0111 17:45:35.794343 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metallb-excludel2\" (UniqueName: \"kubernetes.io/configmap/e3e46c8e-1e90-49a8-a3eb-879ccd3c4807-metallb-excludel2\") pod \"speaker-8bfbc\" (UID: \"e3e46c8e-1e90-49a8-a3eb-879ccd3c4807\") " pod="metallb-system/speaker-8bfbc" Jan 11 17:45:35 crc kubenswrapper[4837]: I0111 17:45:35.795164 4837 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"metallb-webhook-cert" Jan 11 17:45:35 crc kubenswrapper[4837]: I0111 17:45:35.796511 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/e3e46c8e-1e90-49a8-a3eb-879ccd3c4807-metrics-certs\") pod \"speaker-8bfbc\" (UID: \"e3e46c8e-1e90-49a8-a3eb-879ccd3c4807\") " pod="metallb-system/speaker-8bfbc" Jan 11 17:45:35 crc kubenswrapper[4837]: I0111 17:45:35.809409 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-nv4xd\" (UniqueName: \"kubernetes.io/projected/da885226-0b14-4626-8d89-7d4505ab29a1-kube-api-access-nv4xd\") pod \"controller-5bddd4b946-5bpfr\" (UID: \"da885226-0b14-4626-8d89-7d4505ab29a1\") " pod="metallb-system/controller-5bddd4b946-5bpfr" Jan 11 17:45:35 crc kubenswrapper[4837]: I0111 17:45:35.810290 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/da885226-0b14-4626-8d89-7d4505ab29a1-cert\") pod \"controller-5bddd4b946-5bpfr\" (UID: \"da885226-0b14-4626-8d89-7d4505ab29a1\") " pod="metallb-system/controller-5bddd4b946-5bpfr" Jan 11 17:45:35 crc kubenswrapper[4837]: I0111 17:45:35.822853 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-964w6\" (UniqueName: \"kubernetes.io/projected/e3e46c8e-1e90-49a8-a3eb-879ccd3c4807-kube-api-access-964w6\") pod \"speaker-8bfbc\" (UID: \"e3e46c8e-1e90-49a8-a3eb-879ccd3c4807\") " pod="metallb-system/speaker-8bfbc" Jan 11 17:45:35 crc kubenswrapper[4837]: I0111 17:45:35.853869 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/frr-k8s-webhook-server-7784b6fcf-568c8" Jan 11 17:45:36 crc kubenswrapper[4837]: I0111 17:45:36.197931 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/d4fb0b82-36cb-45ce-b356-1a740d312fcf-metrics-certs\") pod \"frr-k8s-m6x95\" (UID: \"d4fb0b82-36cb-45ce-b356-1a740d312fcf\") " pod="metallb-system/frr-k8s-m6x95" Jan 11 17:45:36 crc kubenswrapper[4837]: I0111 17:45:36.205818 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/d4fb0b82-36cb-45ce-b356-1a740d312fcf-metrics-certs\") pod \"frr-k8s-m6x95\" (UID: \"d4fb0b82-36cb-45ce-b356-1a740d312fcf\") " pod="metallb-system/frr-k8s-m6x95" Jan 11 17:45:36 crc kubenswrapper[4837]: I0111 17:45:36.263524 4837 generic.go:334] "Generic (PLEG): container finished" podID="c3bf552a-61ce-44a2-9bfe-902e539a1291" containerID="3fd42816ca63a537b37c20db955b5de922a9782c207ce031e92e850c4677a97c" exitCode=0 Jan 11 17:45:36 crc kubenswrapper[4837]: I0111 17:45:36.263574 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-fqccs" event={"ID":"c3bf552a-61ce-44a2-9bfe-902e539a1291","Type":"ContainerDied","Data":"3fd42816ca63a537b37c20db955b5de922a9782c207ce031e92e850c4677a97c"} Jan 11 17:45:36 crc kubenswrapper[4837]: I0111 17:45:36.300100 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"memberlist\" (UniqueName: \"kubernetes.io/secret/e3e46c8e-1e90-49a8-a3eb-879ccd3c4807-memberlist\") pod \"speaker-8bfbc\" (UID: \"e3e46c8e-1e90-49a8-a3eb-879ccd3c4807\") " pod="metallb-system/speaker-8bfbc" Jan 11 17:45:36 crc kubenswrapper[4837]: I0111 17:45:36.300247 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/da885226-0b14-4626-8d89-7d4505ab29a1-metrics-certs\") pod \"controller-5bddd4b946-5bpfr\" (UID: \"da885226-0b14-4626-8d89-7d4505ab29a1\") " pod="metallb-system/controller-5bddd4b946-5bpfr" Jan 11 17:45:36 crc kubenswrapper[4837]: E0111 17:45:36.300402 4837 secret.go:188] Couldn't get secret metallb-system/metallb-memberlist: secret "metallb-memberlist" not found Jan 11 17:45:36 crc kubenswrapper[4837]: E0111 17:45:36.300533 4837 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/e3e46c8e-1e90-49a8-a3eb-879ccd3c4807-memberlist podName:e3e46c8e-1e90-49a8-a3eb-879ccd3c4807 nodeName:}" failed. No retries permitted until 2026-01-11 17:45:37.300501906 +0000 UTC m=+911.478694672 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "memberlist" (UniqueName: "kubernetes.io/secret/e3e46c8e-1e90-49a8-a3eb-879ccd3c4807-memberlist") pod "speaker-8bfbc" (UID: "e3e46c8e-1e90-49a8-a3eb-879ccd3c4807") : secret "metallb-memberlist" not found Jan 11 17:45:36 crc kubenswrapper[4837]: I0111 17:45:36.303848 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/da885226-0b14-4626-8d89-7d4505ab29a1-metrics-certs\") pod \"controller-5bddd4b946-5bpfr\" (UID: \"da885226-0b14-4626-8d89-7d4505ab29a1\") " pod="metallb-system/controller-5bddd4b946-5bpfr" Jan 11 17:45:36 crc kubenswrapper[4837]: I0111 17:45:36.318067 4837 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/frr-k8s-webhook-server-7784b6fcf-568c8"] Jan 11 17:45:36 crc kubenswrapper[4837]: I0111 17:45:36.442437 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/frr-k8s-m6x95" Jan 11 17:45:36 crc kubenswrapper[4837]: I0111 17:45:36.560985 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/controller-5bddd4b946-5bpfr" Jan 11 17:45:36 crc kubenswrapper[4837]: I0111 17:45:36.685116 4837 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-fqccs" Jan 11 17:45:36 crc kubenswrapper[4837]: I0111 17:45:36.706551 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-57vtv\" (UniqueName: \"kubernetes.io/projected/c3bf552a-61ce-44a2-9bfe-902e539a1291-kube-api-access-57vtv\") pod \"c3bf552a-61ce-44a2-9bfe-902e539a1291\" (UID: \"c3bf552a-61ce-44a2-9bfe-902e539a1291\") " Jan 11 17:45:36 crc kubenswrapper[4837]: I0111 17:45:36.706613 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c3bf552a-61ce-44a2-9bfe-902e539a1291-utilities\") pod \"c3bf552a-61ce-44a2-9bfe-902e539a1291\" (UID: \"c3bf552a-61ce-44a2-9bfe-902e539a1291\") " Jan 11 17:45:36 crc kubenswrapper[4837]: I0111 17:45:36.706666 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c3bf552a-61ce-44a2-9bfe-902e539a1291-catalog-content\") pod \"c3bf552a-61ce-44a2-9bfe-902e539a1291\" (UID: \"c3bf552a-61ce-44a2-9bfe-902e539a1291\") " Jan 11 17:45:36 crc kubenswrapper[4837]: I0111 17:45:36.714291 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/c3bf552a-61ce-44a2-9bfe-902e539a1291-utilities" (OuterVolumeSpecName: "utilities") pod "c3bf552a-61ce-44a2-9bfe-902e539a1291" (UID: "c3bf552a-61ce-44a2-9bfe-902e539a1291"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 11 17:45:36 crc kubenswrapper[4837]: I0111 17:45:36.714420 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c3bf552a-61ce-44a2-9bfe-902e539a1291-kube-api-access-57vtv" (OuterVolumeSpecName: "kube-api-access-57vtv") pod "c3bf552a-61ce-44a2-9bfe-902e539a1291" (UID: "c3bf552a-61ce-44a2-9bfe-902e539a1291"). InnerVolumeSpecName "kube-api-access-57vtv". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 11 17:45:36 crc kubenswrapper[4837]: I0111 17:45:36.808091 4837 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-57vtv\" (UniqueName: \"kubernetes.io/projected/c3bf552a-61ce-44a2-9bfe-902e539a1291-kube-api-access-57vtv\") on node \"crc\" DevicePath \"\"" Jan 11 17:45:36 crc kubenswrapper[4837]: I0111 17:45:36.808116 4837 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c3bf552a-61ce-44a2-9bfe-902e539a1291-utilities\") on node \"crc\" DevicePath \"\"" Jan 11 17:45:37 crc kubenswrapper[4837]: I0111 17:45:37.050989 4837 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/controller-5bddd4b946-5bpfr"] Jan 11 17:45:37 crc kubenswrapper[4837]: W0111 17:45:37.056322 4837 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podda885226_0b14_4626_8d89_7d4505ab29a1.slice/crio-a48a8bdcfe235fad9befdfe23040e7f930c24d564218fabc47e40b72c1b000d3 WatchSource:0}: Error finding container a48a8bdcfe235fad9befdfe23040e7f930c24d564218fabc47e40b72c1b000d3: Status 404 returned error can't find the container with id a48a8bdcfe235fad9befdfe23040e7f930c24d564218fabc47e40b72c1b000d3 Jan 11 17:45:37 crc kubenswrapper[4837]: I0111 17:45:37.097486 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/c3bf552a-61ce-44a2-9bfe-902e539a1291-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "c3bf552a-61ce-44a2-9bfe-902e539a1291" (UID: "c3bf552a-61ce-44a2-9bfe-902e539a1291"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 11 17:45:37 crc kubenswrapper[4837]: I0111 17:45:37.113081 4837 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c3bf552a-61ce-44a2-9bfe-902e539a1291-catalog-content\") on node \"crc\" DevicePath \"\"" Jan 11 17:45:37 crc kubenswrapper[4837]: I0111 17:45:37.269271 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/controller-5bddd4b946-5bpfr" event={"ID":"da885226-0b14-4626-8d89-7d4505ab29a1","Type":"ContainerStarted","Data":"a48a8bdcfe235fad9befdfe23040e7f930c24d564218fabc47e40b72c1b000d3"} Jan 11 17:45:37 crc kubenswrapper[4837]: I0111 17:45:37.270372 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-webhook-server-7784b6fcf-568c8" event={"ID":"aa1bc5b4-0f84-413a-a7fd-d2531bbb8265","Type":"ContainerStarted","Data":"29342bd75b9d1913e54df4f2b1bef3f7019c32d75bf892702627ca92528d4b04"} Jan 11 17:45:37 crc kubenswrapper[4837]: I0111 17:45:37.272014 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-fqccs" event={"ID":"c3bf552a-61ce-44a2-9bfe-902e539a1291","Type":"ContainerDied","Data":"2d5999272912d033fe98c1de556e6eed158ca411fa7fb65c69c37c2c4fc48d10"} Jan 11 17:45:37 crc kubenswrapper[4837]: I0111 17:45:37.272098 4837 scope.go:117] "RemoveContainer" containerID="3fd42816ca63a537b37c20db955b5de922a9782c207ce031e92e850c4677a97c" Jan 11 17:45:37 crc kubenswrapper[4837]: I0111 17:45:37.272238 4837 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-fqccs" Jan 11 17:45:37 crc kubenswrapper[4837]: I0111 17:45:37.293015 4837 scope.go:117] "RemoveContainer" containerID="e180bfdfe3562fd566d0291a8b460860ef4e8442d1c46673522f4b6a1cd0bdfd" Jan 11 17:45:37 crc kubenswrapper[4837]: I0111 17:45:37.298345 4837 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-fqccs"] Jan 11 17:45:37 crc kubenswrapper[4837]: I0111 17:45:37.304051 4837 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-fqccs"] Jan 11 17:45:37 crc kubenswrapper[4837]: I0111 17:45:37.316043 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"memberlist\" (UniqueName: \"kubernetes.io/secret/e3e46c8e-1e90-49a8-a3eb-879ccd3c4807-memberlist\") pod \"speaker-8bfbc\" (UID: \"e3e46c8e-1e90-49a8-a3eb-879ccd3c4807\") " pod="metallb-system/speaker-8bfbc" Jan 11 17:45:37 crc kubenswrapper[4837]: E0111 17:45:37.316203 4837 secret.go:188] Couldn't get secret metallb-system/metallb-memberlist: secret "metallb-memberlist" not found Jan 11 17:45:37 crc kubenswrapper[4837]: E0111 17:45:37.316253 4837 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/e3e46c8e-1e90-49a8-a3eb-879ccd3c4807-memberlist podName:e3e46c8e-1e90-49a8-a3eb-879ccd3c4807 nodeName:}" failed. No retries permitted until 2026-01-11 17:45:39.316238968 +0000 UTC m=+913.494431674 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "memberlist" (UniqueName: "kubernetes.io/secret/e3e46c8e-1e90-49a8-a3eb-879ccd3c4807-memberlist") pod "speaker-8bfbc" (UID: "e3e46c8e-1e90-49a8-a3eb-879ccd3c4807") : secret "metallb-memberlist" not found Jan 11 17:45:37 crc kubenswrapper[4837]: I0111 17:45:37.324226 4837 scope.go:117] "RemoveContainer" containerID="18c92cd4a3bd3734317535ece1496fcc4cbb3dc6f700ae726a25757a3c2b42a2" Jan 11 17:45:38 crc kubenswrapper[4837]: I0111 17:45:38.285212 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/controller-5bddd4b946-5bpfr" event={"ID":"da885226-0b14-4626-8d89-7d4505ab29a1","Type":"ContainerStarted","Data":"7031d30a688e082186351ac6c3b7389534bae15f45716bbbe9097fa34fd59521"} Jan 11 17:45:38 crc kubenswrapper[4837]: I0111 17:45:38.285609 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/controller-5bddd4b946-5bpfr" event={"ID":"da885226-0b14-4626-8d89-7d4505ab29a1","Type":"ContainerStarted","Data":"40f9328730369a103b895230e41e105f5bb6908a743a1b42a9fd274b9fb2e878"} Jan 11 17:45:38 crc kubenswrapper[4837]: I0111 17:45:38.285629 4837 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="metallb-system/controller-5bddd4b946-5bpfr" Jan 11 17:45:38 crc kubenswrapper[4837]: I0111 17:45:38.286383 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-m6x95" event={"ID":"d4fb0b82-36cb-45ce-b356-1a740d312fcf","Type":"ContainerStarted","Data":"861f742ab177c3e6a42f1d959ec65912d55a81172c648dfcc9e150abc25efe4e"} Jan 11 17:45:38 crc kubenswrapper[4837]: I0111 17:45:38.301153 4837 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="metallb-system/controller-5bddd4b946-5bpfr" podStartSLOduration=3.301138543 podStartE2EDuration="3.301138543s" podCreationTimestamp="2026-01-11 17:45:35 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-11 17:45:38.300542818 +0000 UTC m=+912.478735534" watchObservedRunningTime="2026-01-11 17:45:38.301138543 +0000 UTC m=+912.479331249" Jan 11 17:45:38 crc kubenswrapper[4837]: I0111 17:45:38.372815 4837 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c3bf552a-61ce-44a2-9bfe-902e539a1291" path="/var/lib/kubelet/pods/c3bf552a-61ce-44a2-9bfe-902e539a1291/volumes" Jan 11 17:45:39 crc kubenswrapper[4837]: I0111 17:45:39.342299 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"memberlist\" (UniqueName: \"kubernetes.io/secret/e3e46c8e-1e90-49a8-a3eb-879ccd3c4807-memberlist\") pod \"speaker-8bfbc\" (UID: \"e3e46c8e-1e90-49a8-a3eb-879ccd3c4807\") " pod="metallb-system/speaker-8bfbc" Jan 11 17:45:39 crc kubenswrapper[4837]: I0111 17:45:39.367941 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"memberlist\" (UniqueName: \"kubernetes.io/secret/e3e46c8e-1e90-49a8-a3eb-879ccd3c4807-memberlist\") pod \"speaker-8bfbc\" (UID: \"e3e46c8e-1e90-49a8-a3eb-879ccd3c4807\") " pod="metallb-system/speaker-8bfbc" Jan 11 17:45:39 crc kubenswrapper[4837]: I0111 17:45:39.531746 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/speaker-8bfbc" Jan 11 17:45:39 crc kubenswrapper[4837]: W0111 17:45:39.573802 4837 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pode3e46c8e_1e90_49a8_a3eb_879ccd3c4807.slice/crio-a94793bd3c7e25bfe39edbec2b2b648fe54640574678f38058028bbe7400f56d WatchSource:0}: Error finding container a94793bd3c7e25bfe39edbec2b2b648fe54640574678f38058028bbe7400f56d: Status 404 returned error can't find the container with id a94793bd3c7e25bfe39edbec2b2b648fe54640574678f38058028bbe7400f56d Jan 11 17:45:40 crc kubenswrapper[4837]: I0111 17:45:40.301573 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/speaker-8bfbc" event={"ID":"e3e46c8e-1e90-49a8-a3eb-879ccd3c4807","Type":"ContainerStarted","Data":"035e91994ab8cbdeb89e3da3b930093bb9ad0fe7e38d59f97f029136547be598"} Jan 11 17:45:40 crc kubenswrapper[4837]: I0111 17:45:40.301951 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/speaker-8bfbc" event={"ID":"e3e46c8e-1e90-49a8-a3eb-879ccd3c4807","Type":"ContainerStarted","Data":"0a0eaa50525394f02f4136845a23cf85d9fb66753d14c3df222e2bda5a0f917b"} Jan 11 17:45:40 crc kubenswrapper[4837]: I0111 17:45:40.301973 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/speaker-8bfbc" event={"ID":"e3e46c8e-1e90-49a8-a3eb-879ccd3c4807","Type":"ContainerStarted","Data":"a94793bd3c7e25bfe39edbec2b2b648fe54640574678f38058028bbe7400f56d"} Jan 11 17:45:40 crc kubenswrapper[4837]: I0111 17:45:40.302310 4837 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="metallb-system/speaker-8bfbc" Jan 11 17:45:40 crc kubenswrapper[4837]: I0111 17:45:40.317039 4837 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="metallb-system/speaker-8bfbc" podStartSLOduration=5.317021459 podStartE2EDuration="5.317021459s" podCreationTimestamp="2026-01-11 17:45:35 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-11 17:45:40.316756692 +0000 UTC m=+914.494949418" watchObservedRunningTime="2026-01-11 17:45:40.317021459 +0000 UTC m=+914.495214175" Jan 11 17:45:44 crc kubenswrapper[4837]: I0111 17:45:44.340107 4837 generic.go:334] "Generic (PLEG): container finished" podID="d4fb0b82-36cb-45ce-b356-1a740d312fcf" containerID="dd7b9880d97bb1ee98323eb451163fa1efbc587e6eb35952e16326b74706724b" exitCode=0 Jan 11 17:45:44 crc kubenswrapper[4837]: I0111 17:45:44.340214 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-m6x95" event={"ID":"d4fb0b82-36cb-45ce-b356-1a740d312fcf","Type":"ContainerDied","Data":"dd7b9880d97bb1ee98323eb451163fa1efbc587e6eb35952e16326b74706724b"} Jan 11 17:45:44 crc kubenswrapper[4837]: I0111 17:45:44.344031 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-webhook-server-7784b6fcf-568c8" event={"ID":"aa1bc5b4-0f84-413a-a7fd-d2531bbb8265","Type":"ContainerStarted","Data":"e273767b5d9632d6049db89c866ad71a197847102ee92917b7741d4cc8b58ef8"} Jan 11 17:45:44 crc kubenswrapper[4837]: I0111 17:45:44.344207 4837 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="metallb-system/frr-k8s-webhook-server-7784b6fcf-568c8" Jan 11 17:45:44 crc kubenswrapper[4837]: I0111 17:45:44.401901 4837 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="metallb-system/frr-k8s-webhook-server-7784b6fcf-568c8" podStartSLOduration=2.001009459 podStartE2EDuration="9.40187652s" podCreationTimestamp="2026-01-11 17:45:35 +0000 UTC" firstStartedPulling="2026-01-11 17:45:36.329144973 +0000 UTC m=+910.507337709" lastFinishedPulling="2026-01-11 17:45:43.730012024 +0000 UTC m=+917.908204770" observedRunningTime="2026-01-11 17:45:44.392968021 +0000 UTC m=+918.571160747" watchObservedRunningTime="2026-01-11 17:45:44.40187652 +0000 UTC m=+918.580069266" Jan 11 17:45:45 crc kubenswrapper[4837]: I0111 17:45:45.352177 4837 generic.go:334] "Generic (PLEG): container finished" podID="d4fb0b82-36cb-45ce-b356-1a740d312fcf" containerID="c453c5fd55ae04058e4c2ccf2a215beb5b0ee0a6702f7b91379b9a775e24adef" exitCode=0 Jan 11 17:45:45 crc kubenswrapper[4837]: I0111 17:45:45.352256 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-m6x95" event={"ID":"d4fb0b82-36cb-45ce-b356-1a740d312fcf","Type":"ContainerDied","Data":"c453c5fd55ae04058e4c2ccf2a215beb5b0ee0a6702f7b91379b9a775e24adef"} Jan 11 17:45:46 crc kubenswrapper[4837]: I0111 17:45:46.359302 4837 generic.go:334] "Generic (PLEG): container finished" podID="d4fb0b82-36cb-45ce-b356-1a740d312fcf" containerID="f6178d46f729f5de8a28042dcdff285026e13f31e14fe7f63722b88dcf3f565c" exitCode=0 Jan 11 17:45:46 crc kubenswrapper[4837]: I0111 17:45:46.359358 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-m6x95" event={"ID":"d4fb0b82-36cb-45ce-b356-1a740d312fcf","Type":"ContainerDied","Data":"f6178d46f729f5de8a28042dcdff285026e13f31e14fe7f63722b88dcf3f565c"} Jan 11 17:45:47 crc kubenswrapper[4837]: I0111 17:45:47.370043 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-m6x95" event={"ID":"d4fb0b82-36cb-45ce-b356-1a740d312fcf","Type":"ContainerStarted","Data":"61db12e4f943323567dce452e3823ee5c37d9bbbeb0b32cc5148e40f537af9fb"} Jan 11 17:45:47 crc kubenswrapper[4837]: I0111 17:45:47.370411 4837 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="metallb-system/frr-k8s-m6x95" Jan 11 17:45:47 crc kubenswrapper[4837]: I0111 17:45:47.370424 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-m6x95" event={"ID":"d4fb0b82-36cb-45ce-b356-1a740d312fcf","Type":"ContainerStarted","Data":"b63a95606b853d8c847d85064df30043c801e75bd6741526b578c61fd2a22245"} Jan 11 17:45:47 crc kubenswrapper[4837]: I0111 17:45:47.370435 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-m6x95" event={"ID":"d4fb0b82-36cb-45ce-b356-1a740d312fcf","Type":"ContainerStarted","Data":"cef39a532ce56f553ae6d0a8e1e57beabdc215a155db6239aa16fe9670924a4f"} Jan 11 17:45:47 crc kubenswrapper[4837]: I0111 17:45:47.370445 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-m6x95" event={"ID":"d4fb0b82-36cb-45ce-b356-1a740d312fcf","Type":"ContainerStarted","Data":"70a791e38db438681e9111c7347892ab6fd0a3d8829980ac25b77397b9a953c7"} Jan 11 17:45:47 crc kubenswrapper[4837]: I0111 17:45:47.370456 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-m6x95" event={"ID":"d4fb0b82-36cb-45ce-b356-1a740d312fcf","Type":"ContainerStarted","Data":"1dfda3b0873e309fb148b2a0fe255d2c923ae079f9e51c017e8f65c766c32484"} Jan 11 17:45:47 crc kubenswrapper[4837]: I0111 17:45:47.370465 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-m6x95" event={"ID":"d4fb0b82-36cb-45ce-b356-1a740d312fcf","Type":"ContainerStarted","Data":"2c25b38968f41b7ba78b1e0075ee26cc96ba39f246ecf7b05cb9fda78ff175a3"} Jan 11 17:45:47 crc kubenswrapper[4837]: I0111 17:45:47.405169 4837 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="metallb-system/frr-k8s-m6x95" podStartSLOduration=6.145080236 podStartE2EDuration="12.405151298s" podCreationTimestamp="2026-01-11 17:45:35 +0000 UTC" firstStartedPulling="2026-01-11 17:45:37.508069223 +0000 UTC m=+911.686261919" lastFinishedPulling="2026-01-11 17:45:43.768140235 +0000 UTC m=+917.946332981" observedRunningTime="2026-01-11 17:45:47.399327902 +0000 UTC m=+921.577520598" watchObservedRunningTime="2026-01-11 17:45:47.405151298 +0000 UTC m=+921.583344004" Jan 11 17:45:49 crc kubenswrapper[4837]: I0111 17:45:49.537958 4837 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="metallb-system/speaker-8bfbc" Jan 11 17:45:51 crc kubenswrapper[4837]: I0111 17:45:51.442955 4837 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="metallb-system/frr-k8s-m6x95" Jan 11 17:45:51 crc kubenswrapper[4837]: I0111 17:45:51.492079 4837 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="metallb-system/frr-k8s-m6x95" Jan 11 17:45:52 crc kubenswrapper[4837]: I0111 17:45:52.704794 4837 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/openstack-operator-index-txbd7"] Jan 11 17:45:52 crc kubenswrapper[4837]: E0111 17:45:52.706270 4837 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c3bf552a-61ce-44a2-9bfe-902e539a1291" containerName="extract-utilities" Jan 11 17:45:52 crc kubenswrapper[4837]: I0111 17:45:52.706428 4837 state_mem.go:107] "Deleted CPUSet assignment" podUID="c3bf552a-61ce-44a2-9bfe-902e539a1291" containerName="extract-utilities" Jan 11 17:45:52 crc kubenswrapper[4837]: E0111 17:45:52.706563 4837 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c3bf552a-61ce-44a2-9bfe-902e539a1291" containerName="extract-content" Jan 11 17:45:52 crc kubenswrapper[4837]: I0111 17:45:52.706712 4837 state_mem.go:107] "Deleted CPUSet assignment" podUID="c3bf552a-61ce-44a2-9bfe-902e539a1291" containerName="extract-content" Jan 11 17:45:52 crc kubenswrapper[4837]: E0111 17:45:52.706868 4837 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c3bf552a-61ce-44a2-9bfe-902e539a1291" containerName="registry-server" Jan 11 17:45:52 crc kubenswrapper[4837]: I0111 17:45:52.706985 4837 state_mem.go:107] "Deleted CPUSet assignment" podUID="c3bf552a-61ce-44a2-9bfe-902e539a1291" containerName="registry-server" Jan 11 17:45:52 crc kubenswrapper[4837]: I0111 17:45:52.707278 4837 memory_manager.go:354] "RemoveStaleState removing state" podUID="c3bf552a-61ce-44a2-9bfe-902e539a1291" containerName="registry-server" Jan 11 17:45:52 crc kubenswrapper[4837]: I0111 17:45:52.708027 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-index-txbd7" Jan 11 17:45:52 crc kubenswrapper[4837]: I0111 17:45:52.710102 4837 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"openstack-operator-index-dockercfg-6vw8f" Jan 11 17:45:52 crc kubenswrapper[4837]: I0111 17:45:52.710310 4837 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack-operators"/"openshift-service-ca.crt" Jan 11 17:45:52 crc kubenswrapper[4837]: I0111 17:45:52.710425 4837 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack-operators"/"kube-root-ca.crt" Jan 11 17:45:52 crc kubenswrapper[4837]: I0111 17:45:52.717495 4837 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-operator-index-txbd7"] Jan 11 17:45:52 crc kubenswrapper[4837]: I0111 17:45:52.840366 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nqcfg\" (UniqueName: \"kubernetes.io/projected/cdf85851-f855-4a22-83ea-14c5791be67b-kube-api-access-nqcfg\") pod \"openstack-operator-index-txbd7\" (UID: \"cdf85851-f855-4a22-83ea-14c5791be67b\") " pod="openstack-operators/openstack-operator-index-txbd7" Jan 11 17:45:52 crc kubenswrapper[4837]: I0111 17:45:52.941662 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-nqcfg\" (UniqueName: \"kubernetes.io/projected/cdf85851-f855-4a22-83ea-14c5791be67b-kube-api-access-nqcfg\") pod \"openstack-operator-index-txbd7\" (UID: \"cdf85851-f855-4a22-83ea-14c5791be67b\") " pod="openstack-operators/openstack-operator-index-txbd7" Jan 11 17:45:52 crc kubenswrapper[4837]: I0111 17:45:52.962631 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-nqcfg\" (UniqueName: \"kubernetes.io/projected/cdf85851-f855-4a22-83ea-14c5791be67b-kube-api-access-nqcfg\") pod \"openstack-operator-index-txbd7\" (UID: \"cdf85851-f855-4a22-83ea-14c5791be67b\") " pod="openstack-operators/openstack-operator-index-txbd7" Jan 11 17:45:53 crc kubenswrapper[4837]: I0111 17:45:53.036192 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-index-txbd7" Jan 11 17:45:53 crc kubenswrapper[4837]: I0111 17:45:53.483377 4837 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-operator-index-txbd7"] Jan 11 17:45:54 crc kubenswrapper[4837]: I0111 17:45:54.425975 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-index-txbd7" event={"ID":"cdf85851-f855-4a22-83ea-14c5791be67b","Type":"ContainerStarted","Data":"d05e79b61c7fd4bc0438bed0c9b4ba7024fcd930d273bdd902806d9a18139c74"} Jan 11 17:45:55 crc kubenswrapper[4837]: I0111 17:45:55.865868 4837 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="metallb-system/frr-k8s-webhook-server-7784b6fcf-568c8" Jan 11 17:45:56 crc kubenswrapper[4837]: I0111 17:45:56.445355 4837 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="metallb-system/frr-k8s-m6x95" Jan 11 17:45:56 crc kubenswrapper[4837]: I0111 17:45:56.478059 4837 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-operators/openstack-operator-index-txbd7"] Jan 11 17:45:56 crc kubenswrapper[4837]: I0111 17:45:56.569800 4837 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="metallb-system/controller-5bddd4b946-5bpfr" Jan 11 17:45:57 crc kubenswrapper[4837]: I0111 17:45:57.079915 4837 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/openstack-operator-index-qqjnc"] Jan 11 17:45:57 crc kubenswrapper[4837]: I0111 17:45:57.080873 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-index-qqjnc" Jan 11 17:45:57 crc kubenswrapper[4837]: I0111 17:45:57.091362 4837 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-operator-index-qqjnc"] Jan 11 17:45:57 crc kubenswrapper[4837]: I0111 17:45:57.210858 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4st69\" (UniqueName: \"kubernetes.io/projected/815a7cf2-a384-4c14-954a-19e05a030e78-kube-api-access-4st69\") pod \"openstack-operator-index-qqjnc\" (UID: \"815a7cf2-a384-4c14-954a-19e05a030e78\") " pod="openstack-operators/openstack-operator-index-qqjnc" Jan 11 17:45:57 crc kubenswrapper[4837]: I0111 17:45:57.313038 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4st69\" (UniqueName: \"kubernetes.io/projected/815a7cf2-a384-4c14-954a-19e05a030e78-kube-api-access-4st69\") pod \"openstack-operator-index-qqjnc\" (UID: \"815a7cf2-a384-4c14-954a-19e05a030e78\") " pod="openstack-operators/openstack-operator-index-qqjnc" Jan 11 17:45:57 crc kubenswrapper[4837]: I0111 17:45:57.347998 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4st69\" (UniqueName: \"kubernetes.io/projected/815a7cf2-a384-4c14-954a-19e05a030e78-kube-api-access-4st69\") pod \"openstack-operator-index-qqjnc\" (UID: \"815a7cf2-a384-4c14-954a-19e05a030e78\") " pod="openstack-operators/openstack-operator-index-qqjnc" Jan 11 17:45:57 crc kubenswrapper[4837]: I0111 17:45:57.410305 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-index-qqjnc" Jan 11 17:45:57 crc kubenswrapper[4837]: I0111 17:45:57.450954 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-index-txbd7" event={"ID":"cdf85851-f855-4a22-83ea-14c5791be67b","Type":"ContainerStarted","Data":"cb01251a18230269f915c395f21badc2e3306fd058333470802ae25efd3ec2c1"} Jan 11 17:45:57 crc kubenswrapper[4837]: I0111 17:45:57.511964 4837 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/openstack-operator-index-txbd7" podStartSLOduration=2.71528803 podStartE2EDuration="5.511939017s" podCreationTimestamp="2026-01-11 17:45:52 +0000 UTC" firstStartedPulling="2026-01-11 17:45:53.493422651 +0000 UTC m=+927.671615367" lastFinishedPulling="2026-01-11 17:45:56.290073648 +0000 UTC m=+930.468266354" observedRunningTime="2026-01-11 17:45:57.479124968 +0000 UTC m=+931.657317734" watchObservedRunningTime="2026-01-11 17:45:57.511939017 +0000 UTC m=+931.690131733" Jan 11 17:45:57 crc kubenswrapper[4837]: I0111 17:45:57.891168 4837 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-operator-index-qqjnc"] Jan 11 17:45:57 crc kubenswrapper[4837]: W0111 17:45:57.902056 4837 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod815a7cf2_a384_4c14_954a_19e05a030e78.slice/crio-00fbb8e0f1015cc71ea6a26830d5f1adb6b7da7420742f42240bfbf14516b641 WatchSource:0}: Error finding container 00fbb8e0f1015cc71ea6a26830d5f1adb6b7da7420742f42240bfbf14516b641: Status 404 returned error can't find the container with id 00fbb8e0f1015cc71ea6a26830d5f1adb6b7da7420742f42240bfbf14516b641 Jan 11 17:45:58 crc kubenswrapper[4837]: I0111 17:45:58.459827 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-index-qqjnc" event={"ID":"815a7cf2-a384-4c14-954a-19e05a030e78","Type":"ContainerStarted","Data":"00fbb8e0f1015cc71ea6a26830d5f1adb6b7da7420742f42240bfbf14516b641"} Jan 11 17:45:58 crc kubenswrapper[4837]: I0111 17:45:58.460070 4837 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-operators/openstack-operator-index-txbd7" podUID="cdf85851-f855-4a22-83ea-14c5791be67b" containerName="registry-server" containerID="cri-o://cb01251a18230269f915c395f21badc2e3306fd058333470802ae25efd3ec2c1" gracePeriod=2 Jan 11 17:46:00 crc kubenswrapper[4837]: I0111 17:46:00.480549 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-index-qqjnc" event={"ID":"815a7cf2-a384-4c14-954a-19e05a030e78","Type":"ContainerStarted","Data":"12d7af41f9c857bcb20158533cc7c2697b2625b32b21d460484f00a220a71b17"} Jan 11 17:46:00 crc kubenswrapper[4837]: I0111 17:46:00.482730 4837 generic.go:334] "Generic (PLEG): container finished" podID="cdf85851-f855-4a22-83ea-14c5791be67b" containerID="cb01251a18230269f915c395f21badc2e3306fd058333470802ae25efd3ec2c1" exitCode=0 Jan 11 17:46:00 crc kubenswrapper[4837]: I0111 17:46:00.482804 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-index-txbd7" event={"ID":"cdf85851-f855-4a22-83ea-14c5791be67b","Type":"ContainerDied","Data":"cb01251a18230269f915c395f21badc2e3306fd058333470802ae25efd3ec2c1"} Jan 11 17:46:00 crc kubenswrapper[4837]: I0111 17:46:00.831475 4837 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-index-txbd7" Jan 11 17:46:00 crc kubenswrapper[4837]: I0111 17:46:00.852768 4837 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/openstack-operator-index-qqjnc" podStartSLOduration=1.871335686 podStartE2EDuration="3.852749229s" podCreationTimestamp="2026-01-11 17:45:57 +0000 UTC" firstStartedPulling="2026-01-11 17:45:57.904986938 +0000 UTC m=+932.083179644" lastFinishedPulling="2026-01-11 17:45:59.886400481 +0000 UTC m=+934.064593187" observedRunningTime="2026-01-11 17:46:00.506013548 +0000 UTC m=+934.684206294" watchObservedRunningTime="2026-01-11 17:46:00.852749229 +0000 UTC m=+935.030941935" Jan 11 17:46:00 crc kubenswrapper[4837]: I0111 17:46:00.970334 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-nqcfg\" (UniqueName: \"kubernetes.io/projected/cdf85851-f855-4a22-83ea-14c5791be67b-kube-api-access-nqcfg\") pod \"cdf85851-f855-4a22-83ea-14c5791be67b\" (UID: \"cdf85851-f855-4a22-83ea-14c5791be67b\") " Jan 11 17:46:00 crc kubenswrapper[4837]: I0111 17:46:00.980005 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/cdf85851-f855-4a22-83ea-14c5791be67b-kube-api-access-nqcfg" (OuterVolumeSpecName: "kube-api-access-nqcfg") pod "cdf85851-f855-4a22-83ea-14c5791be67b" (UID: "cdf85851-f855-4a22-83ea-14c5791be67b"). InnerVolumeSpecName "kube-api-access-nqcfg". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 11 17:46:01 crc kubenswrapper[4837]: I0111 17:46:01.073102 4837 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-nqcfg\" (UniqueName: \"kubernetes.io/projected/cdf85851-f855-4a22-83ea-14c5791be67b-kube-api-access-nqcfg\") on node \"crc\" DevicePath \"\"" Jan 11 17:46:01 crc kubenswrapper[4837]: I0111 17:46:01.492768 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-index-txbd7" event={"ID":"cdf85851-f855-4a22-83ea-14c5791be67b","Type":"ContainerDied","Data":"d05e79b61c7fd4bc0438bed0c9b4ba7024fcd930d273bdd902806d9a18139c74"} Jan 11 17:46:01 crc kubenswrapper[4837]: I0111 17:46:01.492853 4837 scope.go:117] "RemoveContainer" containerID="cb01251a18230269f915c395f21badc2e3306fd058333470802ae25efd3ec2c1" Jan 11 17:46:01 crc kubenswrapper[4837]: I0111 17:46:01.492798 4837 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-index-txbd7" Jan 11 17:46:01 crc kubenswrapper[4837]: I0111 17:46:01.541733 4837 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-operators/openstack-operator-index-txbd7"] Jan 11 17:46:01 crc kubenswrapper[4837]: I0111 17:46:01.546558 4837 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-operators/openstack-operator-index-txbd7"] Jan 11 17:46:02 crc kubenswrapper[4837]: I0111 17:46:02.370020 4837 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="cdf85851-f855-4a22-83ea-14c5791be67b" path="/var/lib/kubelet/pods/cdf85851-f855-4a22-83ea-14c5791be67b/volumes" Jan 11 17:46:07 crc kubenswrapper[4837]: I0111 17:46:07.410877 4837 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/openstack-operator-index-qqjnc" Jan 11 17:46:07 crc kubenswrapper[4837]: I0111 17:46:07.411161 4837 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack-operators/openstack-operator-index-qqjnc" Jan 11 17:46:07 crc kubenswrapper[4837]: I0111 17:46:07.452191 4837 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack-operators/openstack-operator-index-qqjnc" Jan 11 17:46:07 crc kubenswrapper[4837]: I0111 17:46:07.567429 4837 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/openstack-operator-index-qqjnc" Jan 11 17:46:08 crc kubenswrapper[4837]: I0111 17:46:08.713244 4837 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/cca30dc84e08e010a301b72613dd29e91e8e5b398c039860aa526153c79jwqg"] Jan 11 17:46:08 crc kubenswrapper[4837]: E0111 17:46:08.713509 4837 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="cdf85851-f855-4a22-83ea-14c5791be67b" containerName="registry-server" Jan 11 17:46:08 crc kubenswrapper[4837]: I0111 17:46:08.713521 4837 state_mem.go:107] "Deleted CPUSet assignment" podUID="cdf85851-f855-4a22-83ea-14c5791be67b" containerName="registry-server" Jan 11 17:46:08 crc kubenswrapper[4837]: I0111 17:46:08.714129 4837 memory_manager.go:354] "RemoveStaleState removing state" podUID="cdf85851-f855-4a22-83ea-14c5791be67b" containerName="registry-server" Jan 11 17:46:08 crc kubenswrapper[4837]: I0111 17:46:08.715175 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/cca30dc84e08e010a301b72613dd29e91e8e5b398c039860aa526153c79jwqg" Jan 11 17:46:08 crc kubenswrapper[4837]: I0111 17:46:08.722160 4837 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"default-dockercfg-skl7t" Jan 11 17:46:08 crc kubenswrapper[4837]: I0111 17:46:08.781087 4837 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/cca30dc84e08e010a301b72613dd29e91e8e5b398c039860aa526153c79jwqg"] Jan 11 17:46:08 crc kubenswrapper[4837]: I0111 17:46:08.827127 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/a02b8a77-2a89-46a4-9aba-2472a30559f7-bundle\") pod \"cca30dc84e08e010a301b72613dd29e91e8e5b398c039860aa526153c79jwqg\" (UID: \"a02b8a77-2a89-46a4-9aba-2472a30559f7\") " pod="openstack-operators/cca30dc84e08e010a301b72613dd29e91e8e5b398c039860aa526153c79jwqg" Jan 11 17:46:08 crc kubenswrapper[4837]: I0111 17:46:08.827202 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vhzc5\" (UniqueName: \"kubernetes.io/projected/a02b8a77-2a89-46a4-9aba-2472a30559f7-kube-api-access-vhzc5\") pod \"cca30dc84e08e010a301b72613dd29e91e8e5b398c039860aa526153c79jwqg\" (UID: \"a02b8a77-2a89-46a4-9aba-2472a30559f7\") " pod="openstack-operators/cca30dc84e08e010a301b72613dd29e91e8e5b398c039860aa526153c79jwqg" Jan 11 17:46:08 crc kubenswrapper[4837]: I0111 17:46:08.827376 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/a02b8a77-2a89-46a4-9aba-2472a30559f7-util\") pod \"cca30dc84e08e010a301b72613dd29e91e8e5b398c039860aa526153c79jwqg\" (UID: \"a02b8a77-2a89-46a4-9aba-2472a30559f7\") " pod="openstack-operators/cca30dc84e08e010a301b72613dd29e91e8e5b398c039860aa526153c79jwqg" Jan 11 17:46:08 crc kubenswrapper[4837]: I0111 17:46:08.928709 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/a02b8a77-2a89-46a4-9aba-2472a30559f7-util\") pod \"cca30dc84e08e010a301b72613dd29e91e8e5b398c039860aa526153c79jwqg\" (UID: \"a02b8a77-2a89-46a4-9aba-2472a30559f7\") " pod="openstack-operators/cca30dc84e08e010a301b72613dd29e91e8e5b398c039860aa526153c79jwqg" Jan 11 17:46:08 crc kubenswrapper[4837]: I0111 17:46:08.928881 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/a02b8a77-2a89-46a4-9aba-2472a30559f7-bundle\") pod \"cca30dc84e08e010a301b72613dd29e91e8e5b398c039860aa526153c79jwqg\" (UID: \"a02b8a77-2a89-46a4-9aba-2472a30559f7\") " pod="openstack-operators/cca30dc84e08e010a301b72613dd29e91e8e5b398c039860aa526153c79jwqg" Jan 11 17:46:08 crc kubenswrapper[4837]: I0111 17:46:08.929312 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/a02b8a77-2a89-46a4-9aba-2472a30559f7-util\") pod \"cca30dc84e08e010a301b72613dd29e91e8e5b398c039860aa526153c79jwqg\" (UID: \"a02b8a77-2a89-46a4-9aba-2472a30559f7\") " pod="openstack-operators/cca30dc84e08e010a301b72613dd29e91e8e5b398c039860aa526153c79jwqg" Jan 11 17:46:08 crc kubenswrapper[4837]: I0111 17:46:08.929622 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/a02b8a77-2a89-46a4-9aba-2472a30559f7-bundle\") pod \"cca30dc84e08e010a301b72613dd29e91e8e5b398c039860aa526153c79jwqg\" (UID: \"a02b8a77-2a89-46a4-9aba-2472a30559f7\") " pod="openstack-operators/cca30dc84e08e010a301b72613dd29e91e8e5b398c039860aa526153c79jwqg" Jan 11 17:46:08 crc kubenswrapper[4837]: I0111 17:46:08.929957 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vhzc5\" (UniqueName: \"kubernetes.io/projected/a02b8a77-2a89-46a4-9aba-2472a30559f7-kube-api-access-vhzc5\") pod \"cca30dc84e08e010a301b72613dd29e91e8e5b398c039860aa526153c79jwqg\" (UID: \"a02b8a77-2a89-46a4-9aba-2472a30559f7\") " pod="openstack-operators/cca30dc84e08e010a301b72613dd29e91e8e5b398c039860aa526153c79jwqg" Jan 11 17:46:08 crc kubenswrapper[4837]: I0111 17:46:08.948154 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vhzc5\" (UniqueName: \"kubernetes.io/projected/a02b8a77-2a89-46a4-9aba-2472a30559f7-kube-api-access-vhzc5\") pod \"cca30dc84e08e010a301b72613dd29e91e8e5b398c039860aa526153c79jwqg\" (UID: \"a02b8a77-2a89-46a4-9aba-2472a30559f7\") " pod="openstack-operators/cca30dc84e08e010a301b72613dd29e91e8e5b398c039860aa526153c79jwqg" Jan 11 17:46:09 crc kubenswrapper[4837]: I0111 17:46:09.059570 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/cca30dc84e08e010a301b72613dd29e91e8e5b398c039860aa526153c79jwqg" Jan 11 17:46:09 crc kubenswrapper[4837]: I0111 17:46:09.460191 4837 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/cca30dc84e08e010a301b72613dd29e91e8e5b398c039860aa526153c79jwqg"] Jan 11 17:46:09 crc kubenswrapper[4837]: W0111 17:46:09.467372 4837 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poda02b8a77_2a89_46a4_9aba_2472a30559f7.slice/crio-2218cd0c73233f89d11e6da1a8a2c7537699dfd0b270f96e8a5a09ec7e28d003 WatchSource:0}: Error finding container 2218cd0c73233f89d11e6da1a8a2c7537699dfd0b270f96e8a5a09ec7e28d003: Status 404 returned error can't find the container with id 2218cd0c73233f89d11e6da1a8a2c7537699dfd0b270f96e8a5a09ec7e28d003 Jan 11 17:46:09 crc kubenswrapper[4837]: I0111 17:46:09.541866 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/cca30dc84e08e010a301b72613dd29e91e8e5b398c039860aa526153c79jwqg" event={"ID":"a02b8a77-2a89-46a4-9aba-2472a30559f7","Type":"ContainerStarted","Data":"2218cd0c73233f89d11e6da1a8a2c7537699dfd0b270f96e8a5a09ec7e28d003"} Jan 11 17:46:10 crc kubenswrapper[4837]: I0111 17:46:10.550520 4837 generic.go:334] "Generic (PLEG): container finished" podID="a02b8a77-2a89-46a4-9aba-2472a30559f7" containerID="130a53cf63a5516b4d37e77b1657a3c9cee406716650611b2f721e43ce39c913" exitCode=0 Jan 11 17:46:10 crc kubenswrapper[4837]: I0111 17:46:10.550708 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/cca30dc84e08e010a301b72613dd29e91e8e5b398c039860aa526153c79jwqg" event={"ID":"a02b8a77-2a89-46a4-9aba-2472a30559f7","Type":"ContainerDied","Data":"130a53cf63a5516b4d37e77b1657a3c9cee406716650611b2f721e43ce39c913"} Jan 11 17:46:11 crc kubenswrapper[4837]: I0111 17:46:11.562454 4837 generic.go:334] "Generic (PLEG): container finished" podID="a02b8a77-2a89-46a4-9aba-2472a30559f7" containerID="335ab6ba0614d69b57f5cb8f0b29318980c0204f08795a85b5dfb3180d60e522" exitCode=0 Jan 11 17:46:11 crc kubenswrapper[4837]: I0111 17:46:11.562522 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/cca30dc84e08e010a301b72613dd29e91e8e5b398c039860aa526153c79jwqg" event={"ID":"a02b8a77-2a89-46a4-9aba-2472a30559f7","Type":"ContainerDied","Data":"335ab6ba0614d69b57f5cb8f0b29318980c0204f08795a85b5dfb3180d60e522"} Jan 11 17:46:12 crc kubenswrapper[4837]: I0111 17:46:12.575603 4837 generic.go:334] "Generic (PLEG): container finished" podID="a02b8a77-2a89-46a4-9aba-2472a30559f7" containerID="da88af6417180039f5a66d8b822c76328ead0866b8ec4dd0582956b6eff5b108" exitCode=0 Jan 11 17:46:12 crc kubenswrapper[4837]: I0111 17:46:12.575732 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/cca30dc84e08e010a301b72613dd29e91e8e5b398c039860aa526153c79jwqg" event={"ID":"a02b8a77-2a89-46a4-9aba-2472a30559f7","Type":"ContainerDied","Data":"da88af6417180039f5a66d8b822c76328ead0866b8ec4dd0582956b6eff5b108"} Jan 11 17:46:13 crc kubenswrapper[4837]: I0111 17:46:13.878773 4837 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-operators/cca30dc84e08e010a301b72613dd29e91e8e5b398c039860aa526153c79jwqg" Jan 11 17:46:14 crc kubenswrapper[4837]: I0111 17:46:14.011119 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-vhzc5\" (UniqueName: \"kubernetes.io/projected/a02b8a77-2a89-46a4-9aba-2472a30559f7-kube-api-access-vhzc5\") pod \"a02b8a77-2a89-46a4-9aba-2472a30559f7\" (UID: \"a02b8a77-2a89-46a4-9aba-2472a30559f7\") " Jan 11 17:46:14 crc kubenswrapper[4837]: I0111 17:46:14.011281 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/a02b8a77-2a89-46a4-9aba-2472a30559f7-util\") pod \"a02b8a77-2a89-46a4-9aba-2472a30559f7\" (UID: \"a02b8a77-2a89-46a4-9aba-2472a30559f7\") " Jan 11 17:46:14 crc kubenswrapper[4837]: I0111 17:46:14.011402 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/a02b8a77-2a89-46a4-9aba-2472a30559f7-bundle\") pod \"a02b8a77-2a89-46a4-9aba-2472a30559f7\" (UID: \"a02b8a77-2a89-46a4-9aba-2472a30559f7\") " Jan 11 17:46:14 crc kubenswrapper[4837]: I0111 17:46:14.017935 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a02b8a77-2a89-46a4-9aba-2472a30559f7-kube-api-access-vhzc5" (OuterVolumeSpecName: "kube-api-access-vhzc5") pod "a02b8a77-2a89-46a4-9aba-2472a30559f7" (UID: "a02b8a77-2a89-46a4-9aba-2472a30559f7"). InnerVolumeSpecName "kube-api-access-vhzc5". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 11 17:46:14 crc kubenswrapper[4837]: I0111 17:46:14.020128 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/a02b8a77-2a89-46a4-9aba-2472a30559f7-bundle" (OuterVolumeSpecName: "bundle") pod "a02b8a77-2a89-46a4-9aba-2472a30559f7" (UID: "a02b8a77-2a89-46a4-9aba-2472a30559f7"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 11 17:46:14 crc kubenswrapper[4837]: I0111 17:46:14.038913 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/a02b8a77-2a89-46a4-9aba-2472a30559f7-util" (OuterVolumeSpecName: "util") pod "a02b8a77-2a89-46a4-9aba-2472a30559f7" (UID: "a02b8a77-2a89-46a4-9aba-2472a30559f7"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 11 17:46:14 crc kubenswrapper[4837]: I0111 17:46:14.085257 4837 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-b8jbl"] Jan 11 17:46:14 crc kubenswrapper[4837]: E0111 17:46:14.085796 4837 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a02b8a77-2a89-46a4-9aba-2472a30559f7" containerName="extract" Jan 11 17:46:14 crc kubenswrapper[4837]: I0111 17:46:14.085854 4837 state_mem.go:107] "Deleted CPUSet assignment" podUID="a02b8a77-2a89-46a4-9aba-2472a30559f7" containerName="extract" Jan 11 17:46:14 crc kubenswrapper[4837]: E0111 17:46:14.085891 4837 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a02b8a77-2a89-46a4-9aba-2472a30559f7" containerName="pull" Jan 11 17:46:14 crc kubenswrapper[4837]: I0111 17:46:14.085909 4837 state_mem.go:107] "Deleted CPUSet assignment" podUID="a02b8a77-2a89-46a4-9aba-2472a30559f7" containerName="pull" Jan 11 17:46:14 crc kubenswrapper[4837]: E0111 17:46:14.085947 4837 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a02b8a77-2a89-46a4-9aba-2472a30559f7" containerName="util" Jan 11 17:46:14 crc kubenswrapper[4837]: I0111 17:46:14.085965 4837 state_mem.go:107] "Deleted CPUSet assignment" podUID="a02b8a77-2a89-46a4-9aba-2472a30559f7" containerName="util" Jan 11 17:46:14 crc kubenswrapper[4837]: I0111 17:46:14.086230 4837 memory_manager.go:354] "RemoveStaleState removing state" podUID="a02b8a77-2a89-46a4-9aba-2472a30559f7" containerName="extract" Jan 11 17:46:14 crc kubenswrapper[4837]: I0111 17:46:14.088136 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-b8jbl" Jan 11 17:46:14 crc kubenswrapper[4837]: I0111 17:46:14.097275 4837 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-b8jbl"] Jan 11 17:46:14 crc kubenswrapper[4837]: I0111 17:46:14.112924 4837 reconciler_common.go:293] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/a02b8a77-2a89-46a4-9aba-2472a30559f7-bundle\") on node \"crc\" DevicePath \"\"" Jan 11 17:46:14 crc kubenswrapper[4837]: I0111 17:46:14.112972 4837 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-vhzc5\" (UniqueName: \"kubernetes.io/projected/a02b8a77-2a89-46a4-9aba-2472a30559f7-kube-api-access-vhzc5\") on node \"crc\" DevicePath \"\"" Jan 11 17:46:14 crc kubenswrapper[4837]: I0111 17:46:14.113000 4837 reconciler_common.go:293] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/a02b8a77-2a89-46a4-9aba-2472a30559f7-util\") on node \"crc\" DevicePath \"\"" Jan 11 17:46:14 crc kubenswrapper[4837]: I0111 17:46:14.214193 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-c4njt\" (UniqueName: \"kubernetes.io/projected/dbd0dd5e-a203-4187-9d46-867f6a7f3df1-kube-api-access-c4njt\") pod \"redhat-marketplace-b8jbl\" (UID: \"dbd0dd5e-a203-4187-9d46-867f6a7f3df1\") " pod="openshift-marketplace/redhat-marketplace-b8jbl" Jan 11 17:46:14 crc kubenswrapper[4837]: I0111 17:46:14.214247 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/dbd0dd5e-a203-4187-9d46-867f6a7f3df1-utilities\") pod \"redhat-marketplace-b8jbl\" (UID: \"dbd0dd5e-a203-4187-9d46-867f6a7f3df1\") " pod="openshift-marketplace/redhat-marketplace-b8jbl" Jan 11 17:46:14 crc kubenswrapper[4837]: I0111 17:46:14.214321 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/dbd0dd5e-a203-4187-9d46-867f6a7f3df1-catalog-content\") pod \"redhat-marketplace-b8jbl\" (UID: \"dbd0dd5e-a203-4187-9d46-867f6a7f3df1\") " pod="openshift-marketplace/redhat-marketplace-b8jbl" Jan 11 17:46:14 crc kubenswrapper[4837]: I0111 17:46:14.315018 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-c4njt\" (UniqueName: \"kubernetes.io/projected/dbd0dd5e-a203-4187-9d46-867f6a7f3df1-kube-api-access-c4njt\") pod \"redhat-marketplace-b8jbl\" (UID: \"dbd0dd5e-a203-4187-9d46-867f6a7f3df1\") " pod="openshift-marketplace/redhat-marketplace-b8jbl" Jan 11 17:46:14 crc kubenswrapper[4837]: I0111 17:46:14.315063 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/dbd0dd5e-a203-4187-9d46-867f6a7f3df1-utilities\") pod \"redhat-marketplace-b8jbl\" (UID: \"dbd0dd5e-a203-4187-9d46-867f6a7f3df1\") " pod="openshift-marketplace/redhat-marketplace-b8jbl" Jan 11 17:46:14 crc kubenswrapper[4837]: I0111 17:46:14.315112 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/dbd0dd5e-a203-4187-9d46-867f6a7f3df1-catalog-content\") pod \"redhat-marketplace-b8jbl\" (UID: \"dbd0dd5e-a203-4187-9d46-867f6a7f3df1\") " pod="openshift-marketplace/redhat-marketplace-b8jbl" Jan 11 17:46:14 crc kubenswrapper[4837]: I0111 17:46:14.315581 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/dbd0dd5e-a203-4187-9d46-867f6a7f3df1-catalog-content\") pod \"redhat-marketplace-b8jbl\" (UID: \"dbd0dd5e-a203-4187-9d46-867f6a7f3df1\") " pod="openshift-marketplace/redhat-marketplace-b8jbl" Jan 11 17:46:14 crc kubenswrapper[4837]: I0111 17:46:14.315845 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/dbd0dd5e-a203-4187-9d46-867f6a7f3df1-utilities\") pod \"redhat-marketplace-b8jbl\" (UID: \"dbd0dd5e-a203-4187-9d46-867f6a7f3df1\") " pod="openshift-marketplace/redhat-marketplace-b8jbl" Jan 11 17:46:14 crc kubenswrapper[4837]: I0111 17:46:14.331848 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-c4njt\" (UniqueName: \"kubernetes.io/projected/dbd0dd5e-a203-4187-9d46-867f6a7f3df1-kube-api-access-c4njt\") pod \"redhat-marketplace-b8jbl\" (UID: \"dbd0dd5e-a203-4187-9d46-867f6a7f3df1\") " pod="openshift-marketplace/redhat-marketplace-b8jbl" Jan 11 17:46:14 crc kubenswrapper[4837]: I0111 17:46:14.421522 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-b8jbl" Jan 11 17:46:14 crc kubenswrapper[4837]: I0111 17:46:14.592153 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/cca30dc84e08e010a301b72613dd29e91e8e5b398c039860aa526153c79jwqg" event={"ID":"a02b8a77-2a89-46a4-9aba-2472a30559f7","Type":"ContainerDied","Data":"2218cd0c73233f89d11e6da1a8a2c7537699dfd0b270f96e8a5a09ec7e28d003"} Jan 11 17:46:14 crc kubenswrapper[4837]: I0111 17:46:14.592192 4837 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="2218cd0c73233f89d11e6da1a8a2c7537699dfd0b270f96e8a5a09ec7e28d003" Jan 11 17:46:14 crc kubenswrapper[4837]: I0111 17:46:14.592204 4837 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-operators/cca30dc84e08e010a301b72613dd29e91e8e5b398c039860aa526153c79jwqg" Jan 11 17:46:14 crc kubenswrapper[4837]: I0111 17:46:14.630309 4837 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-b8jbl"] Jan 11 17:46:14 crc kubenswrapper[4837]: W0111 17:46:14.636275 4837 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poddbd0dd5e_a203_4187_9d46_867f6a7f3df1.slice/crio-2cac1cc07b009eb877eb905e09899ab21fbd889d4fd095ea1ae06ce35c9ce4c9 WatchSource:0}: Error finding container 2cac1cc07b009eb877eb905e09899ab21fbd889d4fd095ea1ae06ce35c9ce4c9: Status 404 returned error can't find the container with id 2cac1cc07b009eb877eb905e09899ab21fbd889d4fd095ea1ae06ce35c9ce4c9 Jan 11 17:46:15 crc kubenswrapper[4837]: I0111 17:46:15.603996 4837 generic.go:334] "Generic (PLEG): container finished" podID="dbd0dd5e-a203-4187-9d46-867f6a7f3df1" containerID="c5c8dc49d037416a011e1beb6ef1f7f8084c59bc887d69b97b5b65fc0f7ed693" exitCode=0 Jan 11 17:46:15 crc kubenswrapper[4837]: I0111 17:46:15.604619 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-b8jbl" event={"ID":"dbd0dd5e-a203-4187-9d46-867f6a7f3df1","Type":"ContainerDied","Data":"c5c8dc49d037416a011e1beb6ef1f7f8084c59bc887d69b97b5b65fc0f7ed693"} Jan 11 17:46:15 crc kubenswrapper[4837]: I0111 17:46:15.604765 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-b8jbl" event={"ID":"dbd0dd5e-a203-4187-9d46-867f6a7f3df1","Type":"ContainerStarted","Data":"2cac1cc07b009eb877eb905e09899ab21fbd889d4fd095ea1ae06ce35c9ce4c9"} Jan 11 17:46:16 crc kubenswrapper[4837]: I0111 17:46:16.613954 4837 generic.go:334] "Generic (PLEG): container finished" podID="dbd0dd5e-a203-4187-9d46-867f6a7f3df1" containerID="ba0e76b4998f7787cc9b5327cfa98f569d222eff05ac4211508954470f89e566" exitCode=0 Jan 11 17:46:16 crc kubenswrapper[4837]: I0111 17:46:16.614241 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-b8jbl" event={"ID":"dbd0dd5e-a203-4187-9d46-867f6a7f3df1","Type":"ContainerDied","Data":"ba0e76b4998f7787cc9b5327cfa98f569d222eff05ac4211508954470f89e566"} Jan 11 17:46:17 crc kubenswrapper[4837]: I0111 17:46:17.102082 4837 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/openstack-operator-controller-operator-597c79dd4-2dspz"] Jan 11 17:46:17 crc kubenswrapper[4837]: I0111 17:46:17.102940 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-controller-operator-597c79dd4-2dspz" Jan 11 17:46:17 crc kubenswrapper[4837]: I0111 17:46:17.105335 4837 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"openstack-operator-controller-operator-dockercfg-gmgdr" Jan 11 17:46:17 crc kubenswrapper[4837]: I0111 17:46:17.129183 4837 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-operator-controller-operator-597c79dd4-2dspz"] Jan 11 17:46:17 crc kubenswrapper[4837]: I0111 17:46:17.253418 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nr4g7\" (UniqueName: \"kubernetes.io/projected/229a8de5-0ba1-4408-b093-28e6e74c143b-kube-api-access-nr4g7\") pod \"openstack-operator-controller-operator-597c79dd4-2dspz\" (UID: \"229a8de5-0ba1-4408-b093-28e6e74c143b\") " pod="openstack-operators/openstack-operator-controller-operator-597c79dd4-2dspz" Jan 11 17:46:17 crc kubenswrapper[4837]: I0111 17:46:17.355202 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-nr4g7\" (UniqueName: \"kubernetes.io/projected/229a8de5-0ba1-4408-b093-28e6e74c143b-kube-api-access-nr4g7\") pod \"openstack-operator-controller-operator-597c79dd4-2dspz\" (UID: \"229a8de5-0ba1-4408-b093-28e6e74c143b\") " pod="openstack-operators/openstack-operator-controller-operator-597c79dd4-2dspz" Jan 11 17:46:17 crc kubenswrapper[4837]: I0111 17:46:17.377427 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-nr4g7\" (UniqueName: \"kubernetes.io/projected/229a8de5-0ba1-4408-b093-28e6e74c143b-kube-api-access-nr4g7\") pod \"openstack-operator-controller-operator-597c79dd4-2dspz\" (UID: \"229a8de5-0ba1-4408-b093-28e6e74c143b\") " pod="openstack-operators/openstack-operator-controller-operator-597c79dd4-2dspz" Jan 11 17:46:17 crc kubenswrapper[4837]: I0111 17:46:17.421534 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-controller-operator-597c79dd4-2dspz" Jan 11 17:46:17 crc kubenswrapper[4837]: I0111 17:46:17.669665 4837 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-operator-controller-operator-597c79dd4-2dspz"] Jan 11 17:46:17 crc kubenswrapper[4837]: W0111 17:46:17.672848 4837 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod229a8de5_0ba1_4408_b093_28e6e74c143b.slice/crio-91c8f7743bf4682789c1c07fd0c84443a1465d49d0b6a1abf985fb7bd3fc96bb WatchSource:0}: Error finding container 91c8f7743bf4682789c1c07fd0c84443a1465d49d0b6a1abf985fb7bd3fc96bb: Status 404 returned error can't find the container with id 91c8f7743bf4682789c1c07fd0c84443a1465d49d0b6a1abf985fb7bd3fc96bb Jan 11 17:46:18 crc kubenswrapper[4837]: I0111 17:46:18.629898 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-controller-operator-597c79dd4-2dspz" event={"ID":"229a8de5-0ba1-4408-b093-28e6e74c143b","Type":"ContainerStarted","Data":"91c8f7743bf4682789c1c07fd0c84443a1465d49d0b6a1abf985fb7bd3fc96bb"} Jan 11 17:46:22 crc kubenswrapper[4837]: I0111 17:46:22.657245 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-b8jbl" event={"ID":"dbd0dd5e-a203-4187-9d46-867f6a7f3df1","Type":"ContainerStarted","Data":"a83c2f11510c66f3184d23620be546bd3cfbbe1ff7cbb11be75932784e183938"} Jan 11 17:46:24 crc kubenswrapper[4837]: I0111 17:46:24.691853 4837 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-b8jbl" podStartSLOduration=6.09181858 podStartE2EDuration="10.691836103s" podCreationTimestamp="2026-01-11 17:46:14 +0000 UTC" firstStartedPulling="2026-01-11 17:46:15.607620449 +0000 UTC m=+949.785813155" lastFinishedPulling="2026-01-11 17:46:20.207637972 +0000 UTC m=+954.385830678" observedRunningTime="2026-01-11 17:46:24.691535685 +0000 UTC m=+958.869728391" watchObservedRunningTime="2026-01-11 17:46:24.691836103 +0000 UTC m=+958.870028809" Jan 11 17:46:33 crc kubenswrapper[4837]: I0111 17:46:33.734204 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-controller-operator-597c79dd4-2dspz" event={"ID":"229a8de5-0ba1-4408-b093-28e6e74c143b","Type":"ContainerStarted","Data":"10cec28a5246d1975cd5884cb6f7197d569450585bb0370adc4bff236c043233"} Jan 11 17:46:33 crc kubenswrapper[4837]: I0111 17:46:33.735369 4837 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/openstack-operator-controller-operator-597c79dd4-2dspz" Jan 11 17:46:33 crc kubenswrapper[4837]: I0111 17:46:33.793448 4837 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/openstack-operator-controller-operator-597c79dd4-2dspz" podStartSLOduration=1.840032211 podStartE2EDuration="16.793430083s" podCreationTimestamp="2026-01-11 17:46:17 +0000 UTC" firstStartedPulling="2026-01-11 17:46:17.675909298 +0000 UTC m=+951.854102004" lastFinishedPulling="2026-01-11 17:46:32.62930717 +0000 UTC m=+966.807499876" observedRunningTime="2026-01-11 17:46:33.793292399 +0000 UTC m=+967.971485145" watchObservedRunningTime="2026-01-11 17:46:33.793430083 +0000 UTC m=+967.971622799" Jan 11 17:46:34 crc kubenswrapper[4837]: I0111 17:46:34.422581 4837 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-b8jbl" Jan 11 17:46:34 crc kubenswrapper[4837]: I0111 17:46:34.422644 4837 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-b8jbl" Jan 11 17:46:34 crc kubenswrapper[4837]: I0111 17:46:34.464827 4837 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-b8jbl" Jan 11 17:46:34 crc kubenswrapper[4837]: I0111 17:46:34.777790 4837 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-b8jbl" Jan 11 17:46:34 crc kubenswrapper[4837]: I0111 17:46:34.815182 4837 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-b8jbl"] Jan 11 17:46:36 crc kubenswrapper[4837]: I0111 17:46:36.756211 4837 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-b8jbl" podUID="dbd0dd5e-a203-4187-9d46-867f6a7f3df1" containerName="registry-server" containerID="cri-o://a83c2f11510c66f3184d23620be546bd3cfbbe1ff7cbb11be75932784e183938" gracePeriod=2 Jan 11 17:46:37 crc kubenswrapper[4837]: I0111 17:46:37.426222 4837 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/openstack-operator-controller-operator-597c79dd4-2dspz" Jan 11 17:46:37 crc kubenswrapper[4837]: I0111 17:46:37.636305 4837 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-b8jbl" Jan 11 17:46:37 crc kubenswrapper[4837]: I0111 17:46:37.736605 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/dbd0dd5e-a203-4187-9d46-867f6a7f3df1-utilities\") pod \"dbd0dd5e-a203-4187-9d46-867f6a7f3df1\" (UID: \"dbd0dd5e-a203-4187-9d46-867f6a7f3df1\") " Jan 11 17:46:37 crc kubenswrapper[4837]: I0111 17:46:37.737051 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/dbd0dd5e-a203-4187-9d46-867f6a7f3df1-catalog-content\") pod \"dbd0dd5e-a203-4187-9d46-867f6a7f3df1\" (UID: \"dbd0dd5e-a203-4187-9d46-867f6a7f3df1\") " Jan 11 17:46:37 crc kubenswrapper[4837]: I0111 17:46:37.737231 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-c4njt\" (UniqueName: \"kubernetes.io/projected/dbd0dd5e-a203-4187-9d46-867f6a7f3df1-kube-api-access-c4njt\") pod \"dbd0dd5e-a203-4187-9d46-867f6a7f3df1\" (UID: \"dbd0dd5e-a203-4187-9d46-867f6a7f3df1\") " Jan 11 17:46:37 crc kubenswrapper[4837]: I0111 17:46:37.737705 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/dbd0dd5e-a203-4187-9d46-867f6a7f3df1-utilities" (OuterVolumeSpecName: "utilities") pod "dbd0dd5e-a203-4187-9d46-867f6a7f3df1" (UID: "dbd0dd5e-a203-4187-9d46-867f6a7f3df1"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 11 17:46:37 crc kubenswrapper[4837]: I0111 17:46:37.742551 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/dbd0dd5e-a203-4187-9d46-867f6a7f3df1-kube-api-access-c4njt" (OuterVolumeSpecName: "kube-api-access-c4njt") pod "dbd0dd5e-a203-4187-9d46-867f6a7f3df1" (UID: "dbd0dd5e-a203-4187-9d46-867f6a7f3df1"). InnerVolumeSpecName "kube-api-access-c4njt". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 11 17:46:37 crc kubenswrapper[4837]: I0111 17:46:37.757399 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/dbd0dd5e-a203-4187-9d46-867f6a7f3df1-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "dbd0dd5e-a203-4187-9d46-867f6a7f3df1" (UID: "dbd0dd5e-a203-4187-9d46-867f6a7f3df1"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 11 17:46:37 crc kubenswrapper[4837]: I0111 17:46:37.764094 4837 generic.go:334] "Generic (PLEG): container finished" podID="dbd0dd5e-a203-4187-9d46-867f6a7f3df1" containerID="a83c2f11510c66f3184d23620be546bd3cfbbe1ff7cbb11be75932784e183938" exitCode=0 Jan 11 17:46:37 crc kubenswrapper[4837]: I0111 17:46:37.764144 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-b8jbl" event={"ID":"dbd0dd5e-a203-4187-9d46-867f6a7f3df1","Type":"ContainerDied","Data":"a83c2f11510c66f3184d23620be546bd3cfbbe1ff7cbb11be75932784e183938"} Jan 11 17:46:37 crc kubenswrapper[4837]: I0111 17:46:37.764172 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-b8jbl" event={"ID":"dbd0dd5e-a203-4187-9d46-867f6a7f3df1","Type":"ContainerDied","Data":"2cac1cc07b009eb877eb905e09899ab21fbd889d4fd095ea1ae06ce35c9ce4c9"} Jan 11 17:46:37 crc kubenswrapper[4837]: I0111 17:46:37.764191 4837 scope.go:117] "RemoveContainer" containerID="a83c2f11510c66f3184d23620be546bd3cfbbe1ff7cbb11be75932784e183938" Jan 11 17:46:37 crc kubenswrapper[4837]: I0111 17:46:37.764209 4837 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-b8jbl" Jan 11 17:46:37 crc kubenswrapper[4837]: I0111 17:46:37.787409 4837 scope.go:117] "RemoveContainer" containerID="ba0e76b4998f7787cc9b5327cfa98f569d222eff05ac4211508954470f89e566" Jan 11 17:46:37 crc kubenswrapper[4837]: I0111 17:46:37.805913 4837 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-b8jbl"] Jan 11 17:46:37 crc kubenswrapper[4837]: I0111 17:46:37.809842 4837 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-b8jbl"] Jan 11 17:46:37 crc kubenswrapper[4837]: I0111 17:46:37.829158 4837 scope.go:117] "RemoveContainer" containerID="c5c8dc49d037416a011e1beb6ef1f7f8084c59bc887d69b97b5b65fc0f7ed693" Jan 11 17:46:37 crc kubenswrapper[4837]: I0111 17:46:37.838519 4837 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-c4njt\" (UniqueName: \"kubernetes.io/projected/dbd0dd5e-a203-4187-9d46-867f6a7f3df1-kube-api-access-c4njt\") on node \"crc\" DevicePath \"\"" Jan 11 17:46:37 crc kubenswrapper[4837]: I0111 17:46:37.838560 4837 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/dbd0dd5e-a203-4187-9d46-867f6a7f3df1-utilities\") on node \"crc\" DevicePath \"\"" Jan 11 17:46:37 crc kubenswrapper[4837]: I0111 17:46:37.838573 4837 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/dbd0dd5e-a203-4187-9d46-867f6a7f3df1-catalog-content\") on node \"crc\" DevicePath \"\"" Jan 11 17:46:37 crc kubenswrapper[4837]: I0111 17:46:37.846608 4837 scope.go:117] "RemoveContainer" containerID="a83c2f11510c66f3184d23620be546bd3cfbbe1ff7cbb11be75932784e183938" Jan 11 17:46:37 crc kubenswrapper[4837]: E0111 17:46:37.847094 4837 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a83c2f11510c66f3184d23620be546bd3cfbbe1ff7cbb11be75932784e183938\": container with ID starting with a83c2f11510c66f3184d23620be546bd3cfbbe1ff7cbb11be75932784e183938 not found: ID does not exist" containerID="a83c2f11510c66f3184d23620be546bd3cfbbe1ff7cbb11be75932784e183938" Jan 11 17:46:37 crc kubenswrapper[4837]: I0111 17:46:37.847139 4837 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a83c2f11510c66f3184d23620be546bd3cfbbe1ff7cbb11be75932784e183938"} err="failed to get container status \"a83c2f11510c66f3184d23620be546bd3cfbbe1ff7cbb11be75932784e183938\": rpc error: code = NotFound desc = could not find container \"a83c2f11510c66f3184d23620be546bd3cfbbe1ff7cbb11be75932784e183938\": container with ID starting with a83c2f11510c66f3184d23620be546bd3cfbbe1ff7cbb11be75932784e183938 not found: ID does not exist" Jan 11 17:46:37 crc kubenswrapper[4837]: I0111 17:46:37.847164 4837 scope.go:117] "RemoveContainer" containerID="ba0e76b4998f7787cc9b5327cfa98f569d222eff05ac4211508954470f89e566" Jan 11 17:46:37 crc kubenswrapper[4837]: E0111 17:46:37.847426 4837 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ba0e76b4998f7787cc9b5327cfa98f569d222eff05ac4211508954470f89e566\": container with ID starting with ba0e76b4998f7787cc9b5327cfa98f569d222eff05ac4211508954470f89e566 not found: ID does not exist" containerID="ba0e76b4998f7787cc9b5327cfa98f569d222eff05ac4211508954470f89e566" Jan 11 17:46:37 crc kubenswrapper[4837]: I0111 17:46:37.847444 4837 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ba0e76b4998f7787cc9b5327cfa98f569d222eff05ac4211508954470f89e566"} err="failed to get container status \"ba0e76b4998f7787cc9b5327cfa98f569d222eff05ac4211508954470f89e566\": rpc error: code = NotFound desc = could not find container \"ba0e76b4998f7787cc9b5327cfa98f569d222eff05ac4211508954470f89e566\": container with ID starting with ba0e76b4998f7787cc9b5327cfa98f569d222eff05ac4211508954470f89e566 not found: ID does not exist" Jan 11 17:46:37 crc kubenswrapper[4837]: I0111 17:46:37.847457 4837 scope.go:117] "RemoveContainer" containerID="c5c8dc49d037416a011e1beb6ef1f7f8084c59bc887d69b97b5b65fc0f7ed693" Jan 11 17:46:37 crc kubenswrapper[4837]: E0111 17:46:37.847995 4837 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"c5c8dc49d037416a011e1beb6ef1f7f8084c59bc887d69b97b5b65fc0f7ed693\": container with ID starting with c5c8dc49d037416a011e1beb6ef1f7f8084c59bc887d69b97b5b65fc0f7ed693 not found: ID does not exist" containerID="c5c8dc49d037416a011e1beb6ef1f7f8084c59bc887d69b97b5b65fc0f7ed693" Jan 11 17:46:37 crc kubenswrapper[4837]: I0111 17:46:37.848027 4837 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c5c8dc49d037416a011e1beb6ef1f7f8084c59bc887d69b97b5b65fc0f7ed693"} err="failed to get container status \"c5c8dc49d037416a011e1beb6ef1f7f8084c59bc887d69b97b5b65fc0f7ed693\": rpc error: code = NotFound desc = could not find container \"c5c8dc49d037416a011e1beb6ef1f7f8084c59bc887d69b97b5b65fc0f7ed693\": container with ID starting with c5c8dc49d037416a011e1beb6ef1f7f8084c59bc887d69b97b5b65fc0f7ed693 not found: ID does not exist" Jan 11 17:46:38 crc kubenswrapper[4837]: I0111 17:46:38.370990 4837 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="dbd0dd5e-a203-4187-9d46-867f6a7f3df1" path="/var/lib/kubelet/pods/dbd0dd5e-a203-4187-9d46-867f6a7f3df1/volumes" Jan 11 17:46:39 crc kubenswrapper[4837]: I0111 17:46:39.444780 4837 patch_prober.go:28] interesting pod/machine-config-daemon-pqnst container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Jan 11 17:46:39 crc kubenswrapper[4837]: I0111 17:46:39.445160 4837 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-pqnst" podUID="1cf6fa66-290a-4e29-bafc-e60185a22fe2" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Jan 11 17:46:54 crc kubenswrapper[4837]: I0111 17:46:54.940616 4837 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/barbican-operator-controller-manager-7ddb5c749-bv24m"] Jan 11 17:46:54 crc kubenswrapper[4837]: E0111 17:46:54.941394 4837 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="dbd0dd5e-a203-4187-9d46-867f6a7f3df1" containerName="extract-utilities" Jan 11 17:46:54 crc kubenswrapper[4837]: I0111 17:46:54.941407 4837 state_mem.go:107] "Deleted CPUSet assignment" podUID="dbd0dd5e-a203-4187-9d46-867f6a7f3df1" containerName="extract-utilities" Jan 11 17:46:54 crc kubenswrapper[4837]: E0111 17:46:54.941424 4837 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="dbd0dd5e-a203-4187-9d46-867f6a7f3df1" containerName="extract-content" Jan 11 17:46:54 crc kubenswrapper[4837]: I0111 17:46:54.941431 4837 state_mem.go:107] "Deleted CPUSet assignment" podUID="dbd0dd5e-a203-4187-9d46-867f6a7f3df1" containerName="extract-content" Jan 11 17:46:54 crc kubenswrapper[4837]: E0111 17:46:54.941444 4837 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="dbd0dd5e-a203-4187-9d46-867f6a7f3df1" containerName="registry-server" Jan 11 17:46:54 crc kubenswrapper[4837]: I0111 17:46:54.941450 4837 state_mem.go:107] "Deleted CPUSet assignment" podUID="dbd0dd5e-a203-4187-9d46-867f6a7f3df1" containerName="registry-server" Jan 11 17:46:54 crc kubenswrapper[4837]: I0111 17:46:54.941568 4837 memory_manager.go:354] "RemoveStaleState removing state" podUID="dbd0dd5e-a203-4187-9d46-867f6a7f3df1" containerName="registry-server" Jan 11 17:46:54 crc kubenswrapper[4837]: I0111 17:46:54.942211 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/barbican-operator-controller-manager-7ddb5c749-bv24m" Jan 11 17:46:54 crc kubenswrapper[4837]: I0111 17:46:54.946973 4837 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"barbican-operator-controller-manager-dockercfg-7zmml" Jan 11 17:46:54 crc kubenswrapper[4837]: I0111 17:46:54.951690 4837 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/barbican-operator-controller-manager-7ddb5c749-bv24m"] Jan 11 17:46:54 crc kubenswrapper[4837]: I0111 17:46:54.973697 4837 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/cinder-operator-controller-manager-9b68f5989-6fhn7"] Jan 11 17:46:54 crc kubenswrapper[4837]: I0111 17:46:54.974581 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/cinder-operator-controller-manager-9b68f5989-6fhn7" Jan 11 17:46:54 crc kubenswrapper[4837]: I0111 17:46:54.976552 4837 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"cinder-operator-controller-manager-dockercfg-mx9pb" Jan 11 17:46:54 crc kubenswrapper[4837]: I0111 17:46:54.983455 4837 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/designate-operator-controller-manager-9f958b845-26d4f"] Jan 11 17:46:54 crc kubenswrapper[4837]: I0111 17:46:54.984224 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/designate-operator-controller-manager-9f958b845-26d4f" Jan 11 17:46:54 crc kubenswrapper[4837]: I0111 17:46:54.988632 4837 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"designate-operator-controller-manager-dockercfg-6bhw8" Jan 11 17:46:55 crc kubenswrapper[4837]: I0111 17:46:55.025437 4837 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/cinder-operator-controller-manager-9b68f5989-6fhn7"] Jan 11 17:46:55 crc kubenswrapper[4837]: I0111 17:46:55.078092 4837 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/designate-operator-controller-manager-9f958b845-26d4f"] Jan 11 17:46:55 crc kubenswrapper[4837]: I0111 17:46:55.084781 4837 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/glance-operator-controller-manager-c6994669c-9ptmm"] Jan 11 17:46:55 crc kubenswrapper[4837]: I0111 17:46:55.085746 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/glance-operator-controller-manager-c6994669c-9ptmm" Jan 11 17:46:55 crc kubenswrapper[4837]: I0111 17:46:55.087346 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-krlrq\" (UniqueName: \"kubernetes.io/projected/63384d88-7d49-4951-8ccd-10871b0b18ad-kube-api-access-krlrq\") pod \"designate-operator-controller-manager-9f958b845-26d4f\" (UID: \"63384d88-7d49-4951-8ccd-10871b0b18ad\") " pod="openstack-operators/designate-operator-controller-manager-9f958b845-26d4f" Jan 11 17:46:55 crc kubenswrapper[4837]: I0111 17:46:55.087408 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-s5l5n\" (UniqueName: \"kubernetes.io/projected/02e82478-6974-4ae1-b8de-57688876d070-kube-api-access-s5l5n\") pod \"barbican-operator-controller-manager-7ddb5c749-bv24m\" (UID: \"02e82478-6974-4ae1-b8de-57688876d070\") " pod="openstack-operators/barbican-operator-controller-manager-7ddb5c749-bv24m" Jan 11 17:46:55 crc kubenswrapper[4837]: I0111 17:46:55.087457 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-d7vbw\" (UniqueName: \"kubernetes.io/projected/eb2c9390-f27a-46b0-9249-3e9bdc0c99e3-kube-api-access-d7vbw\") pod \"cinder-operator-controller-manager-9b68f5989-6fhn7\" (UID: \"eb2c9390-f27a-46b0-9249-3e9bdc0c99e3\") " pod="openstack-operators/cinder-operator-controller-manager-9b68f5989-6fhn7" Jan 11 17:46:55 crc kubenswrapper[4837]: I0111 17:46:55.088998 4837 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"glance-operator-controller-manager-dockercfg-7wxqq" Jan 11 17:46:55 crc kubenswrapper[4837]: I0111 17:46:55.118047 4837 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/glance-operator-controller-manager-c6994669c-9ptmm"] Jan 11 17:46:55 crc kubenswrapper[4837]: I0111 17:46:55.124410 4837 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/horizon-operator-controller-manager-77d5c5b54f-75fgx"] Jan 11 17:46:55 crc kubenswrapper[4837]: I0111 17:46:55.125296 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/horizon-operator-controller-manager-77d5c5b54f-75fgx" Jan 11 17:46:55 crc kubenswrapper[4837]: I0111 17:46:55.127560 4837 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"horizon-operator-controller-manager-dockercfg-dqnmk" Jan 11 17:46:55 crc kubenswrapper[4837]: I0111 17:46:55.134700 4837 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/heat-operator-controller-manager-594c8c9d5d-cz9h6"] Jan 11 17:46:55 crc kubenswrapper[4837]: I0111 17:46:55.135607 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/heat-operator-controller-manager-594c8c9d5d-cz9h6" Jan 11 17:46:55 crc kubenswrapper[4837]: I0111 17:46:55.137348 4837 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"heat-operator-controller-manager-dockercfg-xdpnb" Jan 11 17:46:55 crc kubenswrapper[4837]: I0111 17:46:55.164734 4837 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/infra-operator-controller-manager-77c48c7859-4hndt"] Jan 11 17:46:55 crc kubenswrapper[4837]: I0111 17:46:55.165552 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/infra-operator-controller-manager-77c48c7859-4hndt" Jan 11 17:46:55 crc kubenswrapper[4837]: I0111 17:46:55.169910 4837 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"infra-operator-controller-manager-dockercfg-gxtfq" Jan 11 17:46:55 crc kubenswrapper[4837]: I0111 17:46:55.170132 4837 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"infra-operator-webhook-server-cert" Jan 11 17:46:55 crc kubenswrapper[4837]: I0111 17:46:55.176817 4837 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/horizon-operator-controller-manager-77d5c5b54f-75fgx"] Jan 11 17:46:55 crc kubenswrapper[4837]: I0111 17:46:55.187743 4837 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/heat-operator-controller-manager-594c8c9d5d-cz9h6"] Jan 11 17:46:55 crc kubenswrapper[4837]: I0111 17:46:55.188363 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-krlrq\" (UniqueName: \"kubernetes.io/projected/63384d88-7d49-4951-8ccd-10871b0b18ad-kube-api-access-krlrq\") pod \"designate-operator-controller-manager-9f958b845-26d4f\" (UID: \"63384d88-7d49-4951-8ccd-10871b0b18ad\") " pod="openstack-operators/designate-operator-controller-manager-9f958b845-26d4f" Jan 11 17:46:55 crc kubenswrapper[4837]: I0111 17:46:55.188406 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7qt9r\" (UniqueName: \"kubernetes.io/projected/c4d04eda-5046-43cd-b407-ed14ec61cbd6-kube-api-access-7qt9r\") pod \"glance-operator-controller-manager-c6994669c-9ptmm\" (UID: \"c4d04eda-5046-43cd-b407-ed14ec61cbd6\") " pod="openstack-operators/glance-operator-controller-manager-c6994669c-9ptmm" Jan 11 17:46:55 crc kubenswrapper[4837]: I0111 17:46:55.188428 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s5l5n\" (UniqueName: \"kubernetes.io/projected/02e82478-6974-4ae1-b8de-57688876d070-kube-api-access-s5l5n\") pod \"barbican-operator-controller-manager-7ddb5c749-bv24m\" (UID: \"02e82478-6974-4ae1-b8de-57688876d070\") " pod="openstack-operators/barbican-operator-controller-manager-7ddb5c749-bv24m" Jan 11 17:46:55 crc kubenswrapper[4837]: I0111 17:46:55.188469 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-d7vbw\" (UniqueName: \"kubernetes.io/projected/eb2c9390-f27a-46b0-9249-3e9bdc0c99e3-kube-api-access-d7vbw\") pod \"cinder-operator-controller-manager-9b68f5989-6fhn7\" (UID: \"eb2c9390-f27a-46b0-9249-3e9bdc0c99e3\") " pod="openstack-operators/cinder-operator-controller-manager-9b68f5989-6fhn7" Jan 11 17:46:55 crc kubenswrapper[4837]: I0111 17:46:55.188498 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-v5h6n\" (UniqueName: \"kubernetes.io/projected/fc05ccce-2544-4a54-bdf8-ec1b792ac1ba-kube-api-access-v5h6n\") pod \"horizon-operator-controller-manager-77d5c5b54f-75fgx\" (UID: \"fc05ccce-2544-4a54-bdf8-ec1b792ac1ba\") " pod="openstack-operators/horizon-operator-controller-manager-77d5c5b54f-75fgx" Jan 11 17:46:55 crc kubenswrapper[4837]: I0111 17:46:55.188534 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wsbfn\" (UniqueName: \"kubernetes.io/projected/61f99042-0859-46d8-9af9-727352a885ee-kube-api-access-wsbfn\") pod \"heat-operator-controller-manager-594c8c9d5d-cz9h6\" (UID: \"61f99042-0859-46d8-9af9-727352a885ee\") " pod="openstack-operators/heat-operator-controller-manager-594c8c9d5d-cz9h6" Jan 11 17:46:55 crc kubenswrapper[4837]: I0111 17:46:55.203725 4837 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/infra-operator-controller-manager-77c48c7859-4hndt"] Jan 11 17:46:55 crc kubenswrapper[4837]: I0111 17:46:55.217829 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-s5l5n\" (UniqueName: \"kubernetes.io/projected/02e82478-6974-4ae1-b8de-57688876d070-kube-api-access-s5l5n\") pod \"barbican-operator-controller-manager-7ddb5c749-bv24m\" (UID: \"02e82478-6974-4ae1-b8de-57688876d070\") " pod="openstack-operators/barbican-operator-controller-manager-7ddb5c749-bv24m" Jan 11 17:46:55 crc kubenswrapper[4837]: I0111 17:46:55.223734 4837 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/ironic-operator-controller-manager-78757b4889-87jjr"] Jan 11 17:46:55 crc kubenswrapper[4837]: I0111 17:46:55.224485 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/ironic-operator-controller-manager-78757b4889-87jjr" Jan 11 17:46:55 crc kubenswrapper[4837]: I0111 17:46:55.239329 4837 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"ironic-operator-controller-manager-dockercfg-kzmgl" Jan 11 17:46:55 crc kubenswrapper[4837]: I0111 17:46:55.239488 4837 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/keystone-operator-controller-manager-767fdc4f47-jgrh6"] Jan 11 17:46:55 crc kubenswrapper[4837]: I0111 17:46:55.240166 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/keystone-operator-controller-manager-767fdc4f47-jgrh6" Jan 11 17:46:55 crc kubenswrapper[4837]: I0111 17:46:55.247386 4837 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"keystone-operator-controller-manager-dockercfg-k5mbj" Jan 11 17:46:55 crc kubenswrapper[4837]: I0111 17:46:55.249131 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-krlrq\" (UniqueName: \"kubernetes.io/projected/63384d88-7d49-4951-8ccd-10871b0b18ad-kube-api-access-krlrq\") pod \"designate-operator-controller-manager-9f958b845-26d4f\" (UID: \"63384d88-7d49-4951-8ccd-10871b0b18ad\") " pod="openstack-operators/designate-operator-controller-manager-9f958b845-26d4f" Jan 11 17:46:55 crc kubenswrapper[4837]: I0111 17:46:55.249160 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-d7vbw\" (UniqueName: \"kubernetes.io/projected/eb2c9390-f27a-46b0-9249-3e9bdc0c99e3-kube-api-access-d7vbw\") pod \"cinder-operator-controller-manager-9b68f5989-6fhn7\" (UID: \"eb2c9390-f27a-46b0-9249-3e9bdc0c99e3\") " pod="openstack-operators/cinder-operator-controller-manager-9b68f5989-6fhn7" Jan 11 17:46:55 crc kubenswrapper[4837]: I0111 17:46:55.249193 4837 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/ironic-operator-controller-manager-78757b4889-87jjr"] Jan 11 17:46:55 crc kubenswrapper[4837]: I0111 17:46:55.261248 4837 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/keystone-operator-controller-manager-767fdc4f47-jgrh6"] Jan 11 17:46:55 crc kubenswrapper[4837]: I0111 17:46:55.269438 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/barbican-operator-controller-manager-7ddb5c749-bv24m" Jan 11 17:46:55 crc kubenswrapper[4837]: I0111 17:46:55.290188 4837 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/manila-operator-controller-manager-864f6b75bf-665ds"] Jan 11 17:46:55 crc kubenswrapper[4837]: I0111 17:46:55.291793 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7qt9r\" (UniqueName: \"kubernetes.io/projected/c4d04eda-5046-43cd-b407-ed14ec61cbd6-kube-api-access-7qt9r\") pod \"glance-operator-controller-manager-c6994669c-9ptmm\" (UID: \"c4d04eda-5046-43cd-b407-ed14ec61cbd6\") " pod="openstack-operators/glance-operator-controller-manager-c6994669c-9ptmm" Jan 11 17:46:55 crc kubenswrapper[4837]: I0111 17:46:55.291839 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/c2312108-ddf5-4939-acc1-727557936791-cert\") pod \"infra-operator-controller-manager-77c48c7859-4hndt\" (UID: \"c2312108-ddf5-4939-acc1-727557936791\") " pod="openstack-operators/infra-operator-controller-manager-77c48c7859-4hndt" Jan 11 17:46:55 crc kubenswrapper[4837]: I0111 17:46:55.291887 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-v5h6n\" (UniqueName: \"kubernetes.io/projected/fc05ccce-2544-4a54-bdf8-ec1b792ac1ba-kube-api-access-v5h6n\") pod \"horizon-operator-controller-manager-77d5c5b54f-75fgx\" (UID: \"fc05ccce-2544-4a54-bdf8-ec1b792ac1ba\") " pod="openstack-operators/horizon-operator-controller-manager-77d5c5b54f-75fgx" Jan 11 17:46:55 crc kubenswrapper[4837]: I0111 17:46:55.291906 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-csgfr\" (UniqueName: \"kubernetes.io/projected/537b7dae-5831-4fa5-afba-a5c7e1229e61-kube-api-access-csgfr\") pod \"ironic-operator-controller-manager-78757b4889-87jjr\" (UID: \"537b7dae-5831-4fa5-afba-a5c7e1229e61\") " pod="openstack-operators/ironic-operator-controller-manager-78757b4889-87jjr" Jan 11 17:46:55 crc kubenswrapper[4837]: I0111 17:46:55.291946 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wsbfn\" (UniqueName: \"kubernetes.io/projected/61f99042-0859-46d8-9af9-727352a885ee-kube-api-access-wsbfn\") pod \"heat-operator-controller-manager-594c8c9d5d-cz9h6\" (UID: \"61f99042-0859-46d8-9af9-727352a885ee\") " pod="openstack-operators/heat-operator-controller-manager-594c8c9d5d-cz9h6" Jan 11 17:46:55 crc kubenswrapper[4837]: I0111 17:46:55.291982 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zl99t\" (UniqueName: \"kubernetes.io/projected/c2312108-ddf5-4939-acc1-727557936791-kube-api-access-zl99t\") pod \"infra-operator-controller-manager-77c48c7859-4hndt\" (UID: \"c2312108-ddf5-4939-acc1-727557936791\") " pod="openstack-operators/infra-operator-controller-manager-77c48c7859-4hndt" Jan 11 17:46:55 crc kubenswrapper[4837]: I0111 17:46:55.292031 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/cinder-operator-controller-manager-9b68f5989-6fhn7" Jan 11 17:46:55 crc kubenswrapper[4837]: I0111 17:46:55.300521 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/designate-operator-controller-manager-9f958b845-26d4f" Jan 11 17:46:55 crc kubenswrapper[4837]: I0111 17:46:55.308244 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wsbfn\" (UniqueName: \"kubernetes.io/projected/61f99042-0859-46d8-9af9-727352a885ee-kube-api-access-wsbfn\") pod \"heat-operator-controller-manager-594c8c9d5d-cz9h6\" (UID: \"61f99042-0859-46d8-9af9-727352a885ee\") " pod="openstack-operators/heat-operator-controller-manager-594c8c9d5d-cz9h6" Jan 11 17:46:55 crc kubenswrapper[4837]: I0111 17:46:55.309743 4837 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/mariadb-operator-controller-manager-c87fff755-7k4z8"] Jan 11 17:46:55 crc kubenswrapper[4837]: I0111 17:46:55.309868 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/manila-operator-controller-manager-864f6b75bf-665ds" Jan 11 17:46:55 crc kubenswrapper[4837]: I0111 17:46:55.312126 4837 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"manila-operator-controller-manager-dockercfg-5jdj7" Jan 11 17:46:55 crc kubenswrapper[4837]: I0111 17:46:55.314069 4837 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/manila-operator-controller-manager-864f6b75bf-665ds"] Jan 11 17:46:55 crc kubenswrapper[4837]: I0111 17:46:55.314170 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/mariadb-operator-controller-manager-c87fff755-7k4z8" Jan 11 17:46:55 crc kubenswrapper[4837]: I0111 17:46:55.318216 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-v5h6n\" (UniqueName: \"kubernetes.io/projected/fc05ccce-2544-4a54-bdf8-ec1b792ac1ba-kube-api-access-v5h6n\") pod \"horizon-operator-controller-manager-77d5c5b54f-75fgx\" (UID: \"fc05ccce-2544-4a54-bdf8-ec1b792ac1ba\") " pod="openstack-operators/horizon-operator-controller-manager-77d5c5b54f-75fgx" Jan 11 17:46:55 crc kubenswrapper[4837]: I0111 17:46:55.318523 4837 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"mariadb-operator-controller-manager-dockercfg-vfvtf" Jan 11 17:46:55 crc kubenswrapper[4837]: I0111 17:46:55.319922 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7qt9r\" (UniqueName: \"kubernetes.io/projected/c4d04eda-5046-43cd-b407-ed14ec61cbd6-kube-api-access-7qt9r\") pod \"glance-operator-controller-manager-c6994669c-9ptmm\" (UID: \"c4d04eda-5046-43cd-b407-ed14ec61cbd6\") " pod="openstack-operators/glance-operator-controller-manager-c6994669c-9ptmm" Jan 11 17:46:55 crc kubenswrapper[4837]: I0111 17:46:55.319948 4837 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/mariadb-operator-controller-manager-c87fff755-7k4z8"] Jan 11 17:46:55 crc kubenswrapper[4837]: I0111 17:46:55.369849 4837 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/neutron-operator-controller-manager-cb4666565-x648q"] Jan 11 17:46:55 crc kubenswrapper[4837]: I0111 17:46:55.370832 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/neutron-operator-controller-manager-cb4666565-x648q" Jan 11 17:46:55 crc kubenswrapper[4837]: I0111 17:46:55.374332 4837 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"neutron-operator-controller-manager-dockercfg-85jwk" Jan 11 17:46:55 crc kubenswrapper[4837]: I0111 17:46:55.384249 4837 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/nova-operator-controller-manager-65849867d6-vm5gr"] Jan 11 17:46:55 crc kubenswrapper[4837]: I0111 17:46:55.387446 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/nova-operator-controller-manager-65849867d6-vm5gr" Jan 11 17:46:55 crc kubenswrapper[4837]: I0111 17:46:55.390376 4837 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"nova-operator-controller-manager-dockercfg-kjkst" Jan 11 17:46:55 crc kubenswrapper[4837]: I0111 17:46:55.392722 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ksczq\" (UniqueName: \"kubernetes.io/projected/8bdb5237-cb95-4e0c-b52c-85a8a419506b-kube-api-access-ksczq\") pod \"keystone-operator-controller-manager-767fdc4f47-jgrh6\" (UID: \"8bdb5237-cb95-4e0c-b52c-85a8a419506b\") " pod="openstack-operators/keystone-operator-controller-manager-767fdc4f47-jgrh6" Jan 11 17:46:55 crc kubenswrapper[4837]: I0111 17:46:55.393118 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-v8d9h\" (UniqueName: \"kubernetes.io/projected/b8bd1c51-4c79-44f9-b7d8-e43dd8118ba4-kube-api-access-v8d9h\") pod \"manila-operator-controller-manager-864f6b75bf-665ds\" (UID: \"b8bd1c51-4c79-44f9-b7d8-e43dd8118ba4\") " pod="openstack-operators/manila-operator-controller-manager-864f6b75bf-665ds" Jan 11 17:46:55 crc kubenswrapper[4837]: I0111 17:46:55.393165 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hnmf4\" (UniqueName: \"kubernetes.io/projected/23a744d5-da8a-4fda-8c27-652e4f18d736-kube-api-access-hnmf4\") pod \"mariadb-operator-controller-manager-c87fff755-7k4z8\" (UID: \"23a744d5-da8a-4fda-8c27-652e4f18d736\") " pod="openstack-operators/mariadb-operator-controller-manager-c87fff755-7k4z8" Jan 11 17:46:55 crc kubenswrapper[4837]: I0111 17:46:55.393218 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zl99t\" (UniqueName: \"kubernetes.io/projected/c2312108-ddf5-4939-acc1-727557936791-kube-api-access-zl99t\") pod \"infra-operator-controller-manager-77c48c7859-4hndt\" (UID: \"c2312108-ddf5-4939-acc1-727557936791\") " pod="openstack-operators/infra-operator-controller-manager-77c48c7859-4hndt" Jan 11 17:46:55 crc kubenswrapper[4837]: I0111 17:46:55.393576 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/c2312108-ddf5-4939-acc1-727557936791-cert\") pod \"infra-operator-controller-manager-77c48c7859-4hndt\" (UID: \"c2312108-ddf5-4939-acc1-727557936791\") " pod="openstack-operators/infra-operator-controller-manager-77c48c7859-4hndt" Jan 11 17:46:55 crc kubenswrapper[4837]: I0111 17:46:55.393650 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-csgfr\" (UniqueName: \"kubernetes.io/projected/537b7dae-5831-4fa5-afba-a5c7e1229e61-kube-api-access-csgfr\") pod \"ironic-operator-controller-manager-78757b4889-87jjr\" (UID: \"537b7dae-5831-4fa5-afba-a5c7e1229e61\") " pod="openstack-operators/ironic-operator-controller-manager-78757b4889-87jjr" Jan 11 17:46:55 crc kubenswrapper[4837]: E0111 17:46:55.394971 4837 secret.go:188] Couldn't get secret openstack-operators/infra-operator-webhook-server-cert: secret "infra-operator-webhook-server-cert" not found Jan 11 17:46:55 crc kubenswrapper[4837]: E0111 17:46:55.395219 4837 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/c2312108-ddf5-4939-acc1-727557936791-cert podName:c2312108-ddf5-4939-acc1-727557936791 nodeName:}" failed. No retries permitted until 2026-01-11 17:46:55.895138002 +0000 UTC m=+990.073330708 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/c2312108-ddf5-4939-acc1-727557936791-cert") pod "infra-operator-controller-manager-77c48c7859-4hndt" (UID: "c2312108-ddf5-4939-acc1-727557936791") : secret "infra-operator-webhook-server-cert" not found Jan 11 17:46:55 crc kubenswrapper[4837]: I0111 17:46:55.410775 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/glance-operator-controller-manager-c6994669c-9ptmm" Jan 11 17:46:55 crc kubenswrapper[4837]: I0111 17:46:55.424464 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-csgfr\" (UniqueName: \"kubernetes.io/projected/537b7dae-5831-4fa5-afba-a5c7e1229e61-kube-api-access-csgfr\") pod \"ironic-operator-controller-manager-78757b4889-87jjr\" (UID: \"537b7dae-5831-4fa5-afba-a5c7e1229e61\") " pod="openstack-operators/ironic-operator-controller-manager-78757b4889-87jjr" Jan 11 17:46:55 crc kubenswrapper[4837]: I0111 17:46:55.430496 4837 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/neutron-operator-controller-manager-cb4666565-x648q"] Jan 11 17:46:55 crc kubenswrapper[4837]: I0111 17:46:55.439865 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zl99t\" (UniqueName: \"kubernetes.io/projected/c2312108-ddf5-4939-acc1-727557936791-kube-api-access-zl99t\") pod \"infra-operator-controller-manager-77c48c7859-4hndt\" (UID: \"c2312108-ddf5-4939-acc1-727557936791\") " pod="openstack-operators/infra-operator-controller-manager-77c48c7859-4hndt" Jan 11 17:46:55 crc kubenswrapper[4837]: I0111 17:46:55.442544 4837 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/octavia-operator-controller-manager-7fc9b76cf6-2lvvk"] Jan 11 17:46:55 crc kubenswrapper[4837]: I0111 17:46:55.443800 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/octavia-operator-controller-manager-7fc9b76cf6-2lvvk" Jan 11 17:46:55 crc kubenswrapper[4837]: I0111 17:46:55.451617 4837 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"octavia-operator-controller-manager-dockercfg-xd69w" Jan 11 17:46:55 crc kubenswrapper[4837]: I0111 17:46:55.454137 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/horizon-operator-controller-manager-77d5c5b54f-75fgx" Jan 11 17:46:55 crc kubenswrapper[4837]: I0111 17:46:55.461533 4837 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/nova-operator-controller-manager-65849867d6-vm5gr"] Jan 11 17:46:55 crc kubenswrapper[4837]: I0111 17:46:55.477206 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/heat-operator-controller-manager-594c8c9d5d-cz9h6" Jan 11 17:46:55 crc kubenswrapper[4837]: I0111 17:46:55.480302 4837 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/octavia-operator-controller-manager-7fc9b76cf6-2lvvk"] Jan 11 17:46:55 crc kubenswrapper[4837]: I0111 17:46:55.487431 4837 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/ovn-operator-controller-manager-55db956ddc-4txs5"] Jan 11 17:46:55 crc kubenswrapper[4837]: I0111 17:46:55.488246 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/ovn-operator-controller-manager-55db956ddc-4txs5" Jan 11 17:46:55 crc kubenswrapper[4837]: I0111 17:46:55.490241 4837 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"ovn-operator-controller-manager-dockercfg-lxkqx" Jan 11 17:46:55 crc kubenswrapper[4837]: I0111 17:46:55.497166 4837 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/placement-operator-controller-manager-686df47fcb-wf222"] Jan 11 17:46:55 crc kubenswrapper[4837]: I0111 17:46:55.499216 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ksczq\" (UniqueName: \"kubernetes.io/projected/8bdb5237-cb95-4e0c-b52c-85a8a419506b-kube-api-access-ksczq\") pod \"keystone-operator-controller-manager-767fdc4f47-jgrh6\" (UID: \"8bdb5237-cb95-4e0c-b52c-85a8a419506b\") " pod="openstack-operators/keystone-operator-controller-manager-767fdc4f47-jgrh6" Jan 11 17:46:55 crc kubenswrapper[4837]: I0111 17:46:55.499253 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-v8d9h\" (UniqueName: \"kubernetes.io/projected/b8bd1c51-4c79-44f9-b7d8-e43dd8118ba4-kube-api-access-v8d9h\") pod \"manila-operator-controller-manager-864f6b75bf-665ds\" (UID: \"b8bd1c51-4c79-44f9-b7d8-e43dd8118ba4\") " pod="openstack-operators/manila-operator-controller-manager-864f6b75bf-665ds" Jan 11 17:46:55 crc kubenswrapper[4837]: I0111 17:46:55.499299 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-g5vwj\" (UniqueName: \"kubernetes.io/projected/b8022f77-44ba-493f-bed8-ad82fa1ca45a-kube-api-access-g5vwj\") pod \"octavia-operator-controller-manager-7fc9b76cf6-2lvvk\" (UID: \"b8022f77-44ba-493f-bed8-ad82fa1ca45a\") " pod="openstack-operators/octavia-operator-controller-manager-7fc9b76cf6-2lvvk" Jan 11 17:46:55 crc kubenswrapper[4837]: I0111 17:46:55.499327 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hnmf4\" (UniqueName: \"kubernetes.io/projected/23a744d5-da8a-4fda-8c27-652e4f18d736-kube-api-access-hnmf4\") pod \"mariadb-operator-controller-manager-c87fff755-7k4z8\" (UID: \"23a744d5-da8a-4fda-8c27-652e4f18d736\") " pod="openstack-operators/mariadb-operator-controller-manager-c87fff755-7k4z8" Jan 11 17:46:55 crc kubenswrapper[4837]: I0111 17:46:55.499446 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fnnbz\" (UniqueName: \"kubernetes.io/projected/69296cc2-890b-439c-8151-9b10963bae3f-kube-api-access-fnnbz\") pod \"neutron-operator-controller-manager-cb4666565-x648q\" (UID: \"69296cc2-890b-439c-8151-9b10963bae3f\") " pod="openstack-operators/neutron-operator-controller-manager-cb4666565-x648q" Jan 11 17:46:55 crc kubenswrapper[4837]: I0111 17:46:55.499475 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kxmth\" (UniqueName: \"kubernetes.io/projected/b7039fa0-8e22-4369-abcd-baa005429b7b-kube-api-access-kxmth\") pod \"nova-operator-controller-manager-65849867d6-vm5gr\" (UID: \"b7039fa0-8e22-4369-abcd-baa005429b7b\") " pod="openstack-operators/nova-operator-controller-manager-65849867d6-vm5gr" Jan 11 17:46:55 crc kubenswrapper[4837]: I0111 17:46:55.500600 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/placement-operator-controller-manager-686df47fcb-wf222" Jan 11 17:46:55 crc kubenswrapper[4837]: I0111 17:46:55.502492 4837 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/openstack-baremetal-operator-controller-manager-6b68b8b854pfgjq"] Jan 11 17:46:55 crc kubenswrapper[4837]: I0111 17:46:55.503999 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-baremetal-operator-controller-manager-6b68b8b854pfgjq" Jan 11 17:46:55 crc kubenswrapper[4837]: I0111 17:46:55.504061 4837 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"placement-operator-controller-manager-dockercfg-d6nfs" Jan 11 17:46:55 crc kubenswrapper[4837]: I0111 17:46:55.507659 4837 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"openstack-baremetal-operator-controller-manager-dockercfg-2dhjx" Jan 11 17:46:55 crc kubenswrapper[4837]: I0111 17:46:55.508210 4837 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"openstack-baremetal-operator-webhook-server-cert" Jan 11 17:46:55 crc kubenswrapper[4837]: I0111 17:46:55.510381 4837 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/ovn-operator-controller-manager-55db956ddc-4txs5"] Jan 11 17:46:55 crc kubenswrapper[4837]: I0111 17:46:55.515431 4837 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/swift-operator-controller-manager-85dd56d4cc-hspqd"] Jan 11 17:46:55 crc kubenswrapper[4837]: I0111 17:46:55.516388 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/swift-operator-controller-manager-85dd56d4cc-hspqd" Jan 11 17:46:55 crc kubenswrapper[4837]: I0111 17:46:55.520357 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-v8d9h\" (UniqueName: \"kubernetes.io/projected/b8bd1c51-4c79-44f9-b7d8-e43dd8118ba4-kube-api-access-v8d9h\") pod \"manila-operator-controller-manager-864f6b75bf-665ds\" (UID: \"b8bd1c51-4c79-44f9-b7d8-e43dd8118ba4\") " pod="openstack-operators/manila-operator-controller-manager-864f6b75bf-665ds" Jan 11 17:46:55 crc kubenswrapper[4837]: I0111 17:46:55.520421 4837 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"swift-operator-controller-manager-dockercfg-skp46" Jan 11 17:46:55 crc kubenswrapper[4837]: I0111 17:46:55.521004 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ksczq\" (UniqueName: \"kubernetes.io/projected/8bdb5237-cb95-4e0c-b52c-85a8a419506b-kube-api-access-ksczq\") pod \"keystone-operator-controller-manager-767fdc4f47-jgrh6\" (UID: \"8bdb5237-cb95-4e0c-b52c-85a8a419506b\") " pod="openstack-operators/keystone-operator-controller-manager-767fdc4f47-jgrh6" Jan 11 17:46:55 crc kubenswrapper[4837]: I0111 17:46:55.530147 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hnmf4\" (UniqueName: \"kubernetes.io/projected/23a744d5-da8a-4fda-8c27-652e4f18d736-kube-api-access-hnmf4\") pod \"mariadb-operator-controller-manager-c87fff755-7k4z8\" (UID: \"23a744d5-da8a-4fda-8c27-652e4f18d736\") " pod="openstack-operators/mariadb-operator-controller-manager-c87fff755-7k4z8" Jan 11 17:46:55 crc kubenswrapper[4837]: I0111 17:46:55.560989 4837 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/swift-operator-controller-manager-85dd56d4cc-hspqd"] Jan 11 17:46:55 crc kubenswrapper[4837]: I0111 17:46:55.601608 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xpdjj\" (UniqueName: \"kubernetes.io/projected/374c350e-a484-40a8-8563-45eb7f3eafd1-kube-api-access-xpdjj\") pod \"openstack-baremetal-operator-controller-manager-6b68b8b854pfgjq\" (UID: \"374c350e-a484-40a8-8563-45eb7f3eafd1\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-6b68b8b854pfgjq" Jan 11 17:46:55 crc kubenswrapper[4837]: I0111 17:46:55.601730 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-g5vwj\" (UniqueName: \"kubernetes.io/projected/b8022f77-44ba-493f-bed8-ad82fa1ca45a-kube-api-access-g5vwj\") pod \"octavia-operator-controller-manager-7fc9b76cf6-2lvvk\" (UID: \"b8022f77-44ba-493f-bed8-ad82fa1ca45a\") " pod="openstack-operators/octavia-operator-controller-manager-7fc9b76cf6-2lvvk" Jan 11 17:46:55 crc kubenswrapper[4837]: I0111 17:46:55.601903 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/374c350e-a484-40a8-8563-45eb7f3eafd1-cert\") pod \"openstack-baremetal-operator-controller-manager-6b68b8b854pfgjq\" (UID: \"374c350e-a484-40a8-8563-45eb7f3eafd1\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-6b68b8b854pfgjq" Jan 11 17:46:55 crc kubenswrapper[4837]: I0111 17:46:55.601966 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-p247h\" (UniqueName: \"kubernetes.io/projected/5ea53463-b9a9-4406-b27f-ab1324f4bdcc-kube-api-access-p247h\") pod \"swift-operator-controller-manager-85dd56d4cc-hspqd\" (UID: \"5ea53463-b9a9-4406-b27f-ab1324f4bdcc\") " pod="openstack-operators/swift-operator-controller-manager-85dd56d4cc-hspqd" Jan 11 17:46:55 crc kubenswrapper[4837]: I0111 17:46:55.602018 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-sl5b5\" (UniqueName: \"kubernetes.io/projected/ddce549f-ba1d-483d-b50b-4011c826bbff-kube-api-access-sl5b5\") pod \"placement-operator-controller-manager-686df47fcb-wf222\" (UID: \"ddce549f-ba1d-483d-b50b-4011c826bbff\") " pod="openstack-operators/placement-operator-controller-manager-686df47fcb-wf222" Jan 11 17:46:55 crc kubenswrapper[4837]: I0111 17:46:55.602052 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fnnbz\" (UniqueName: \"kubernetes.io/projected/69296cc2-890b-439c-8151-9b10963bae3f-kube-api-access-fnnbz\") pod \"neutron-operator-controller-manager-cb4666565-x648q\" (UID: \"69296cc2-890b-439c-8151-9b10963bae3f\") " pod="openstack-operators/neutron-operator-controller-manager-cb4666565-x648q" Jan 11 17:46:55 crc kubenswrapper[4837]: I0111 17:46:55.602083 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-v68qk\" (UniqueName: \"kubernetes.io/projected/8fe4bbe3-9aed-4232-9036-d53346db80b2-kube-api-access-v68qk\") pod \"ovn-operator-controller-manager-55db956ddc-4txs5\" (UID: \"8fe4bbe3-9aed-4232-9036-d53346db80b2\") " pod="openstack-operators/ovn-operator-controller-manager-55db956ddc-4txs5" Jan 11 17:46:55 crc kubenswrapper[4837]: I0111 17:46:55.602117 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kxmth\" (UniqueName: \"kubernetes.io/projected/b7039fa0-8e22-4369-abcd-baa005429b7b-kube-api-access-kxmth\") pod \"nova-operator-controller-manager-65849867d6-vm5gr\" (UID: \"b7039fa0-8e22-4369-abcd-baa005429b7b\") " pod="openstack-operators/nova-operator-controller-manager-65849867d6-vm5gr" Jan 11 17:46:55 crc kubenswrapper[4837]: I0111 17:46:55.625556 4837 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/placement-operator-controller-manager-686df47fcb-wf222"] Jan 11 17:46:55 crc kubenswrapper[4837]: I0111 17:46:55.627360 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-g5vwj\" (UniqueName: \"kubernetes.io/projected/b8022f77-44ba-493f-bed8-ad82fa1ca45a-kube-api-access-g5vwj\") pod \"octavia-operator-controller-manager-7fc9b76cf6-2lvvk\" (UID: \"b8022f77-44ba-493f-bed8-ad82fa1ca45a\") " pod="openstack-operators/octavia-operator-controller-manager-7fc9b76cf6-2lvvk" Jan 11 17:46:55 crc kubenswrapper[4837]: I0111 17:46:55.628962 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-kxmth\" (UniqueName: \"kubernetes.io/projected/b7039fa0-8e22-4369-abcd-baa005429b7b-kube-api-access-kxmth\") pod \"nova-operator-controller-manager-65849867d6-vm5gr\" (UID: \"b7039fa0-8e22-4369-abcd-baa005429b7b\") " pod="openstack-operators/nova-operator-controller-manager-65849867d6-vm5gr" Jan 11 17:46:55 crc kubenswrapper[4837]: I0111 17:46:55.633752 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fnnbz\" (UniqueName: \"kubernetes.io/projected/69296cc2-890b-439c-8151-9b10963bae3f-kube-api-access-fnnbz\") pod \"neutron-operator-controller-manager-cb4666565-x648q\" (UID: \"69296cc2-890b-439c-8151-9b10963bae3f\") " pod="openstack-operators/neutron-operator-controller-manager-cb4666565-x648q" Jan 11 17:46:55 crc kubenswrapper[4837]: I0111 17:46:55.639753 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/ironic-operator-controller-manager-78757b4889-87jjr" Jan 11 17:46:55 crc kubenswrapper[4837]: I0111 17:46:55.657771 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/keystone-operator-controller-manager-767fdc4f47-jgrh6" Jan 11 17:46:55 crc kubenswrapper[4837]: I0111 17:46:55.658493 4837 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-baremetal-operator-controller-manager-6b68b8b854pfgjq"] Jan 11 17:46:55 crc kubenswrapper[4837]: I0111 17:46:55.685129 4837 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/telemetry-operator-controller-manager-5f8f495fcf-tkj2c"] Jan 11 17:46:55 crc kubenswrapper[4837]: I0111 17:46:55.686103 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/telemetry-operator-controller-manager-5f8f495fcf-tkj2c" Jan 11 17:46:55 crc kubenswrapper[4837]: I0111 17:46:55.689782 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/manila-operator-controller-manager-864f6b75bf-665ds" Jan 11 17:46:55 crc kubenswrapper[4837]: I0111 17:46:55.690642 4837 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"telemetry-operator-controller-manager-dockercfg-lzdb4" Jan 11 17:46:55 crc kubenswrapper[4837]: I0111 17:46:55.703515 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/mariadb-operator-controller-manager-c87fff755-7k4z8" Jan 11 17:46:55 crc kubenswrapper[4837]: I0111 17:46:55.703938 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-sl5b5\" (UniqueName: \"kubernetes.io/projected/ddce549f-ba1d-483d-b50b-4011c826bbff-kube-api-access-sl5b5\") pod \"placement-operator-controller-manager-686df47fcb-wf222\" (UID: \"ddce549f-ba1d-483d-b50b-4011c826bbff\") " pod="openstack-operators/placement-operator-controller-manager-686df47fcb-wf222" Jan 11 17:46:55 crc kubenswrapper[4837]: I0111 17:46:55.703993 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-v68qk\" (UniqueName: \"kubernetes.io/projected/8fe4bbe3-9aed-4232-9036-d53346db80b2-kube-api-access-v68qk\") pod \"ovn-operator-controller-manager-55db956ddc-4txs5\" (UID: \"8fe4bbe3-9aed-4232-9036-d53346db80b2\") " pod="openstack-operators/ovn-operator-controller-manager-55db956ddc-4txs5" Jan 11 17:46:55 crc kubenswrapper[4837]: I0111 17:46:55.704069 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xpdjj\" (UniqueName: \"kubernetes.io/projected/374c350e-a484-40a8-8563-45eb7f3eafd1-kube-api-access-xpdjj\") pod \"openstack-baremetal-operator-controller-manager-6b68b8b854pfgjq\" (UID: \"374c350e-a484-40a8-8563-45eb7f3eafd1\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-6b68b8b854pfgjq" Jan 11 17:46:55 crc kubenswrapper[4837]: I0111 17:46:55.704110 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/374c350e-a484-40a8-8563-45eb7f3eafd1-cert\") pod \"openstack-baremetal-operator-controller-manager-6b68b8b854pfgjq\" (UID: \"374c350e-a484-40a8-8563-45eb7f3eafd1\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-6b68b8b854pfgjq" Jan 11 17:46:55 crc kubenswrapper[4837]: I0111 17:46:55.704158 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-p247h\" (UniqueName: \"kubernetes.io/projected/5ea53463-b9a9-4406-b27f-ab1324f4bdcc-kube-api-access-p247h\") pod \"swift-operator-controller-manager-85dd56d4cc-hspqd\" (UID: \"5ea53463-b9a9-4406-b27f-ab1324f4bdcc\") " pod="openstack-operators/swift-operator-controller-manager-85dd56d4cc-hspqd" Jan 11 17:46:55 crc kubenswrapper[4837]: E0111 17:46:55.704867 4837 secret.go:188] Couldn't get secret openstack-operators/openstack-baremetal-operator-webhook-server-cert: secret "openstack-baremetal-operator-webhook-server-cert" not found Jan 11 17:46:55 crc kubenswrapper[4837]: E0111 17:46:55.704920 4837 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/374c350e-a484-40a8-8563-45eb7f3eafd1-cert podName:374c350e-a484-40a8-8563-45eb7f3eafd1 nodeName:}" failed. No retries permitted until 2026-01-11 17:46:56.204902193 +0000 UTC m=+990.383094899 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/374c350e-a484-40a8-8563-45eb7f3eafd1-cert") pod "openstack-baremetal-operator-controller-manager-6b68b8b854pfgjq" (UID: "374c350e-a484-40a8-8563-45eb7f3eafd1") : secret "openstack-baremetal-operator-webhook-server-cert" not found Jan 11 17:46:55 crc kubenswrapper[4837]: I0111 17:46:55.714699 4837 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/telemetry-operator-controller-manager-5f8f495fcf-tkj2c"] Jan 11 17:46:55 crc kubenswrapper[4837]: I0111 17:46:55.715035 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/neutron-operator-controller-manager-cb4666565-x648q" Jan 11 17:46:55 crc kubenswrapper[4837]: I0111 17:46:55.730832 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/nova-operator-controller-manager-65849867d6-vm5gr" Jan 11 17:46:55 crc kubenswrapper[4837]: I0111 17:46:55.731324 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xpdjj\" (UniqueName: \"kubernetes.io/projected/374c350e-a484-40a8-8563-45eb7f3eafd1-kube-api-access-xpdjj\") pod \"openstack-baremetal-operator-controller-manager-6b68b8b854pfgjq\" (UID: \"374c350e-a484-40a8-8563-45eb7f3eafd1\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-6b68b8b854pfgjq" Jan 11 17:46:55 crc kubenswrapper[4837]: I0111 17:46:55.733745 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-p247h\" (UniqueName: \"kubernetes.io/projected/5ea53463-b9a9-4406-b27f-ab1324f4bdcc-kube-api-access-p247h\") pod \"swift-operator-controller-manager-85dd56d4cc-hspqd\" (UID: \"5ea53463-b9a9-4406-b27f-ab1324f4bdcc\") " pod="openstack-operators/swift-operator-controller-manager-85dd56d4cc-hspqd" Jan 11 17:46:55 crc kubenswrapper[4837]: I0111 17:46:55.739122 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-v68qk\" (UniqueName: \"kubernetes.io/projected/8fe4bbe3-9aed-4232-9036-d53346db80b2-kube-api-access-v68qk\") pod \"ovn-operator-controller-manager-55db956ddc-4txs5\" (UID: \"8fe4bbe3-9aed-4232-9036-d53346db80b2\") " pod="openstack-operators/ovn-operator-controller-manager-55db956ddc-4txs5" Jan 11 17:46:55 crc kubenswrapper[4837]: I0111 17:46:55.752960 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-sl5b5\" (UniqueName: \"kubernetes.io/projected/ddce549f-ba1d-483d-b50b-4011c826bbff-kube-api-access-sl5b5\") pod \"placement-operator-controller-manager-686df47fcb-wf222\" (UID: \"ddce549f-ba1d-483d-b50b-4011c826bbff\") " pod="openstack-operators/placement-operator-controller-manager-686df47fcb-wf222" Jan 11 17:46:55 crc kubenswrapper[4837]: I0111 17:46:55.776780 4837 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/test-operator-controller-manager-7cd8bc9dbb-shqx6"] Jan 11 17:46:55 crc kubenswrapper[4837]: I0111 17:46:55.777913 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/test-operator-controller-manager-7cd8bc9dbb-shqx6" Jan 11 17:46:55 crc kubenswrapper[4837]: I0111 17:46:55.785048 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/octavia-operator-controller-manager-7fc9b76cf6-2lvvk" Jan 11 17:46:55 crc kubenswrapper[4837]: I0111 17:46:55.785130 4837 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"test-operator-controller-manager-dockercfg-cv75k" Jan 11 17:46:55 crc kubenswrapper[4837]: I0111 17:46:55.791336 4837 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/test-operator-controller-manager-7cd8bc9dbb-shqx6"] Jan 11 17:46:55 crc kubenswrapper[4837]: I0111 17:46:55.808378 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5s8n4\" (UniqueName: \"kubernetes.io/projected/0296da23-fe5c-4f47-b26b-6d83da73bf31-kube-api-access-5s8n4\") pod \"telemetry-operator-controller-manager-5f8f495fcf-tkj2c\" (UID: \"0296da23-fe5c-4f47-b26b-6d83da73bf31\") " pod="openstack-operators/telemetry-operator-controller-manager-5f8f495fcf-tkj2c" Jan 11 17:46:55 crc kubenswrapper[4837]: I0111 17:46:55.822469 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/ovn-operator-controller-manager-55db956ddc-4txs5" Jan 11 17:46:55 crc kubenswrapper[4837]: I0111 17:46:55.838102 4837 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/watcher-operator-controller-manager-64cd966744-jsqjz"] Jan 11 17:46:55 crc kubenswrapper[4837]: I0111 17:46:55.839614 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/watcher-operator-controller-manager-64cd966744-jsqjz" Jan 11 17:46:55 crc kubenswrapper[4837]: I0111 17:46:55.843612 4837 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"watcher-operator-controller-manager-dockercfg-v4knj" Jan 11 17:46:55 crc kubenswrapper[4837]: I0111 17:46:55.852364 4837 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/watcher-operator-controller-manager-64cd966744-jsqjz"] Jan 11 17:46:55 crc kubenswrapper[4837]: W0111 17:46:55.868396 4837 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod02e82478_6974_4ae1_b8de_57688876d070.slice/crio-bf7fdf2cb6fd9bfee6da93b2d6c4f4b2465e42ffb5b464eae7b161583047a637 WatchSource:0}: Error finding container bf7fdf2cb6fd9bfee6da93b2d6c4f4b2465e42ffb5b464eae7b161583047a637: Status 404 returned error can't find the container with id bf7fdf2cb6fd9bfee6da93b2d6c4f4b2465e42ffb5b464eae7b161583047a637 Jan 11 17:46:55 crc kubenswrapper[4837]: I0111 17:46:55.869384 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/placement-operator-controller-manager-686df47fcb-wf222" Jan 11 17:46:55 crc kubenswrapper[4837]: I0111 17:46:55.883977 4837 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/openstack-operator-controller-manager-5569b88c46-6jqzq"] Jan 11 17:46:55 crc kubenswrapper[4837]: I0111 17:46:55.885044 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-controller-manager-5569b88c46-6jqzq" Jan 11 17:46:55 crc kubenswrapper[4837]: I0111 17:46:55.888693 4837 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"metrics-server-cert" Jan 11 17:46:55 crc kubenswrapper[4837]: I0111 17:46:55.888983 4837 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"openstack-operator-controller-manager-dockercfg-9ts8d" Jan 11 17:46:55 crc kubenswrapper[4837]: I0111 17:46:55.890398 4837 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"webhook-server-cert" Jan 11 17:46:55 crc kubenswrapper[4837]: I0111 17:46:55.891736 4837 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-operator-controller-manager-5569b88c46-6jqzq"] Jan 11 17:46:55 crc kubenswrapper[4837]: I0111 17:46:55.895156 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/swift-operator-controller-manager-85dd56d4cc-hspqd" Jan 11 17:46:55 crc kubenswrapper[4837]: I0111 17:46:55.918953 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/c2312108-ddf5-4939-acc1-727557936791-cert\") pod \"infra-operator-controller-manager-77c48c7859-4hndt\" (UID: \"c2312108-ddf5-4939-acc1-727557936791\") " pod="openstack-operators/infra-operator-controller-manager-77c48c7859-4hndt" Jan 11 17:46:55 crc kubenswrapper[4837]: I0111 17:46:55.918995 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xv4pg\" (UniqueName: \"kubernetes.io/projected/04067843-8e2d-4a0c-8c68-2e321669b605-kube-api-access-xv4pg\") pod \"test-operator-controller-manager-7cd8bc9dbb-shqx6\" (UID: \"04067843-8e2d-4a0c-8c68-2e321669b605\") " pod="openstack-operators/test-operator-controller-manager-7cd8bc9dbb-shqx6" Jan 11 17:46:55 crc kubenswrapper[4837]: I0111 17:46:55.919024 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pr8g6\" (UniqueName: \"kubernetes.io/projected/56dd103a-afaf-46fa-9cf3-f85418264d29-kube-api-access-pr8g6\") pod \"watcher-operator-controller-manager-64cd966744-jsqjz\" (UID: \"56dd103a-afaf-46fa-9cf3-f85418264d29\") " pod="openstack-operators/watcher-operator-controller-manager-64cd966744-jsqjz" Jan 11 17:46:55 crc kubenswrapper[4837]: I0111 17:46:55.919050 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5s8n4\" (UniqueName: \"kubernetes.io/projected/0296da23-fe5c-4f47-b26b-6d83da73bf31-kube-api-access-5s8n4\") pod \"telemetry-operator-controller-manager-5f8f495fcf-tkj2c\" (UID: \"0296da23-fe5c-4f47-b26b-6d83da73bf31\") " pod="openstack-operators/telemetry-operator-controller-manager-5f8f495fcf-tkj2c" Jan 11 17:46:55 crc kubenswrapper[4837]: E0111 17:46:55.919284 4837 secret.go:188] Couldn't get secret openstack-operators/infra-operator-webhook-server-cert: secret "infra-operator-webhook-server-cert" not found Jan 11 17:46:55 crc kubenswrapper[4837]: E0111 17:46:55.919333 4837 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/c2312108-ddf5-4939-acc1-727557936791-cert podName:c2312108-ddf5-4939-acc1-727557936791 nodeName:}" failed. No retries permitted until 2026-01-11 17:46:56.919316245 +0000 UTC m=+991.097508951 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/c2312108-ddf5-4939-acc1-727557936791-cert") pod "infra-operator-controller-manager-77c48c7859-4hndt" (UID: "c2312108-ddf5-4939-acc1-727557936791") : secret "infra-operator-webhook-server-cert" not found Jan 11 17:46:55 crc kubenswrapper[4837]: I0111 17:46:55.925296 4837 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/barbican-operator-controller-manager-7ddb5c749-bv24m"] Jan 11 17:46:55 crc kubenswrapper[4837]: I0111 17:46:55.939620 4837 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-kwp7z"] Jan 11 17:46:55 crc kubenswrapper[4837]: I0111 17:46:55.941885 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-kwp7z" Jan 11 17:46:55 crc kubenswrapper[4837]: I0111 17:46:55.946647 4837 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"rabbitmq-cluster-operator-controller-manager-dockercfg-fnkzc" Jan 11 17:46:55 crc kubenswrapper[4837]: I0111 17:46:55.950301 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/barbican-operator-controller-manager-7ddb5c749-bv24m" event={"ID":"02e82478-6974-4ae1-b8de-57688876d070","Type":"ContainerStarted","Data":"bf7fdf2cb6fd9bfee6da93b2d6c4f4b2465e42ffb5b464eae7b161583047a637"} Jan 11 17:46:55 crc kubenswrapper[4837]: I0111 17:46:55.952260 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5s8n4\" (UniqueName: \"kubernetes.io/projected/0296da23-fe5c-4f47-b26b-6d83da73bf31-kube-api-access-5s8n4\") pod \"telemetry-operator-controller-manager-5f8f495fcf-tkj2c\" (UID: \"0296da23-fe5c-4f47-b26b-6d83da73bf31\") " pod="openstack-operators/telemetry-operator-controller-manager-5f8f495fcf-tkj2c" Jan 11 17:46:56 crc kubenswrapper[4837]: I0111 17:46:56.009607 4837 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-kwp7z"] Jan 11 17:46:56 crc kubenswrapper[4837]: I0111 17:46:56.029449 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/3081056a-171f-44ab-a8c4-57a3c40686c4-metrics-certs\") pod \"openstack-operator-controller-manager-5569b88c46-6jqzq\" (UID: \"3081056a-171f-44ab-a8c4-57a3c40686c4\") " pod="openstack-operators/openstack-operator-controller-manager-5569b88c46-6jqzq" Jan 11 17:46:56 crc kubenswrapper[4837]: I0111 17:46:56.029732 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xv4pg\" (UniqueName: \"kubernetes.io/projected/04067843-8e2d-4a0c-8c68-2e321669b605-kube-api-access-xv4pg\") pod \"test-operator-controller-manager-7cd8bc9dbb-shqx6\" (UID: \"04067843-8e2d-4a0c-8c68-2e321669b605\") " pod="openstack-operators/test-operator-controller-manager-7cd8bc9dbb-shqx6" Jan 11 17:46:56 crc kubenswrapper[4837]: I0111 17:46:56.029755 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-b96f8\" (UniqueName: \"kubernetes.io/projected/c76abbe1-c9d2-414f-8c9a-372f8d5e17bc-kube-api-access-b96f8\") pod \"rabbitmq-cluster-operator-manager-668c99d594-kwp7z\" (UID: \"c76abbe1-c9d2-414f-8c9a-372f8d5e17bc\") " pod="openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-kwp7z" Jan 11 17:46:56 crc kubenswrapper[4837]: I0111 17:46:56.029779 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jt6gz\" (UniqueName: \"kubernetes.io/projected/3081056a-171f-44ab-a8c4-57a3c40686c4-kube-api-access-jt6gz\") pod \"openstack-operator-controller-manager-5569b88c46-6jqzq\" (UID: \"3081056a-171f-44ab-a8c4-57a3c40686c4\") " pod="openstack-operators/openstack-operator-controller-manager-5569b88c46-6jqzq" Jan 11 17:46:56 crc kubenswrapper[4837]: I0111 17:46:56.029795 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-pr8g6\" (UniqueName: \"kubernetes.io/projected/56dd103a-afaf-46fa-9cf3-f85418264d29-kube-api-access-pr8g6\") pod \"watcher-operator-controller-manager-64cd966744-jsqjz\" (UID: \"56dd103a-afaf-46fa-9cf3-f85418264d29\") " pod="openstack-operators/watcher-operator-controller-manager-64cd966744-jsqjz" Jan 11 17:46:56 crc kubenswrapper[4837]: I0111 17:46:56.029821 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/3081056a-171f-44ab-a8c4-57a3c40686c4-webhook-certs\") pod \"openstack-operator-controller-manager-5569b88c46-6jqzq\" (UID: \"3081056a-171f-44ab-a8c4-57a3c40686c4\") " pod="openstack-operators/openstack-operator-controller-manager-5569b88c46-6jqzq" Jan 11 17:46:56 crc kubenswrapper[4837]: I0111 17:46:56.030628 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/telemetry-operator-controller-manager-5f8f495fcf-tkj2c" Jan 11 17:46:56 crc kubenswrapper[4837]: I0111 17:46:56.083945 4837 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/cinder-operator-controller-manager-9b68f5989-6fhn7"] Jan 11 17:46:56 crc kubenswrapper[4837]: I0111 17:46:56.121017 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-pr8g6\" (UniqueName: \"kubernetes.io/projected/56dd103a-afaf-46fa-9cf3-f85418264d29-kube-api-access-pr8g6\") pod \"watcher-operator-controller-manager-64cd966744-jsqjz\" (UID: \"56dd103a-afaf-46fa-9cf3-f85418264d29\") " pod="openstack-operators/watcher-operator-controller-manager-64cd966744-jsqjz" Jan 11 17:46:56 crc kubenswrapper[4837]: I0111 17:46:56.132759 4837 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/glance-operator-controller-manager-c6994669c-9ptmm"] Jan 11 17:46:56 crc kubenswrapper[4837]: I0111 17:46:56.134407 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/3081056a-171f-44ab-a8c4-57a3c40686c4-webhook-certs\") pod \"openstack-operator-controller-manager-5569b88c46-6jqzq\" (UID: \"3081056a-171f-44ab-a8c4-57a3c40686c4\") " pod="openstack-operators/openstack-operator-controller-manager-5569b88c46-6jqzq" Jan 11 17:46:56 crc kubenswrapper[4837]: I0111 17:46:56.134513 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/3081056a-171f-44ab-a8c4-57a3c40686c4-metrics-certs\") pod \"openstack-operator-controller-manager-5569b88c46-6jqzq\" (UID: \"3081056a-171f-44ab-a8c4-57a3c40686c4\") " pod="openstack-operators/openstack-operator-controller-manager-5569b88c46-6jqzq" Jan 11 17:46:56 crc kubenswrapper[4837]: I0111 17:46:56.134599 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-b96f8\" (UniqueName: \"kubernetes.io/projected/c76abbe1-c9d2-414f-8c9a-372f8d5e17bc-kube-api-access-b96f8\") pod \"rabbitmq-cluster-operator-manager-668c99d594-kwp7z\" (UID: \"c76abbe1-c9d2-414f-8c9a-372f8d5e17bc\") " pod="openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-kwp7z" Jan 11 17:46:56 crc kubenswrapper[4837]: I0111 17:46:56.134627 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jt6gz\" (UniqueName: \"kubernetes.io/projected/3081056a-171f-44ab-a8c4-57a3c40686c4-kube-api-access-jt6gz\") pod \"openstack-operator-controller-manager-5569b88c46-6jqzq\" (UID: \"3081056a-171f-44ab-a8c4-57a3c40686c4\") " pod="openstack-operators/openstack-operator-controller-manager-5569b88c46-6jqzq" Jan 11 17:46:56 crc kubenswrapper[4837]: E0111 17:46:56.134967 4837 secret.go:188] Couldn't get secret openstack-operators/metrics-server-cert: secret "metrics-server-cert" not found Jan 11 17:46:56 crc kubenswrapper[4837]: E0111 17:46:56.135685 4837 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/3081056a-171f-44ab-a8c4-57a3c40686c4-metrics-certs podName:3081056a-171f-44ab-a8c4-57a3c40686c4 nodeName:}" failed. No retries permitted until 2026-01-11 17:46:56.635647749 +0000 UTC m=+990.813840455 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/3081056a-171f-44ab-a8c4-57a3c40686c4-metrics-certs") pod "openstack-operator-controller-manager-5569b88c46-6jqzq" (UID: "3081056a-171f-44ab-a8c4-57a3c40686c4") : secret "metrics-server-cert" not found Jan 11 17:46:56 crc kubenswrapper[4837]: E0111 17:46:56.135939 4837 secret.go:188] Couldn't get secret openstack-operators/webhook-server-cert: secret "webhook-server-cert" not found Jan 11 17:46:56 crc kubenswrapper[4837]: E0111 17:46:56.136015 4837 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/3081056a-171f-44ab-a8c4-57a3c40686c4-webhook-certs podName:3081056a-171f-44ab-a8c4-57a3c40686c4 nodeName:}" failed. No retries permitted until 2026-01-11 17:46:56.635992558 +0000 UTC m=+990.814185344 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "webhook-certs" (UniqueName: "kubernetes.io/secret/3081056a-171f-44ab-a8c4-57a3c40686c4-webhook-certs") pod "openstack-operator-controller-manager-5569b88c46-6jqzq" (UID: "3081056a-171f-44ab-a8c4-57a3c40686c4") : secret "webhook-server-cert" not found Jan 11 17:46:56 crc kubenswrapper[4837]: I0111 17:46:56.136955 4837 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/designate-operator-controller-manager-9f958b845-26d4f"] Jan 11 17:46:56 crc kubenswrapper[4837]: I0111 17:46:56.144136 4837 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/horizon-operator-controller-manager-77d5c5b54f-75fgx"] Jan 11 17:46:56 crc kubenswrapper[4837]: I0111 17:46:56.144256 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xv4pg\" (UniqueName: \"kubernetes.io/projected/04067843-8e2d-4a0c-8c68-2e321669b605-kube-api-access-xv4pg\") pod \"test-operator-controller-manager-7cd8bc9dbb-shqx6\" (UID: \"04067843-8e2d-4a0c-8c68-2e321669b605\") " pod="openstack-operators/test-operator-controller-manager-7cd8bc9dbb-shqx6" Jan 11 17:46:56 crc kubenswrapper[4837]: I0111 17:46:56.167956 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jt6gz\" (UniqueName: \"kubernetes.io/projected/3081056a-171f-44ab-a8c4-57a3c40686c4-kube-api-access-jt6gz\") pod \"openstack-operator-controller-manager-5569b88c46-6jqzq\" (UID: \"3081056a-171f-44ab-a8c4-57a3c40686c4\") " pod="openstack-operators/openstack-operator-controller-manager-5569b88c46-6jqzq" Jan 11 17:46:56 crc kubenswrapper[4837]: I0111 17:46:56.175149 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/test-operator-controller-manager-7cd8bc9dbb-shqx6" Jan 11 17:46:56 crc kubenswrapper[4837]: I0111 17:46:56.182210 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-b96f8\" (UniqueName: \"kubernetes.io/projected/c76abbe1-c9d2-414f-8c9a-372f8d5e17bc-kube-api-access-b96f8\") pod \"rabbitmq-cluster-operator-manager-668c99d594-kwp7z\" (UID: \"c76abbe1-c9d2-414f-8c9a-372f8d5e17bc\") " pod="openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-kwp7z" Jan 11 17:46:56 crc kubenswrapper[4837]: I0111 17:46:56.220141 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/watcher-operator-controller-manager-64cd966744-jsqjz" Jan 11 17:46:56 crc kubenswrapper[4837]: I0111 17:46:56.235809 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/374c350e-a484-40a8-8563-45eb7f3eafd1-cert\") pod \"openstack-baremetal-operator-controller-manager-6b68b8b854pfgjq\" (UID: \"374c350e-a484-40a8-8563-45eb7f3eafd1\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-6b68b8b854pfgjq" Jan 11 17:46:56 crc kubenswrapper[4837]: E0111 17:46:56.236008 4837 secret.go:188] Couldn't get secret openstack-operators/openstack-baremetal-operator-webhook-server-cert: secret "openstack-baremetal-operator-webhook-server-cert" not found Jan 11 17:46:56 crc kubenswrapper[4837]: E0111 17:46:56.236050 4837 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/374c350e-a484-40a8-8563-45eb7f3eafd1-cert podName:374c350e-a484-40a8-8563-45eb7f3eafd1 nodeName:}" failed. No retries permitted until 2026-01-11 17:46:57.236036221 +0000 UTC m=+991.414228927 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/374c350e-a484-40a8-8563-45eb7f3eafd1-cert") pod "openstack-baremetal-operator-controller-manager-6b68b8b854pfgjq" (UID: "374c350e-a484-40a8-8563-45eb7f3eafd1") : secret "openstack-baremetal-operator-webhook-server-cert" not found Jan 11 17:46:56 crc kubenswrapper[4837]: W0111 17:46:56.253887 4837 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod63384d88_7d49_4951_8ccd_10871b0b18ad.slice/crio-5f7fde35e4a47e33aec29b28f957b1326587f377519135c1608351b5c8729efd WatchSource:0}: Error finding container 5f7fde35e4a47e33aec29b28f957b1326587f377519135c1608351b5c8729efd: Status 404 returned error can't find the container with id 5f7fde35e4a47e33aec29b28f957b1326587f377519135c1608351b5c8729efd Jan 11 17:46:56 crc kubenswrapper[4837]: I0111 17:46:56.309775 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-kwp7z" Jan 11 17:46:56 crc kubenswrapper[4837]: I0111 17:46:56.558900 4837 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/heat-operator-controller-manager-594c8c9d5d-cz9h6"] Jan 11 17:46:56 crc kubenswrapper[4837]: I0111 17:46:56.643907 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/3081056a-171f-44ab-a8c4-57a3c40686c4-webhook-certs\") pod \"openstack-operator-controller-manager-5569b88c46-6jqzq\" (UID: \"3081056a-171f-44ab-a8c4-57a3c40686c4\") " pod="openstack-operators/openstack-operator-controller-manager-5569b88c46-6jqzq" Jan 11 17:46:56 crc kubenswrapper[4837]: I0111 17:46:56.644053 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/3081056a-171f-44ab-a8c4-57a3c40686c4-metrics-certs\") pod \"openstack-operator-controller-manager-5569b88c46-6jqzq\" (UID: \"3081056a-171f-44ab-a8c4-57a3c40686c4\") " pod="openstack-operators/openstack-operator-controller-manager-5569b88c46-6jqzq" Jan 11 17:46:56 crc kubenswrapper[4837]: E0111 17:46:56.644087 4837 secret.go:188] Couldn't get secret openstack-operators/webhook-server-cert: secret "webhook-server-cert" not found Jan 11 17:46:56 crc kubenswrapper[4837]: E0111 17:46:56.644154 4837 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/3081056a-171f-44ab-a8c4-57a3c40686c4-webhook-certs podName:3081056a-171f-44ab-a8c4-57a3c40686c4 nodeName:}" failed. No retries permitted until 2026-01-11 17:46:57.6441364 +0000 UTC m=+991.822329106 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "webhook-certs" (UniqueName: "kubernetes.io/secret/3081056a-171f-44ab-a8c4-57a3c40686c4-webhook-certs") pod "openstack-operator-controller-manager-5569b88c46-6jqzq" (UID: "3081056a-171f-44ab-a8c4-57a3c40686c4") : secret "webhook-server-cert" not found Jan 11 17:46:56 crc kubenswrapper[4837]: E0111 17:46:56.644253 4837 secret.go:188] Couldn't get secret openstack-operators/metrics-server-cert: secret "metrics-server-cert" not found Jan 11 17:46:56 crc kubenswrapper[4837]: E0111 17:46:56.644288 4837 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/3081056a-171f-44ab-a8c4-57a3c40686c4-metrics-certs podName:3081056a-171f-44ab-a8c4-57a3c40686c4 nodeName:}" failed. No retries permitted until 2026-01-11 17:46:57.644276534 +0000 UTC m=+991.822469240 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/3081056a-171f-44ab-a8c4-57a3c40686c4-metrics-certs") pod "openstack-operator-controller-manager-5569b88c46-6jqzq" (UID: "3081056a-171f-44ab-a8c4-57a3c40686c4") : secret "metrics-server-cert" not found Jan 11 17:46:56 crc kubenswrapper[4837]: I0111 17:46:56.951399 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/c2312108-ddf5-4939-acc1-727557936791-cert\") pod \"infra-operator-controller-manager-77c48c7859-4hndt\" (UID: \"c2312108-ddf5-4939-acc1-727557936791\") " pod="openstack-operators/infra-operator-controller-manager-77c48c7859-4hndt" Jan 11 17:46:56 crc kubenswrapper[4837]: E0111 17:46:56.951551 4837 secret.go:188] Couldn't get secret openstack-operators/infra-operator-webhook-server-cert: secret "infra-operator-webhook-server-cert" not found Jan 11 17:46:56 crc kubenswrapper[4837]: E0111 17:46:56.951600 4837 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/c2312108-ddf5-4939-acc1-727557936791-cert podName:c2312108-ddf5-4939-acc1-727557936791 nodeName:}" failed. No retries permitted until 2026-01-11 17:46:58.951585508 +0000 UTC m=+993.129778214 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/c2312108-ddf5-4939-acc1-727557936791-cert") pod "infra-operator-controller-manager-77c48c7859-4hndt" (UID: "c2312108-ddf5-4939-acc1-727557936791") : secret "infra-operator-webhook-server-cert" not found Jan 11 17:46:56 crc kubenswrapper[4837]: I0111 17:46:56.956885 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/horizon-operator-controller-manager-77d5c5b54f-75fgx" event={"ID":"fc05ccce-2544-4a54-bdf8-ec1b792ac1ba","Type":"ContainerStarted","Data":"3273de26fa6fbf27492f9d2f6872d141d1fc7c458091b380ba28670d9d9c44cc"} Jan 11 17:46:56 crc kubenswrapper[4837]: I0111 17:46:56.962783 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/glance-operator-controller-manager-c6994669c-9ptmm" event={"ID":"c4d04eda-5046-43cd-b407-ed14ec61cbd6","Type":"ContainerStarted","Data":"4fde43b54dd59fc7d51780965b65e66c968b36f0d7abf409cd61b9ecbc0ec579"} Jan 11 17:46:56 crc kubenswrapper[4837]: I0111 17:46:56.965764 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/heat-operator-controller-manager-594c8c9d5d-cz9h6" event={"ID":"61f99042-0859-46d8-9af9-727352a885ee","Type":"ContainerStarted","Data":"a7367d6b61d55a441670a5ce889b7e4499b5994b075a8c1384def7f830c6c036"} Jan 11 17:46:56 crc kubenswrapper[4837]: I0111 17:46:56.976699 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/designate-operator-controller-manager-9f958b845-26d4f" event={"ID":"63384d88-7d49-4951-8ccd-10871b0b18ad","Type":"ContainerStarted","Data":"5f7fde35e4a47e33aec29b28f957b1326587f377519135c1608351b5c8729efd"} Jan 11 17:46:56 crc kubenswrapper[4837]: I0111 17:46:56.977977 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/cinder-operator-controller-manager-9b68f5989-6fhn7" event={"ID":"eb2c9390-f27a-46b0-9249-3e9bdc0c99e3","Type":"ContainerStarted","Data":"b795389f2e1345ce92a268a6a94d77e8ad0ccc161d963a13337169453c1a61c7"} Jan 11 17:46:57 crc kubenswrapper[4837]: I0111 17:46:57.032756 4837 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/ironic-operator-controller-manager-78757b4889-87jjr"] Jan 11 17:46:57 crc kubenswrapper[4837]: I0111 17:46:57.038260 4837 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/nova-operator-controller-manager-65849867d6-vm5gr"] Jan 11 17:46:57 crc kubenswrapper[4837]: I0111 17:46:57.048563 4837 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/neutron-operator-controller-manager-cb4666565-x648q"] Jan 11 17:46:57 crc kubenswrapper[4837]: I0111 17:46:57.052522 4837 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/placement-operator-controller-manager-686df47fcb-wf222"] Jan 11 17:46:57 crc kubenswrapper[4837]: I0111 17:46:57.061711 4837 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/octavia-operator-controller-manager-7fc9b76cf6-2lvvk"] Jan 11 17:46:57 crc kubenswrapper[4837]: W0111 17:46:57.062754 4837 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod537b7dae_5831_4fa5_afba_a5c7e1229e61.slice/crio-b82c85dbcf96f0ceab8a070a1606c7dafb3ea67bf3442646f53b74fe7ad70840 WatchSource:0}: Error finding container b82c85dbcf96f0ceab8a070a1606c7dafb3ea67bf3442646f53b74fe7ad70840: Status 404 returned error can't find the container with id b82c85dbcf96f0ceab8a070a1606c7dafb3ea67bf3442646f53b74fe7ad70840 Jan 11 17:46:57 crc kubenswrapper[4837]: W0111 17:46:57.063332 4837 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod69296cc2_890b_439c_8151_9b10963bae3f.slice/crio-643bc4134bfc47e93e62c28639a0a11c0c53ce1dbdc1effa40c81b52672ac1ef WatchSource:0}: Error finding container 643bc4134bfc47e93e62c28639a0a11c0c53ce1dbdc1effa40c81b52672ac1ef: Status 404 returned error can't find the container with id 643bc4134bfc47e93e62c28639a0a11c0c53ce1dbdc1effa40c81b52672ac1ef Jan 11 17:46:57 crc kubenswrapper[4837]: W0111 17:46:57.064611 4837 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod8bdb5237_cb95_4e0c_b52c_85a8a419506b.slice/crio-0ad89b537bd1ff32b77262f8e3249e56ecec11c4b93de8a0032a86fcc58a876c WatchSource:0}: Error finding container 0ad89b537bd1ff32b77262f8e3249e56ecec11c4b93de8a0032a86fcc58a876c: Status 404 returned error can't find the container with id 0ad89b537bd1ff32b77262f8e3249e56ecec11c4b93de8a0032a86fcc58a876c Jan 11 17:46:57 crc kubenswrapper[4837]: I0111 17:46:57.066802 4837 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/keystone-operator-controller-manager-767fdc4f47-jgrh6"] Jan 11 17:46:57 crc kubenswrapper[4837]: I0111 17:46:57.074984 4837 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/swift-operator-controller-manager-85dd56d4cc-hspqd"] Jan 11 17:46:57 crc kubenswrapper[4837]: I0111 17:46:57.086156 4837 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/mariadb-operator-controller-manager-c87fff755-7k4z8"] Jan 11 17:46:57 crc kubenswrapper[4837]: I0111 17:46:57.105707 4837 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/watcher-operator-controller-manager-64cd966744-jsqjz"] Jan 11 17:46:57 crc kubenswrapper[4837]: I0111 17:46:57.110974 4837 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/manila-operator-controller-manager-864f6b75bf-665ds"] Jan 11 17:46:57 crc kubenswrapper[4837]: I0111 17:46:57.118350 4837 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/ovn-operator-controller-manager-55db956ddc-4txs5"] Jan 11 17:46:57 crc kubenswrapper[4837]: W0111 17:46:57.119434 4837 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod5ea53463_b9a9_4406_b27f_ab1324f4bdcc.slice/crio-3b79d51a0f60e696737e39eaf489ee7f2b1d19b9705d9c0dba5df56eac29649a WatchSource:0}: Error finding container 3b79d51a0f60e696737e39eaf489ee7f2b1d19b9705d9c0dba5df56eac29649a: Status 404 returned error can't find the container with id 3b79d51a0f60e696737e39eaf489ee7f2b1d19b9705d9c0dba5df56eac29649a Jan 11 17:46:57 crc kubenswrapper[4837]: E0111 17:46:57.126899 4837 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/telemetry-operator@sha256:2e89109f5db66abf1afd15ef59bda35a53db40c5e59e020579ac5aa0acea1843,Command:[/manager],Args:[--leader-elect --health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-5s8n4,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod telemetry-operator-controller-manager-5f8f495fcf-tkj2c_openstack-operators(0296da23-fe5c-4f47-b26b-6d83da73bf31): ErrImagePull: pull QPS exceeded" logger="UnhandledError" Jan 11 17:46:57 crc kubenswrapper[4837]: E0111 17:46:57.126909 4837 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/test-operator@sha256:244a4906353b84899db16a89e1ebb64491c9f85e69327cb2a72b6da0142a6e5e,Command:[/manager],Args:[--leader-elect --health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-xv4pg,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod test-operator-controller-manager-7cd8bc9dbb-shqx6_openstack-operators(04067843-8e2d-4a0c-8c68-2e321669b605): ErrImagePull: pull QPS exceeded" logger="UnhandledError" Jan 11 17:46:57 crc kubenswrapper[4837]: E0111 17:46:57.127544 4837 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:operator,Image:quay.io/openstack-k8s-operators/rabbitmq-cluster-operator@sha256:893e66303c1b0bc1d00a299a3f0380bad55c8dc813c8a1c6a4aab379f5aa12a2,Command:[/manager],Args:[],WorkingDir:,Ports:[]ContainerPort{ContainerPort{Name:metrics,HostPort:0,ContainerPort:9782,Protocol:TCP,HostIP:,},},Env:[]EnvVar{EnvVar{Name:OPERATOR_NAMESPACE,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:metadata.namespace,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{200 -3} {} 200m DecimalSI},memory: {{524288000 0} {} 500Mi BinarySI},},Requests:ResourceList{cpu: {{5 -3} {} 5m DecimalSI},memory: {{67108864 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-b96f8,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000660000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod rabbitmq-cluster-operator-manager-668c99d594-kwp7z_openstack-operators(c76abbe1-c9d2-414f-8c9a-372f8d5e17bc): ErrImagePull: pull QPS exceeded" logger="UnhandledError" Jan 11 17:46:57 crc kubenswrapper[4837]: E0111 17:46:57.128005 4837 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"pull QPS exceeded\"" pod="openstack-operators/telemetry-operator-controller-manager-5f8f495fcf-tkj2c" podUID="0296da23-fe5c-4f47-b26b-6d83da73bf31" Jan 11 17:46:57 crc kubenswrapper[4837]: E0111 17:46:57.128069 4837 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"pull QPS exceeded\"" pod="openstack-operators/test-operator-controller-manager-7cd8bc9dbb-shqx6" podUID="04067843-8e2d-4a0c-8c68-2e321669b605" Jan 11 17:46:57 crc kubenswrapper[4837]: E0111 17:46:57.128838 4837 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"operator\" with ErrImagePull: \"pull QPS exceeded\"" pod="openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-kwp7z" podUID="c76abbe1-c9d2-414f-8c9a-372f8d5e17bc" Jan 11 17:46:57 crc kubenswrapper[4837]: I0111 17:46:57.132550 4837 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/telemetry-operator-controller-manager-5f8f495fcf-tkj2c"] Jan 11 17:46:57 crc kubenswrapper[4837]: E0111 17:46:57.136508 4837 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/manila-operator@sha256:fd2e631e747c35a95f083418f5829d06c4b830f1fdb322368ff6190b9887ea32,Command:[/manager],Args:[--leader-elect --health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-v8d9h,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod manila-operator-controller-manager-864f6b75bf-665ds_openstack-operators(b8bd1c51-4c79-44f9-b7d8-e43dd8118ba4): ErrImagePull: pull QPS exceeded" logger="UnhandledError" Jan 11 17:46:57 crc kubenswrapper[4837]: E0111 17:46:57.137691 4837 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"pull QPS exceeded\"" pod="openstack-operators/manila-operator-controller-manager-864f6b75bf-665ds" podUID="b8bd1c51-4c79-44f9-b7d8-e43dd8118ba4" Jan 11 17:46:57 crc kubenswrapper[4837]: I0111 17:46:57.141522 4837 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-kwp7z"] Jan 11 17:46:57 crc kubenswrapper[4837]: I0111 17:46:57.148115 4837 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/test-operator-controller-manager-7cd8bc9dbb-shqx6"] Jan 11 17:46:57 crc kubenswrapper[4837]: I0111 17:46:57.256473 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/374c350e-a484-40a8-8563-45eb7f3eafd1-cert\") pod \"openstack-baremetal-operator-controller-manager-6b68b8b854pfgjq\" (UID: \"374c350e-a484-40a8-8563-45eb7f3eafd1\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-6b68b8b854pfgjq" Jan 11 17:46:57 crc kubenswrapper[4837]: E0111 17:46:57.256924 4837 secret.go:188] Couldn't get secret openstack-operators/openstack-baremetal-operator-webhook-server-cert: secret "openstack-baremetal-operator-webhook-server-cert" not found Jan 11 17:46:57 crc kubenswrapper[4837]: E0111 17:46:57.257166 4837 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/374c350e-a484-40a8-8563-45eb7f3eafd1-cert podName:374c350e-a484-40a8-8563-45eb7f3eafd1 nodeName:}" failed. No retries permitted until 2026-01-11 17:46:59.257149546 +0000 UTC m=+993.435342252 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/374c350e-a484-40a8-8563-45eb7f3eafd1-cert") pod "openstack-baremetal-operator-controller-manager-6b68b8b854pfgjq" (UID: "374c350e-a484-40a8-8563-45eb7f3eafd1") : secret "openstack-baremetal-operator-webhook-server-cert" not found Jan 11 17:46:57 crc kubenswrapper[4837]: I0111 17:46:57.666400 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/3081056a-171f-44ab-a8c4-57a3c40686c4-webhook-certs\") pod \"openstack-operator-controller-manager-5569b88c46-6jqzq\" (UID: \"3081056a-171f-44ab-a8c4-57a3c40686c4\") " pod="openstack-operators/openstack-operator-controller-manager-5569b88c46-6jqzq" Jan 11 17:46:57 crc kubenswrapper[4837]: I0111 17:46:57.666498 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/3081056a-171f-44ab-a8c4-57a3c40686c4-metrics-certs\") pod \"openstack-operator-controller-manager-5569b88c46-6jqzq\" (UID: \"3081056a-171f-44ab-a8c4-57a3c40686c4\") " pod="openstack-operators/openstack-operator-controller-manager-5569b88c46-6jqzq" Jan 11 17:46:57 crc kubenswrapper[4837]: E0111 17:46:57.666639 4837 secret.go:188] Couldn't get secret openstack-operators/metrics-server-cert: secret "metrics-server-cert" not found Jan 11 17:46:57 crc kubenswrapper[4837]: E0111 17:46:57.666705 4837 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/3081056a-171f-44ab-a8c4-57a3c40686c4-metrics-certs podName:3081056a-171f-44ab-a8c4-57a3c40686c4 nodeName:}" failed. No retries permitted until 2026-01-11 17:46:59.666671532 +0000 UTC m=+993.844864238 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/3081056a-171f-44ab-a8c4-57a3c40686c4-metrics-certs") pod "openstack-operator-controller-manager-5569b88c46-6jqzq" (UID: "3081056a-171f-44ab-a8c4-57a3c40686c4") : secret "metrics-server-cert" not found Jan 11 17:46:57 crc kubenswrapper[4837]: E0111 17:46:57.666958 4837 secret.go:188] Couldn't get secret openstack-operators/webhook-server-cert: secret "webhook-server-cert" not found Jan 11 17:46:57 crc kubenswrapper[4837]: E0111 17:46:57.667041 4837 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/3081056a-171f-44ab-a8c4-57a3c40686c4-webhook-certs podName:3081056a-171f-44ab-a8c4-57a3c40686c4 nodeName:}" failed. No retries permitted until 2026-01-11 17:46:59.667019711 +0000 UTC m=+993.845212477 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "webhook-certs" (UniqueName: "kubernetes.io/secret/3081056a-171f-44ab-a8c4-57a3c40686c4-webhook-certs") pod "openstack-operator-controller-manager-5569b88c46-6jqzq" (UID: "3081056a-171f-44ab-a8c4-57a3c40686c4") : secret "webhook-server-cert" not found Jan 11 17:46:57 crc kubenswrapper[4837]: I0111 17:46:57.986233 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/neutron-operator-controller-manager-cb4666565-x648q" event={"ID":"69296cc2-890b-439c-8151-9b10963bae3f","Type":"ContainerStarted","Data":"643bc4134bfc47e93e62c28639a0a11c0c53ce1dbdc1effa40c81b52672ac1ef"} Jan 11 17:46:57 crc kubenswrapper[4837]: I0111 17:46:57.988770 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/keystone-operator-controller-manager-767fdc4f47-jgrh6" event={"ID":"8bdb5237-cb95-4e0c-b52c-85a8a419506b","Type":"ContainerStarted","Data":"0ad89b537bd1ff32b77262f8e3249e56ecec11c4b93de8a0032a86fcc58a876c"} Jan 11 17:46:57 crc kubenswrapper[4837]: I0111 17:46:57.990108 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/octavia-operator-controller-manager-7fc9b76cf6-2lvvk" event={"ID":"b8022f77-44ba-493f-bed8-ad82fa1ca45a","Type":"ContainerStarted","Data":"a9bc6e38e6419b6b4f6679c638afdaac116061968fe3d4db77eb57ee62e25864"} Jan 11 17:46:57 crc kubenswrapper[4837]: I0111 17:46:57.992134 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/nova-operator-controller-manager-65849867d6-vm5gr" event={"ID":"b7039fa0-8e22-4369-abcd-baa005429b7b","Type":"ContainerStarted","Data":"ed1b471e506b043c0f776c63cb5618d8d37e92e97bea79156c7a43c899c1f938"} Jan 11 17:46:57 crc kubenswrapper[4837]: I0111 17:46:57.993216 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/placement-operator-controller-manager-686df47fcb-wf222" event={"ID":"ddce549f-ba1d-483d-b50b-4011c826bbff","Type":"ContainerStarted","Data":"896cd1a311bc033a7c1f858b74c4a1cb243e52305aabe638d4e077e54c0e5460"} Jan 11 17:46:57 crc kubenswrapper[4837]: I0111 17:46:57.994769 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/ovn-operator-controller-manager-55db956ddc-4txs5" event={"ID":"8fe4bbe3-9aed-4232-9036-d53346db80b2","Type":"ContainerStarted","Data":"34052e7fb9e60d504601f3ecc3564315e4910d009073ee60f90d792b9a150cb9"} Jan 11 17:46:57 crc kubenswrapper[4837]: I0111 17:46:57.996918 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-kwp7z" event={"ID":"c76abbe1-c9d2-414f-8c9a-372f8d5e17bc","Type":"ContainerStarted","Data":"e1c4b69b8e7c263d4f8fda9fdf1c51041473cec78defc65dffc7c12d18af3be2"} Jan 11 17:46:57 crc kubenswrapper[4837]: E0111 17:46:57.998415 4837 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"operator\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/rabbitmq-cluster-operator@sha256:893e66303c1b0bc1d00a299a3f0380bad55c8dc813c8a1c6a4aab379f5aa12a2\\\"\"" pod="openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-kwp7z" podUID="c76abbe1-c9d2-414f-8c9a-372f8d5e17bc" Jan 11 17:46:58 crc kubenswrapper[4837]: I0111 17:46:58.005336 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/test-operator-controller-manager-7cd8bc9dbb-shqx6" event={"ID":"04067843-8e2d-4a0c-8c68-2e321669b605","Type":"ContainerStarted","Data":"143174fd3f8b04225e731d145ad27d784c77d3a195e47dd6e636e6d4f877500a"} Jan 11 17:46:58 crc kubenswrapper[4837]: I0111 17:46:58.006692 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/mariadb-operator-controller-manager-c87fff755-7k4z8" event={"ID":"23a744d5-da8a-4fda-8c27-652e4f18d736","Type":"ContainerStarted","Data":"bde62ad1cb8078ea83c5f4df5fde7cc6745bc3a6244c5b86204ae44a1473d3b6"} Jan 11 17:46:58 crc kubenswrapper[4837]: E0111 17:46:58.006793 4837 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/test-operator@sha256:244a4906353b84899db16a89e1ebb64491c9f85e69327cb2a72b6da0142a6e5e\\\"\"" pod="openstack-operators/test-operator-controller-manager-7cd8bc9dbb-shqx6" podUID="04067843-8e2d-4a0c-8c68-2e321669b605" Jan 11 17:46:58 crc kubenswrapper[4837]: I0111 17:46:58.013818 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/ironic-operator-controller-manager-78757b4889-87jjr" event={"ID":"537b7dae-5831-4fa5-afba-a5c7e1229e61","Type":"ContainerStarted","Data":"b82c85dbcf96f0ceab8a070a1606c7dafb3ea67bf3442646f53b74fe7ad70840"} Jan 11 17:46:58 crc kubenswrapper[4837]: I0111 17:46:58.017061 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/manila-operator-controller-manager-864f6b75bf-665ds" event={"ID":"b8bd1c51-4c79-44f9-b7d8-e43dd8118ba4","Type":"ContainerStarted","Data":"100a0762c0f787a5f3807727c4387789c277919f6774a0d06e5bcde5c7964ca5"} Jan 11 17:46:58 crc kubenswrapper[4837]: I0111 17:46:58.018570 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/watcher-operator-controller-manager-64cd966744-jsqjz" event={"ID":"56dd103a-afaf-46fa-9cf3-f85418264d29","Type":"ContainerStarted","Data":"babc733aa7b408f001f3c9ada45116954c4434dcbfc2d528a262d3eafef8c988"} Jan 11 17:46:58 crc kubenswrapper[4837]: E0111 17:46:58.018688 4837 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/manila-operator@sha256:fd2e631e747c35a95f083418f5829d06c4b830f1fdb322368ff6190b9887ea32\\\"\"" pod="openstack-operators/manila-operator-controller-manager-864f6b75bf-665ds" podUID="b8bd1c51-4c79-44f9-b7d8-e43dd8118ba4" Jan 11 17:46:58 crc kubenswrapper[4837]: I0111 17:46:58.022421 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/swift-operator-controller-manager-85dd56d4cc-hspqd" event={"ID":"5ea53463-b9a9-4406-b27f-ab1324f4bdcc","Type":"ContainerStarted","Data":"3b79d51a0f60e696737e39eaf489ee7f2b1d19b9705d9c0dba5df56eac29649a"} Jan 11 17:46:58 crc kubenswrapper[4837]: I0111 17:46:58.024091 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/telemetry-operator-controller-manager-5f8f495fcf-tkj2c" event={"ID":"0296da23-fe5c-4f47-b26b-6d83da73bf31","Type":"ContainerStarted","Data":"a6b5edec02fb6c5c17250dc36479af91b08fc0dcd77fe1b83cebca7a0606f16a"} Jan 11 17:46:58 crc kubenswrapper[4837]: E0111 17:46:58.031538 4837 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/telemetry-operator@sha256:2e89109f5db66abf1afd15ef59bda35a53db40c5e59e020579ac5aa0acea1843\\\"\"" pod="openstack-operators/telemetry-operator-controller-manager-5f8f495fcf-tkj2c" podUID="0296da23-fe5c-4f47-b26b-6d83da73bf31" Jan 11 17:46:58 crc kubenswrapper[4837]: I0111 17:46:58.986525 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/c2312108-ddf5-4939-acc1-727557936791-cert\") pod \"infra-operator-controller-manager-77c48c7859-4hndt\" (UID: \"c2312108-ddf5-4939-acc1-727557936791\") " pod="openstack-operators/infra-operator-controller-manager-77c48c7859-4hndt" Jan 11 17:46:58 crc kubenswrapper[4837]: E0111 17:46:58.988789 4837 secret.go:188] Couldn't get secret openstack-operators/infra-operator-webhook-server-cert: secret "infra-operator-webhook-server-cert" not found Jan 11 17:46:58 crc kubenswrapper[4837]: E0111 17:46:58.988883 4837 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/c2312108-ddf5-4939-acc1-727557936791-cert podName:c2312108-ddf5-4939-acc1-727557936791 nodeName:}" failed. No retries permitted until 2026-01-11 17:47:02.988860333 +0000 UTC m=+997.167053109 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/c2312108-ddf5-4939-acc1-727557936791-cert") pod "infra-operator-controller-manager-77c48c7859-4hndt" (UID: "c2312108-ddf5-4939-acc1-727557936791") : secret "infra-operator-webhook-server-cert" not found Jan 11 17:46:59 crc kubenswrapper[4837]: E0111 17:46:59.032195 4837 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/test-operator@sha256:244a4906353b84899db16a89e1ebb64491c9f85e69327cb2a72b6da0142a6e5e\\\"\"" pod="openstack-operators/test-operator-controller-manager-7cd8bc9dbb-shqx6" podUID="04067843-8e2d-4a0c-8c68-2e321669b605" Jan 11 17:46:59 crc kubenswrapper[4837]: E0111 17:46:59.036922 4837 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/telemetry-operator@sha256:2e89109f5db66abf1afd15ef59bda35a53db40c5e59e020579ac5aa0acea1843\\\"\"" pod="openstack-operators/telemetry-operator-controller-manager-5f8f495fcf-tkj2c" podUID="0296da23-fe5c-4f47-b26b-6d83da73bf31" Jan 11 17:46:59 crc kubenswrapper[4837]: E0111 17:46:59.036998 4837 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/manila-operator@sha256:fd2e631e747c35a95f083418f5829d06c4b830f1fdb322368ff6190b9887ea32\\\"\"" pod="openstack-operators/manila-operator-controller-manager-864f6b75bf-665ds" podUID="b8bd1c51-4c79-44f9-b7d8-e43dd8118ba4" Jan 11 17:46:59 crc kubenswrapper[4837]: E0111 17:46:59.037045 4837 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"operator\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/rabbitmq-cluster-operator@sha256:893e66303c1b0bc1d00a299a3f0380bad55c8dc813c8a1c6a4aab379f5aa12a2\\\"\"" pod="openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-kwp7z" podUID="c76abbe1-c9d2-414f-8c9a-372f8d5e17bc" Jan 11 17:46:59 crc kubenswrapper[4837]: I0111 17:46:59.293033 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/374c350e-a484-40a8-8563-45eb7f3eafd1-cert\") pod \"openstack-baremetal-operator-controller-manager-6b68b8b854pfgjq\" (UID: \"374c350e-a484-40a8-8563-45eb7f3eafd1\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-6b68b8b854pfgjq" Jan 11 17:46:59 crc kubenswrapper[4837]: E0111 17:46:59.293185 4837 secret.go:188] Couldn't get secret openstack-operators/openstack-baremetal-operator-webhook-server-cert: secret "openstack-baremetal-operator-webhook-server-cert" not found Jan 11 17:46:59 crc kubenswrapper[4837]: E0111 17:46:59.293232 4837 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/374c350e-a484-40a8-8563-45eb7f3eafd1-cert podName:374c350e-a484-40a8-8563-45eb7f3eafd1 nodeName:}" failed. No retries permitted until 2026-01-11 17:47:03.293215358 +0000 UTC m=+997.471408064 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/374c350e-a484-40a8-8563-45eb7f3eafd1-cert") pod "openstack-baremetal-operator-controller-manager-6b68b8b854pfgjq" (UID: "374c350e-a484-40a8-8563-45eb7f3eafd1") : secret "openstack-baremetal-operator-webhook-server-cert" not found Jan 11 17:46:59 crc kubenswrapper[4837]: I0111 17:46:59.700079 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/3081056a-171f-44ab-a8c4-57a3c40686c4-metrics-certs\") pod \"openstack-operator-controller-manager-5569b88c46-6jqzq\" (UID: \"3081056a-171f-44ab-a8c4-57a3c40686c4\") " pod="openstack-operators/openstack-operator-controller-manager-5569b88c46-6jqzq" Jan 11 17:46:59 crc kubenswrapper[4837]: I0111 17:46:59.700173 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/3081056a-171f-44ab-a8c4-57a3c40686c4-webhook-certs\") pod \"openstack-operator-controller-manager-5569b88c46-6jqzq\" (UID: \"3081056a-171f-44ab-a8c4-57a3c40686c4\") " pod="openstack-operators/openstack-operator-controller-manager-5569b88c46-6jqzq" Jan 11 17:46:59 crc kubenswrapper[4837]: E0111 17:46:59.700300 4837 secret.go:188] Couldn't get secret openstack-operators/metrics-server-cert: secret "metrics-server-cert" not found Jan 11 17:46:59 crc kubenswrapper[4837]: E0111 17:46:59.700398 4837 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/3081056a-171f-44ab-a8c4-57a3c40686c4-metrics-certs podName:3081056a-171f-44ab-a8c4-57a3c40686c4 nodeName:}" failed. No retries permitted until 2026-01-11 17:47:03.700375991 +0000 UTC m=+997.878568697 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/3081056a-171f-44ab-a8c4-57a3c40686c4-metrics-certs") pod "openstack-operator-controller-manager-5569b88c46-6jqzq" (UID: "3081056a-171f-44ab-a8c4-57a3c40686c4") : secret "metrics-server-cert" not found Jan 11 17:46:59 crc kubenswrapper[4837]: E0111 17:46:59.700305 4837 secret.go:188] Couldn't get secret openstack-operators/webhook-server-cert: secret "webhook-server-cert" not found Jan 11 17:46:59 crc kubenswrapper[4837]: E0111 17:46:59.700488 4837 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/3081056a-171f-44ab-a8c4-57a3c40686c4-webhook-certs podName:3081056a-171f-44ab-a8c4-57a3c40686c4 nodeName:}" failed. No retries permitted until 2026-01-11 17:47:03.700471484 +0000 UTC m=+997.878664190 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "webhook-certs" (UniqueName: "kubernetes.io/secret/3081056a-171f-44ab-a8c4-57a3c40686c4-webhook-certs") pod "openstack-operator-controller-manager-5569b88c46-6jqzq" (UID: "3081056a-171f-44ab-a8c4-57a3c40686c4") : secret "webhook-server-cert" not found Jan 11 17:47:03 crc kubenswrapper[4837]: I0111 17:47:03.052299 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/c2312108-ddf5-4939-acc1-727557936791-cert\") pod \"infra-operator-controller-manager-77c48c7859-4hndt\" (UID: \"c2312108-ddf5-4939-acc1-727557936791\") " pod="openstack-operators/infra-operator-controller-manager-77c48c7859-4hndt" Jan 11 17:47:03 crc kubenswrapper[4837]: E0111 17:47:03.052510 4837 secret.go:188] Couldn't get secret openstack-operators/infra-operator-webhook-server-cert: secret "infra-operator-webhook-server-cert" not found Jan 11 17:47:03 crc kubenswrapper[4837]: E0111 17:47:03.052877 4837 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/c2312108-ddf5-4939-acc1-727557936791-cert podName:c2312108-ddf5-4939-acc1-727557936791 nodeName:}" failed. No retries permitted until 2026-01-11 17:47:11.05285594 +0000 UTC m=+1005.231048646 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/c2312108-ddf5-4939-acc1-727557936791-cert") pod "infra-operator-controller-manager-77c48c7859-4hndt" (UID: "c2312108-ddf5-4939-acc1-727557936791") : secret "infra-operator-webhook-server-cert" not found Jan 11 17:47:03 crc kubenswrapper[4837]: I0111 17:47:03.356985 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/374c350e-a484-40a8-8563-45eb7f3eafd1-cert\") pod \"openstack-baremetal-operator-controller-manager-6b68b8b854pfgjq\" (UID: \"374c350e-a484-40a8-8563-45eb7f3eafd1\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-6b68b8b854pfgjq" Jan 11 17:47:03 crc kubenswrapper[4837]: E0111 17:47:03.357147 4837 secret.go:188] Couldn't get secret openstack-operators/openstack-baremetal-operator-webhook-server-cert: secret "openstack-baremetal-operator-webhook-server-cert" not found Jan 11 17:47:03 crc kubenswrapper[4837]: E0111 17:47:03.357233 4837 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/374c350e-a484-40a8-8563-45eb7f3eafd1-cert podName:374c350e-a484-40a8-8563-45eb7f3eafd1 nodeName:}" failed. No retries permitted until 2026-01-11 17:47:11.357216224 +0000 UTC m=+1005.535408930 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/374c350e-a484-40a8-8563-45eb7f3eafd1-cert") pod "openstack-baremetal-operator-controller-manager-6b68b8b854pfgjq" (UID: "374c350e-a484-40a8-8563-45eb7f3eafd1") : secret "openstack-baremetal-operator-webhook-server-cert" not found Jan 11 17:47:03 crc kubenswrapper[4837]: I0111 17:47:03.761815 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/3081056a-171f-44ab-a8c4-57a3c40686c4-metrics-certs\") pod \"openstack-operator-controller-manager-5569b88c46-6jqzq\" (UID: \"3081056a-171f-44ab-a8c4-57a3c40686c4\") " pod="openstack-operators/openstack-operator-controller-manager-5569b88c46-6jqzq" Jan 11 17:47:03 crc kubenswrapper[4837]: I0111 17:47:03.761941 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/3081056a-171f-44ab-a8c4-57a3c40686c4-webhook-certs\") pod \"openstack-operator-controller-manager-5569b88c46-6jqzq\" (UID: \"3081056a-171f-44ab-a8c4-57a3c40686c4\") " pod="openstack-operators/openstack-operator-controller-manager-5569b88c46-6jqzq" Jan 11 17:47:03 crc kubenswrapper[4837]: E0111 17:47:03.761999 4837 secret.go:188] Couldn't get secret openstack-operators/metrics-server-cert: secret "metrics-server-cert" not found Jan 11 17:47:03 crc kubenswrapper[4837]: E0111 17:47:03.762068 4837 secret.go:188] Couldn't get secret openstack-operators/webhook-server-cert: secret "webhook-server-cert" not found Jan 11 17:47:03 crc kubenswrapper[4837]: E0111 17:47:03.762078 4837 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/3081056a-171f-44ab-a8c4-57a3c40686c4-metrics-certs podName:3081056a-171f-44ab-a8c4-57a3c40686c4 nodeName:}" failed. No retries permitted until 2026-01-11 17:47:11.762059796 +0000 UTC m=+1005.940252502 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/3081056a-171f-44ab-a8c4-57a3c40686c4-metrics-certs") pod "openstack-operator-controller-manager-5569b88c46-6jqzq" (UID: "3081056a-171f-44ab-a8c4-57a3c40686c4") : secret "metrics-server-cert" not found Jan 11 17:47:03 crc kubenswrapper[4837]: E0111 17:47:03.762113 4837 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/3081056a-171f-44ab-a8c4-57a3c40686c4-webhook-certs podName:3081056a-171f-44ab-a8c4-57a3c40686c4 nodeName:}" failed. No retries permitted until 2026-01-11 17:47:11.762099287 +0000 UTC m=+1005.940291993 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "webhook-certs" (UniqueName: "kubernetes.io/secret/3081056a-171f-44ab-a8c4-57a3c40686c4-webhook-certs") pod "openstack-operator-controller-manager-5569b88c46-6jqzq" (UID: "3081056a-171f-44ab-a8c4-57a3c40686c4") : secret "webhook-server-cert" not found Jan 11 17:47:09 crc kubenswrapper[4837]: I0111 17:47:09.444928 4837 patch_prober.go:28] interesting pod/machine-config-daemon-pqnst container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Jan 11 17:47:09 crc kubenswrapper[4837]: I0111 17:47:09.445534 4837 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-pqnst" podUID="1cf6fa66-290a-4e29-bafc-e60185a22fe2" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Jan 11 17:47:11 crc kubenswrapper[4837]: I0111 17:47:11.054870 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/c2312108-ddf5-4939-acc1-727557936791-cert\") pod \"infra-operator-controller-manager-77c48c7859-4hndt\" (UID: \"c2312108-ddf5-4939-acc1-727557936791\") " pod="openstack-operators/infra-operator-controller-manager-77c48c7859-4hndt" Jan 11 17:47:11 crc kubenswrapper[4837]: E0111 17:47:11.055078 4837 secret.go:188] Couldn't get secret openstack-operators/infra-operator-webhook-server-cert: secret "infra-operator-webhook-server-cert" not found Jan 11 17:47:11 crc kubenswrapper[4837]: E0111 17:47:11.055360 4837 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/c2312108-ddf5-4939-acc1-727557936791-cert podName:c2312108-ddf5-4939-acc1-727557936791 nodeName:}" failed. No retries permitted until 2026-01-11 17:47:27.055342036 +0000 UTC m=+1021.233534752 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/c2312108-ddf5-4939-acc1-727557936791-cert") pod "infra-operator-controller-manager-77c48c7859-4hndt" (UID: "c2312108-ddf5-4939-acc1-727557936791") : secret "infra-operator-webhook-server-cert" not found Jan 11 17:47:11 crc kubenswrapper[4837]: I0111 17:47:11.359424 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/374c350e-a484-40a8-8563-45eb7f3eafd1-cert\") pod \"openstack-baremetal-operator-controller-manager-6b68b8b854pfgjq\" (UID: \"374c350e-a484-40a8-8563-45eb7f3eafd1\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-6b68b8b854pfgjq" Jan 11 17:47:11 crc kubenswrapper[4837]: E0111 17:47:11.359659 4837 secret.go:188] Couldn't get secret openstack-operators/openstack-baremetal-operator-webhook-server-cert: secret "openstack-baremetal-operator-webhook-server-cert" not found Jan 11 17:47:11 crc kubenswrapper[4837]: E0111 17:47:11.359871 4837 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/374c350e-a484-40a8-8563-45eb7f3eafd1-cert podName:374c350e-a484-40a8-8563-45eb7f3eafd1 nodeName:}" failed. No retries permitted until 2026-01-11 17:47:27.359830114 +0000 UTC m=+1021.538022850 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/374c350e-a484-40a8-8563-45eb7f3eafd1-cert") pod "openstack-baremetal-operator-controller-manager-6b68b8b854pfgjq" (UID: "374c350e-a484-40a8-8563-45eb7f3eafd1") : secret "openstack-baremetal-operator-webhook-server-cert" not found Jan 11 17:47:11 crc kubenswrapper[4837]: I0111 17:47:11.765105 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/3081056a-171f-44ab-a8c4-57a3c40686c4-webhook-certs\") pod \"openstack-operator-controller-manager-5569b88c46-6jqzq\" (UID: \"3081056a-171f-44ab-a8c4-57a3c40686c4\") " pod="openstack-operators/openstack-operator-controller-manager-5569b88c46-6jqzq" Jan 11 17:47:11 crc kubenswrapper[4837]: I0111 17:47:11.765200 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/3081056a-171f-44ab-a8c4-57a3c40686c4-metrics-certs\") pod \"openstack-operator-controller-manager-5569b88c46-6jqzq\" (UID: \"3081056a-171f-44ab-a8c4-57a3c40686c4\") " pod="openstack-operators/openstack-operator-controller-manager-5569b88c46-6jqzq" Jan 11 17:47:11 crc kubenswrapper[4837]: E0111 17:47:11.765294 4837 secret.go:188] Couldn't get secret openstack-operators/webhook-server-cert: secret "webhook-server-cert" not found Jan 11 17:47:11 crc kubenswrapper[4837]: E0111 17:47:11.765396 4837 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/3081056a-171f-44ab-a8c4-57a3c40686c4-webhook-certs podName:3081056a-171f-44ab-a8c4-57a3c40686c4 nodeName:}" failed. No retries permitted until 2026-01-11 17:47:27.765374674 +0000 UTC m=+1021.943567390 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "webhook-certs" (UniqueName: "kubernetes.io/secret/3081056a-171f-44ab-a8c4-57a3c40686c4-webhook-certs") pod "openstack-operator-controller-manager-5569b88c46-6jqzq" (UID: "3081056a-171f-44ab-a8c4-57a3c40686c4") : secret "webhook-server-cert" not found Jan 11 17:47:11 crc kubenswrapper[4837]: E0111 17:47:11.765412 4837 secret.go:188] Couldn't get secret openstack-operators/metrics-server-cert: secret "metrics-server-cert" not found Jan 11 17:47:11 crc kubenswrapper[4837]: E0111 17:47:11.765486 4837 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/3081056a-171f-44ab-a8c4-57a3c40686c4-metrics-certs podName:3081056a-171f-44ab-a8c4-57a3c40686c4 nodeName:}" failed. No retries permitted until 2026-01-11 17:47:27.765466767 +0000 UTC m=+1021.943659473 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/3081056a-171f-44ab-a8c4-57a3c40686c4-metrics-certs") pod "openstack-operator-controller-manager-5569b88c46-6jqzq" (UID: "3081056a-171f-44ab-a8c4-57a3c40686c4") : secret "metrics-server-cert" not found Jan 11 17:47:25 crc kubenswrapper[4837]: E0111 17:47:25.480374 4837 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/openstack-k8s-operators/octavia-operator@sha256:ab629ec4ce57b5cde9cd6d75069e68edca441b97b7b5a3f58804e2e61766b729" Jan 11 17:47:25 crc kubenswrapper[4837]: E0111 17:47:25.481303 4837 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/octavia-operator@sha256:ab629ec4ce57b5cde9cd6d75069e68edca441b97b7b5a3f58804e2e61766b729,Command:[/manager],Args:[--leader-elect --health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-g5vwj,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod octavia-operator-controller-manager-7fc9b76cf6-2lvvk_openstack-operators(b8022f77-44ba-493f-bed8-ad82fa1ca45a): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Jan 11 17:47:25 crc kubenswrapper[4837]: E0111 17:47:25.482708 4837 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack-operators/octavia-operator-controller-manager-7fc9b76cf6-2lvvk" podUID="b8022f77-44ba-493f-bed8-ad82fa1ca45a" Jan 11 17:47:27 crc kubenswrapper[4837]: I0111 17:47:27.109662 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/c2312108-ddf5-4939-acc1-727557936791-cert\") pod \"infra-operator-controller-manager-77c48c7859-4hndt\" (UID: \"c2312108-ddf5-4939-acc1-727557936791\") " pod="openstack-operators/infra-operator-controller-manager-77c48c7859-4hndt" Jan 11 17:47:27 crc kubenswrapper[4837]: I0111 17:47:27.118516 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/c2312108-ddf5-4939-acc1-727557936791-cert\") pod \"infra-operator-controller-manager-77c48c7859-4hndt\" (UID: \"c2312108-ddf5-4939-acc1-727557936791\") " pod="openstack-operators/infra-operator-controller-manager-77c48c7859-4hndt" Jan 11 17:47:27 crc kubenswrapper[4837]: I0111 17:47:27.301258 4837 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"infra-operator-controller-manager-dockercfg-gxtfq" Jan 11 17:47:27 crc kubenswrapper[4837]: I0111 17:47:27.309130 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/infra-operator-controller-manager-77c48c7859-4hndt" Jan 11 17:47:27 crc kubenswrapper[4837]: I0111 17:47:27.415262 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/374c350e-a484-40a8-8563-45eb7f3eafd1-cert\") pod \"openstack-baremetal-operator-controller-manager-6b68b8b854pfgjq\" (UID: \"374c350e-a484-40a8-8563-45eb7f3eafd1\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-6b68b8b854pfgjq" Jan 11 17:47:27 crc kubenswrapper[4837]: I0111 17:47:27.423282 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/374c350e-a484-40a8-8563-45eb7f3eafd1-cert\") pod \"openstack-baremetal-operator-controller-manager-6b68b8b854pfgjq\" (UID: \"374c350e-a484-40a8-8563-45eb7f3eafd1\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-6b68b8b854pfgjq" Jan 11 17:47:27 crc kubenswrapper[4837]: I0111 17:47:27.692823 4837 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"openstack-baremetal-operator-controller-manager-dockercfg-2dhjx" Jan 11 17:47:27 crc kubenswrapper[4837]: I0111 17:47:27.701170 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-baremetal-operator-controller-manager-6b68b8b854pfgjq" Jan 11 17:47:27 crc kubenswrapper[4837]: I0111 17:47:27.822076 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/3081056a-171f-44ab-a8c4-57a3c40686c4-webhook-certs\") pod \"openstack-operator-controller-manager-5569b88c46-6jqzq\" (UID: \"3081056a-171f-44ab-a8c4-57a3c40686c4\") " pod="openstack-operators/openstack-operator-controller-manager-5569b88c46-6jqzq" Jan 11 17:47:27 crc kubenswrapper[4837]: I0111 17:47:27.822237 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/3081056a-171f-44ab-a8c4-57a3c40686c4-metrics-certs\") pod \"openstack-operator-controller-manager-5569b88c46-6jqzq\" (UID: \"3081056a-171f-44ab-a8c4-57a3c40686c4\") " pod="openstack-operators/openstack-operator-controller-manager-5569b88c46-6jqzq" Jan 11 17:47:27 crc kubenswrapper[4837]: I0111 17:47:27.830153 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/3081056a-171f-44ab-a8c4-57a3c40686c4-metrics-certs\") pod \"openstack-operator-controller-manager-5569b88c46-6jqzq\" (UID: \"3081056a-171f-44ab-a8c4-57a3c40686c4\") " pod="openstack-operators/openstack-operator-controller-manager-5569b88c46-6jqzq" Jan 11 17:47:27 crc kubenswrapper[4837]: I0111 17:47:27.830614 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/3081056a-171f-44ab-a8c4-57a3c40686c4-webhook-certs\") pod \"openstack-operator-controller-manager-5569b88c46-6jqzq\" (UID: \"3081056a-171f-44ab-a8c4-57a3c40686c4\") " pod="openstack-operators/openstack-operator-controller-manager-5569b88c46-6jqzq" Jan 11 17:47:28 crc kubenswrapper[4837]: I0111 17:47:28.066467 4837 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"openstack-operator-controller-manager-dockercfg-9ts8d" Jan 11 17:47:28 crc kubenswrapper[4837]: I0111 17:47:28.074849 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-controller-manager-5569b88c46-6jqzq" Jan 11 17:47:29 crc kubenswrapper[4837]: E0111 17:47:29.309332 4837 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/octavia-operator@sha256:ab629ec4ce57b5cde9cd6d75069e68edca441b97b7b5a3f58804e2e61766b729\\\"\"" pod="openstack-operators/octavia-operator-controller-manager-7fc9b76cf6-2lvvk" podUID="b8022f77-44ba-493f-bed8-ad82fa1ca45a" Jan 11 17:47:29 crc kubenswrapper[4837]: E0111 17:47:29.865227 4837 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/openstack-k8s-operators/placement-operator@sha256:146961cac3291daf96c1ca2bc7bd52bc94d1f4787a0770e23205c2c9beb0d737" Jan 11 17:47:29 crc kubenswrapper[4837]: E0111 17:47:29.865705 4837 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/placement-operator@sha256:146961cac3291daf96c1ca2bc7bd52bc94d1f4787a0770e23205c2c9beb0d737,Command:[/manager],Args:[--leader-elect --health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-sl5b5,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod placement-operator-controller-manager-686df47fcb-wf222_openstack-operators(ddce549f-ba1d-483d-b50b-4011c826bbff): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Jan 11 17:47:29 crc kubenswrapper[4837]: E0111 17:47:29.866854 4837 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack-operators/placement-operator-controller-manager-686df47fcb-wf222" podUID="ddce549f-ba1d-483d-b50b-4011c826bbff" Jan 11 17:47:30 crc kubenswrapper[4837]: E0111 17:47:30.287396 4837 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/placement-operator@sha256:146961cac3291daf96c1ca2bc7bd52bc94d1f4787a0770e23205c2c9beb0d737\\\"\"" pod="openstack-operators/placement-operator-controller-manager-686df47fcb-wf222" podUID="ddce549f-ba1d-483d-b50b-4011c826bbff" Jan 11 17:47:31 crc kubenswrapper[4837]: E0111 17:47:31.834036 4837 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/openstack-k8s-operators/heat-operator@sha256:2f9a2f064448faebbae58f52d564dc0e8e39bed0fc12bd6b9fe925e42f1b5492" Jan 11 17:47:31 crc kubenswrapper[4837]: E0111 17:47:31.834255 4837 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/heat-operator@sha256:2f9a2f064448faebbae58f52d564dc0e8e39bed0fc12bd6b9fe925e42f1b5492,Command:[/manager],Args:[--leader-elect --health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-wsbfn,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod heat-operator-controller-manager-594c8c9d5d-cz9h6_openstack-operators(61f99042-0859-46d8-9af9-727352a885ee): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Jan 11 17:47:31 crc kubenswrapper[4837]: E0111 17:47:31.835444 4837 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack-operators/heat-operator-controller-manager-594c8c9d5d-cz9h6" podUID="61f99042-0859-46d8-9af9-727352a885ee" Jan 11 17:47:32 crc kubenswrapper[4837]: E0111 17:47:32.301035 4837 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/heat-operator@sha256:2f9a2f064448faebbae58f52d564dc0e8e39bed0fc12bd6b9fe925e42f1b5492\\\"\"" pod="openstack-operators/heat-operator-controller-manager-594c8c9d5d-cz9h6" podUID="61f99042-0859-46d8-9af9-727352a885ee" Jan 11 17:47:35 crc kubenswrapper[4837]: E0111 17:47:35.451069 4837 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/openstack-k8s-operators/keystone-operator@sha256:393d7567eef4fd05af625389f5a7384c6bb75108b21b06183f1f5e33aac5417e" Jan 11 17:47:35 crc kubenswrapper[4837]: E0111 17:47:35.451494 4837 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/keystone-operator@sha256:393d7567eef4fd05af625389f5a7384c6bb75108b21b06183f1f5e33aac5417e,Command:[/manager],Args:[--leader-elect --health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-ksczq,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod keystone-operator-controller-manager-767fdc4f47-jgrh6_openstack-operators(8bdb5237-cb95-4e0c-b52c-85a8a419506b): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Jan 11 17:47:35 crc kubenswrapper[4837]: E0111 17:47:35.452821 4837 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack-operators/keystone-operator-controller-manager-767fdc4f47-jgrh6" podUID="8bdb5237-cb95-4e0c-b52c-85a8a419506b" Jan 11 17:47:36 crc kubenswrapper[4837]: E0111 17:47:36.324924 4837 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/keystone-operator@sha256:393d7567eef4fd05af625389f5a7384c6bb75108b21b06183f1f5e33aac5417e\\\"\"" pod="openstack-operators/keystone-operator-controller-manager-767fdc4f47-jgrh6" podUID="8bdb5237-cb95-4e0c-b52c-85a8a419506b" Jan 11 17:47:36 crc kubenswrapper[4837]: E0111 17:47:36.469746 4837 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/openstack-k8s-operators/nova-operator@sha256:6defa56fc6a5bfbd5b27d28ff7b1c7bc89b24b2ef956e2a6d97b2726f668a231" Jan 11 17:47:36 crc kubenswrapper[4837]: E0111 17:47:36.469945 4837 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/nova-operator@sha256:6defa56fc6a5bfbd5b27d28ff7b1c7bc89b24b2ef956e2a6d97b2726f668a231,Command:[/manager],Args:[--leader-elect --health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-kxmth,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod nova-operator-controller-manager-65849867d6-vm5gr_openstack-operators(b7039fa0-8e22-4369-abcd-baa005429b7b): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Jan 11 17:47:36 crc kubenswrapper[4837]: E0111 17:47:36.472335 4837 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack-operators/nova-operator-controller-manager-65849867d6-vm5gr" podUID="b7039fa0-8e22-4369-abcd-baa005429b7b" Jan 11 17:47:37 crc kubenswrapper[4837]: E0111 17:47:37.329952 4837 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/nova-operator@sha256:6defa56fc6a5bfbd5b27d28ff7b1c7bc89b24b2ef956e2a6d97b2726f668a231\\\"\"" pod="openstack-operators/nova-operator-controller-manager-65849867d6-vm5gr" podUID="b7039fa0-8e22-4369-abcd-baa005429b7b" Jan 11 17:47:39 crc kubenswrapper[4837]: I0111 17:47:39.444350 4837 patch_prober.go:28] interesting pod/machine-config-daemon-pqnst container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Jan 11 17:47:39 crc kubenswrapper[4837]: I0111 17:47:39.444736 4837 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-pqnst" podUID="1cf6fa66-290a-4e29-bafc-e60185a22fe2" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Jan 11 17:47:39 crc kubenswrapper[4837]: I0111 17:47:39.444789 4837 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-pqnst" Jan 11 17:47:39 crc kubenswrapper[4837]: I0111 17:47:39.445382 4837 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"fefbb2dded3dd3108c6af21a68e03b8d09000012756ad3a506890b3d9bced335"} pod="openshift-machine-config-operator/machine-config-daemon-pqnst" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Jan 11 17:47:39 crc kubenswrapper[4837]: I0111 17:47:39.445451 4837 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-pqnst" podUID="1cf6fa66-290a-4e29-bafc-e60185a22fe2" containerName="machine-config-daemon" containerID="cri-o://fefbb2dded3dd3108c6af21a68e03b8d09000012756ad3a506890b3d9bced335" gracePeriod=600 Jan 11 17:47:40 crc kubenswrapper[4837]: I0111 17:47:40.349022 4837 generic.go:334] "Generic (PLEG): container finished" podID="1cf6fa66-290a-4e29-bafc-e60185a22fe2" containerID="fefbb2dded3dd3108c6af21a68e03b8d09000012756ad3a506890b3d9bced335" exitCode=0 Jan 11 17:47:40 crc kubenswrapper[4837]: I0111 17:47:40.349108 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-pqnst" event={"ID":"1cf6fa66-290a-4e29-bafc-e60185a22fe2","Type":"ContainerDied","Data":"fefbb2dded3dd3108c6af21a68e03b8d09000012756ad3a506890b3d9bced335"} Jan 11 17:47:40 crc kubenswrapper[4837]: I0111 17:47:40.349407 4837 scope.go:117] "RemoveContainer" containerID="6987b5d3ba894eab85f4ec92caa53aa44c7a42f0a8df2d9713c40a8de9354658" Jan 11 17:47:41 crc kubenswrapper[4837]: I0111 17:47:41.233581 4837 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-baremetal-operator-controller-manager-6b68b8b854pfgjq"] Jan 11 17:47:41 crc kubenswrapper[4837]: W0111 17:47:41.269709 4837 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod374c350e_a484_40a8_8563_45eb7f3eafd1.slice/crio-332f79fe5feb3e7ce10434e493ee75441cf74f951d31d1406dd07bd67fe9fc4b WatchSource:0}: Error finding container 332f79fe5feb3e7ce10434e493ee75441cf74f951d31d1406dd07bd67fe9fc4b: Status 404 returned error can't find the container with id 332f79fe5feb3e7ce10434e493ee75441cf74f951d31d1406dd07bd67fe9fc4b Jan 11 17:47:41 crc kubenswrapper[4837]: I0111 17:47:41.272731 4837 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Jan 11 17:47:41 crc kubenswrapper[4837]: I0111 17:47:41.352866 4837 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/infra-operator-controller-manager-77c48c7859-4hndt"] Jan 11 17:47:41 crc kubenswrapper[4837]: I0111 17:47:41.358371 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-baremetal-operator-controller-manager-6b68b8b854pfgjq" event={"ID":"374c350e-a484-40a8-8563-45eb7f3eafd1","Type":"ContainerStarted","Data":"332f79fe5feb3e7ce10434e493ee75441cf74f951d31d1406dd07bd67fe9fc4b"} Jan 11 17:47:41 crc kubenswrapper[4837]: I0111 17:47:41.429589 4837 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-operator-controller-manager-5569b88c46-6jqzq"] Jan 11 17:47:41 crc kubenswrapper[4837]: W0111 17:47:41.434323 4837 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod3081056a_171f_44ab_a8c4_57a3c40686c4.slice/crio-bd7fada25b9aef53ff0da9566184dfca9053176373806cc59595a0fa44b93bee WatchSource:0}: Error finding container bd7fada25b9aef53ff0da9566184dfca9053176373806cc59595a0fa44b93bee: Status 404 returned error can't find the container with id bd7fada25b9aef53ff0da9566184dfca9053176373806cc59595a0fa44b93bee Jan 11 17:47:42 crc kubenswrapper[4837]: I0111 17:47:42.379418 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-controller-manager-5569b88c46-6jqzq" event={"ID":"3081056a-171f-44ab-a8c4-57a3c40686c4","Type":"ContainerStarted","Data":"bd7fada25b9aef53ff0da9566184dfca9053176373806cc59595a0fa44b93bee"} Jan 11 17:47:42 crc kubenswrapper[4837]: I0111 17:47:42.382359 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/infra-operator-controller-manager-77c48c7859-4hndt" event={"ID":"c2312108-ddf5-4939-acc1-727557936791","Type":"ContainerStarted","Data":"6cd2c2f1273ebbaae0762f445281118c40bce778009441c150ee33b979428615"} Jan 11 17:47:43 crc kubenswrapper[4837]: I0111 17:47:43.393649 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/designate-operator-controller-manager-9f958b845-26d4f" event={"ID":"63384d88-7d49-4951-8ccd-10871b0b18ad","Type":"ContainerStarted","Data":"1f70a4de439d2f81de38e065d5040f4b6ae2e613151e71953e1d9dac10b15f00"} Jan 11 17:47:44 crc kubenswrapper[4837]: I0111 17:47:44.400158 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/horizon-operator-controller-manager-77d5c5b54f-75fgx" event={"ID":"fc05ccce-2544-4a54-bdf8-ec1b792ac1ba","Type":"ContainerStarted","Data":"0035e2826f0b69d657a63572ec2d2cc1dc405e74df1d2e027be3a3d5db18517c"} Jan 11 17:47:44 crc kubenswrapper[4837]: I0111 17:47:44.400475 4837 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/horizon-operator-controller-manager-77d5c5b54f-75fgx" Jan 11 17:47:44 crc kubenswrapper[4837]: I0111 17:47:44.401353 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/glance-operator-controller-manager-c6994669c-9ptmm" event={"ID":"c4d04eda-5046-43cd-b407-ed14ec61cbd6","Type":"ContainerStarted","Data":"1406a0559da0eb50052290a45922be0a4067b8b3f0fc9d4a3103d28affc3a340"} Jan 11 17:47:44 crc kubenswrapper[4837]: I0111 17:47:44.401525 4837 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/glance-operator-controller-manager-c6994669c-9ptmm" Jan 11 17:47:44 crc kubenswrapper[4837]: I0111 17:47:44.402944 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/ovn-operator-controller-manager-55db956ddc-4txs5" event={"ID":"8fe4bbe3-9aed-4232-9036-d53346db80b2","Type":"ContainerStarted","Data":"97ce7d73aab6c389c63b23661a95811056f1cdd2bea0f284b6ccaa2084053423"} Jan 11 17:47:44 crc kubenswrapper[4837]: I0111 17:47:44.403061 4837 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/ovn-operator-controller-manager-55db956ddc-4txs5" Jan 11 17:47:44 crc kubenswrapper[4837]: I0111 17:47:44.404317 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/manila-operator-controller-manager-864f6b75bf-665ds" event={"ID":"b8bd1c51-4c79-44f9-b7d8-e43dd8118ba4","Type":"ContainerStarted","Data":"44b0782f882e83b1414118796572a2a0fcec5d02f9201b74a6ce02fd6b89fcdf"} Jan 11 17:47:44 crc kubenswrapper[4837]: I0111 17:47:44.404480 4837 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/manila-operator-controller-manager-864f6b75bf-665ds" Jan 11 17:47:44 crc kubenswrapper[4837]: I0111 17:47:44.405868 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-controller-manager-5569b88c46-6jqzq" event={"ID":"3081056a-171f-44ab-a8c4-57a3c40686c4","Type":"ContainerStarted","Data":"956edfdc6b1bd8003e397b3d456e67b6c18f17c09a7ae5186c42ee3106da5490"} Jan 11 17:47:44 crc kubenswrapper[4837]: I0111 17:47:44.405995 4837 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/openstack-operator-controller-manager-5569b88c46-6jqzq" Jan 11 17:47:44 crc kubenswrapper[4837]: I0111 17:47:44.407887 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-pqnst" event={"ID":"1cf6fa66-290a-4e29-bafc-e60185a22fe2","Type":"ContainerStarted","Data":"018458ba654491d89959a574a332d71c982ec5d0e53864759d97887f8cd68688"} Jan 11 17:47:44 crc kubenswrapper[4837]: I0111 17:47:44.409254 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/mariadb-operator-controller-manager-c87fff755-7k4z8" event={"ID":"23a744d5-da8a-4fda-8c27-652e4f18d736","Type":"ContainerStarted","Data":"289b4318e1b52b43325376e47511286758fad29ffbb1f70ef944b1b616e9670a"} Jan 11 17:47:44 crc kubenswrapper[4837]: I0111 17:47:44.409402 4837 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/mariadb-operator-controller-manager-c87fff755-7k4z8" Jan 11 17:47:44 crc kubenswrapper[4837]: I0111 17:47:44.410866 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/ironic-operator-controller-manager-78757b4889-87jjr" event={"ID":"537b7dae-5831-4fa5-afba-a5c7e1229e61","Type":"ContainerStarted","Data":"4c8b29c308e389ed7a710f095b31b9321024ab5d04e1dac64a32aecc57c2029e"} Jan 11 17:47:44 crc kubenswrapper[4837]: I0111 17:47:44.410989 4837 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/ironic-operator-controller-manager-78757b4889-87jjr" Jan 11 17:47:44 crc kubenswrapper[4837]: I0111 17:47:44.412282 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/telemetry-operator-controller-manager-5f8f495fcf-tkj2c" event={"ID":"0296da23-fe5c-4f47-b26b-6d83da73bf31","Type":"ContainerStarted","Data":"2a17c0347764747f76f78a4363f2b50c05d575fe093be0509849152dc3091986"} Jan 11 17:47:44 crc kubenswrapper[4837]: I0111 17:47:44.412608 4837 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/telemetry-operator-controller-manager-5f8f495fcf-tkj2c" Jan 11 17:47:44 crc kubenswrapper[4837]: I0111 17:47:44.413527 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/test-operator-controller-manager-7cd8bc9dbb-shqx6" event={"ID":"04067843-8e2d-4a0c-8c68-2e321669b605","Type":"ContainerStarted","Data":"9636afef7618a07564c5d5f8635d885281cb9e363b9fbe29e374e258aff38703"} Jan 11 17:47:44 crc kubenswrapper[4837]: I0111 17:47:44.413734 4837 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/test-operator-controller-manager-7cd8bc9dbb-shqx6" Jan 11 17:47:44 crc kubenswrapper[4837]: I0111 17:47:44.414915 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/barbican-operator-controller-manager-7ddb5c749-bv24m" event={"ID":"02e82478-6974-4ae1-b8de-57688876d070","Type":"ContainerStarted","Data":"07cb4251ab0dea36508bd1a14f83f250b2129edb356c2449ffcb8f0942301b58"} Jan 11 17:47:44 crc kubenswrapper[4837]: I0111 17:47:44.414986 4837 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/barbican-operator-controller-manager-7ddb5c749-bv24m" Jan 11 17:47:44 crc kubenswrapper[4837]: I0111 17:47:44.416221 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/watcher-operator-controller-manager-64cd966744-jsqjz" event={"ID":"56dd103a-afaf-46fa-9cf3-f85418264d29","Type":"ContainerStarted","Data":"8dcbe8abfba40273e19fd01203330a0a711f258a5acfbafdf7d78142a892a451"} Jan 11 17:47:44 crc kubenswrapper[4837]: I0111 17:47:44.416350 4837 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/watcher-operator-controller-manager-64cd966744-jsqjz" Jan 11 17:47:44 crc kubenswrapper[4837]: I0111 17:47:44.417753 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/neutron-operator-controller-manager-cb4666565-x648q" event={"ID":"69296cc2-890b-439c-8151-9b10963bae3f","Type":"ContainerStarted","Data":"e017b7ca3ab386b17c598783807bbf885c146808e0012d9f48b05e19beaed647"} Jan 11 17:47:44 crc kubenswrapper[4837]: I0111 17:47:44.417797 4837 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/neutron-operator-controller-manager-cb4666565-x648q" Jan 11 17:47:44 crc kubenswrapper[4837]: I0111 17:47:44.419042 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/cinder-operator-controller-manager-9b68f5989-6fhn7" event={"ID":"eb2c9390-f27a-46b0-9249-3e9bdc0c99e3","Type":"ContainerStarted","Data":"66f69d413786b177aa76263d1167eb431532da9101faea88e53e9f431f89ad97"} Jan 11 17:47:44 crc kubenswrapper[4837]: I0111 17:47:44.419157 4837 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/cinder-operator-controller-manager-9b68f5989-6fhn7" Jan 11 17:47:44 crc kubenswrapper[4837]: I0111 17:47:44.420312 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/swift-operator-controller-manager-85dd56d4cc-hspqd" event={"ID":"5ea53463-b9a9-4406-b27f-ab1324f4bdcc","Type":"ContainerStarted","Data":"649cd4fdd1b913fcbc3d0ad539efa38f2305b7a9b4c33f7c75464de386b6ae66"} Jan 11 17:47:44 crc kubenswrapper[4837]: I0111 17:47:44.420599 4837 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/designate-operator-controller-manager-9f958b845-26d4f" Jan 11 17:47:44 crc kubenswrapper[4837]: I0111 17:47:44.437050 4837 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/horizon-operator-controller-manager-77d5c5b54f-75fgx" podStartSLOduration=9.256161872 podStartE2EDuration="49.437032472s" podCreationTimestamp="2026-01-11 17:46:55 +0000 UTC" firstStartedPulling="2026-01-11 17:46:56.272637204 +0000 UTC m=+990.450829910" lastFinishedPulling="2026-01-11 17:47:36.453507804 +0000 UTC m=+1030.631700510" observedRunningTime="2026-01-11 17:47:44.43251693 +0000 UTC m=+1038.610709636" watchObservedRunningTime="2026-01-11 17:47:44.437032472 +0000 UTC m=+1038.615225178" Jan 11 17:47:44 crc kubenswrapper[4837]: I0111 17:47:44.452500 4837 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/watcher-operator-controller-manager-64cd966744-jsqjz" podStartSLOduration=12.08527509 podStartE2EDuration="49.452481546s" podCreationTimestamp="2026-01-11 17:46:55 +0000 UTC" firstStartedPulling="2026-01-11 17:46:57.121837296 +0000 UTC m=+991.300030002" lastFinishedPulling="2026-01-11 17:47:34.489043752 +0000 UTC m=+1028.667236458" observedRunningTime="2026-01-11 17:47:44.450439521 +0000 UTC m=+1038.628632227" watchObservedRunningTime="2026-01-11 17:47:44.452481546 +0000 UTC m=+1038.630674252" Jan 11 17:47:44 crc kubenswrapper[4837]: I0111 17:47:44.495009 4837 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/cinder-operator-controller-manager-9b68f5989-6fhn7" podStartSLOduration=13.282330475 podStartE2EDuration="50.494996816s" podCreationTimestamp="2026-01-11 17:46:54 +0000 UTC" firstStartedPulling="2026-01-11 17:46:56.22480716 +0000 UTC m=+990.402999866" lastFinishedPulling="2026-01-11 17:47:33.437473501 +0000 UTC m=+1027.615666207" observedRunningTime="2026-01-11 17:47:44.493326532 +0000 UTC m=+1038.671519238" watchObservedRunningTime="2026-01-11 17:47:44.494996816 +0000 UTC m=+1038.673189522" Jan 11 17:47:44 crc kubenswrapper[4837]: I0111 17:47:44.496450 4837 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/swift-operator-controller-manager-85dd56d4cc-hspqd" podStartSLOduration=12.132747213 podStartE2EDuration="49.496444775s" podCreationTimestamp="2026-01-11 17:46:55 +0000 UTC" firstStartedPulling="2026-01-11 17:46:57.125136354 +0000 UTC m=+991.303329060" lastFinishedPulling="2026-01-11 17:47:34.488833896 +0000 UTC m=+1028.667026622" observedRunningTime="2026-01-11 17:47:44.471563778 +0000 UTC m=+1038.649756494" watchObservedRunningTime="2026-01-11 17:47:44.496444775 +0000 UTC m=+1038.674637481" Jan 11 17:47:44 crc kubenswrapper[4837]: I0111 17:47:44.519254 4837 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/designate-operator-controller-manager-9f958b845-26d4f" podStartSLOduration=12.291228567 podStartE2EDuration="50.519235057s" podCreationTimestamp="2026-01-11 17:46:54 +0000 UTC" firstStartedPulling="2026-01-11 17:46:56.261448724 +0000 UTC m=+990.439641430" lastFinishedPulling="2026-01-11 17:47:34.489455204 +0000 UTC m=+1028.667647920" observedRunningTime="2026-01-11 17:47:44.514745356 +0000 UTC m=+1038.692938052" watchObservedRunningTime="2026-01-11 17:47:44.519235057 +0000 UTC m=+1038.697427763" Jan 11 17:47:44 crc kubenswrapper[4837]: I0111 17:47:44.544297 4837 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/neutron-operator-controller-manager-cb4666565-x648q" podStartSLOduration=10.173573283 podStartE2EDuration="49.544278499s" podCreationTimestamp="2026-01-11 17:46:55 +0000 UTC" firstStartedPulling="2026-01-11 17:46:57.097148003 +0000 UTC m=+991.275340709" lastFinishedPulling="2026-01-11 17:47:36.467853219 +0000 UTC m=+1030.646045925" observedRunningTime="2026-01-11 17:47:44.539777427 +0000 UTC m=+1038.717970153" watchObservedRunningTime="2026-01-11 17:47:44.544278499 +0000 UTC m=+1038.722471205" Jan 11 17:47:44 crc kubenswrapper[4837]: I0111 17:47:44.562613 4837 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/ovn-operator-controller-manager-55db956ddc-4txs5" podStartSLOduration=12.195758572 podStartE2EDuration="49.562592929s" podCreationTimestamp="2026-01-11 17:46:55 +0000 UTC" firstStartedPulling="2026-01-11 17:46:57.123708835 +0000 UTC m=+991.301901541" lastFinishedPulling="2026-01-11 17:47:34.490543192 +0000 UTC m=+1028.668735898" observedRunningTime="2026-01-11 17:47:44.559345742 +0000 UTC m=+1038.737538448" watchObservedRunningTime="2026-01-11 17:47:44.562592929 +0000 UTC m=+1038.740785635" Jan 11 17:47:44 crc kubenswrapper[4837]: I0111 17:47:44.608439 4837 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/openstack-operator-controller-manager-5569b88c46-6jqzq" podStartSLOduration=49.60842215 podStartE2EDuration="49.60842215s" podCreationTimestamp="2026-01-11 17:46:55 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-11 17:47:44.603250691 +0000 UTC m=+1038.781443397" watchObservedRunningTime="2026-01-11 17:47:44.60842215 +0000 UTC m=+1038.786614856" Jan 11 17:47:44 crc kubenswrapper[4837]: I0111 17:47:44.629443 4837 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/ironic-operator-controller-manager-78757b4889-87jjr" podStartSLOduration=10.272757623 podStartE2EDuration="49.629424622s" podCreationTimestamp="2026-01-11 17:46:55 +0000 UTC" firstStartedPulling="2026-01-11 17:46:57.096830195 +0000 UTC m=+991.275022901" lastFinishedPulling="2026-01-11 17:47:36.453497194 +0000 UTC m=+1030.631689900" observedRunningTime="2026-01-11 17:47:44.625116147 +0000 UTC m=+1038.803308853" watchObservedRunningTime="2026-01-11 17:47:44.629424622 +0000 UTC m=+1038.807617328" Jan 11 17:47:44 crc kubenswrapper[4837]: I0111 17:47:44.667452 4837 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/telemetry-operator-controller-manager-5f8f495fcf-tkj2c" podStartSLOduration=5.857221127 podStartE2EDuration="49.667431863s" podCreationTimestamp="2026-01-11 17:46:55 +0000 UTC" firstStartedPulling="2026-01-11 17:46:57.126735327 +0000 UTC m=+991.304928033" lastFinishedPulling="2026-01-11 17:47:40.936946063 +0000 UTC m=+1035.115138769" observedRunningTime="2026-01-11 17:47:44.662691385 +0000 UTC m=+1038.840884111" watchObservedRunningTime="2026-01-11 17:47:44.667431863 +0000 UTC m=+1038.845624569" Jan 11 17:47:44 crc kubenswrapper[4837]: I0111 17:47:44.689816 4837 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/manila-operator-controller-manager-864f6b75bf-665ds" podStartSLOduration=5.905632946 podStartE2EDuration="49.689801913s" podCreationTimestamp="2026-01-11 17:46:55 +0000 UTC" firstStartedPulling="2026-01-11 17:46:57.136246372 +0000 UTC m=+991.314439078" lastFinishedPulling="2026-01-11 17:47:40.920415349 +0000 UTC m=+1035.098608045" observedRunningTime="2026-01-11 17:47:44.685978239 +0000 UTC m=+1038.864170945" watchObservedRunningTime="2026-01-11 17:47:44.689801913 +0000 UTC m=+1038.867994619" Jan 11 17:47:44 crc kubenswrapper[4837]: I0111 17:47:44.706739 4837 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/mariadb-operator-controller-manager-c87fff755-7k4z8" podStartSLOduration=10.35381508 podStartE2EDuration="49.706721867s" podCreationTimestamp="2026-01-11 17:46:55 +0000 UTC" firstStartedPulling="2026-01-11 17:46:57.099182548 +0000 UTC m=+991.277375254" lastFinishedPulling="2026-01-11 17:47:36.452089335 +0000 UTC m=+1030.630282041" observedRunningTime="2026-01-11 17:47:44.702248576 +0000 UTC m=+1038.880441292" watchObservedRunningTime="2026-01-11 17:47:44.706721867 +0000 UTC m=+1038.884914573" Jan 11 17:47:44 crc kubenswrapper[4837]: I0111 17:47:44.745522 4837 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/barbican-operator-controller-manager-7ddb5c749-bv24m" podStartSLOduration=10.17188153 podStartE2EDuration="50.745505087s" podCreationTimestamp="2026-01-11 17:46:54 +0000 UTC" firstStartedPulling="2026-01-11 17:46:55.879892247 +0000 UTC m=+990.058084953" lastFinishedPulling="2026-01-11 17:47:36.453515804 +0000 UTC m=+1030.631708510" observedRunningTime="2026-01-11 17:47:44.730578076 +0000 UTC m=+1038.908770772" watchObservedRunningTime="2026-01-11 17:47:44.745505087 +0000 UTC m=+1038.923697793" Jan 11 17:47:44 crc kubenswrapper[4837]: I0111 17:47:44.749115 4837 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/test-operator-controller-manager-7cd8bc9dbb-shqx6" podStartSLOduration=5.962612813 podStartE2EDuration="49.749108794s" podCreationTimestamp="2026-01-11 17:46:55 +0000 UTC" firstStartedPulling="2026-01-11 17:46:57.126830209 +0000 UTC m=+991.305022915" lastFinishedPulling="2026-01-11 17:47:40.91332619 +0000 UTC m=+1035.091518896" observedRunningTime="2026-01-11 17:47:44.74414779 +0000 UTC m=+1038.922340496" watchObservedRunningTime="2026-01-11 17:47:44.749108794 +0000 UTC m=+1038.927301500" Jan 11 17:47:44 crc kubenswrapper[4837]: I0111 17:47:44.763920 4837 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/glance-operator-controller-manager-c6994669c-9ptmm" podStartSLOduration=16.941002099 podStartE2EDuration="50.76390132s" podCreationTimestamp="2026-01-11 17:46:54 +0000 UTC" firstStartedPulling="2026-01-11 17:46:56.034824634 +0000 UTC m=+990.213017340" lastFinishedPulling="2026-01-11 17:47:29.857723855 +0000 UTC m=+1024.035916561" observedRunningTime="2026-01-11 17:47:44.758856945 +0000 UTC m=+1038.937049651" watchObservedRunningTime="2026-01-11 17:47:44.76390132 +0000 UTC m=+1038.942094026" Jan 11 17:47:45 crc kubenswrapper[4837]: I0111 17:47:45.430554 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-kwp7z" event={"ID":"c76abbe1-c9d2-414f-8c9a-372f8d5e17bc","Type":"ContainerStarted","Data":"51fecd1109e821ca41d607502d93550d911ecc4e0d4fa07f9634c0ffbc878bf1"} Jan 11 17:47:45 crc kubenswrapper[4837]: I0111 17:47:45.431991 4837 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/swift-operator-controller-manager-85dd56d4cc-hspqd" Jan 11 17:47:46 crc kubenswrapper[4837]: I0111 17:47:46.445778 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/octavia-operator-controller-manager-7fc9b76cf6-2lvvk" event={"ID":"b8022f77-44ba-493f-bed8-ad82fa1ca45a","Type":"ContainerStarted","Data":"59f5b0cc6652706cde2cce22ea75fd466ac0945f1442e72387fa286547c74f0c"} Jan 11 17:47:46 crc kubenswrapper[4837]: I0111 17:47:46.447367 4837 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/octavia-operator-controller-manager-7fc9b76cf6-2lvvk" Jan 11 17:47:46 crc kubenswrapper[4837]: I0111 17:47:46.472905 4837 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/octavia-operator-controller-manager-7fc9b76cf6-2lvvk" podStartSLOduration=3.695891655 podStartE2EDuration="51.472879248s" podCreationTimestamp="2026-01-11 17:46:55 +0000 UTC" firstStartedPulling="2026-01-11 17:46:57.096742413 +0000 UTC m=+991.274935119" lastFinishedPulling="2026-01-11 17:47:44.873730006 +0000 UTC m=+1039.051922712" observedRunningTime="2026-01-11 17:47:46.470337439 +0000 UTC m=+1040.648530185" watchObservedRunningTime="2026-01-11 17:47:46.472879248 +0000 UTC m=+1040.651071984" Jan 11 17:47:46 crc kubenswrapper[4837]: I0111 17:47:46.486471 4837 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-kwp7z" podStartSLOduration=3.7119213330000003 podStartE2EDuration="51.486453212s" podCreationTimestamp="2026-01-11 17:46:55 +0000 UTC" firstStartedPulling="2026-01-11 17:46:57.127474037 +0000 UTC m=+991.305666743" lastFinishedPulling="2026-01-11 17:47:44.902005916 +0000 UTC m=+1039.080198622" observedRunningTime="2026-01-11 17:47:46.484557421 +0000 UTC m=+1040.662750157" watchObservedRunningTime="2026-01-11 17:47:46.486453212 +0000 UTC m=+1040.664645928" Jan 11 17:47:48 crc kubenswrapper[4837]: I0111 17:47:48.086642 4837 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/openstack-operator-controller-manager-5569b88c46-6jqzq" Jan 11 17:47:50 crc kubenswrapper[4837]: I0111 17:47:50.493802 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/nova-operator-controller-manager-65849867d6-vm5gr" event={"ID":"b7039fa0-8e22-4369-abcd-baa005429b7b","Type":"ContainerStarted","Data":"7ae46c42637773ce6cdd0cff10055c63651bc6261ceb90306d7f13616b22ce0d"} Jan 11 17:47:50 crc kubenswrapper[4837]: I0111 17:47:50.494734 4837 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/nova-operator-controller-manager-65849867d6-vm5gr" Jan 11 17:47:50 crc kubenswrapper[4837]: I0111 17:47:50.495340 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/heat-operator-controller-manager-594c8c9d5d-cz9h6" event={"ID":"61f99042-0859-46d8-9af9-727352a885ee","Type":"ContainerStarted","Data":"941684798ec051ebf5b8ca4d92c2957da1015e7070508936d70dc817d812d72c"} Jan 11 17:47:50 crc kubenswrapper[4837]: I0111 17:47:50.495600 4837 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/heat-operator-controller-manager-594c8c9d5d-cz9h6" Jan 11 17:47:50 crc kubenswrapper[4837]: I0111 17:47:50.497221 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-baremetal-operator-controller-manager-6b68b8b854pfgjq" event={"ID":"374c350e-a484-40a8-8563-45eb7f3eafd1","Type":"ContainerStarted","Data":"a0c94c362ae16b5ef6deeaa1f1f146263035ace679e3aec3b02361bdef89c35b"} Jan 11 17:47:50 crc kubenswrapper[4837]: I0111 17:47:50.497349 4837 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/openstack-baremetal-operator-controller-manager-6b68b8b854pfgjq" Jan 11 17:47:50 crc kubenswrapper[4837]: I0111 17:47:50.498602 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/keystone-operator-controller-manager-767fdc4f47-jgrh6" event={"ID":"8bdb5237-cb95-4e0c-b52c-85a8a419506b","Type":"ContainerStarted","Data":"8031397bb6e624d542a34b076c017e5784fe813b035c05333a9b91987c8cabcb"} Jan 11 17:47:50 crc kubenswrapper[4837]: I0111 17:47:50.498824 4837 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/keystone-operator-controller-manager-767fdc4f47-jgrh6" Jan 11 17:47:50 crc kubenswrapper[4837]: I0111 17:47:50.499962 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/infra-operator-controller-manager-77c48c7859-4hndt" event={"ID":"c2312108-ddf5-4939-acc1-727557936791","Type":"ContainerStarted","Data":"0047a68dbc39349e2497d72e6d036f8e54eac066a70dfc7a85d0754a166d0318"} Jan 11 17:47:50 crc kubenswrapper[4837]: I0111 17:47:50.500097 4837 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/infra-operator-controller-manager-77c48c7859-4hndt" Jan 11 17:47:50 crc kubenswrapper[4837]: I0111 17:47:50.501976 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/placement-operator-controller-manager-686df47fcb-wf222" event={"ID":"ddce549f-ba1d-483d-b50b-4011c826bbff","Type":"ContainerStarted","Data":"67a62bebab503a1ad66abcf37dd4e8d7bb80630c0178683bc37b6714b34881e9"} Jan 11 17:47:50 crc kubenswrapper[4837]: I0111 17:47:50.502177 4837 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/placement-operator-controller-manager-686df47fcb-wf222" Jan 11 17:47:50 crc kubenswrapper[4837]: I0111 17:47:50.513505 4837 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/nova-operator-controller-manager-65849867d6-vm5gr" podStartSLOduration=3.180491027 podStartE2EDuration="55.513488258s" podCreationTimestamp="2026-01-11 17:46:55 +0000 UTC" firstStartedPulling="2026-01-11 17:46:57.082823529 +0000 UTC m=+991.261016235" lastFinishedPulling="2026-01-11 17:47:49.41582076 +0000 UTC m=+1043.594013466" observedRunningTime="2026-01-11 17:47:50.512214793 +0000 UTC m=+1044.690407509" watchObservedRunningTime="2026-01-11 17:47:50.513488258 +0000 UTC m=+1044.691680964" Jan 11 17:47:50 crc kubenswrapper[4837]: I0111 17:47:50.543692 4837 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/openstack-baremetal-operator-controller-manager-6b68b8b854pfgjq" podStartSLOduration=47.402043307 podStartE2EDuration="55.543658346s" podCreationTimestamp="2026-01-11 17:46:55 +0000 UTC" firstStartedPulling="2026-01-11 17:47:41.272505085 +0000 UTC m=+1035.450697791" lastFinishedPulling="2026-01-11 17:47:49.414120124 +0000 UTC m=+1043.592312830" observedRunningTime="2026-01-11 17:47:50.536975417 +0000 UTC m=+1044.715168123" watchObservedRunningTime="2026-01-11 17:47:50.543658346 +0000 UTC m=+1044.721851052" Jan 11 17:47:50 crc kubenswrapper[4837]: I0111 17:47:50.571539 4837 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/placement-operator-controller-manager-686df47fcb-wf222" podStartSLOduration=3.257131543 podStartE2EDuration="55.571522764s" podCreationTimestamp="2026-01-11 17:46:55 +0000 UTC" firstStartedPulling="2026-01-11 17:46:57.099355423 +0000 UTC m=+991.277548129" lastFinishedPulling="2026-01-11 17:47:49.413746634 +0000 UTC m=+1043.591939350" observedRunningTime="2026-01-11 17:47:50.552195895 +0000 UTC m=+1044.730388611" watchObservedRunningTime="2026-01-11 17:47:50.571522764 +0000 UTC m=+1044.749715470" Jan 11 17:47:50 crc kubenswrapper[4837]: I0111 17:47:50.571966 4837 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/heat-operator-controller-manager-594c8c9d5d-cz9h6" podStartSLOduration=2.720689051 podStartE2EDuration="55.571964006s" podCreationTimestamp="2026-01-11 17:46:55 +0000 UTC" firstStartedPulling="2026-01-11 17:46:56.562443728 +0000 UTC m=+990.740636424" lastFinishedPulling="2026-01-11 17:47:49.413718673 +0000 UTC m=+1043.591911379" observedRunningTime="2026-01-11 17:47:50.571247547 +0000 UTC m=+1044.749440253" watchObservedRunningTime="2026-01-11 17:47:50.571964006 +0000 UTC m=+1044.750156712" Jan 11 17:47:50 crc kubenswrapper[4837]: I0111 17:47:50.593490 4837 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/keystone-operator-controller-manager-767fdc4f47-jgrh6" podStartSLOduration=3.276857262 podStartE2EDuration="55.593475633s" podCreationTimestamp="2026-01-11 17:46:55 +0000 UTC" firstStartedPulling="2026-01-11 17:46:57.097363229 +0000 UTC m=+991.275555935" lastFinishedPulling="2026-01-11 17:47:49.4139816 +0000 UTC m=+1043.592174306" observedRunningTime="2026-01-11 17:47:50.591278434 +0000 UTC m=+1044.769471140" watchObservedRunningTime="2026-01-11 17:47:50.593475633 +0000 UTC m=+1044.771668339" Jan 11 17:47:50 crc kubenswrapper[4837]: I0111 17:47:50.611835 4837 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/infra-operator-controller-manager-77c48c7859-4hndt" podStartSLOduration=47.560387166 podStartE2EDuration="55.611811995s" podCreationTimestamp="2026-01-11 17:46:55 +0000 UTC" firstStartedPulling="2026-01-11 17:47:41.362543941 +0000 UTC m=+1035.540736647" lastFinishedPulling="2026-01-11 17:47:49.41396878 +0000 UTC m=+1043.592161476" observedRunningTime="2026-01-11 17:47:50.607835498 +0000 UTC m=+1044.786028204" watchObservedRunningTime="2026-01-11 17:47:50.611811995 +0000 UTC m=+1044.790004701" Jan 11 17:47:55 crc kubenswrapper[4837]: I0111 17:47:55.273044 4837 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/barbican-operator-controller-manager-7ddb5c749-bv24m" Jan 11 17:47:55 crc kubenswrapper[4837]: I0111 17:47:55.301003 4837 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/cinder-operator-controller-manager-9b68f5989-6fhn7" Jan 11 17:47:55 crc kubenswrapper[4837]: I0111 17:47:55.306115 4837 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/designate-operator-controller-manager-9f958b845-26d4f" Jan 11 17:47:55 crc kubenswrapper[4837]: I0111 17:47:55.413976 4837 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/glance-operator-controller-manager-c6994669c-9ptmm" Jan 11 17:47:55 crc kubenswrapper[4837]: I0111 17:47:55.457116 4837 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/horizon-operator-controller-manager-77d5c5b54f-75fgx" Jan 11 17:47:55 crc kubenswrapper[4837]: I0111 17:47:55.481300 4837 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/heat-operator-controller-manager-594c8c9d5d-cz9h6" Jan 11 17:47:55 crc kubenswrapper[4837]: I0111 17:47:55.644346 4837 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/ironic-operator-controller-manager-78757b4889-87jjr" Jan 11 17:47:55 crc kubenswrapper[4837]: I0111 17:47:55.661050 4837 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/keystone-operator-controller-manager-767fdc4f47-jgrh6" Jan 11 17:47:55 crc kubenswrapper[4837]: I0111 17:47:55.692520 4837 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/manila-operator-controller-manager-864f6b75bf-665ds" Jan 11 17:47:55 crc kubenswrapper[4837]: I0111 17:47:55.714980 4837 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/mariadb-operator-controller-manager-c87fff755-7k4z8" Jan 11 17:47:55 crc kubenswrapper[4837]: I0111 17:47:55.719926 4837 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/neutron-operator-controller-manager-cb4666565-x648q" Jan 11 17:47:55 crc kubenswrapper[4837]: I0111 17:47:55.736309 4837 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/nova-operator-controller-manager-65849867d6-vm5gr" Jan 11 17:47:55 crc kubenswrapper[4837]: I0111 17:47:55.787657 4837 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/octavia-operator-controller-manager-7fc9b76cf6-2lvvk" Jan 11 17:47:55 crc kubenswrapper[4837]: I0111 17:47:55.825785 4837 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/ovn-operator-controller-manager-55db956ddc-4txs5" Jan 11 17:47:55 crc kubenswrapper[4837]: I0111 17:47:55.871995 4837 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/placement-operator-controller-manager-686df47fcb-wf222" Jan 11 17:47:55 crc kubenswrapper[4837]: I0111 17:47:55.904274 4837 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/swift-operator-controller-manager-85dd56d4cc-hspqd" Jan 11 17:47:56 crc kubenswrapper[4837]: I0111 17:47:56.033093 4837 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/telemetry-operator-controller-manager-5f8f495fcf-tkj2c" Jan 11 17:47:56 crc kubenswrapper[4837]: I0111 17:47:56.179297 4837 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/test-operator-controller-manager-7cd8bc9dbb-shqx6" Jan 11 17:47:56 crc kubenswrapper[4837]: I0111 17:47:56.223162 4837 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/watcher-operator-controller-manager-64cd966744-jsqjz" Jan 11 17:47:57 crc kubenswrapper[4837]: I0111 17:47:57.316583 4837 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/infra-operator-controller-manager-77c48c7859-4hndt" Jan 11 17:47:57 crc kubenswrapper[4837]: I0111 17:47:57.708552 4837 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/openstack-baremetal-operator-controller-manager-6b68b8b854pfgjq" Jan 11 17:48:15 crc kubenswrapper[4837]: I0111 17:48:15.539734 4837 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-675f4bcbfc-vrgl5"] Jan 11 17:48:15 crc kubenswrapper[4837]: I0111 17:48:15.541490 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-675f4bcbfc-vrgl5" Jan 11 17:48:15 crc kubenswrapper[4837]: I0111 17:48:15.545798 4837 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openshift-service-ca.crt" Jan 11 17:48:15 crc kubenswrapper[4837]: I0111 17:48:15.545863 4837 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"dns" Jan 11 17:48:15 crc kubenswrapper[4837]: I0111 17:48:15.545936 4837 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"kube-root-ca.crt" Jan 11 17:48:15 crc kubenswrapper[4837]: I0111 17:48:15.545988 4837 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dnsmasq-dns-dockercfg-cmtmq" Jan 11 17:48:15 crc kubenswrapper[4837]: I0111 17:48:15.556212 4837 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-675f4bcbfc-vrgl5"] Jan 11 17:48:15 crc kubenswrapper[4837]: I0111 17:48:15.621728 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f7b65961-9462-460e-9dff-90141e4f764c-config\") pod \"dnsmasq-dns-675f4bcbfc-vrgl5\" (UID: \"f7b65961-9462-460e-9dff-90141e4f764c\") " pod="openstack/dnsmasq-dns-675f4bcbfc-vrgl5" Jan 11 17:48:15 crc kubenswrapper[4837]: I0111 17:48:15.621781 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-85g4h\" (UniqueName: \"kubernetes.io/projected/f7b65961-9462-460e-9dff-90141e4f764c-kube-api-access-85g4h\") pod \"dnsmasq-dns-675f4bcbfc-vrgl5\" (UID: \"f7b65961-9462-460e-9dff-90141e4f764c\") " pod="openstack/dnsmasq-dns-675f4bcbfc-vrgl5" Jan 11 17:48:15 crc kubenswrapper[4837]: I0111 17:48:15.641033 4837 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-78dd6ddcc-brdp5"] Jan 11 17:48:15 crc kubenswrapper[4837]: I0111 17:48:15.642071 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-78dd6ddcc-brdp5" Jan 11 17:48:15 crc kubenswrapper[4837]: I0111 17:48:15.645310 4837 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"dns-svc" Jan 11 17:48:15 crc kubenswrapper[4837]: I0111 17:48:15.722317 4837 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-78dd6ddcc-brdp5"] Jan 11 17:48:15 crc kubenswrapper[4837]: I0111 17:48:15.723223 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-85g4h\" (UniqueName: \"kubernetes.io/projected/f7b65961-9462-460e-9dff-90141e4f764c-kube-api-access-85g4h\") pod \"dnsmasq-dns-675f4bcbfc-vrgl5\" (UID: \"f7b65961-9462-460e-9dff-90141e4f764c\") " pod="openstack/dnsmasq-dns-675f4bcbfc-vrgl5" Jan 11 17:48:15 crc kubenswrapper[4837]: I0111 17:48:15.723256 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/bd4d43a7-b705-4282-8493-8c1b3bdd4015-dns-svc\") pod \"dnsmasq-dns-78dd6ddcc-brdp5\" (UID: \"bd4d43a7-b705-4282-8493-8c1b3bdd4015\") " pod="openstack/dnsmasq-dns-78dd6ddcc-brdp5" Jan 11 17:48:15 crc kubenswrapper[4837]: I0111 17:48:15.723557 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ktzxk\" (UniqueName: \"kubernetes.io/projected/bd4d43a7-b705-4282-8493-8c1b3bdd4015-kube-api-access-ktzxk\") pod \"dnsmasq-dns-78dd6ddcc-brdp5\" (UID: \"bd4d43a7-b705-4282-8493-8c1b3bdd4015\") " pod="openstack/dnsmasq-dns-78dd6ddcc-brdp5" Jan 11 17:48:15 crc kubenswrapper[4837]: I0111 17:48:15.723628 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/bd4d43a7-b705-4282-8493-8c1b3bdd4015-config\") pod \"dnsmasq-dns-78dd6ddcc-brdp5\" (UID: \"bd4d43a7-b705-4282-8493-8c1b3bdd4015\") " pod="openstack/dnsmasq-dns-78dd6ddcc-brdp5" Jan 11 17:48:15 crc kubenswrapper[4837]: I0111 17:48:15.723689 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f7b65961-9462-460e-9dff-90141e4f764c-config\") pod \"dnsmasq-dns-675f4bcbfc-vrgl5\" (UID: \"f7b65961-9462-460e-9dff-90141e4f764c\") " pod="openstack/dnsmasq-dns-675f4bcbfc-vrgl5" Jan 11 17:48:15 crc kubenswrapper[4837]: I0111 17:48:15.724466 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f7b65961-9462-460e-9dff-90141e4f764c-config\") pod \"dnsmasq-dns-675f4bcbfc-vrgl5\" (UID: \"f7b65961-9462-460e-9dff-90141e4f764c\") " pod="openstack/dnsmasq-dns-675f4bcbfc-vrgl5" Jan 11 17:48:15 crc kubenswrapper[4837]: I0111 17:48:15.745659 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-85g4h\" (UniqueName: \"kubernetes.io/projected/f7b65961-9462-460e-9dff-90141e4f764c-kube-api-access-85g4h\") pod \"dnsmasq-dns-675f4bcbfc-vrgl5\" (UID: \"f7b65961-9462-460e-9dff-90141e4f764c\") " pod="openstack/dnsmasq-dns-675f4bcbfc-vrgl5" Jan 11 17:48:15 crc kubenswrapper[4837]: I0111 17:48:15.824435 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/bd4d43a7-b705-4282-8493-8c1b3bdd4015-dns-svc\") pod \"dnsmasq-dns-78dd6ddcc-brdp5\" (UID: \"bd4d43a7-b705-4282-8493-8c1b3bdd4015\") " pod="openstack/dnsmasq-dns-78dd6ddcc-brdp5" Jan 11 17:48:15 crc kubenswrapper[4837]: I0111 17:48:15.824703 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ktzxk\" (UniqueName: \"kubernetes.io/projected/bd4d43a7-b705-4282-8493-8c1b3bdd4015-kube-api-access-ktzxk\") pod \"dnsmasq-dns-78dd6ddcc-brdp5\" (UID: \"bd4d43a7-b705-4282-8493-8c1b3bdd4015\") " pod="openstack/dnsmasq-dns-78dd6ddcc-brdp5" Jan 11 17:48:15 crc kubenswrapper[4837]: I0111 17:48:15.824751 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/bd4d43a7-b705-4282-8493-8c1b3bdd4015-config\") pod \"dnsmasq-dns-78dd6ddcc-brdp5\" (UID: \"bd4d43a7-b705-4282-8493-8c1b3bdd4015\") " pod="openstack/dnsmasq-dns-78dd6ddcc-brdp5" Jan 11 17:48:15 crc kubenswrapper[4837]: I0111 17:48:15.825386 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/bd4d43a7-b705-4282-8493-8c1b3bdd4015-dns-svc\") pod \"dnsmasq-dns-78dd6ddcc-brdp5\" (UID: \"bd4d43a7-b705-4282-8493-8c1b3bdd4015\") " pod="openstack/dnsmasq-dns-78dd6ddcc-brdp5" Jan 11 17:48:15 crc kubenswrapper[4837]: I0111 17:48:15.825445 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/bd4d43a7-b705-4282-8493-8c1b3bdd4015-config\") pod \"dnsmasq-dns-78dd6ddcc-brdp5\" (UID: \"bd4d43a7-b705-4282-8493-8c1b3bdd4015\") " pod="openstack/dnsmasq-dns-78dd6ddcc-brdp5" Jan 11 17:48:15 crc kubenswrapper[4837]: I0111 17:48:15.868019 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-675f4bcbfc-vrgl5" Jan 11 17:48:15 crc kubenswrapper[4837]: I0111 17:48:15.869041 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ktzxk\" (UniqueName: \"kubernetes.io/projected/bd4d43a7-b705-4282-8493-8c1b3bdd4015-kube-api-access-ktzxk\") pod \"dnsmasq-dns-78dd6ddcc-brdp5\" (UID: \"bd4d43a7-b705-4282-8493-8c1b3bdd4015\") " pod="openstack/dnsmasq-dns-78dd6ddcc-brdp5" Jan 11 17:48:15 crc kubenswrapper[4837]: I0111 17:48:15.957583 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-78dd6ddcc-brdp5" Jan 11 17:48:16 crc kubenswrapper[4837]: I0111 17:48:16.299610 4837 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-675f4bcbfc-vrgl5"] Jan 11 17:48:16 crc kubenswrapper[4837]: I0111 17:48:16.428565 4837 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-78dd6ddcc-brdp5"] Jan 11 17:48:16 crc kubenswrapper[4837]: W0111 17:48:16.433843 4837 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podbd4d43a7_b705_4282_8493_8c1b3bdd4015.slice/crio-89578bb91fcf988d19c3df4debc0647cfdc22a7e4dc1e876f1e3ddab59bb8717 WatchSource:0}: Error finding container 89578bb91fcf988d19c3df4debc0647cfdc22a7e4dc1e876f1e3ddab59bb8717: Status 404 returned error can't find the container with id 89578bb91fcf988d19c3df4debc0647cfdc22a7e4dc1e876f1e3ddab59bb8717 Jan 11 17:48:17 crc kubenswrapper[4837]: I0111 17:48:17.065174 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-675f4bcbfc-vrgl5" event={"ID":"f7b65961-9462-460e-9dff-90141e4f764c","Type":"ContainerStarted","Data":"1a88f536e90e5690338040a0d4b8064f6c4b7ebaeb1c916e54e992dac62258d4"} Jan 11 17:48:17 crc kubenswrapper[4837]: I0111 17:48:17.066743 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-78dd6ddcc-brdp5" event={"ID":"bd4d43a7-b705-4282-8493-8c1b3bdd4015","Type":"ContainerStarted","Data":"89578bb91fcf988d19c3df4debc0647cfdc22a7e4dc1e876f1e3ddab59bb8717"} Jan 11 17:48:18 crc kubenswrapper[4837]: I0111 17:48:18.515144 4837 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-675f4bcbfc-vrgl5"] Jan 11 17:48:18 crc kubenswrapper[4837]: I0111 17:48:18.544775 4837 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-666b6646f7-9d6t8"] Jan 11 17:48:18 crc kubenswrapper[4837]: I0111 17:48:18.545806 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-666b6646f7-9d6t8" Jan 11 17:48:18 crc kubenswrapper[4837]: I0111 17:48:18.553882 4837 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-666b6646f7-9d6t8"] Jan 11 17:48:18 crc kubenswrapper[4837]: I0111 17:48:18.665328 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-plsmv\" (UniqueName: \"kubernetes.io/projected/c92d4f03-aa9b-4056-8e9b-2ef99a23d6ca-kube-api-access-plsmv\") pod \"dnsmasq-dns-666b6646f7-9d6t8\" (UID: \"c92d4f03-aa9b-4056-8e9b-2ef99a23d6ca\") " pod="openstack/dnsmasq-dns-666b6646f7-9d6t8" Jan 11 17:48:18 crc kubenswrapper[4837]: I0111 17:48:18.665380 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/c92d4f03-aa9b-4056-8e9b-2ef99a23d6ca-dns-svc\") pod \"dnsmasq-dns-666b6646f7-9d6t8\" (UID: \"c92d4f03-aa9b-4056-8e9b-2ef99a23d6ca\") " pod="openstack/dnsmasq-dns-666b6646f7-9d6t8" Jan 11 17:48:18 crc kubenswrapper[4837]: I0111 17:48:18.665425 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c92d4f03-aa9b-4056-8e9b-2ef99a23d6ca-config\") pod \"dnsmasq-dns-666b6646f7-9d6t8\" (UID: \"c92d4f03-aa9b-4056-8e9b-2ef99a23d6ca\") " pod="openstack/dnsmasq-dns-666b6646f7-9d6t8" Jan 11 17:48:18 crc kubenswrapper[4837]: I0111 17:48:18.766962 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-plsmv\" (UniqueName: \"kubernetes.io/projected/c92d4f03-aa9b-4056-8e9b-2ef99a23d6ca-kube-api-access-plsmv\") pod \"dnsmasq-dns-666b6646f7-9d6t8\" (UID: \"c92d4f03-aa9b-4056-8e9b-2ef99a23d6ca\") " pod="openstack/dnsmasq-dns-666b6646f7-9d6t8" Jan 11 17:48:18 crc kubenswrapper[4837]: I0111 17:48:18.767022 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/c92d4f03-aa9b-4056-8e9b-2ef99a23d6ca-dns-svc\") pod \"dnsmasq-dns-666b6646f7-9d6t8\" (UID: \"c92d4f03-aa9b-4056-8e9b-2ef99a23d6ca\") " pod="openstack/dnsmasq-dns-666b6646f7-9d6t8" Jan 11 17:48:18 crc kubenswrapper[4837]: I0111 17:48:18.767067 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c92d4f03-aa9b-4056-8e9b-2ef99a23d6ca-config\") pod \"dnsmasq-dns-666b6646f7-9d6t8\" (UID: \"c92d4f03-aa9b-4056-8e9b-2ef99a23d6ca\") " pod="openstack/dnsmasq-dns-666b6646f7-9d6t8" Jan 11 17:48:18 crc kubenswrapper[4837]: I0111 17:48:18.767978 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c92d4f03-aa9b-4056-8e9b-2ef99a23d6ca-config\") pod \"dnsmasq-dns-666b6646f7-9d6t8\" (UID: \"c92d4f03-aa9b-4056-8e9b-2ef99a23d6ca\") " pod="openstack/dnsmasq-dns-666b6646f7-9d6t8" Jan 11 17:48:18 crc kubenswrapper[4837]: I0111 17:48:18.768080 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/c92d4f03-aa9b-4056-8e9b-2ef99a23d6ca-dns-svc\") pod \"dnsmasq-dns-666b6646f7-9d6t8\" (UID: \"c92d4f03-aa9b-4056-8e9b-2ef99a23d6ca\") " pod="openstack/dnsmasq-dns-666b6646f7-9d6t8" Jan 11 17:48:18 crc kubenswrapper[4837]: I0111 17:48:18.777856 4837 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-78dd6ddcc-brdp5"] Jan 11 17:48:18 crc kubenswrapper[4837]: I0111 17:48:18.799661 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-plsmv\" (UniqueName: \"kubernetes.io/projected/c92d4f03-aa9b-4056-8e9b-2ef99a23d6ca-kube-api-access-plsmv\") pod \"dnsmasq-dns-666b6646f7-9d6t8\" (UID: \"c92d4f03-aa9b-4056-8e9b-2ef99a23d6ca\") " pod="openstack/dnsmasq-dns-666b6646f7-9d6t8" Jan 11 17:48:18 crc kubenswrapper[4837]: I0111 17:48:18.800581 4837 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-57d769cc4f-mm7j2"] Jan 11 17:48:18 crc kubenswrapper[4837]: I0111 17:48:18.801790 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-57d769cc4f-mm7j2" Jan 11 17:48:18 crc kubenswrapper[4837]: I0111 17:48:18.819095 4837 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-57d769cc4f-mm7j2"] Jan 11 17:48:18 crc kubenswrapper[4837]: I0111 17:48:18.864088 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-666b6646f7-9d6t8" Jan 11 17:48:18 crc kubenswrapper[4837]: I0111 17:48:18.969818 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/238f9a29-a4e5-4a96-a90d-17b17a1200d2-dns-svc\") pod \"dnsmasq-dns-57d769cc4f-mm7j2\" (UID: \"238f9a29-a4e5-4a96-a90d-17b17a1200d2\") " pod="openstack/dnsmasq-dns-57d769cc4f-mm7j2" Jan 11 17:48:18 crc kubenswrapper[4837]: I0111 17:48:18.969899 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hmgqz\" (UniqueName: \"kubernetes.io/projected/238f9a29-a4e5-4a96-a90d-17b17a1200d2-kube-api-access-hmgqz\") pod \"dnsmasq-dns-57d769cc4f-mm7j2\" (UID: \"238f9a29-a4e5-4a96-a90d-17b17a1200d2\") " pod="openstack/dnsmasq-dns-57d769cc4f-mm7j2" Jan 11 17:48:18 crc kubenswrapper[4837]: I0111 17:48:18.969929 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/238f9a29-a4e5-4a96-a90d-17b17a1200d2-config\") pod \"dnsmasq-dns-57d769cc4f-mm7j2\" (UID: \"238f9a29-a4e5-4a96-a90d-17b17a1200d2\") " pod="openstack/dnsmasq-dns-57d769cc4f-mm7j2" Jan 11 17:48:19 crc kubenswrapper[4837]: I0111 17:48:19.071265 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/238f9a29-a4e5-4a96-a90d-17b17a1200d2-dns-svc\") pod \"dnsmasq-dns-57d769cc4f-mm7j2\" (UID: \"238f9a29-a4e5-4a96-a90d-17b17a1200d2\") " pod="openstack/dnsmasq-dns-57d769cc4f-mm7j2" Jan 11 17:48:19 crc kubenswrapper[4837]: I0111 17:48:19.071362 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hmgqz\" (UniqueName: \"kubernetes.io/projected/238f9a29-a4e5-4a96-a90d-17b17a1200d2-kube-api-access-hmgqz\") pod \"dnsmasq-dns-57d769cc4f-mm7j2\" (UID: \"238f9a29-a4e5-4a96-a90d-17b17a1200d2\") " pod="openstack/dnsmasq-dns-57d769cc4f-mm7j2" Jan 11 17:48:19 crc kubenswrapper[4837]: I0111 17:48:19.071383 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/238f9a29-a4e5-4a96-a90d-17b17a1200d2-config\") pod \"dnsmasq-dns-57d769cc4f-mm7j2\" (UID: \"238f9a29-a4e5-4a96-a90d-17b17a1200d2\") " pod="openstack/dnsmasq-dns-57d769cc4f-mm7j2" Jan 11 17:48:19 crc kubenswrapper[4837]: I0111 17:48:19.072208 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/238f9a29-a4e5-4a96-a90d-17b17a1200d2-config\") pod \"dnsmasq-dns-57d769cc4f-mm7j2\" (UID: \"238f9a29-a4e5-4a96-a90d-17b17a1200d2\") " pod="openstack/dnsmasq-dns-57d769cc4f-mm7j2" Jan 11 17:48:19 crc kubenswrapper[4837]: I0111 17:48:19.072212 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/238f9a29-a4e5-4a96-a90d-17b17a1200d2-dns-svc\") pod \"dnsmasq-dns-57d769cc4f-mm7j2\" (UID: \"238f9a29-a4e5-4a96-a90d-17b17a1200d2\") " pod="openstack/dnsmasq-dns-57d769cc4f-mm7j2" Jan 11 17:48:19 crc kubenswrapper[4837]: I0111 17:48:19.091487 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hmgqz\" (UniqueName: \"kubernetes.io/projected/238f9a29-a4e5-4a96-a90d-17b17a1200d2-kube-api-access-hmgqz\") pod \"dnsmasq-dns-57d769cc4f-mm7j2\" (UID: \"238f9a29-a4e5-4a96-a90d-17b17a1200d2\") " pod="openstack/dnsmasq-dns-57d769cc4f-mm7j2" Jan 11 17:48:19 crc kubenswrapper[4837]: I0111 17:48:19.126803 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-57d769cc4f-mm7j2" Jan 11 17:48:19 crc kubenswrapper[4837]: I0111 17:48:19.295504 4837 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-666b6646f7-9d6t8"] Jan 11 17:48:19 crc kubenswrapper[4837]: W0111 17:48:19.297723 4837 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podc92d4f03_aa9b_4056_8e9b_2ef99a23d6ca.slice/crio-b16f3d51c6d06d04f6e28cc20ba7d997f514ad577976a8ebe1b7b43e67e72849 WatchSource:0}: Error finding container b16f3d51c6d06d04f6e28cc20ba7d997f514ad577976a8ebe1b7b43e67e72849: Status 404 returned error can't find the container with id b16f3d51c6d06d04f6e28cc20ba7d997f514ad577976a8ebe1b7b43e67e72849 Jan 11 17:48:19 crc kubenswrapper[4837]: I0111 17:48:19.570059 4837 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-57d769cc4f-mm7j2"] Jan 11 17:48:19 crc kubenswrapper[4837]: W0111 17:48:19.578364 4837 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod238f9a29_a4e5_4a96_a90d_17b17a1200d2.slice/crio-e9940f25dacab204423c946fb342bdfa2eb047261c8c4a53d9c7676956a6de15 WatchSource:0}: Error finding container e9940f25dacab204423c946fb342bdfa2eb047261c8c4a53d9c7676956a6de15: Status 404 returned error can't find the container with id e9940f25dacab204423c946fb342bdfa2eb047261c8c4a53d9c7676956a6de15 Jan 11 17:48:20 crc kubenswrapper[4837]: I0111 17:48:20.092930 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-57d769cc4f-mm7j2" event={"ID":"238f9a29-a4e5-4a96-a90d-17b17a1200d2","Type":"ContainerStarted","Data":"e9940f25dacab204423c946fb342bdfa2eb047261c8c4a53d9c7676956a6de15"} Jan 11 17:48:20 crc kubenswrapper[4837]: I0111 17:48:20.094246 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-666b6646f7-9d6t8" event={"ID":"c92d4f03-aa9b-4056-8e9b-2ef99a23d6ca","Type":"ContainerStarted","Data":"b16f3d51c6d06d04f6e28cc20ba7d997f514ad577976a8ebe1b7b43e67e72849"} Jan 11 17:48:22 crc kubenswrapper[4837]: I0111 17:48:22.195377 4837 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/rabbitmq-server-0"] Jan 11 17:48:22 crc kubenswrapper[4837]: I0111 17:48:22.201889 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-server-0" Jan 11 17:48:22 crc kubenswrapper[4837]: I0111 17:48:22.205541 4837 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-erlang-cookie" Jan 11 17:48:22 crc kubenswrapper[4837]: I0111 17:48:22.205847 4837 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-plugins-conf" Jan 11 17:48:22 crc kubenswrapper[4837]: I0111 17:48:22.205970 4837 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-config-data" Jan 11 17:48:22 crc kubenswrapper[4837]: I0111 17:48:22.206119 4837 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-server-dockercfg-8xf8t" Jan 11 17:48:22 crc kubenswrapper[4837]: I0111 17:48:22.206243 4837 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-server-conf" Jan 11 17:48:22 crc kubenswrapper[4837]: I0111 17:48:22.206346 4837 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-default-user" Jan 11 17:48:22 crc kubenswrapper[4837]: I0111 17:48:22.209343 4837 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Jan 11 17:48:22 crc kubenswrapper[4837]: I0111 17:48:22.210519 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-cell1-server-0" Jan 11 17:48:22 crc kubenswrapper[4837]: I0111 17:48:22.211917 4837 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-rabbitmq-svc" Jan 11 17:48:22 crc kubenswrapper[4837]: I0111 17:48:22.212147 4837 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-cell1-config-data" Jan 11 17:48:22 crc kubenswrapper[4837]: I0111 17:48:22.212262 4837 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-rabbitmq-cell1-svc" Jan 11 17:48:22 crc kubenswrapper[4837]: I0111 17:48:22.213025 4837 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-cell1-server-dockercfg-xhdt5" Jan 11 17:48:22 crc kubenswrapper[4837]: I0111 17:48:22.213308 4837 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-cell1-plugins-conf" Jan 11 17:48:22 crc kubenswrapper[4837]: I0111 17:48:22.213415 4837 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-cell1-erlang-cookie" Jan 11 17:48:22 crc kubenswrapper[4837]: I0111 17:48:22.214269 4837 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-cell1-server-conf" Jan 11 17:48:22 crc kubenswrapper[4837]: I0111 17:48:22.214391 4837 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-cell1-default-user" Jan 11 17:48:22 crc kubenswrapper[4837]: I0111 17:48:22.222054 4837 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/openstack-galera-0"] Jan 11 17:48:22 crc kubenswrapper[4837]: I0111 17:48:22.223787 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstack-galera-0" Jan 11 17:48:22 crc kubenswrapper[4837]: I0111 17:48:22.232694 4837 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/openstack-galera-0"] Jan 11 17:48:22 crc kubenswrapper[4837]: I0111 17:48:22.234883 4837 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-config-data" Jan 11 17:48:22 crc kubenswrapper[4837]: I0111 17:48:22.234917 4837 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"galera-openstack-dockercfg-ckn46" Jan 11 17:48:22 crc kubenswrapper[4837]: I0111 17:48:22.234978 4837 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-galera-openstack-svc" Jan 11 17:48:22 crc kubenswrapper[4837]: I0111 17:48:22.235469 4837 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-scripts" Jan 11 17:48:22 crc kubenswrapper[4837]: I0111 17:48:22.238093 4837 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"combined-ca-bundle" Jan 11 17:48:22 crc kubenswrapper[4837]: I0111 17:48:22.239068 4837 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Jan 11 17:48:22 crc kubenswrapper[4837]: I0111 17:48:22.245129 4837 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-server-0"] Jan 11 17:48:22 crc kubenswrapper[4837]: I0111 17:48:22.325399 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/f21b505a-45c3-4f7e-b323-204d384185b9-server-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"f21b505a-45c3-4f7e-b323-204d384185b9\") " pod="openstack/rabbitmq-cell1-server-0" Jan 11 17:48:22 crc kubenswrapper[4837]: I0111 17:48:22.325441 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nmxwv\" (UniqueName: \"kubernetes.io/projected/09134535-27db-4787-89a5-c01f72ffa182-kube-api-access-nmxwv\") pod \"openstack-galera-0\" (UID: \"09134535-27db-4787-89a5-c01f72ffa182\") " pod="openstack/openstack-galera-0" Jan 11 17:48:22 crc kubenswrapper[4837]: I0111 17:48:22.325462 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/89ca88bf-2462-4fce-8a85-8dc04655b21c-pod-info\") pod \"rabbitmq-server-0\" (UID: \"89ca88bf-2462-4fce-8a85-8dc04655b21c\") " pod="openstack/rabbitmq-server-0" Jan 11 17:48:22 crc kubenswrapper[4837]: I0111 17:48:22.325484 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage02-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage02-crc\") pod \"rabbitmq-cell1-server-0\" (UID: \"f21b505a-45c3-4f7e-b323-204d384185b9\") " pod="openstack/rabbitmq-cell1-server-0" Jan 11 17:48:22 crc kubenswrapper[4837]: I0111 17:48:22.325503 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/89ca88bf-2462-4fce-8a85-8dc04655b21c-rabbitmq-tls\") pod \"rabbitmq-server-0\" (UID: \"89ca88bf-2462-4fce-8a85-8dc04655b21c\") " pod="openstack/rabbitmq-server-0" Jan 11 17:48:22 crc kubenswrapper[4837]: I0111 17:48:22.325576 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/f21b505a-45c3-4f7e-b323-204d384185b9-rabbitmq-erlang-cookie\") pod \"rabbitmq-cell1-server-0\" (UID: \"f21b505a-45c3-4f7e-b323-204d384185b9\") " pod="openstack/rabbitmq-cell1-server-0" Jan 11 17:48:22 crc kubenswrapper[4837]: I0111 17:48:22.325687 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/f21b505a-45c3-4f7e-b323-204d384185b9-erlang-cookie-secret\") pod \"rabbitmq-cell1-server-0\" (UID: \"f21b505a-45c3-4f7e-b323-204d384185b9\") " pod="openstack/rabbitmq-cell1-server-0" Jan 11 17:48:22 crc kubenswrapper[4837]: I0111 17:48:22.325713 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/89ca88bf-2462-4fce-8a85-8dc04655b21c-plugins-conf\") pod \"rabbitmq-server-0\" (UID: \"89ca88bf-2462-4fce-8a85-8dc04655b21c\") " pod="openstack/rabbitmq-server-0" Jan 11 17:48:22 crc kubenswrapper[4837]: I0111 17:48:22.325735 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/09134535-27db-4787-89a5-c01f72ffa182-galera-tls-certs\") pod \"openstack-galera-0\" (UID: \"09134535-27db-4787-89a5-c01f72ffa182\") " pod="openstack/openstack-galera-0" Jan 11 17:48:22 crc kubenswrapper[4837]: I0111 17:48:22.325753 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/89ca88bf-2462-4fce-8a85-8dc04655b21c-rabbitmq-plugins\") pod \"rabbitmq-server-0\" (UID: \"89ca88bf-2462-4fce-8a85-8dc04655b21c\") " pod="openstack/rabbitmq-server-0" Jan 11 17:48:22 crc kubenswrapper[4837]: I0111 17:48:22.325789 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/f21b505a-45c3-4f7e-b323-204d384185b9-rabbitmq-tls\") pod \"rabbitmq-cell1-server-0\" (UID: \"f21b505a-45c3-4f7e-b323-204d384185b9\") " pod="openstack/rabbitmq-cell1-server-0" Jan 11 17:48:22 crc kubenswrapper[4837]: I0111 17:48:22.325808 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5lvv9\" (UniqueName: \"kubernetes.io/projected/f21b505a-45c3-4f7e-b323-204d384185b9-kube-api-access-5lvv9\") pod \"rabbitmq-cell1-server-0\" (UID: \"f21b505a-45c3-4f7e-b323-204d384185b9\") " pod="openstack/rabbitmq-cell1-server-0" Jan 11 17:48:22 crc kubenswrapper[4837]: I0111 17:48:22.325827 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/09134535-27db-4787-89a5-c01f72ffa182-config-data-default\") pod \"openstack-galera-0\" (UID: \"09134535-27db-4787-89a5-c01f72ffa182\") " pod="openstack/openstack-galera-0" Jan 11 17:48:22 crc kubenswrapper[4837]: I0111 17:48:22.325842 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/f21b505a-45c3-4f7e-b323-204d384185b9-plugins-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"f21b505a-45c3-4f7e-b323-204d384185b9\") " pod="openstack/rabbitmq-cell1-server-0" Jan 11 17:48:22 crc kubenswrapper[4837]: I0111 17:48:22.325885 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage09-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage09-crc\") pod \"openstack-galera-0\" (UID: \"09134535-27db-4787-89a5-c01f72ffa182\") " pod="openstack/openstack-galera-0" Jan 11 17:48:22 crc kubenswrapper[4837]: I0111 17:48:22.325966 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/f21b505a-45c3-4f7e-b323-204d384185b9-config-data\") pod \"rabbitmq-cell1-server-0\" (UID: \"f21b505a-45c3-4f7e-b323-204d384185b9\") " pod="openstack/rabbitmq-cell1-server-0" Jan 11 17:48:22 crc kubenswrapper[4837]: I0111 17:48:22.326019 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/09134535-27db-4787-89a5-c01f72ffa182-operator-scripts\") pod \"openstack-galera-0\" (UID: \"09134535-27db-4787-89a5-c01f72ffa182\") " pod="openstack/openstack-galera-0" Jan 11 17:48:22 crc kubenswrapper[4837]: I0111 17:48:22.326041 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/89ca88bf-2462-4fce-8a85-8dc04655b21c-erlang-cookie-secret\") pod \"rabbitmq-server-0\" (UID: \"89ca88bf-2462-4fce-8a85-8dc04655b21c\") " pod="openstack/rabbitmq-server-0" Jan 11 17:48:22 crc kubenswrapper[4837]: I0111 17:48:22.326065 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/f21b505a-45c3-4f7e-b323-204d384185b9-rabbitmq-plugins\") pod \"rabbitmq-cell1-server-0\" (UID: \"f21b505a-45c3-4f7e-b323-204d384185b9\") " pod="openstack/rabbitmq-cell1-server-0" Jan 11 17:48:22 crc kubenswrapper[4837]: I0111 17:48:22.326084 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/89ca88bf-2462-4fce-8a85-8dc04655b21c-rabbitmq-erlang-cookie\") pod \"rabbitmq-server-0\" (UID: \"89ca88bf-2462-4fce-8a85-8dc04655b21c\") " pod="openstack/rabbitmq-server-0" Jan 11 17:48:22 crc kubenswrapper[4837]: I0111 17:48:22.326106 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/09134535-27db-4787-89a5-c01f72ffa182-kolla-config\") pod \"openstack-galera-0\" (UID: \"09134535-27db-4787-89a5-c01f72ffa182\") " pod="openstack/openstack-galera-0" Jan 11 17:48:22 crc kubenswrapper[4837]: I0111 17:48:22.326119 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/09134535-27db-4787-89a5-c01f72ffa182-combined-ca-bundle\") pod \"openstack-galera-0\" (UID: \"09134535-27db-4787-89a5-c01f72ffa182\") " pod="openstack/openstack-galera-0" Jan 11 17:48:22 crc kubenswrapper[4837]: I0111 17:48:22.326150 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage07-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage07-crc\") pod \"rabbitmq-server-0\" (UID: \"89ca88bf-2462-4fce-8a85-8dc04655b21c\") " pod="openstack/rabbitmq-server-0" Jan 11 17:48:22 crc kubenswrapper[4837]: I0111 17:48:22.326184 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/f21b505a-45c3-4f7e-b323-204d384185b9-pod-info\") pod \"rabbitmq-cell1-server-0\" (UID: \"f21b505a-45c3-4f7e-b323-204d384185b9\") " pod="openstack/rabbitmq-cell1-server-0" Jan 11 17:48:22 crc kubenswrapper[4837]: I0111 17:48:22.326231 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/89ca88bf-2462-4fce-8a85-8dc04655b21c-rabbitmq-confd\") pod \"rabbitmq-server-0\" (UID: \"89ca88bf-2462-4fce-8a85-8dc04655b21c\") " pod="openstack/rabbitmq-server-0" Jan 11 17:48:22 crc kubenswrapper[4837]: I0111 17:48:22.326259 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kz8zc\" (UniqueName: \"kubernetes.io/projected/89ca88bf-2462-4fce-8a85-8dc04655b21c-kube-api-access-kz8zc\") pod \"rabbitmq-server-0\" (UID: \"89ca88bf-2462-4fce-8a85-8dc04655b21c\") " pod="openstack/rabbitmq-server-0" Jan 11 17:48:22 crc kubenswrapper[4837]: I0111 17:48:22.326282 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/89ca88bf-2462-4fce-8a85-8dc04655b21c-server-conf\") pod \"rabbitmq-server-0\" (UID: \"89ca88bf-2462-4fce-8a85-8dc04655b21c\") " pod="openstack/rabbitmq-server-0" Jan 11 17:48:22 crc kubenswrapper[4837]: I0111 17:48:22.326301 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/89ca88bf-2462-4fce-8a85-8dc04655b21c-config-data\") pod \"rabbitmq-server-0\" (UID: \"89ca88bf-2462-4fce-8a85-8dc04655b21c\") " pod="openstack/rabbitmq-server-0" Jan 11 17:48:22 crc kubenswrapper[4837]: I0111 17:48:22.326333 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/09134535-27db-4787-89a5-c01f72ffa182-config-data-generated\") pod \"openstack-galera-0\" (UID: \"09134535-27db-4787-89a5-c01f72ffa182\") " pod="openstack/openstack-galera-0" Jan 11 17:48:22 crc kubenswrapper[4837]: I0111 17:48:22.326370 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/f21b505a-45c3-4f7e-b323-204d384185b9-rabbitmq-confd\") pod \"rabbitmq-cell1-server-0\" (UID: \"f21b505a-45c3-4f7e-b323-204d384185b9\") " pod="openstack/rabbitmq-cell1-server-0" Jan 11 17:48:22 crc kubenswrapper[4837]: I0111 17:48:22.434116 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/f21b505a-45c3-4f7e-b323-204d384185b9-erlang-cookie-secret\") pod \"rabbitmq-cell1-server-0\" (UID: \"f21b505a-45c3-4f7e-b323-204d384185b9\") " pod="openstack/rabbitmq-cell1-server-0" Jan 11 17:48:22 crc kubenswrapper[4837]: I0111 17:48:22.434166 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/89ca88bf-2462-4fce-8a85-8dc04655b21c-plugins-conf\") pod \"rabbitmq-server-0\" (UID: \"89ca88bf-2462-4fce-8a85-8dc04655b21c\") " pod="openstack/rabbitmq-server-0" Jan 11 17:48:22 crc kubenswrapper[4837]: I0111 17:48:22.434192 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/09134535-27db-4787-89a5-c01f72ffa182-galera-tls-certs\") pod \"openstack-galera-0\" (UID: \"09134535-27db-4787-89a5-c01f72ffa182\") " pod="openstack/openstack-galera-0" Jan 11 17:48:22 crc kubenswrapper[4837]: I0111 17:48:22.434217 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/89ca88bf-2462-4fce-8a85-8dc04655b21c-rabbitmq-plugins\") pod \"rabbitmq-server-0\" (UID: \"89ca88bf-2462-4fce-8a85-8dc04655b21c\") " pod="openstack/rabbitmq-server-0" Jan 11 17:48:22 crc kubenswrapper[4837]: I0111 17:48:22.434254 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/f21b505a-45c3-4f7e-b323-204d384185b9-rabbitmq-tls\") pod \"rabbitmq-cell1-server-0\" (UID: \"f21b505a-45c3-4f7e-b323-204d384185b9\") " pod="openstack/rabbitmq-cell1-server-0" Jan 11 17:48:22 crc kubenswrapper[4837]: I0111 17:48:22.434276 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5lvv9\" (UniqueName: \"kubernetes.io/projected/f21b505a-45c3-4f7e-b323-204d384185b9-kube-api-access-5lvv9\") pod \"rabbitmq-cell1-server-0\" (UID: \"f21b505a-45c3-4f7e-b323-204d384185b9\") " pod="openstack/rabbitmq-cell1-server-0" Jan 11 17:48:22 crc kubenswrapper[4837]: I0111 17:48:22.434297 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/09134535-27db-4787-89a5-c01f72ffa182-config-data-default\") pod \"openstack-galera-0\" (UID: \"09134535-27db-4787-89a5-c01f72ffa182\") " pod="openstack/openstack-galera-0" Jan 11 17:48:22 crc kubenswrapper[4837]: I0111 17:48:22.434313 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/f21b505a-45c3-4f7e-b323-204d384185b9-plugins-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"f21b505a-45c3-4f7e-b323-204d384185b9\") " pod="openstack/rabbitmq-cell1-server-0" Jan 11 17:48:22 crc kubenswrapper[4837]: I0111 17:48:22.434333 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage09-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage09-crc\") pod \"openstack-galera-0\" (UID: \"09134535-27db-4787-89a5-c01f72ffa182\") " pod="openstack/openstack-galera-0" Jan 11 17:48:22 crc kubenswrapper[4837]: I0111 17:48:22.434347 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/f21b505a-45c3-4f7e-b323-204d384185b9-config-data\") pod \"rabbitmq-cell1-server-0\" (UID: \"f21b505a-45c3-4f7e-b323-204d384185b9\") " pod="openstack/rabbitmq-cell1-server-0" Jan 11 17:48:22 crc kubenswrapper[4837]: I0111 17:48:22.434369 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/09134535-27db-4787-89a5-c01f72ffa182-operator-scripts\") pod \"openstack-galera-0\" (UID: \"09134535-27db-4787-89a5-c01f72ffa182\") " pod="openstack/openstack-galera-0" Jan 11 17:48:22 crc kubenswrapper[4837]: I0111 17:48:22.434388 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/89ca88bf-2462-4fce-8a85-8dc04655b21c-erlang-cookie-secret\") pod \"rabbitmq-server-0\" (UID: \"89ca88bf-2462-4fce-8a85-8dc04655b21c\") " pod="openstack/rabbitmq-server-0" Jan 11 17:48:22 crc kubenswrapper[4837]: I0111 17:48:22.434419 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/f21b505a-45c3-4f7e-b323-204d384185b9-rabbitmq-plugins\") pod \"rabbitmq-cell1-server-0\" (UID: \"f21b505a-45c3-4f7e-b323-204d384185b9\") " pod="openstack/rabbitmq-cell1-server-0" Jan 11 17:48:22 crc kubenswrapper[4837]: I0111 17:48:22.434438 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/89ca88bf-2462-4fce-8a85-8dc04655b21c-rabbitmq-erlang-cookie\") pod \"rabbitmq-server-0\" (UID: \"89ca88bf-2462-4fce-8a85-8dc04655b21c\") " pod="openstack/rabbitmq-server-0" Jan 11 17:48:22 crc kubenswrapper[4837]: I0111 17:48:22.434453 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/09134535-27db-4787-89a5-c01f72ffa182-combined-ca-bundle\") pod \"openstack-galera-0\" (UID: \"09134535-27db-4787-89a5-c01f72ffa182\") " pod="openstack/openstack-galera-0" Jan 11 17:48:22 crc kubenswrapper[4837]: I0111 17:48:22.434469 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/09134535-27db-4787-89a5-c01f72ffa182-kolla-config\") pod \"openstack-galera-0\" (UID: \"09134535-27db-4787-89a5-c01f72ffa182\") " pod="openstack/openstack-galera-0" Jan 11 17:48:22 crc kubenswrapper[4837]: I0111 17:48:22.434495 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage07-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage07-crc\") pod \"rabbitmq-server-0\" (UID: \"89ca88bf-2462-4fce-8a85-8dc04655b21c\") " pod="openstack/rabbitmq-server-0" Jan 11 17:48:22 crc kubenswrapper[4837]: I0111 17:48:22.434522 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/f21b505a-45c3-4f7e-b323-204d384185b9-pod-info\") pod \"rabbitmq-cell1-server-0\" (UID: \"f21b505a-45c3-4f7e-b323-204d384185b9\") " pod="openstack/rabbitmq-cell1-server-0" Jan 11 17:48:22 crc kubenswrapper[4837]: I0111 17:48:22.434559 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/89ca88bf-2462-4fce-8a85-8dc04655b21c-rabbitmq-confd\") pod \"rabbitmq-server-0\" (UID: \"89ca88bf-2462-4fce-8a85-8dc04655b21c\") " pod="openstack/rabbitmq-server-0" Jan 11 17:48:22 crc kubenswrapper[4837]: I0111 17:48:22.434582 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kz8zc\" (UniqueName: \"kubernetes.io/projected/89ca88bf-2462-4fce-8a85-8dc04655b21c-kube-api-access-kz8zc\") pod \"rabbitmq-server-0\" (UID: \"89ca88bf-2462-4fce-8a85-8dc04655b21c\") " pod="openstack/rabbitmq-server-0" Jan 11 17:48:22 crc kubenswrapper[4837]: I0111 17:48:22.434626 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/89ca88bf-2462-4fce-8a85-8dc04655b21c-server-conf\") pod \"rabbitmq-server-0\" (UID: \"89ca88bf-2462-4fce-8a85-8dc04655b21c\") " pod="openstack/rabbitmq-server-0" Jan 11 17:48:22 crc kubenswrapper[4837]: I0111 17:48:22.434645 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/89ca88bf-2462-4fce-8a85-8dc04655b21c-config-data\") pod \"rabbitmq-server-0\" (UID: \"89ca88bf-2462-4fce-8a85-8dc04655b21c\") " pod="openstack/rabbitmq-server-0" Jan 11 17:48:22 crc kubenswrapper[4837]: I0111 17:48:22.434687 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/09134535-27db-4787-89a5-c01f72ffa182-config-data-generated\") pod \"openstack-galera-0\" (UID: \"09134535-27db-4787-89a5-c01f72ffa182\") " pod="openstack/openstack-galera-0" Jan 11 17:48:22 crc kubenswrapper[4837]: I0111 17:48:22.434707 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/f21b505a-45c3-4f7e-b323-204d384185b9-rabbitmq-confd\") pod \"rabbitmq-cell1-server-0\" (UID: \"f21b505a-45c3-4f7e-b323-204d384185b9\") " pod="openstack/rabbitmq-cell1-server-0" Jan 11 17:48:22 crc kubenswrapper[4837]: I0111 17:48:22.434723 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/f21b505a-45c3-4f7e-b323-204d384185b9-server-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"f21b505a-45c3-4f7e-b323-204d384185b9\") " pod="openstack/rabbitmq-cell1-server-0" Jan 11 17:48:22 crc kubenswrapper[4837]: I0111 17:48:22.434737 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-nmxwv\" (UniqueName: \"kubernetes.io/projected/09134535-27db-4787-89a5-c01f72ffa182-kube-api-access-nmxwv\") pod \"openstack-galera-0\" (UID: \"09134535-27db-4787-89a5-c01f72ffa182\") " pod="openstack/openstack-galera-0" Jan 11 17:48:22 crc kubenswrapper[4837]: I0111 17:48:22.434753 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/89ca88bf-2462-4fce-8a85-8dc04655b21c-pod-info\") pod \"rabbitmq-server-0\" (UID: \"89ca88bf-2462-4fce-8a85-8dc04655b21c\") " pod="openstack/rabbitmq-server-0" Jan 11 17:48:22 crc kubenswrapper[4837]: I0111 17:48:22.434771 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage02-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage02-crc\") pod \"rabbitmq-cell1-server-0\" (UID: \"f21b505a-45c3-4f7e-b323-204d384185b9\") " pod="openstack/rabbitmq-cell1-server-0" Jan 11 17:48:22 crc kubenswrapper[4837]: I0111 17:48:22.434796 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/89ca88bf-2462-4fce-8a85-8dc04655b21c-rabbitmq-tls\") pod \"rabbitmq-server-0\" (UID: \"89ca88bf-2462-4fce-8a85-8dc04655b21c\") " pod="openstack/rabbitmq-server-0" Jan 11 17:48:22 crc kubenswrapper[4837]: I0111 17:48:22.434810 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/f21b505a-45c3-4f7e-b323-204d384185b9-rabbitmq-erlang-cookie\") pod \"rabbitmq-cell1-server-0\" (UID: \"f21b505a-45c3-4f7e-b323-204d384185b9\") " pod="openstack/rabbitmq-cell1-server-0" Jan 11 17:48:22 crc kubenswrapper[4837]: I0111 17:48:22.435283 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/f21b505a-45c3-4f7e-b323-204d384185b9-rabbitmq-erlang-cookie\") pod \"rabbitmq-cell1-server-0\" (UID: \"f21b505a-45c3-4f7e-b323-204d384185b9\") " pod="openstack/rabbitmq-cell1-server-0" Jan 11 17:48:22 crc kubenswrapper[4837]: I0111 17:48:22.437844 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/89ca88bf-2462-4fce-8a85-8dc04655b21c-config-data\") pod \"rabbitmq-server-0\" (UID: \"89ca88bf-2462-4fce-8a85-8dc04655b21c\") " pod="openstack/rabbitmq-server-0" Jan 11 17:48:22 crc kubenswrapper[4837]: I0111 17:48:22.438098 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/f21b505a-45c3-4f7e-b323-204d384185b9-server-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"f21b505a-45c3-4f7e-b323-204d384185b9\") " pod="openstack/rabbitmq-cell1-server-0" Jan 11 17:48:22 crc kubenswrapper[4837]: I0111 17:48:22.438571 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/89ca88bf-2462-4fce-8a85-8dc04655b21c-rabbitmq-plugins\") pod \"rabbitmq-server-0\" (UID: \"89ca88bf-2462-4fce-8a85-8dc04655b21c\") " pod="openstack/rabbitmq-server-0" Jan 11 17:48:22 crc kubenswrapper[4837]: I0111 17:48:22.439630 4837 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage02-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage02-crc\") pod \"rabbitmq-cell1-server-0\" (UID: \"f21b505a-45c3-4f7e-b323-204d384185b9\") device mount path \"/mnt/openstack/pv02\"" pod="openstack/rabbitmq-cell1-server-0" Jan 11 17:48:22 crc kubenswrapper[4837]: I0111 17:48:22.439742 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/89ca88bf-2462-4fce-8a85-8dc04655b21c-plugins-conf\") pod \"rabbitmq-server-0\" (UID: \"89ca88bf-2462-4fce-8a85-8dc04655b21c\") " pod="openstack/rabbitmq-server-0" Jan 11 17:48:22 crc kubenswrapper[4837]: I0111 17:48:22.439785 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/09134535-27db-4787-89a5-c01f72ffa182-config-data-generated\") pod \"openstack-galera-0\" (UID: \"09134535-27db-4787-89a5-c01f72ffa182\") " pod="openstack/openstack-galera-0" Jan 11 17:48:22 crc kubenswrapper[4837]: I0111 17:48:22.440550 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/09134535-27db-4787-89a5-c01f72ffa182-config-data-default\") pod \"openstack-galera-0\" (UID: \"09134535-27db-4787-89a5-c01f72ffa182\") " pod="openstack/openstack-galera-0" Jan 11 17:48:22 crc kubenswrapper[4837]: I0111 17:48:22.440732 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/f21b505a-45c3-4f7e-b323-204d384185b9-plugins-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"f21b505a-45c3-4f7e-b323-204d384185b9\") " pod="openstack/rabbitmq-cell1-server-0" Jan 11 17:48:22 crc kubenswrapper[4837]: I0111 17:48:22.440817 4837 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage07-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage07-crc\") pod \"rabbitmq-server-0\" (UID: \"89ca88bf-2462-4fce-8a85-8dc04655b21c\") device mount path \"/mnt/openstack/pv07\"" pod="openstack/rabbitmq-server-0" Jan 11 17:48:22 crc kubenswrapper[4837]: I0111 17:48:22.443015 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/09134535-27db-4787-89a5-c01f72ffa182-kolla-config\") pod \"openstack-galera-0\" (UID: \"09134535-27db-4787-89a5-c01f72ffa182\") " pod="openstack/openstack-galera-0" Jan 11 17:48:22 crc kubenswrapper[4837]: I0111 17:48:22.443398 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/89ca88bf-2462-4fce-8a85-8dc04655b21c-pod-info\") pod \"rabbitmq-server-0\" (UID: \"89ca88bf-2462-4fce-8a85-8dc04655b21c\") " pod="openstack/rabbitmq-server-0" Jan 11 17:48:22 crc kubenswrapper[4837]: I0111 17:48:22.443613 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/89ca88bf-2462-4fce-8a85-8dc04655b21c-rabbitmq-tls\") pod \"rabbitmq-server-0\" (UID: \"89ca88bf-2462-4fce-8a85-8dc04655b21c\") " pod="openstack/rabbitmq-server-0" Jan 11 17:48:22 crc kubenswrapper[4837]: I0111 17:48:22.443906 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/f21b505a-45c3-4f7e-b323-204d384185b9-rabbitmq-tls\") pod \"rabbitmq-cell1-server-0\" (UID: \"f21b505a-45c3-4f7e-b323-204d384185b9\") " pod="openstack/rabbitmq-cell1-server-0" Jan 11 17:48:22 crc kubenswrapper[4837]: I0111 17:48:22.443924 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/f21b505a-45c3-4f7e-b323-204d384185b9-rabbitmq-plugins\") pod \"rabbitmq-cell1-server-0\" (UID: \"f21b505a-45c3-4f7e-b323-204d384185b9\") " pod="openstack/rabbitmq-cell1-server-0" Jan 11 17:48:22 crc kubenswrapper[4837]: I0111 17:48:22.444314 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/89ca88bf-2462-4fce-8a85-8dc04655b21c-rabbitmq-erlang-cookie\") pod \"rabbitmq-server-0\" (UID: \"89ca88bf-2462-4fce-8a85-8dc04655b21c\") " pod="openstack/rabbitmq-server-0" Jan 11 17:48:22 crc kubenswrapper[4837]: I0111 17:48:22.444496 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/89ca88bf-2462-4fce-8a85-8dc04655b21c-server-conf\") pod \"rabbitmq-server-0\" (UID: \"89ca88bf-2462-4fce-8a85-8dc04655b21c\") " pod="openstack/rabbitmq-server-0" Jan 11 17:48:22 crc kubenswrapper[4837]: I0111 17:48:22.444588 4837 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage09-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage09-crc\") pod \"openstack-galera-0\" (UID: \"09134535-27db-4787-89a5-c01f72ffa182\") device mount path \"/mnt/openstack/pv09\"" pod="openstack/openstack-galera-0" Jan 11 17:48:22 crc kubenswrapper[4837]: I0111 17:48:22.447873 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/f21b505a-45c3-4f7e-b323-204d384185b9-erlang-cookie-secret\") pod \"rabbitmq-cell1-server-0\" (UID: \"f21b505a-45c3-4f7e-b323-204d384185b9\") " pod="openstack/rabbitmq-cell1-server-0" Jan 11 17:48:22 crc kubenswrapper[4837]: I0111 17:48:22.448004 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/f21b505a-45c3-4f7e-b323-204d384185b9-config-data\") pod \"rabbitmq-cell1-server-0\" (UID: \"f21b505a-45c3-4f7e-b323-204d384185b9\") " pod="openstack/rabbitmq-cell1-server-0" Jan 11 17:48:22 crc kubenswrapper[4837]: I0111 17:48:22.448447 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/09134535-27db-4787-89a5-c01f72ffa182-operator-scripts\") pod \"openstack-galera-0\" (UID: \"09134535-27db-4787-89a5-c01f72ffa182\") " pod="openstack/openstack-galera-0" Jan 11 17:48:22 crc kubenswrapper[4837]: I0111 17:48:22.460619 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/f21b505a-45c3-4f7e-b323-204d384185b9-rabbitmq-confd\") pod \"rabbitmq-cell1-server-0\" (UID: \"f21b505a-45c3-4f7e-b323-204d384185b9\") " pod="openstack/rabbitmq-cell1-server-0" Jan 11 17:48:22 crc kubenswrapper[4837]: I0111 17:48:22.463380 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/89ca88bf-2462-4fce-8a85-8dc04655b21c-erlang-cookie-secret\") pod \"rabbitmq-server-0\" (UID: \"89ca88bf-2462-4fce-8a85-8dc04655b21c\") " pod="openstack/rabbitmq-server-0" Jan 11 17:48:22 crc kubenswrapper[4837]: I0111 17:48:22.463632 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/f21b505a-45c3-4f7e-b323-204d384185b9-pod-info\") pod \"rabbitmq-cell1-server-0\" (UID: \"f21b505a-45c3-4f7e-b323-204d384185b9\") " pod="openstack/rabbitmq-cell1-server-0" Jan 11 17:48:22 crc kubenswrapper[4837]: I0111 17:48:22.464861 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/09134535-27db-4787-89a5-c01f72ffa182-galera-tls-certs\") pod \"openstack-galera-0\" (UID: \"09134535-27db-4787-89a5-c01f72ffa182\") " pod="openstack/openstack-galera-0" Jan 11 17:48:22 crc kubenswrapper[4837]: I0111 17:48:22.465123 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-nmxwv\" (UniqueName: \"kubernetes.io/projected/09134535-27db-4787-89a5-c01f72ffa182-kube-api-access-nmxwv\") pod \"openstack-galera-0\" (UID: \"09134535-27db-4787-89a5-c01f72ffa182\") " pod="openstack/openstack-galera-0" Jan 11 17:48:22 crc kubenswrapper[4837]: I0111 17:48:22.465363 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/09134535-27db-4787-89a5-c01f72ffa182-combined-ca-bundle\") pod \"openstack-galera-0\" (UID: \"09134535-27db-4787-89a5-c01f72ffa182\") " pod="openstack/openstack-galera-0" Jan 11 17:48:22 crc kubenswrapper[4837]: I0111 17:48:22.465537 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5lvv9\" (UniqueName: \"kubernetes.io/projected/f21b505a-45c3-4f7e-b323-204d384185b9-kube-api-access-5lvv9\") pod \"rabbitmq-cell1-server-0\" (UID: \"f21b505a-45c3-4f7e-b323-204d384185b9\") " pod="openstack/rabbitmq-cell1-server-0" Jan 11 17:48:22 crc kubenswrapper[4837]: I0111 17:48:22.466121 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/89ca88bf-2462-4fce-8a85-8dc04655b21c-rabbitmq-confd\") pod \"rabbitmq-server-0\" (UID: \"89ca88bf-2462-4fce-8a85-8dc04655b21c\") " pod="openstack/rabbitmq-server-0" Jan 11 17:48:22 crc kubenswrapper[4837]: I0111 17:48:22.485885 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage07-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage07-crc\") pod \"rabbitmq-server-0\" (UID: \"89ca88bf-2462-4fce-8a85-8dc04655b21c\") " pod="openstack/rabbitmq-server-0" Jan 11 17:48:22 crc kubenswrapper[4837]: I0111 17:48:22.489453 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-kz8zc\" (UniqueName: \"kubernetes.io/projected/89ca88bf-2462-4fce-8a85-8dc04655b21c-kube-api-access-kz8zc\") pod \"rabbitmq-server-0\" (UID: \"89ca88bf-2462-4fce-8a85-8dc04655b21c\") " pod="openstack/rabbitmq-server-0" Jan 11 17:48:22 crc kubenswrapper[4837]: I0111 17:48:22.506097 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage02-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage02-crc\") pod \"rabbitmq-cell1-server-0\" (UID: \"f21b505a-45c3-4f7e-b323-204d384185b9\") " pod="openstack/rabbitmq-cell1-server-0" Jan 11 17:48:22 crc kubenswrapper[4837]: I0111 17:48:22.532037 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage09-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage09-crc\") pod \"openstack-galera-0\" (UID: \"09134535-27db-4787-89a5-c01f72ffa182\") " pod="openstack/openstack-galera-0" Jan 11 17:48:22 crc kubenswrapper[4837]: I0111 17:48:22.541974 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-server-0" Jan 11 17:48:22 crc kubenswrapper[4837]: I0111 17:48:22.563031 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-cell1-server-0" Jan 11 17:48:22 crc kubenswrapper[4837]: I0111 17:48:22.579882 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstack-galera-0" Jan 11 17:48:22 crc kubenswrapper[4837]: I0111 17:48:22.585556 4837 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/openstack-cell1-galera-0"] Jan 11 17:48:22 crc kubenswrapper[4837]: I0111 17:48:22.586589 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstack-cell1-galera-0" Jan 11 17:48:22 crc kubenswrapper[4837]: I0111 17:48:22.592324 4837 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"galera-openstack-cell1-dockercfg-f57xf" Jan 11 17:48:22 crc kubenswrapper[4837]: I0111 17:48:22.594543 4837 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-cell1-scripts" Jan 11 17:48:22 crc kubenswrapper[4837]: I0111 17:48:22.603939 4837 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-cell1-config-data" Jan 11 17:48:22 crc kubenswrapper[4837]: I0111 17:48:22.615579 4837 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/openstack-cell1-galera-0"] Jan 11 17:48:22 crc kubenswrapper[4837]: I0111 17:48:22.636859 4837 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-galera-openstack-cell1-svc" Jan 11 17:48:22 crc kubenswrapper[4837]: I0111 17:48:22.739705 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/bafaf023-917f-44a9-807e-b6a0f6a55e77-operator-scripts\") pod \"openstack-cell1-galera-0\" (UID: \"bafaf023-917f-44a9-807e-b6a0f6a55e77\") " pod="openstack/openstack-cell1-galera-0" Jan 11 17:48:22 crc kubenswrapper[4837]: I0111 17:48:22.740063 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/bafaf023-917f-44a9-807e-b6a0f6a55e77-kolla-config\") pod \"openstack-cell1-galera-0\" (UID: \"bafaf023-917f-44a9-807e-b6a0f6a55e77\") " pod="openstack/openstack-cell1-galera-0" Jan 11 17:48:22 crc kubenswrapper[4837]: I0111 17:48:22.740105 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage08-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage08-crc\") pod \"openstack-cell1-galera-0\" (UID: \"bafaf023-917f-44a9-807e-b6a0f6a55e77\") " pod="openstack/openstack-cell1-galera-0" Jan 11 17:48:22 crc kubenswrapper[4837]: I0111 17:48:22.740123 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/bafaf023-917f-44a9-807e-b6a0f6a55e77-config-data-generated\") pod \"openstack-cell1-galera-0\" (UID: \"bafaf023-917f-44a9-807e-b6a0f6a55e77\") " pod="openstack/openstack-cell1-galera-0" Jan 11 17:48:22 crc kubenswrapper[4837]: I0111 17:48:22.740264 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/bafaf023-917f-44a9-807e-b6a0f6a55e77-combined-ca-bundle\") pod \"openstack-cell1-galera-0\" (UID: \"bafaf023-917f-44a9-807e-b6a0f6a55e77\") " pod="openstack/openstack-cell1-galera-0" Jan 11 17:48:22 crc kubenswrapper[4837]: I0111 17:48:22.740378 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/bafaf023-917f-44a9-807e-b6a0f6a55e77-config-data-default\") pod \"openstack-cell1-galera-0\" (UID: \"bafaf023-917f-44a9-807e-b6a0f6a55e77\") " pod="openstack/openstack-cell1-galera-0" Jan 11 17:48:22 crc kubenswrapper[4837]: I0111 17:48:22.740397 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hkvwt\" (UniqueName: \"kubernetes.io/projected/bafaf023-917f-44a9-807e-b6a0f6a55e77-kube-api-access-hkvwt\") pod \"openstack-cell1-galera-0\" (UID: \"bafaf023-917f-44a9-807e-b6a0f6a55e77\") " pod="openstack/openstack-cell1-galera-0" Jan 11 17:48:22 crc kubenswrapper[4837]: I0111 17:48:22.740423 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/bafaf023-917f-44a9-807e-b6a0f6a55e77-galera-tls-certs\") pod \"openstack-cell1-galera-0\" (UID: \"bafaf023-917f-44a9-807e-b6a0f6a55e77\") " pod="openstack/openstack-cell1-galera-0" Jan 11 17:48:22 crc kubenswrapper[4837]: I0111 17:48:22.841689 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage08-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage08-crc\") pod \"openstack-cell1-galera-0\" (UID: \"bafaf023-917f-44a9-807e-b6a0f6a55e77\") " pod="openstack/openstack-cell1-galera-0" Jan 11 17:48:22 crc kubenswrapper[4837]: I0111 17:48:22.841728 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/bafaf023-917f-44a9-807e-b6a0f6a55e77-config-data-generated\") pod \"openstack-cell1-galera-0\" (UID: \"bafaf023-917f-44a9-807e-b6a0f6a55e77\") " pod="openstack/openstack-cell1-galera-0" Jan 11 17:48:22 crc kubenswrapper[4837]: I0111 17:48:22.841771 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/bafaf023-917f-44a9-807e-b6a0f6a55e77-combined-ca-bundle\") pod \"openstack-cell1-galera-0\" (UID: \"bafaf023-917f-44a9-807e-b6a0f6a55e77\") " pod="openstack/openstack-cell1-galera-0" Jan 11 17:48:22 crc kubenswrapper[4837]: I0111 17:48:22.841810 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/bafaf023-917f-44a9-807e-b6a0f6a55e77-config-data-default\") pod \"openstack-cell1-galera-0\" (UID: \"bafaf023-917f-44a9-807e-b6a0f6a55e77\") " pod="openstack/openstack-cell1-galera-0" Jan 11 17:48:22 crc kubenswrapper[4837]: I0111 17:48:22.841827 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hkvwt\" (UniqueName: \"kubernetes.io/projected/bafaf023-917f-44a9-807e-b6a0f6a55e77-kube-api-access-hkvwt\") pod \"openstack-cell1-galera-0\" (UID: \"bafaf023-917f-44a9-807e-b6a0f6a55e77\") " pod="openstack/openstack-cell1-galera-0" Jan 11 17:48:22 crc kubenswrapper[4837]: I0111 17:48:22.841842 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/bafaf023-917f-44a9-807e-b6a0f6a55e77-galera-tls-certs\") pod \"openstack-cell1-galera-0\" (UID: \"bafaf023-917f-44a9-807e-b6a0f6a55e77\") " pod="openstack/openstack-cell1-galera-0" Jan 11 17:48:22 crc kubenswrapper[4837]: I0111 17:48:22.841866 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/bafaf023-917f-44a9-807e-b6a0f6a55e77-operator-scripts\") pod \"openstack-cell1-galera-0\" (UID: \"bafaf023-917f-44a9-807e-b6a0f6a55e77\") " pod="openstack/openstack-cell1-galera-0" Jan 11 17:48:22 crc kubenswrapper[4837]: I0111 17:48:22.841895 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/bafaf023-917f-44a9-807e-b6a0f6a55e77-kolla-config\") pod \"openstack-cell1-galera-0\" (UID: \"bafaf023-917f-44a9-807e-b6a0f6a55e77\") " pod="openstack/openstack-cell1-galera-0" Jan 11 17:48:22 crc kubenswrapper[4837]: I0111 17:48:22.842508 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/bafaf023-917f-44a9-807e-b6a0f6a55e77-kolla-config\") pod \"openstack-cell1-galera-0\" (UID: \"bafaf023-917f-44a9-807e-b6a0f6a55e77\") " pod="openstack/openstack-cell1-galera-0" Jan 11 17:48:22 crc kubenswrapper[4837]: I0111 17:48:22.843008 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/bafaf023-917f-44a9-807e-b6a0f6a55e77-config-data-generated\") pod \"openstack-cell1-galera-0\" (UID: \"bafaf023-917f-44a9-807e-b6a0f6a55e77\") " pod="openstack/openstack-cell1-galera-0" Jan 11 17:48:22 crc kubenswrapper[4837]: I0111 17:48:22.843207 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/bafaf023-917f-44a9-807e-b6a0f6a55e77-config-data-default\") pod \"openstack-cell1-galera-0\" (UID: \"bafaf023-917f-44a9-807e-b6a0f6a55e77\") " pod="openstack/openstack-cell1-galera-0" Jan 11 17:48:22 crc kubenswrapper[4837]: I0111 17:48:22.843630 4837 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage08-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage08-crc\") pod \"openstack-cell1-galera-0\" (UID: \"bafaf023-917f-44a9-807e-b6a0f6a55e77\") device mount path \"/mnt/openstack/pv08\"" pod="openstack/openstack-cell1-galera-0" Jan 11 17:48:22 crc kubenswrapper[4837]: I0111 17:48:22.844578 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/bafaf023-917f-44a9-807e-b6a0f6a55e77-operator-scripts\") pod \"openstack-cell1-galera-0\" (UID: \"bafaf023-917f-44a9-807e-b6a0f6a55e77\") " pod="openstack/openstack-cell1-galera-0" Jan 11 17:48:22 crc kubenswrapper[4837]: I0111 17:48:22.849268 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/bafaf023-917f-44a9-807e-b6a0f6a55e77-galera-tls-certs\") pod \"openstack-cell1-galera-0\" (UID: \"bafaf023-917f-44a9-807e-b6a0f6a55e77\") " pod="openstack/openstack-cell1-galera-0" Jan 11 17:48:22 crc kubenswrapper[4837]: I0111 17:48:22.855434 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/bafaf023-917f-44a9-807e-b6a0f6a55e77-combined-ca-bundle\") pod \"openstack-cell1-galera-0\" (UID: \"bafaf023-917f-44a9-807e-b6a0f6a55e77\") " pod="openstack/openstack-cell1-galera-0" Jan 11 17:48:22 crc kubenswrapper[4837]: I0111 17:48:22.874986 4837 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/memcached-0"] Jan 11 17:48:22 crc kubenswrapper[4837]: I0111 17:48:22.875958 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/memcached-0" Jan 11 17:48:22 crc kubenswrapper[4837]: I0111 17:48:22.877974 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hkvwt\" (UniqueName: \"kubernetes.io/projected/bafaf023-917f-44a9-807e-b6a0f6a55e77-kube-api-access-hkvwt\") pod \"openstack-cell1-galera-0\" (UID: \"bafaf023-917f-44a9-807e-b6a0f6a55e77\") " pod="openstack/openstack-cell1-galera-0" Jan 11 17:48:22 crc kubenswrapper[4837]: I0111 17:48:22.880779 4837 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-memcached-svc" Jan 11 17:48:22 crc kubenswrapper[4837]: I0111 17:48:22.880957 4837 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"memcached-config-data" Jan 11 17:48:22 crc kubenswrapper[4837]: I0111 17:48:22.881061 4837 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"memcached-memcached-dockercfg-8xzh8" Jan 11 17:48:22 crc kubenswrapper[4837]: I0111 17:48:22.891395 4837 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/memcached-0"] Jan 11 17:48:22 crc kubenswrapper[4837]: I0111 17:48:22.896969 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage08-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage08-crc\") pod \"openstack-cell1-galera-0\" (UID: \"bafaf023-917f-44a9-807e-b6a0f6a55e77\") " pod="openstack/openstack-cell1-galera-0" Jan 11 17:48:22 crc kubenswrapper[4837]: I0111 17:48:22.953367 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstack-cell1-galera-0" Jan 11 17:48:22 crc kubenswrapper[4837]: I0111 17:48:22.953855 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/d1aa5cf3-303a-4a5b-8802-fe264fa090d6-config-data\") pod \"memcached-0\" (UID: \"d1aa5cf3-303a-4a5b-8802-fe264fa090d6\") " pod="openstack/memcached-0" Jan 11 17:48:22 crc kubenswrapper[4837]: I0111 17:48:22.953896 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d1aa5cf3-303a-4a5b-8802-fe264fa090d6-combined-ca-bundle\") pod \"memcached-0\" (UID: \"d1aa5cf3-303a-4a5b-8802-fe264fa090d6\") " pod="openstack/memcached-0" Jan 11 17:48:22 crc kubenswrapper[4837]: I0111 17:48:22.953920 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"memcached-tls-certs\" (UniqueName: \"kubernetes.io/secret/d1aa5cf3-303a-4a5b-8802-fe264fa090d6-memcached-tls-certs\") pod \"memcached-0\" (UID: \"d1aa5cf3-303a-4a5b-8802-fe264fa090d6\") " pod="openstack/memcached-0" Jan 11 17:48:22 crc kubenswrapper[4837]: I0111 17:48:22.953952 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/d1aa5cf3-303a-4a5b-8802-fe264fa090d6-kolla-config\") pod \"memcached-0\" (UID: \"d1aa5cf3-303a-4a5b-8802-fe264fa090d6\") " pod="openstack/memcached-0" Jan 11 17:48:22 crc kubenswrapper[4837]: I0111 17:48:22.953971 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5r596\" (UniqueName: \"kubernetes.io/projected/d1aa5cf3-303a-4a5b-8802-fe264fa090d6-kube-api-access-5r596\") pod \"memcached-0\" (UID: \"d1aa5cf3-303a-4a5b-8802-fe264fa090d6\") " pod="openstack/memcached-0" Jan 11 17:48:23 crc kubenswrapper[4837]: I0111 17:48:23.071687 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5r596\" (UniqueName: \"kubernetes.io/projected/d1aa5cf3-303a-4a5b-8802-fe264fa090d6-kube-api-access-5r596\") pod \"memcached-0\" (UID: \"d1aa5cf3-303a-4a5b-8802-fe264fa090d6\") " pod="openstack/memcached-0" Jan 11 17:48:23 crc kubenswrapper[4837]: I0111 17:48:23.071802 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/d1aa5cf3-303a-4a5b-8802-fe264fa090d6-config-data\") pod \"memcached-0\" (UID: \"d1aa5cf3-303a-4a5b-8802-fe264fa090d6\") " pod="openstack/memcached-0" Jan 11 17:48:23 crc kubenswrapper[4837]: I0111 17:48:23.071841 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d1aa5cf3-303a-4a5b-8802-fe264fa090d6-combined-ca-bundle\") pod \"memcached-0\" (UID: \"d1aa5cf3-303a-4a5b-8802-fe264fa090d6\") " pod="openstack/memcached-0" Jan 11 17:48:23 crc kubenswrapper[4837]: I0111 17:48:23.071869 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"memcached-tls-certs\" (UniqueName: \"kubernetes.io/secret/d1aa5cf3-303a-4a5b-8802-fe264fa090d6-memcached-tls-certs\") pod \"memcached-0\" (UID: \"d1aa5cf3-303a-4a5b-8802-fe264fa090d6\") " pod="openstack/memcached-0" Jan 11 17:48:23 crc kubenswrapper[4837]: I0111 17:48:23.071901 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/d1aa5cf3-303a-4a5b-8802-fe264fa090d6-kolla-config\") pod \"memcached-0\" (UID: \"d1aa5cf3-303a-4a5b-8802-fe264fa090d6\") " pod="openstack/memcached-0" Jan 11 17:48:23 crc kubenswrapper[4837]: I0111 17:48:23.072719 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/d1aa5cf3-303a-4a5b-8802-fe264fa090d6-kolla-config\") pod \"memcached-0\" (UID: \"d1aa5cf3-303a-4a5b-8802-fe264fa090d6\") " pod="openstack/memcached-0" Jan 11 17:48:23 crc kubenswrapper[4837]: I0111 17:48:23.073314 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/d1aa5cf3-303a-4a5b-8802-fe264fa090d6-config-data\") pod \"memcached-0\" (UID: \"d1aa5cf3-303a-4a5b-8802-fe264fa090d6\") " pod="openstack/memcached-0" Jan 11 17:48:23 crc kubenswrapper[4837]: I0111 17:48:23.092033 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"memcached-tls-certs\" (UniqueName: \"kubernetes.io/secret/d1aa5cf3-303a-4a5b-8802-fe264fa090d6-memcached-tls-certs\") pod \"memcached-0\" (UID: \"d1aa5cf3-303a-4a5b-8802-fe264fa090d6\") " pod="openstack/memcached-0" Jan 11 17:48:23 crc kubenswrapper[4837]: I0111 17:48:23.092487 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d1aa5cf3-303a-4a5b-8802-fe264fa090d6-combined-ca-bundle\") pod \"memcached-0\" (UID: \"d1aa5cf3-303a-4a5b-8802-fe264fa090d6\") " pod="openstack/memcached-0" Jan 11 17:48:23 crc kubenswrapper[4837]: I0111 17:48:23.098618 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5r596\" (UniqueName: \"kubernetes.io/projected/d1aa5cf3-303a-4a5b-8802-fe264fa090d6-kube-api-access-5r596\") pod \"memcached-0\" (UID: \"d1aa5cf3-303a-4a5b-8802-fe264fa090d6\") " pod="openstack/memcached-0" Jan 11 17:48:23 crc kubenswrapper[4837]: I0111 17:48:23.220773 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/memcached-0" Jan 11 17:48:23 crc kubenswrapper[4837]: I0111 17:48:23.433492 4837 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Jan 11 17:48:23 crc kubenswrapper[4837]: W0111 17:48:23.439862 4837 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf21b505a_45c3_4f7e_b323_204d384185b9.slice/crio-593ebf5705e544460c5f953044732d953c8ad2664adbfad585706c2209e16132 WatchSource:0}: Error finding container 593ebf5705e544460c5f953044732d953c8ad2664adbfad585706c2209e16132: Status 404 returned error can't find the container with id 593ebf5705e544460c5f953044732d953c8ad2664adbfad585706c2209e16132 Jan 11 17:48:23 crc kubenswrapper[4837]: I0111 17:48:23.558165 4837 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/openstack-galera-0"] Jan 11 17:48:23 crc kubenswrapper[4837]: I0111 17:48:23.571108 4837 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/openstack-cell1-galera-0"] Jan 11 17:48:23 crc kubenswrapper[4837]: W0111 17:48:23.592509 4837 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod89ca88bf_2462_4fce_8a85_8dc04655b21c.slice/crio-b8506eccbe2ac34ea0c156f9f059de610ca78f68e43af4f6163f0631a5b79a96 WatchSource:0}: Error finding container b8506eccbe2ac34ea0c156f9f059de610ca78f68e43af4f6163f0631a5b79a96: Status 404 returned error can't find the container with id b8506eccbe2ac34ea0c156f9f059de610ca78f68e43af4f6163f0631a5b79a96 Jan 11 17:48:23 crc kubenswrapper[4837]: I0111 17:48:23.593751 4837 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-server-0"] Jan 11 17:48:23 crc kubenswrapper[4837]: I0111 17:48:23.888514 4837 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/memcached-0"] Jan 11 17:48:23 crc kubenswrapper[4837]: W0111 17:48:23.907355 4837 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podd1aa5cf3_303a_4a5b_8802_fe264fa090d6.slice/crio-d688dc13ae15e848636d1c84122664c3d0af4edf9e2888177d679dd2bb4a2c4c WatchSource:0}: Error finding container d688dc13ae15e848636d1c84122664c3d0af4edf9e2888177d679dd2bb4a2c4c: Status 404 returned error can't find the container with id d688dc13ae15e848636d1c84122664c3d0af4edf9e2888177d679dd2bb4a2c4c Jan 11 17:48:24 crc kubenswrapper[4837]: I0111 17:48:24.173423 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"f21b505a-45c3-4f7e-b323-204d384185b9","Type":"ContainerStarted","Data":"593ebf5705e544460c5f953044732d953c8ad2664adbfad585706c2209e16132"} Jan 11 17:48:24 crc kubenswrapper[4837]: I0111 17:48:24.174894 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-cell1-galera-0" event={"ID":"bafaf023-917f-44a9-807e-b6a0f6a55e77","Type":"ContainerStarted","Data":"f13ba99fefba6ab17c434538360d9d232659534645fae801fd61906d8f1571a6"} Jan 11 17:48:24 crc kubenswrapper[4837]: I0111 17:48:24.178370 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/memcached-0" event={"ID":"d1aa5cf3-303a-4a5b-8802-fe264fa090d6","Type":"ContainerStarted","Data":"d688dc13ae15e848636d1c84122664c3d0af4edf9e2888177d679dd2bb4a2c4c"} Jan 11 17:48:24 crc kubenswrapper[4837]: I0111 17:48:24.182086 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"89ca88bf-2462-4fce-8a85-8dc04655b21c","Type":"ContainerStarted","Data":"b8506eccbe2ac34ea0c156f9f059de610ca78f68e43af4f6163f0631a5b79a96"} Jan 11 17:48:24 crc kubenswrapper[4837]: I0111 17:48:24.187092 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-galera-0" event={"ID":"09134535-27db-4787-89a5-c01f72ffa182","Type":"ContainerStarted","Data":"1aa96cc311b45268c6dcd4499fef86a838015be2dbaefa49aef32eb443d2a06c"} Jan 11 17:48:24 crc kubenswrapper[4837]: I0111 17:48:24.836756 4837 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/kube-state-metrics-0"] Jan 11 17:48:24 crc kubenswrapper[4837]: I0111 17:48:24.838327 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/kube-state-metrics-0" Jan 11 17:48:24 crc kubenswrapper[4837]: I0111 17:48:24.852112 4837 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"telemetry-ceilometer-dockercfg-x44cs" Jan 11 17:48:24 crc kubenswrapper[4837]: I0111 17:48:24.863527 4837 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/kube-state-metrics-0"] Jan 11 17:48:25 crc kubenswrapper[4837]: I0111 17:48:25.023578 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-424k8\" (UniqueName: \"kubernetes.io/projected/cad8e11f-3ef0-4043-a49e-308c103a973f-kube-api-access-424k8\") pod \"kube-state-metrics-0\" (UID: \"cad8e11f-3ef0-4043-a49e-308c103a973f\") " pod="openstack/kube-state-metrics-0" Jan 11 17:48:25 crc kubenswrapper[4837]: I0111 17:48:25.124633 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-424k8\" (UniqueName: \"kubernetes.io/projected/cad8e11f-3ef0-4043-a49e-308c103a973f-kube-api-access-424k8\") pod \"kube-state-metrics-0\" (UID: \"cad8e11f-3ef0-4043-a49e-308c103a973f\") " pod="openstack/kube-state-metrics-0" Jan 11 17:48:25 crc kubenswrapper[4837]: I0111 17:48:25.145600 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-424k8\" (UniqueName: \"kubernetes.io/projected/cad8e11f-3ef0-4043-a49e-308c103a973f-kube-api-access-424k8\") pod \"kube-state-metrics-0\" (UID: \"cad8e11f-3ef0-4043-a49e-308c103a973f\") " pod="openstack/kube-state-metrics-0" Jan 11 17:48:25 crc kubenswrapper[4837]: I0111 17:48:25.194056 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/kube-state-metrics-0" Jan 11 17:48:25 crc kubenswrapper[4837]: I0111 17:48:25.772552 4837 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/kube-state-metrics-0"] Jan 11 17:48:25 crc kubenswrapper[4837]: W0111 17:48:25.806924 4837 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podcad8e11f_3ef0_4043_a49e_308c103a973f.slice/crio-a21b0b133eabc2ef3669d0ea1e12c8eee8851315f3e5bf7d1bb79bce3fcaeda2 WatchSource:0}: Error finding container a21b0b133eabc2ef3669d0ea1e12c8eee8851315f3e5bf7d1bb79bce3fcaeda2: Status 404 returned error can't find the container with id a21b0b133eabc2ef3669d0ea1e12c8eee8851315f3e5bf7d1bb79bce3fcaeda2 Jan 11 17:48:26 crc kubenswrapper[4837]: I0111 17:48:26.241160 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/kube-state-metrics-0" event={"ID":"cad8e11f-3ef0-4043-a49e-308c103a973f","Type":"ContainerStarted","Data":"a21b0b133eabc2ef3669d0ea1e12c8eee8851315f3e5bf7d1bb79bce3fcaeda2"} Jan 11 17:48:28 crc kubenswrapper[4837]: I0111 17:48:28.883562 4837 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovn-controller-zfjdc"] Jan 11 17:48:28 crc kubenswrapper[4837]: I0111 17:48:28.885205 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-zfjdc" Jan 11 17:48:28 crc kubenswrapper[4837]: I0111 17:48:28.887388 4837 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ovncontroller-ovncontroller-dockercfg-n9jff" Jan 11 17:48:28 crc kubenswrapper[4837]: I0111 17:48:28.887818 4837 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-ovncontroller-ovndbs" Jan 11 17:48:28 crc kubenswrapper[4837]: I0111 17:48:28.888091 4837 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovncontroller-scripts" Jan 11 17:48:28 crc kubenswrapper[4837]: I0111 17:48:28.899390 4837 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-zfjdc"] Jan 11 17:48:28 crc kubenswrapper[4837]: I0111 17:48:28.939885 4837 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovn-controller-ovs-22bpd"] Jan 11 17:48:28 crc kubenswrapper[4837]: I0111 17:48:28.942118 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-ovs-22bpd" Jan 11 17:48:28 crc kubenswrapper[4837]: I0111 17:48:28.945821 4837 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-ovs-22bpd"] Jan 11 17:48:28 crc kubenswrapper[4837]: I0111 17:48:28.993401 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/91f28f51-1965-4fdd-bcb8-c261644249d5-combined-ca-bundle\") pod \"ovn-controller-zfjdc\" (UID: \"91f28f51-1965-4fdd-bcb8-c261644249d5\") " pod="openstack/ovn-controller-zfjdc" Jan 11 17:48:28 crc kubenswrapper[4837]: I0111 17:48:28.993449 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/0e7e2f2f-8ba4-4156-a06f-abe8b8c39477-var-log\") pod \"ovn-controller-ovs-22bpd\" (UID: \"0e7e2f2f-8ba4-4156-a06f-abe8b8c39477\") " pod="openstack/ovn-controller-ovs-22bpd" Jan 11 17:48:28 crc kubenswrapper[4837]: I0111 17:48:28.993540 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib\" (UniqueName: \"kubernetes.io/host-path/0e7e2f2f-8ba4-4156-a06f-abe8b8c39477-var-lib\") pod \"ovn-controller-ovs-22bpd\" (UID: \"0e7e2f2f-8ba4-4156-a06f-abe8b8c39477\") " pod="openstack/ovn-controller-ovs-22bpd" Jan 11 17:48:28 crc kubenswrapper[4837]: I0111 17:48:28.993581 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/0e7e2f2f-8ba4-4156-a06f-abe8b8c39477-var-run\") pod \"ovn-controller-ovs-22bpd\" (UID: \"0e7e2f2f-8ba4-4156-a06f-abe8b8c39477\") " pod="openstack/ovn-controller-ovs-22bpd" Jan 11 17:48:28 crc kubenswrapper[4837]: I0111 17:48:28.993610 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/0e7e2f2f-8ba4-4156-a06f-abe8b8c39477-scripts\") pod \"ovn-controller-ovs-22bpd\" (UID: \"0e7e2f2f-8ba4-4156-a06f-abe8b8c39477\") " pod="openstack/ovn-controller-ovs-22bpd" Jan 11 17:48:28 crc kubenswrapper[4837]: I0111 17:48:28.993630 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/91f28f51-1965-4fdd-bcb8-c261644249d5-var-run-ovn\") pod \"ovn-controller-zfjdc\" (UID: \"91f28f51-1965-4fdd-bcb8-c261644249d5\") " pod="openstack/ovn-controller-zfjdc" Jan 11 17:48:28 crc kubenswrapper[4837]: I0111 17:48:28.993766 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/91f28f51-1965-4fdd-bcb8-c261644249d5-scripts\") pod \"ovn-controller-zfjdc\" (UID: \"91f28f51-1965-4fdd-bcb8-c261644249d5\") " pod="openstack/ovn-controller-zfjdc" Jan 11 17:48:28 crc kubenswrapper[4837]: I0111 17:48:28.993805 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-controller-tls-certs\" (UniqueName: \"kubernetes.io/secret/91f28f51-1965-4fdd-bcb8-c261644249d5-ovn-controller-tls-certs\") pod \"ovn-controller-zfjdc\" (UID: \"91f28f51-1965-4fdd-bcb8-c261644249d5\") " pod="openstack/ovn-controller-zfjdc" Jan 11 17:48:28 crc kubenswrapper[4837]: I0111 17:48:28.993865 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/91f28f51-1965-4fdd-bcb8-c261644249d5-var-log-ovn\") pod \"ovn-controller-zfjdc\" (UID: \"91f28f51-1965-4fdd-bcb8-c261644249d5\") " pod="openstack/ovn-controller-zfjdc" Jan 11 17:48:28 crc kubenswrapper[4837]: I0111 17:48:28.993887 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-t64m9\" (UniqueName: \"kubernetes.io/projected/91f28f51-1965-4fdd-bcb8-c261644249d5-kube-api-access-t64m9\") pod \"ovn-controller-zfjdc\" (UID: \"91f28f51-1965-4fdd-bcb8-c261644249d5\") " pod="openstack/ovn-controller-zfjdc" Jan 11 17:48:28 crc kubenswrapper[4837]: I0111 17:48:28.993933 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-ovs\" (UniqueName: \"kubernetes.io/host-path/0e7e2f2f-8ba4-4156-a06f-abe8b8c39477-etc-ovs\") pod \"ovn-controller-ovs-22bpd\" (UID: \"0e7e2f2f-8ba4-4156-a06f-abe8b8c39477\") " pod="openstack/ovn-controller-ovs-22bpd" Jan 11 17:48:28 crc kubenswrapper[4837]: I0111 17:48:28.993989 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-d7dwk\" (UniqueName: \"kubernetes.io/projected/0e7e2f2f-8ba4-4156-a06f-abe8b8c39477-kube-api-access-d7dwk\") pod \"ovn-controller-ovs-22bpd\" (UID: \"0e7e2f2f-8ba4-4156-a06f-abe8b8c39477\") " pod="openstack/ovn-controller-ovs-22bpd" Jan 11 17:48:28 crc kubenswrapper[4837]: I0111 17:48:28.994024 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/91f28f51-1965-4fdd-bcb8-c261644249d5-var-run\") pod \"ovn-controller-zfjdc\" (UID: \"91f28f51-1965-4fdd-bcb8-c261644249d5\") " pod="openstack/ovn-controller-zfjdc" Jan 11 17:48:29 crc kubenswrapper[4837]: I0111 17:48:29.095921 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/91f28f51-1965-4fdd-bcb8-c261644249d5-combined-ca-bundle\") pod \"ovn-controller-zfjdc\" (UID: \"91f28f51-1965-4fdd-bcb8-c261644249d5\") " pod="openstack/ovn-controller-zfjdc" Jan 11 17:48:29 crc kubenswrapper[4837]: I0111 17:48:29.095973 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/0e7e2f2f-8ba4-4156-a06f-abe8b8c39477-var-log\") pod \"ovn-controller-ovs-22bpd\" (UID: \"0e7e2f2f-8ba4-4156-a06f-abe8b8c39477\") " pod="openstack/ovn-controller-ovs-22bpd" Jan 11 17:48:29 crc kubenswrapper[4837]: I0111 17:48:29.096012 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lib\" (UniqueName: \"kubernetes.io/host-path/0e7e2f2f-8ba4-4156-a06f-abe8b8c39477-var-lib\") pod \"ovn-controller-ovs-22bpd\" (UID: \"0e7e2f2f-8ba4-4156-a06f-abe8b8c39477\") " pod="openstack/ovn-controller-ovs-22bpd" Jan 11 17:48:29 crc kubenswrapper[4837]: I0111 17:48:29.096050 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/0e7e2f2f-8ba4-4156-a06f-abe8b8c39477-var-run\") pod \"ovn-controller-ovs-22bpd\" (UID: \"0e7e2f2f-8ba4-4156-a06f-abe8b8c39477\") " pod="openstack/ovn-controller-ovs-22bpd" Jan 11 17:48:29 crc kubenswrapper[4837]: I0111 17:48:29.097093 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/0e7e2f2f-8ba4-4156-a06f-abe8b8c39477-scripts\") pod \"ovn-controller-ovs-22bpd\" (UID: \"0e7e2f2f-8ba4-4156-a06f-abe8b8c39477\") " pod="openstack/ovn-controller-ovs-22bpd" Jan 11 17:48:29 crc kubenswrapper[4837]: I0111 17:48:29.097115 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/91f28f51-1965-4fdd-bcb8-c261644249d5-var-run-ovn\") pod \"ovn-controller-zfjdc\" (UID: \"91f28f51-1965-4fdd-bcb8-c261644249d5\") " pod="openstack/ovn-controller-zfjdc" Jan 11 17:48:29 crc kubenswrapper[4837]: I0111 17:48:29.097029 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/0e7e2f2f-8ba4-4156-a06f-abe8b8c39477-var-log\") pod \"ovn-controller-ovs-22bpd\" (UID: \"0e7e2f2f-8ba4-4156-a06f-abe8b8c39477\") " pod="openstack/ovn-controller-ovs-22bpd" Jan 11 17:48:29 crc kubenswrapper[4837]: I0111 17:48:29.097150 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/91f28f51-1965-4fdd-bcb8-c261644249d5-scripts\") pod \"ovn-controller-zfjdc\" (UID: \"91f28f51-1965-4fdd-bcb8-c261644249d5\") " pod="openstack/ovn-controller-zfjdc" Jan 11 17:48:29 crc kubenswrapper[4837]: I0111 17:48:29.097168 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-controller-tls-certs\" (UniqueName: \"kubernetes.io/secret/91f28f51-1965-4fdd-bcb8-c261644249d5-ovn-controller-tls-certs\") pod \"ovn-controller-zfjdc\" (UID: \"91f28f51-1965-4fdd-bcb8-c261644249d5\") " pod="openstack/ovn-controller-zfjdc" Jan 11 17:48:29 crc kubenswrapper[4837]: I0111 17:48:29.097194 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/91f28f51-1965-4fdd-bcb8-c261644249d5-var-log-ovn\") pod \"ovn-controller-zfjdc\" (UID: \"91f28f51-1965-4fdd-bcb8-c261644249d5\") " pod="openstack/ovn-controller-zfjdc" Jan 11 17:48:29 crc kubenswrapper[4837]: I0111 17:48:29.097213 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-t64m9\" (UniqueName: \"kubernetes.io/projected/91f28f51-1965-4fdd-bcb8-c261644249d5-kube-api-access-t64m9\") pod \"ovn-controller-zfjdc\" (UID: \"91f28f51-1965-4fdd-bcb8-c261644249d5\") " pod="openstack/ovn-controller-zfjdc" Jan 11 17:48:29 crc kubenswrapper[4837]: I0111 17:48:29.097238 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-ovs\" (UniqueName: \"kubernetes.io/host-path/0e7e2f2f-8ba4-4156-a06f-abe8b8c39477-etc-ovs\") pod \"ovn-controller-ovs-22bpd\" (UID: \"0e7e2f2f-8ba4-4156-a06f-abe8b8c39477\") " pod="openstack/ovn-controller-ovs-22bpd" Jan 11 17:48:29 crc kubenswrapper[4837]: I0111 17:48:29.097273 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-d7dwk\" (UniqueName: \"kubernetes.io/projected/0e7e2f2f-8ba4-4156-a06f-abe8b8c39477-kube-api-access-d7dwk\") pod \"ovn-controller-ovs-22bpd\" (UID: \"0e7e2f2f-8ba4-4156-a06f-abe8b8c39477\") " pod="openstack/ovn-controller-ovs-22bpd" Jan 11 17:48:29 crc kubenswrapper[4837]: I0111 17:48:29.097295 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/91f28f51-1965-4fdd-bcb8-c261644249d5-var-run\") pod \"ovn-controller-zfjdc\" (UID: \"91f28f51-1965-4fdd-bcb8-c261644249d5\") " pod="openstack/ovn-controller-zfjdc" Jan 11 17:48:29 crc kubenswrapper[4837]: I0111 17:48:29.097994 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-ovs\" (UniqueName: \"kubernetes.io/host-path/0e7e2f2f-8ba4-4156-a06f-abe8b8c39477-etc-ovs\") pod \"ovn-controller-ovs-22bpd\" (UID: \"0e7e2f2f-8ba4-4156-a06f-abe8b8c39477\") " pod="openstack/ovn-controller-ovs-22bpd" Jan 11 17:48:29 crc kubenswrapper[4837]: I0111 17:48:29.098073 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/91f28f51-1965-4fdd-bcb8-c261644249d5-var-log-ovn\") pod \"ovn-controller-zfjdc\" (UID: \"91f28f51-1965-4fdd-bcb8-c261644249d5\") " pod="openstack/ovn-controller-zfjdc" Jan 11 17:48:29 crc kubenswrapper[4837]: I0111 17:48:29.097109 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lib\" (UniqueName: \"kubernetes.io/host-path/0e7e2f2f-8ba4-4156-a06f-abe8b8c39477-var-lib\") pod \"ovn-controller-ovs-22bpd\" (UID: \"0e7e2f2f-8ba4-4156-a06f-abe8b8c39477\") " pod="openstack/ovn-controller-ovs-22bpd" Jan 11 17:48:29 crc kubenswrapper[4837]: I0111 17:48:29.099031 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/91f28f51-1965-4fdd-bcb8-c261644249d5-scripts\") pod \"ovn-controller-zfjdc\" (UID: \"91f28f51-1965-4fdd-bcb8-c261644249d5\") " pod="openstack/ovn-controller-zfjdc" Jan 11 17:48:29 crc kubenswrapper[4837]: I0111 17:48:29.099998 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/0e7e2f2f-8ba4-4156-a06f-abe8b8c39477-var-run\") pod \"ovn-controller-ovs-22bpd\" (UID: \"0e7e2f2f-8ba4-4156-a06f-abe8b8c39477\") " pod="openstack/ovn-controller-ovs-22bpd" Jan 11 17:48:29 crc kubenswrapper[4837]: I0111 17:48:29.100180 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/0e7e2f2f-8ba4-4156-a06f-abe8b8c39477-scripts\") pod \"ovn-controller-ovs-22bpd\" (UID: \"0e7e2f2f-8ba4-4156-a06f-abe8b8c39477\") " pod="openstack/ovn-controller-ovs-22bpd" Jan 11 17:48:29 crc kubenswrapper[4837]: I0111 17:48:29.100854 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/91f28f51-1965-4fdd-bcb8-c261644249d5-var-run\") pod \"ovn-controller-zfjdc\" (UID: \"91f28f51-1965-4fdd-bcb8-c261644249d5\") " pod="openstack/ovn-controller-zfjdc" Jan 11 17:48:29 crc kubenswrapper[4837]: I0111 17:48:29.100859 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/91f28f51-1965-4fdd-bcb8-c261644249d5-var-run-ovn\") pod \"ovn-controller-zfjdc\" (UID: \"91f28f51-1965-4fdd-bcb8-c261644249d5\") " pod="openstack/ovn-controller-zfjdc" Jan 11 17:48:29 crc kubenswrapper[4837]: I0111 17:48:29.102846 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/91f28f51-1965-4fdd-bcb8-c261644249d5-combined-ca-bundle\") pod \"ovn-controller-zfjdc\" (UID: \"91f28f51-1965-4fdd-bcb8-c261644249d5\") " pod="openstack/ovn-controller-zfjdc" Jan 11 17:48:29 crc kubenswrapper[4837]: I0111 17:48:29.112235 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-controller-tls-certs\" (UniqueName: \"kubernetes.io/secret/91f28f51-1965-4fdd-bcb8-c261644249d5-ovn-controller-tls-certs\") pod \"ovn-controller-zfjdc\" (UID: \"91f28f51-1965-4fdd-bcb8-c261644249d5\") " pod="openstack/ovn-controller-zfjdc" Jan 11 17:48:29 crc kubenswrapper[4837]: I0111 17:48:29.142506 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-t64m9\" (UniqueName: \"kubernetes.io/projected/91f28f51-1965-4fdd-bcb8-c261644249d5-kube-api-access-t64m9\") pod \"ovn-controller-zfjdc\" (UID: \"91f28f51-1965-4fdd-bcb8-c261644249d5\") " pod="openstack/ovn-controller-zfjdc" Jan 11 17:48:29 crc kubenswrapper[4837]: I0111 17:48:29.147176 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-d7dwk\" (UniqueName: \"kubernetes.io/projected/0e7e2f2f-8ba4-4156-a06f-abe8b8c39477-kube-api-access-d7dwk\") pod \"ovn-controller-ovs-22bpd\" (UID: \"0e7e2f2f-8ba4-4156-a06f-abe8b8c39477\") " pod="openstack/ovn-controller-ovs-22bpd" Jan 11 17:48:29 crc kubenswrapper[4837]: I0111 17:48:29.222158 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-zfjdc" Jan 11 17:48:29 crc kubenswrapper[4837]: I0111 17:48:29.266625 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-ovs-22bpd" Jan 11 17:48:30 crc kubenswrapper[4837]: I0111 17:48:30.705746 4837 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovsdbserver-nb-0"] Jan 11 17:48:30 crc kubenswrapper[4837]: I0111 17:48:30.707050 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovsdbserver-nb-0" Jan 11 17:48:30 crc kubenswrapper[4837]: I0111 17:48:30.709592 4837 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ovncluster-ovndbcluster-nb-dockercfg-sgbpn" Jan 11 17:48:30 crc kubenswrapper[4837]: I0111 17:48:30.709834 4837 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovndbcluster-nb-config" Jan 11 17:48:30 crc kubenswrapper[4837]: I0111 17:48:30.710526 4837 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-ovn-metrics" Jan 11 17:48:30 crc kubenswrapper[4837]: I0111 17:48:30.710548 4837 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovndbcluster-nb-scripts" Jan 11 17:48:30 crc kubenswrapper[4837]: I0111 17:48:30.710832 4837 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-ovndbcluster-nb-ovndbs" Jan 11 17:48:30 crc kubenswrapper[4837]: I0111 17:48:30.714828 4837 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovsdbserver-nb-0"] Jan 11 17:48:30 crc kubenswrapper[4837]: I0111 17:48:30.821463 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/5e69a588-3047-499a-b5cb-000fdcc7762a-ovsdb-rundir\") pod \"ovsdbserver-nb-0\" (UID: \"5e69a588-3047-499a-b5cb-000fdcc7762a\") " pod="openstack/ovsdbserver-nb-0" Jan 11 17:48:30 crc kubenswrapper[4837]: I0111 17:48:30.821525 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rf8tb\" (UniqueName: \"kubernetes.io/projected/5e69a588-3047-499a-b5cb-000fdcc7762a-kube-api-access-rf8tb\") pod \"ovsdbserver-nb-0\" (UID: \"5e69a588-3047-499a-b5cb-000fdcc7762a\") " pod="openstack/ovsdbserver-nb-0" Jan 11 17:48:30 crc kubenswrapper[4837]: I0111 17:48:30.821566 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage03-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage03-crc\") pod \"ovsdbserver-nb-0\" (UID: \"5e69a588-3047-499a-b5cb-000fdcc7762a\") " pod="openstack/ovsdbserver-nb-0" Jan 11 17:48:30 crc kubenswrapper[4837]: I0111 17:48:30.821605 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/5e69a588-3047-499a-b5cb-000fdcc7762a-scripts\") pod \"ovsdbserver-nb-0\" (UID: \"5e69a588-3047-499a-b5cb-000fdcc7762a\") " pod="openstack/ovsdbserver-nb-0" Jan 11 17:48:30 crc kubenswrapper[4837]: I0111 17:48:30.821629 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5e69a588-3047-499a-b5cb-000fdcc7762a-config\") pod \"ovsdbserver-nb-0\" (UID: \"5e69a588-3047-499a-b5cb-000fdcc7762a\") " pod="openstack/ovsdbserver-nb-0" Jan 11 17:48:30 crc kubenswrapper[4837]: I0111 17:48:30.821649 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb-tls-certs\" (UniqueName: \"kubernetes.io/secret/5e69a588-3047-499a-b5cb-000fdcc7762a-ovsdbserver-nb-tls-certs\") pod \"ovsdbserver-nb-0\" (UID: \"5e69a588-3047-499a-b5cb-000fdcc7762a\") " pod="openstack/ovsdbserver-nb-0" Jan 11 17:48:30 crc kubenswrapper[4837]: I0111 17:48:30.821772 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/5e69a588-3047-499a-b5cb-000fdcc7762a-metrics-certs-tls-certs\") pod \"ovsdbserver-nb-0\" (UID: \"5e69a588-3047-499a-b5cb-000fdcc7762a\") " pod="openstack/ovsdbserver-nb-0" Jan 11 17:48:30 crc kubenswrapper[4837]: I0111 17:48:30.821805 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5e69a588-3047-499a-b5cb-000fdcc7762a-combined-ca-bundle\") pod \"ovsdbserver-nb-0\" (UID: \"5e69a588-3047-499a-b5cb-000fdcc7762a\") " pod="openstack/ovsdbserver-nb-0" Jan 11 17:48:30 crc kubenswrapper[4837]: I0111 17:48:30.923581 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/5e69a588-3047-499a-b5cb-000fdcc7762a-ovsdb-rundir\") pod \"ovsdbserver-nb-0\" (UID: \"5e69a588-3047-499a-b5cb-000fdcc7762a\") " pod="openstack/ovsdbserver-nb-0" Jan 11 17:48:30 crc kubenswrapper[4837]: I0111 17:48:30.923650 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rf8tb\" (UniqueName: \"kubernetes.io/projected/5e69a588-3047-499a-b5cb-000fdcc7762a-kube-api-access-rf8tb\") pod \"ovsdbserver-nb-0\" (UID: \"5e69a588-3047-499a-b5cb-000fdcc7762a\") " pod="openstack/ovsdbserver-nb-0" Jan 11 17:48:30 crc kubenswrapper[4837]: I0111 17:48:30.923708 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage03-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage03-crc\") pod \"ovsdbserver-nb-0\" (UID: \"5e69a588-3047-499a-b5cb-000fdcc7762a\") " pod="openstack/ovsdbserver-nb-0" Jan 11 17:48:30 crc kubenswrapper[4837]: I0111 17:48:30.923730 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/5e69a588-3047-499a-b5cb-000fdcc7762a-scripts\") pod \"ovsdbserver-nb-0\" (UID: \"5e69a588-3047-499a-b5cb-000fdcc7762a\") " pod="openstack/ovsdbserver-nb-0" Jan 11 17:48:30 crc kubenswrapper[4837]: I0111 17:48:30.923748 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5e69a588-3047-499a-b5cb-000fdcc7762a-config\") pod \"ovsdbserver-nb-0\" (UID: \"5e69a588-3047-499a-b5cb-000fdcc7762a\") " pod="openstack/ovsdbserver-nb-0" Jan 11 17:48:30 crc kubenswrapper[4837]: I0111 17:48:30.923762 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb-tls-certs\" (UniqueName: \"kubernetes.io/secret/5e69a588-3047-499a-b5cb-000fdcc7762a-ovsdbserver-nb-tls-certs\") pod \"ovsdbserver-nb-0\" (UID: \"5e69a588-3047-499a-b5cb-000fdcc7762a\") " pod="openstack/ovsdbserver-nb-0" Jan 11 17:48:30 crc kubenswrapper[4837]: I0111 17:48:30.923782 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/5e69a588-3047-499a-b5cb-000fdcc7762a-metrics-certs-tls-certs\") pod \"ovsdbserver-nb-0\" (UID: \"5e69a588-3047-499a-b5cb-000fdcc7762a\") " pod="openstack/ovsdbserver-nb-0" Jan 11 17:48:30 crc kubenswrapper[4837]: I0111 17:48:30.923800 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5e69a588-3047-499a-b5cb-000fdcc7762a-combined-ca-bundle\") pod \"ovsdbserver-nb-0\" (UID: \"5e69a588-3047-499a-b5cb-000fdcc7762a\") " pod="openstack/ovsdbserver-nb-0" Jan 11 17:48:30 crc kubenswrapper[4837]: I0111 17:48:30.924305 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/5e69a588-3047-499a-b5cb-000fdcc7762a-ovsdb-rundir\") pod \"ovsdbserver-nb-0\" (UID: \"5e69a588-3047-499a-b5cb-000fdcc7762a\") " pod="openstack/ovsdbserver-nb-0" Jan 11 17:48:30 crc kubenswrapper[4837]: I0111 17:48:30.924717 4837 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage03-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage03-crc\") pod \"ovsdbserver-nb-0\" (UID: \"5e69a588-3047-499a-b5cb-000fdcc7762a\") device mount path \"/mnt/openstack/pv03\"" pod="openstack/ovsdbserver-nb-0" Jan 11 17:48:30 crc kubenswrapper[4837]: I0111 17:48:30.925222 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5e69a588-3047-499a-b5cb-000fdcc7762a-config\") pod \"ovsdbserver-nb-0\" (UID: \"5e69a588-3047-499a-b5cb-000fdcc7762a\") " pod="openstack/ovsdbserver-nb-0" Jan 11 17:48:30 crc kubenswrapper[4837]: I0111 17:48:30.926235 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/5e69a588-3047-499a-b5cb-000fdcc7762a-scripts\") pod \"ovsdbserver-nb-0\" (UID: \"5e69a588-3047-499a-b5cb-000fdcc7762a\") " pod="openstack/ovsdbserver-nb-0" Jan 11 17:48:30 crc kubenswrapper[4837]: I0111 17:48:30.928819 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb-tls-certs\" (UniqueName: \"kubernetes.io/secret/5e69a588-3047-499a-b5cb-000fdcc7762a-ovsdbserver-nb-tls-certs\") pod \"ovsdbserver-nb-0\" (UID: \"5e69a588-3047-499a-b5cb-000fdcc7762a\") " pod="openstack/ovsdbserver-nb-0" Jan 11 17:48:30 crc kubenswrapper[4837]: I0111 17:48:30.930729 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/5e69a588-3047-499a-b5cb-000fdcc7762a-metrics-certs-tls-certs\") pod \"ovsdbserver-nb-0\" (UID: \"5e69a588-3047-499a-b5cb-000fdcc7762a\") " pod="openstack/ovsdbserver-nb-0" Jan 11 17:48:30 crc kubenswrapper[4837]: I0111 17:48:30.930884 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5e69a588-3047-499a-b5cb-000fdcc7762a-combined-ca-bundle\") pod \"ovsdbserver-nb-0\" (UID: \"5e69a588-3047-499a-b5cb-000fdcc7762a\") " pod="openstack/ovsdbserver-nb-0" Jan 11 17:48:30 crc kubenswrapper[4837]: I0111 17:48:30.943646 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage03-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage03-crc\") pod \"ovsdbserver-nb-0\" (UID: \"5e69a588-3047-499a-b5cb-000fdcc7762a\") " pod="openstack/ovsdbserver-nb-0" Jan 11 17:48:30 crc kubenswrapper[4837]: I0111 17:48:30.947724 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rf8tb\" (UniqueName: \"kubernetes.io/projected/5e69a588-3047-499a-b5cb-000fdcc7762a-kube-api-access-rf8tb\") pod \"ovsdbserver-nb-0\" (UID: \"5e69a588-3047-499a-b5cb-000fdcc7762a\") " pod="openstack/ovsdbserver-nb-0" Jan 11 17:48:31 crc kubenswrapper[4837]: I0111 17:48:31.030654 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovsdbserver-nb-0" Jan 11 17:48:31 crc kubenswrapper[4837]: I0111 17:48:31.590491 4837 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovsdbserver-sb-0"] Jan 11 17:48:31 crc kubenswrapper[4837]: I0111 17:48:31.596088 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovsdbserver-sb-0" Jan 11 17:48:31 crc kubenswrapper[4837]: I0111 17:48:31.598630 4837 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovsdbserver-sb-0"] Jan 11 17:48:31 crc kubenswrapper[4837]: I0111 17:48:31.606395 4837 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovndbcluster-sb-scripts" Jan 11 17:48:31 crc kubenswrapper[4837]: I0111 17:48:31.606472 4837 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-ovndbcluster-sb-ovndbs" Jan 11 17:48:31 crc kubenswrapper[4837]: I0111 17:48:31.607382 4837 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ovncluster-ovndbcluster-sb-dockercfg-sl9r8" Jan 11 17:48:31 crc kubenswrapper[4837]: I0111 17:48:31.607485 4837 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovndbcluster-sb-config" Jan 11 17:48:31 crc kubenswrapper[4837]: I0111 17:48:31.638089 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage04-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage04-crc\") pod \"ovsdbserver-sb-0\" (UID: \"bc3c0fec-5357-46ca-929a-527f01e1eb3d\") " pod="openstack/ovsdbserver-sb-0" Jan 11 17:48:31 crc kubenswrapper[4837]: I0111 17:48:31.638160 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/bc3c0fec-5357-46ca-929a-527f01e1eb3d-combined-ca-bundle\") pod \"ovsdbserver-sb-0\" (UID: \"bc3c0fec-5357-46ca-929a-527f01e1eb3d\") " pod="openstack/ovsdbserver-sb-0" Jan 11 17:48:31 crc kubenswrapper[4837]: I0111 17:48:31.638195 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/bc3c0fec-5357-46ca-929a-527f01e1eb3d-ovsdb-rundir\") pod \"ovsdbserver-sb-0\" (UID: \"bc3c0fec-5357-46ca-929a-527f01e1eb3d\") " pod="openstack/ovsdbserver-sb-0" Jan 11 17:48:31 crc kubenswrapper[4837]: I0111 17:48:31.638238 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mkc5p\" (UniqueName: \"kubernetes.io/projected/bc3c0fec-5357-46ca-929a-527f01e1eb3d-kube-api-access-mkc5p\") pod \"ovsdbserver-sb-0\" (UID: \"bc3c0fec-5357-46ca-929a-527f01e1eb3d\") " pod="openstack/ovsdbserver-sb-0" Jan 11 17:48:31 crc kubenswrapper[4837]: I0111 17:48:31.638327 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/bc3c0fec-5357-46ca-929a-527f01e1eb3d-scripts\") pod \"ovsdbserver-sb-0\" (UID: \"bc3c0fec-5357-46ca-929a-527f01e1eb3d\") " pod="openstack/ovsdbserver-sb-0" Jan 11 17:48:31 crc kubenswrapper[4837]: I0111 17:48:31.638361 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/bc3c0fec-5357-46ca-929a-527f01e1eb3d-metrics-certs-tls-certs\") pod \"ovsdbserver-sb-0\" (UID: \"bc3c0fec-5357-46ca-929a-527f01e1eb3d\") " pod="openstack/ovsdbserver-sb-0" Jan 11 17:48:31 crc kubenswrapper[4837]: I0111 17:48:31.638390 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/bc3c0fec-5357-46ca-929a-527f01e1eb3d-config\") pod \"ovsdbserver-sb-0\" (UID: \"bc3c0fec-5357-46ca-929a-527f01e1eb3d\") " pod="openstack/ovsdbserver-sb-0" Jan 11 17:48:31 crc kubenswrapper[4837]: I0111 17:48:31.638437 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb-tls-certs\" (UniqueName: \"kubernetes.io/secret/bc3c0fec-5357-46ca-929a-527f01e1eb3d-ovsdbserver-sb-tls-certs\") pod \"ovsdbserver-sb-0\" (UID: \"bc3c0fec-5357-46ca-929a-527f01e1eb3d\") " pod="openstack/ovsdbserver-sb-0" Jan 11 17:48:31 crc kubenswrapper[4837]: I0111 17:48:31.739626 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mkc5p\" (UniqueName: \"kubernetes.io/projected/bc3c0fec-5357-46ca-929a-527f01e1eb3d-kube-api-access-mkc5p\") pod \"ovsdbserver-sb-0\" (UID: \"bc3c0fec-5357-46ca-929a-527f01e1eb3d\") " pod="openstack/ovsdbserver-sb-0" Jan 11 17:48:31 crc kubenswrapper[4837]: I0111 17:48:31.739782 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/bc3c0fec-5357-46ca-929a-527f01e1eb3d-scripts\") pod \"ovsdbserver-sb-0\" (UID: \"bc3c0fec-5357-46ca-929a-527f01e1eb3d\") " pod="openstack/ovsdbserver-sb-0" Jan 11 17:48:31 crc kubenswrapper[4837]: I0111 17:48:31.739828 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/bc3c0fec-5357-46ca-929a-527f01e1eb3d-metrics-certs-tls-certs\") pod \"ovsdbserver-sb-0\" (UID: \"bc3c0fec-5357-46ca-929a-527f01e1eb3d\") " pod="openstack/ovsdbserver-sb-0" Jan 11 17:48:31 crc kubenswrapper[4837]: I0111 17:48:31.739860 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/bc3c0fec-5357-46ca-929a-527f01e1eb3d-config\") pod \"ovsdbserver-sb-0\" (UID: \"bc3c0fec-5357-46ca-929a-527f01e1eb3d\") " pod="openstack/ovsdbserver-sb-0" Jan 11 17:48:31 crc kubenswrapper[4837]: I0111 17:48:31.739889 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb-tls-certs\" (UniqueName: \"kubernetes.io/secret/bc3c0fec-5357-46ca-929a-527f01e1eb3d-ovsdbserver-sb-tls-certs\") pod \"ovsdbserver-sb-0\" (UID: \"bc3c0fec-5357-46ca-929a-527f01e1eb3d\") " pod="openstack/ovsdbserver-sb-0" Jan 11 17:48:31 crc kubenswrapper[4837]: I0111 17:48:31.739923 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage04-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage04-crc\") pod \"ovsdbserver-sb-0\" (UID: \"bc3c0fec-5357-46ca-929a-527f01e1eb3d\") " pod="openstack/ovsdbserver-sb-0" Jan 11 17:48:31 crc kubenswrapper[4837]: I0111 17:48:31.739983 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/bc3c0fec-5357-46ca-929a-527f01e1eb3d-combined-ca-bundle\") pod \"ovsdbserver-sb-0\" (UID: \"bc3c0fec-5357-46ca-929a-527f01e1eb3d\") " pod="openstack/ovsdbserver-sb-0" Jan 11 17:48:31 crc kubenswrapper[4837]: I0111 17:48:31.740407 4837 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage04-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage04-crc\") pod \"ovsdbserver-sb-0\" (UID: \"bc3c0fec-5357-46ca-929a-527f01e1eb3d\") device mount path \"/mnt/openstack/pv04\"" pod="openstack/ovsdbserver-sb-0" Jan 11 17:48:31 crc kubenswrapper[4837]: I0111 17:48:31.740534 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/bc3c0fec-5357-46ca-929a-527f01e1eb3d-ovsdb-rundir\") pod \"ovsdbserver-sb-0\" (UID: \"bc3c0fec-5357-46ca-929a-527f01e1eb3d\") " pod="openstack/ovsdbserver-sb-0" Jan 11 17:48:31 crc kubenswrapper[4837]: I0111 17:48:31.740952 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/bc3c0fec-5357-46ca-929a-527f01e1eb3d-config\") pod \"ovsdbserver-sb-0\" (UID: \"bc3c0fec-5357-46ca-929a-527f01e1eb3d\") " pod="openstack/ovsdbserver-sb-0" Jan 11 17:48:31 crc kubenswrapper[4837]: I0111 17:48:31.740990 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/bc3c0fec-5357-46ca-929a-527f01e1eb3d-ovsdb-rundir\") pod \"ovsdbserver-sb-0\" (UID: \"bc3c0fec-5357-46ca-929a-527f01e1eb3d\") " pod="openstack/ovsdbserver-sb-0" Jan 11 17:48:31 crc kubenswrapper[4837]: I0111 17:48:31.741258 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/bc3c0fec-5357-46ca-929a-527f01e1eb3d-scripts\") pod \"ovsdbserver-sb-0\" (UID: \"bc3c0fec-5357-46ca-929a-527f01e1eb3d\") " pod="openstack/ovsdbserver-sb-0" Jan 11 17:48:31 crc kubenswrapper[4837]: I0111 17:48:31.745545 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb-tls-certs\" (UniqueName: \"kubernetes.io/secret/bc3c0fec-5357-46ca-929a-527f01e1eb3d-ovsdbserver-sb-tls-certs\") pod \"ovsdbserver-sb-0\" (UID: \"bc3c0fec-5357-46ca-929a-527f01e1eb3d\") " pod="openstack/ovsdbserver-sb-0" Jan 11 17:48:31 crc kubenswrapper[4837]: I0111 17:48:31.745745 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/bc3c0fec-5357-46ca-929a-527f01e1eb3d-combined-ca-bundle\") pod \"ovsdbserver-sb-0\" (UID: \"bc3c0fec-5357-46ca-929a-527f01e1eb3d\") " pod="openstack/ovsdbserver-sb-0" Jan 11 17:48:31 crc kubenswrapper[4837]: I0111 17:48:31.751176 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/bc3c0fec-5357-46ca-929a-527f01e1eb3d-metrics-certs-tls-certs\") pod \"ovsdbserver-sb-0\" (UID: \"bc3c0fec-5357-46ca-929a-527f01e1eb3d\") " pod="openstack/ovsdbserver-sb-0" Jan 11 17:48:31 crc kubenswrapper[4837]: I0111 17:48:31.758331 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mkc5p\" (UniqueName: \"kubernetes.io/projected/bc3c0fec-5357-46ca-929a-527f01e1eb3d-kube-api-access-mkc5p\") pod \"ovsdbserver-sb-0\" (UID: \"bc3c0fec-5357-46ca-929a-527f01e1eb3d\") " pod="openstack/ovsdbserver-sb-0" Jan 11 17:48:31 crc kubenswrapper[4837]: I0111 17:48:31.779010 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage04-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage04-crc\") pod \"ovsdbserver-sb-0\" (UID: \"bc3c0fec-5357-46ca-929a-527f01e1eb3d\") " pod="openstack/ovsdbserver-sb-0" Jan 11 17:48:31 crc kubenswrapper[4837]: I0111 17:48:31.916406 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovsdbserver-sb-0" Jan 11 17:48:51 crc kubenswrapper[4837]: E0111 17:48:51.087958 4837 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/podified-antelope-centos9/openstack-memcached:current-podified" Jan 11 17:48:51 crc kubenswrapper[4837]: E0111 17:48:51.088640 4837 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:memcached,Image:quay.io/podified-antelope-centos9/openstack-memcached:current-podified,Command:[/usr/bin/dumb-init -- /usr/local/bin/kolla_start],Args:[],WorkingDir:,Ports:[]ContainerPort{ContainerPort{Name:memcached,HostPort:0,ContainerPort:11211,Protocol:TCP,HostIP:,},ContainerPort{Name:memcached-tls,HostPort:0,ContainerPort:11212,Protocol:TCP,HostIP:,},},Env:[]EnvVar{EnvVar{Name:KOLLA_CONFIG_STRATEGY,Value:COPY_ALWAYS,ValueFrom:nil,},EnvVar{Name:POD_IPS,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:status.podIPs,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},EnvVar{Name:CONFIG_HASH,Value:ncbh554h66h95h5d6h558h5b6h5d8h68h57dh5ffh8dhb8h7dh5bbh55bh79h669h695hcch5dch6hchbfh666h66chf9hb4h554h5fh595h678q,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:config-data,ReadOnly:true,MountPath:/var/lib/kolla/config_files/src,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kolla-config,ReadOnly:true,MountPath:/var/lib/kolla/config_files,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:memcached-tls-certs,ReadOnly:true,MountPath:/var/lib/config-data/tls/certs/memcached.crt,SubPath:tls.crt,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:memcached-tls-certs,ReadOnly:true,MountPath:/var/lib/config-data/tls/private/memcached.key,SubPath:tls.key,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:combined-ca-bundle,ReadOnly:true,MountPath:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem,SubPath:tls-ca-bundle.pem,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-5r596,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:nil,TCPSocket:&TCPSocketAction{Port:{0 11211 },Host:,},GRPC:nil,},InitialDelaySeconds:3,TimeoutSeconds:5,PeriodSeconds:3,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:nil,TCPSocket:&TCPSocketAction{Port:{0 11211 },Host:,},GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:5,PeriodSeconds:5,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*42457,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:nil,RunAsGroup:*42457,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod memcached-0_openstack(d1aa5cf3-303a-4a5b-8802-fe264fa090d6): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Jan 11 17:48:51 crc kubenswrapper[4837]: E0111 17:48:51.089776 4837 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"memcached\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack/memcached-0" podUID="d1aa5cf3-303a-4a5b-8802-fe264fa090d6" Jan 11 17:48:51 crc kubenswrapper[4837]: E0111 17:48:51.474434 4837 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"memcached\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/podified-antelope-centos9/openstack-memcached:current-podified\\\"\"" pod="openstack/memcached-0" podUID="d1aa5cf3-303a-4a5b-8802-fe264fa090d6" Jan 11 17:48:52 crc kubenswrapper[4837]: E0111 17:48:52.544540 4837 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/podified-antelope-centos9/openstack-rabbitmq:current-podified" Jan 11 17:48:52 crc kubenswrapper[4837]: E0111 17:48:52.544726 4837 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:setup-container,Image:quay.io/podified-antelope-centos9/openstack-rabbitmq:current-podified,Command:[sh -c cp /tmp/erlang-cookie-secret/.erlang.cookie /var/lib/rabbitmq/.erlang.cookie && chmod 600 /var/lib/rabbitmq/.erlang.cookie ; cp /tmp/rabbitmq-plugins/enabled_plugins /operator/enabled_plugins ; echo '[default]' > /var/lib/rabbitmq/.rabbitmqadmin.conf && sed -e 's/default_user/username/' -e 's/default_pass/password/' /tmp/default_user.conf >> /var/lib/rabbitmq/.rabbitmqadmin.conf && chmod 600 /var/lib/rabbitmq/.rabbitmqadmin.conf ; sleep 30],Args:[],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{20 -3} {} 20m DecimalSI},memory: {{67108864 0} {} BinarySI},},Requests:ResourceList{cpu: {{20 -3} {} 20m DecimalSI},memory: {{67108864 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:plugins-conf,ReadOnly:false,MountPath:/tmp/rabbitmq-plugins/,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:rabbitmq-erlang-cookie,ReadOnly:false,MountPath:/var/lib/rabbitmq/,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:erlang-cookie-secret,ReadOnly:false,MountPath:/tmp/erlang-cookie-secret/,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:rabbitmq-plugins,ReadOnly:false,MountPath:/operator,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:persistence,ReadOnly:false,MountPath:/var/lib/rabbitmq/mnesia/,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:rabbitmq-confd,ReadOnly:false,MountPath:/tmp/default_user.conf,SubPath:default_user.conf,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-5lvv9,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000650000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod rabbitmq-cell1-server-0_openstack(f21b505a-45c3-4f7e-b323-204d384185b9): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Jan 11 17:48:52 crc kubenswrapper[4837]: E0111 17:48:52.545904 4837 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"setup-container\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack/rabbitmq-cell1-server-0" podUID="f21b505a-45c3-4f7e-b323-204d384185b9" Jan 11 17:48:53 crc kubenswrapper[4837]: E0111 17:48:53.492524 4837 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"setup-container\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/podified-antelope-centos9/openstack-rabbitmq:current-podified\\\"\"" pod="openstack/rabbitmq-cell1-server-0" podUID="f21b505a-45c3-4f7e-b323-204d384185b9" Jan 11 17:48:56 crc kubenswrapper[4837]: E0111 17:48:56.468367 4837 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/podified-antelope-centos9/openstack-rabbitmq:current-podified" Jan 11 17:48:56 crc kubenswrapper[4837]: E0111 17:48:56.470045 4837 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:setup-container,Image:quay.io/podified-antelope-centos9/openstack-rabbitmq:current-podified,Command:[sh -c cp /tmp/erlang-cookie-secret/.erlang.cookie /var/lib/rabbitmq/.erlang.cookie && chmod 600 /var/lib/rabbitmq/.erlang.cookie ; cp /tmp/rabbitmq-plugins/enabled_plugins /operator/enabled_plugins ; echo '[default]' > /var/lib/rabbitmq/.rabbitmqadmin.conf && sed -e 's/default_user/username/' -e 's/default_pass/password/' /tmp/default_user.conf >> /var/lib/rabbitmq/.rabbitmqadmin.conf && chmod 600 /var/lib/rabbitmq/.rabbitmqadmin.conf ; sleep 30],Args:[],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{20 -3} {} 20m DecimalSI},memory: {{67108864 0} {} BinarySI},},Requests:ResourceList{cpu: {{20 -3} {} 20m DecimalSI},memory: {{67108864 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:plugins-conf,ReadOnly:false,MountPath:/tmp/rabbitmq-plugins/,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:rabbitmq-erlang-cookie,ReadOnly:false,MountPath:/var/lib/rabbitmq/,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:erlang-cookie-secret,ReadOnly:false,MountPath:/tmp/erlang-cookie-secret/,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:rabbitmq-plugins,ReadOnly:false,MountPath:/operator,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:persistence,ReadOnly:false,MountPath:/var/lib/rabbitmq/mnesia/,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:rabbitmq-confd,ReadOnly:false,MountPath:/tmp/default_user.conf,SubPath:default_user.conf,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-kz8zc,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000650000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod rabbitmq-server-0_openstack(89ca88bf-2462-4fce-8a85-8dc04655b21c): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Jan 11 17:48:56 crc kubenswrapper[4837]: E0111 17:48:56.471509 4837 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"setup-container\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack/rabbitmq-server-0" podUID="89ca88bf-2462-4fce-8a85-8dc04655b21c" Jan 11 17:48:56 crc kubenswrapper[4837]: E0111 17:48:56.514251 4837 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"setup-container\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/podified-antelope-centos9/openstack-rabbitmq:current-podified\\\"\"" pod="openstack/rabbitmq-server-0" podUID="89ca88bf-2462-4fce-8a85-8dc04655b21c" Jan 11 17:48:57 crc kubenswrapper[4837]: E0111 17:48:57.367009 4837 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/podified-antelope-centos9/openstack-neutron-server:current-podified" Jan 11 17:48:57 crc kubenswrapper[4837]: E0111 17:48:57.367500 4837 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:init,Image:quay.io/podified-antelope-centos9/openstack-neutron-server:current-podified,Command:[/bin/bash],Args:[-c dnsmasq --interface=* --conf-dir=/etc/dnsmasq.d --hostsdir=/etc/dnsmasq.d/hosts --keep-in-foreground --log-debug --bind-interfaces --listen-address=$(POD_IP) --port 5353 --log-facility=- --no-hosts --domain-needed --no-resolv --bogus-priv --log-queries --test],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:CONFIG_HASH,Value:n659h4h664hbh658h587h67ch89h587h8fh679hc6hf9h55fh644h5d5h698h68dh5cdh5ffh669h54ch9h689hb8hd4h5bfhd8h5d7h5fh665h574q,ValueFrom:nil,},EnvVar{Name:POD_IP,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:status.podIP,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:config,ReadOnly:true,MountPath:/etc/dnsmasq.d/config.cfg,SubPath:dns,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:dns-svc,ReadOnly:true,MountPath:/etc/dnsmasq.d/hosts/dns-svc,SubPath:dns-svc,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-hmgqz,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000650000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod dnsmasq-dns-57d769cc4f-mm7j2_openstack(238f9a29-a4e5-4a96-a90d-17b17a1200d2): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Jan 11 17:48:57 crc kubenswrapper[4837]: E0111 17:48:57.368688 4837 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"init\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack/dnsmasq-dns-57d769cc4f-mm7j2" podUID="238f9a29-a4e5-4a96-a90d-17b17a1200d2" Jan 11 17:48:57 crc kubenswrapper[4837]: E0111 17:48:57.373251 4837 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/podified-antelope-centos9/openstack-neutron-server:current-podified" Jan 11 17:48:57 crc kubenswrapper[4837]: E0111 17:48:57.373391 4837 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:init,Image:quay.io/podified-antelope-centos9/openstack-neutron-server:current-podified,Command:[/bin/bash],Args:[-c dnsmasq --interface=* --conf-dir=/etc/dnsmasq.d --hostsdir=/etc/dnsmasq.d/hosts --keep-in-foreground --log-debug --bind-interfaces --listen-address=$(POD_IP) --port 5353 --log-facility=- --no-hosts --domain-needed --no-resolv --bogus-priv --log-queries --test],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:CONFIG_HASH,Value:ndfhb5h667h568h584h5f9h58dh565h664h587h597h577h64bh5c4h66fh647hbdh68ch5c5h68dh686h5f7h64hd7hc6h55fh57bh98h57fh87h5fh57fq,ValueFrom:nil,},EnvVar{Name:POD_IP,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:status.podIP,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:config,ReadOnly:true,MountPath:/etc/dnsmasq.d/config.cfg,SubPath:dns,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:dns-svc,ReadOnly:true,MountPath:/etc/dnsmasq.d/hosts/dns-svc,SubPath:dns-svc,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-ktzxk,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000650000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod dnsmasq-dns-78dd6ddcc-brdp5_openstack(bd4d43a7-b705-4282-8493-8c1b3bdd4015): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Jan 11 17:48:57 crc kubenswrapper[4837]: E0111 17:48:57.375135 4837 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"init\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack/dnsmasq-dns-78dd6ddcc-brdp5" podUID="bd4d43a7-b705-4282-8493-8c1b3bdd4015" Jan 11 17:48:57 crc kubenswrapper[4837]: E0111 17:48:57.376873 4837 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/podified-antelope-centos9/openstack-neutron-server:current-podified" Jan 11 17:48:57 crc kubenswrapper[4837]: E0111 17:48:57.377064 4837 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:init,Image:quay.io/podified-antelope-centos9/openstack-neutron-server:current-podified,Command:[/bin/bash],Args:[-c dnsmasq --interface=* --conf-dir=/etc/dnsmasq.d --hostsdir=/etc/dnsmasq.d/hosts --keep-in-foreground --log-debug --bind-interfaces --listen-address=$(POD_IP) --port 5353 --log-facility=- --no-hosts --domain-needed --no-resolv --bogus-priv --log-queries --test],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:CONFIG_HASH,Value:n68chd6h679hbfh55fhc6h5ffh5d8h94h56ch589hb4hc5h57bh677hcdh655h8dh667h675h654h66ch567h8fh659h5b4h675h566h55bh54h67dh6dq,ValueFrom:nil,},EnvVar{Name:POD_IP,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:status.podIP,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:config,ReadOnly:true,MountPath:/etc/dnsmasq.d/config.cfg,SubPath:dns,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:dns-svc,ReadOnly:true,MountPath:/etc/dnsmasq.d/hosts/dns-svc,SubPath:dns-svc,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-plsmv,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000650000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod dnsmasq-dns-666b6646f7-9d6t8_openstack(c92d4f03-aa9b-4056-8e9b-2ef99a23d6ca): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Jan 11 17:48:57 crc kubenswrapper[4837]: E0111 17:48:57.378214 4837 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"init\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack/dnsmasq-dns-666b6646f7-9d6t8" podUID="c92d4f03-aa9b-4056-8e9b-2ef99a23d6ca" Jan 11 17:48:57 crc kubenswrapper[4837]: E0111 17:48:57.398006 4837 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/podified-antelope-centos9/openstack-neutron-server:current-podified" Jan 11 17:48:57 crc kubenswrapper[4837]: E0111 17:48:57.398190 4837 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:init,Image:quay.io/podified-antelope-centos9/openstack-neutron-server:current-podified,Command:[/bin/bash],Args:[-c dnsmasq --interface=* --conf-dir=/etc/dnsmasq.d --hostsdir=/etc/dnsmasq.d/hosts --keep-in-foreground --log-debug --bind-interfaces --listen-address=$(POD_IP) --port 5353 --log-facility=- --no-hosts --domain-needed --no-resolv --bogus-priv --log-queries --test],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:CONFIG_HASH,Value:nffh5bdhf4h5f8h79h55h77h58fh56dh7bh6fh578hbch55dh68h56bhd9h65dh57ch658hc9h566h666h688h58h65dh684h5d7h6ch575h5d6h88q,ValueFrom:nil,},EnvVar{Name:POD_IP,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:status.podIP,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:config,ReadOnly:true,MountPath:/etc/dnsmasq.d/config.cfg,SubPath:dns,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-85g4h,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000650000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod dnsmasq-dns-675f4bcbfc-vrgl5_openstack(f7b65961-9462-460e-9dff-90141e4f764c): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Jan 11 17:48:57 crc kubenswrapper[4837]: E0111 17:48:57.399741 4837 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"init\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack/dnsmasq-dns-675f4bcbfc-vrgl5" podUID="f7b65961-9462-460e-9dff-90141e4f764c" Jan 11 17:48:57 crc kubenswrapper[4837]: E0111 17:48:57.523123 4837 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"init\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/podified-antelope-centos9/openstack-neutron-server:current-podified\\\"\"" pod="openstack/dnsmasq-dns-57d769cc4f-mm7j2" podUID="238f9a29-a4e5-4a96-a90d-17b17a1200d2" Jan 11 17:48:57 crc kubenswrapper[4837]: E0111 17:48:57.524462 4837 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"init\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/podified-antelope-centos9/openstack-neutron-server:current-podified\\\"\"" pod="openstack/dnsmasq-dns-666b6646f7-9d6t8" podUID="c92d4f03-aa9b-4056-8e9b-2ef99a23d6ca" Jan 11 17:48:57 crc kubenswrapper[4837]: I0111 17:48:57.775411 4837 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-zfjdc"] Jan 11 17:48:57 crc kubenswrapper[4837]: I0111 17:48:57.909027 4837 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-78dd6ddcc-brdp5" Jan 11 17:48:57 crc kubenswrapper[4837]: I0111 17:48:57.940059 4837 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-ovs-22bpd"] Jan 11 17:48:57 crc kubenswrapper[4837]: W0111 17:48:57.943590 4837 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod91f28f51_1965_4fdd_bcb8_c261644249d5.slice/crio-e70a9e6d5b4d4d3870d2b1d1b2a90d721a121e3119f0b4fb015a489d186a2748 WatchSource:0}: Error finding container e70a9e6d5b4d4d3870d2b1d1b2a90d721a121e3119f0b4fb015a489d186a2748: Status 404 returned error can't find the container with id e70a9e6d5b4d4d3870d2b1d1b2a90d721a121e3119f0b4fb015a489d186a2748 Jan 11 17:48:57 crc kubenswrapper[4837]: W0111 17:48:57.950962 4837 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod0e7e2f2f_8ba4_4156_a06f_abe8b8c39477.slice/crio-950685ced33eb848c3326703772f058213cf9b6093ee2d030659aae0a917ec29 WatchSource:0}: Error finding container 950685ced33eb848c3326703772f058213cf9b6093ee2d030659aae0a917ec29: Status 404 returned error can't find the container with id 950685ced33eb848c3326703772f058213cf9b6093ee2d030659aae0a917ec29 Jan 11 17:48:57 crc kubenswrapper[4837]: I0111 17:48:57.977369 4837 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-675f4bcbfc-vrgl5" Jan 11 17:48:58 crc kubenswrapper[4837]: I0111 17:48:58.024738 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/bd4d43a7-b705-4282-8493-8c1b3bdd4015-config\") pod \"bd4d43a7-b705-4282-8493-8c1b3bdd4015\" (UID: \"bd4d43a7-b705-4282-8493-8c1b3bdd4015\") " Jan 11 17:48:58 crc kubenswrapper[4837]: I0111 17:48:58.024924 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ktzxk\" (UniqueName: \"kubernetes.io/projected/bd4d43a7-b705-4282-8493-8c1b3bdd4015-kube-api-access-ktzxk\") pod \"bd4d43a7-b705-4282-8493-8c1b3bdd4015\" (UID: \"bd4d43a7-b705-4282-8493-8c1b3bdd4015\") " Jan 11 17:48:58 crc kubenswrapper[4837]: I0111 17:48:58.024949 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/bd4d43a7-b705-4282-8493-8c1b3bdd4015-dns-svc\") pod \"bd4d43a7-b705-4282-8493-8c1b3bdd4015\" (UID: \"bd4d43a7-b705-4282-8493-8c1b3bdd4015\") " Jan 11 17:48:58 crc kubenswrapper[4837]: I0111 17:48:58.025378 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/bd4d43a7-b705-4282-8493-8c1b3bdd4015-config" (OuterVolumeSpecName: "config") pod "bd4d43a7-b705-4282-8493-8c1b3bdd4015" (UID: "bd4d43a7-b705-4282-8493-8c1b3bdd4015"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 11 17:48:58 crc kubenswrapper[4837]: I0111 17:48:58.025483 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/bd4d43a7-b705-4282-8493-8c1b3bdd4015-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "bd4d43a7-b705-4282-8493-8c1b3bdd4015" (UID: "bd4d43a7-b705-4282-8493-8c1b3bdd4015"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 11 17:48:58 crc kubenswrapper[4837]: I0111 17:48:58.028045 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bd4d43a7-b705-4282-8493-8c1b3bdd4015-kube-api-access-ktzxk" (OuterVolumeSpecName: "kube-api-access-ktzxk") pod "bd4d43a7-b705-4282-8493-8c1b3bdd4015" (UID: "bd4d43a7-b705-4282-8493-8c1b3bdd4015"). InnerVolumeSpecName "kube-api-access-ktzxk". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 11 17:48:58 crc kubenswrapper[4837]: I0111 17:48:58.126639 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f7b65961-9462-460e-9dff-90141e4f764c-config\") pod \"f7b65961-9462-460e-9dff-90141e4f764c\" (UID: \"f7b65961-9462-460e-9dff-90141e4f764c\") " Jan 11 17:48:58 crc kubenswrapper[4837]: I0111 17:48:58.126957 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-85g4h\" (UniqueName: \"kubernetes.io/projected/f7b65961-9462-460e-9dff-90141e4f764c-kube-api-access-85g4h\") pod \"f7b65961-9462-460e-9dff-90141e4f764c\" (UID: \"f7b65961-9462-460e-9dff-90141e4f764c\") " Jan 11 17:48:58 crc kubenswrapper[4837]: I0111 17:48:58.127382 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/f7b65961-9462-460e-9dff-90141e4f764c-config" (OuterVolumeSpecName: "config") pod "f7b65961-9462-460e-9dff-90141e4f764c" (UID: "f7b65961-9462-460e-9dff-90141e4f764c"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 11 17:48:58 crc kubenswrapper[4837]: I0111 17:48:58.127408 4837 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-ktzxk\" (UniqueName: \"kubernetes.io/projected/bd4d43a7-b705-4282-8493-8c1b3bdd4015-kube-api-access-ktzxk\") on node \"crc\" DevicePath \"\"" Jan 11 17:48:58 crc kubenswrapper[4837]: I0111 17:48:58.127546 4837 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/bd4d43a7-b705-4282-8493-8c1b3bdd4015-dns-svc\") on node \"crc\" DevicePath \"\"" Jan 11 17:48:58 crc kubenswrapper[4837]: I0111 17:48:58.127630 4837 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/bd4d43a7-b705-4282-8493-8c1b3bdd4015-config\") on node \"crc\" DevicePath \"\"" Jan 11 17:48:58 crc kubenswrapper[4837]: I0111 17:48:58.130371 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f7b65961-9462-460e-9dff-90141e4f764c-kube-api-access-85g4h" (OuterVolumeSpecName: "kube-api-access-85g4h") pod "f7b65961-9462-460e-9dff-90141e4f764c" (UID: "f7b65961-9462-460e-9dff-90141e4f764c"). InnerVolumeSpecName "kube-api-access-85g4h". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 11 17:48:58 crc kubenswrapper[4837]: I0111 17:48:58.229429 4837 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f7b65961-9462-460e-9dff-90141e4f764c-config\") on node \"crc\" DevicePath \"\"" Jan 11 17:48:58 crc kubenswrapper[4837]: I0111 17:48:58.229465 4837 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-85g4h\" (UniqueName: \"kubernetes.io/projected/f7b65961-9462-460e-9dff-90141e4f764c-kube-api-access-85g4h\") on node \"crc\" DevicePath \"\"" Jan 11 17:48:58 crc kubenswrapper[4837]: I0111 17:48:58.229537 4837 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovsdbserver-nb-0"] Jan 11 17:48:58 crc kubenswrapper[4837]: I0111 17:48:58.314587 4837 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovsdbserver-sb-0"] Jan 11 17:48:58 crc kubenswrapper[4837]: I0111 17:48:58.528509 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-675f4bcbfc-vrgl5" event={"ID":"f7b65961-9462-460e-9dff-90141e4f764c","Type":"ContainerDied","Data":"1a88f536e90e5690338040a0d4b8064f6c4b7ebaeb1c916e54e992dac62258d4"} Jan 11 17:48:58 crc kubenswrapper[4837]: I0111 17:48:58.528597 4837 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-675f4bcbfc-vrgl5" Jan 11 17:48:58 crc kubenswrapper[4837]: I0111 17:48:58.531710 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-zfjdc" event={"ID":"91f28f51-1965-4fdd-bcb8-c261644249d5","Type":"ContainerStarted","Data":"e70a9e6d5b4d4d3870d2b1d1b2a90d721a121e3119f0b4fb015a489d186a2748"} Jan 11 17:48:58 crc kubenswrapper[4837]: I0111 17:48:58.533116 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-ovs-22bpd" event={"ID":"0e7e2f2f-8ba4-4156-a06f-abe8b8c39477","Type":"ContainerStarted","Data":"950685ced33eb848c3326703772f058213cf9b6093ee2d030659aae0a917ec29"} Jan 11 17:48:58 crc kubenswrapper[4837]: I0111 17:48:58.534862 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-78dd6ddcc-brdp5" event={"ID":"bd4d43a7-b705-4282-8493-8c1b3bdd4015","Type":"ContainerDied","Data":"89578bb91fcf988d19c3df4debc0647cfdc22a7e4dc1e876f1e3ddab59bb8717"} Jan 11 17:48:58 crc kubenswrapper[4837]: I0111 17:48:58.534921 4837 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-78dd6ddcc-brdp5" Jan 11 17:48:58 crc kubenswrapper[4837]: I0111 17:48:58.577572 4837 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-675f4bcbfc-vrgl5"] Jan 11 17:48:58 crc kubenswrapper[4837]: I0111 17:48:58.590374 4837 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-675f4bcbfc-vrgl5"] Jan 11 17:48:58 crc kubenswrapper[4837]: I0111 17:48:58.619577 4837 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-78dd6ddcc-brdp5"] Jan 11 17:48:58 crc kubenswrapper[4837]: I0111 17:48:58.623996 4837 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-78dd6ddcc-brdp5"] Jan 11 17:48:58 crc kubenswrapper[4837]: E0111 17:48:58.739848 4837 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying system image from manifest list: copying layer: context canceled" image="registry.k8s.io/kube-state-metrics/kube-state-metrics:v2.15.0" Jan 11 17:48:58 crc kubenswrapper[4837]: E0111 17:48:58.740222 4837 kuberuntime_image.go:55] "Failed to pull image" err="rpc error: code = Canceled desc = copying system image from manifest list: copying layer: context canceled" image="registry.k8s.io/kube-state-metrics/kube-state-metrics:v2.15.0" Jan 11 17:48:58 crc kubenswrapper[4837]: E0111 17:48:58.740388 4837 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:kube-state-metrics,Image:registry.k8s.io/kube-state-metrics/kube-state-metrics:v2.15.0,Command:[],Args:[--resources=pods --namespaces=openstack],WorkingDir:,Ports:[]ContainerPort{ContainerPort{Name:http-metrics,HostPort:0,ContainerPort:8080,Protocol:TCP,HostIP:,},ContainerPort{Name:telemetry,HostPort:0,ContainerPort:8081,Protocol:TCP,HostIP:,},},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-424k8,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/livez,Port:{0 8080 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:5,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:5,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000650000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:*true,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod kube-state-metrics-0_openstack(cad8e11f-3ef0-4043-a49e-308c103a973f): ErrImagePull: rpc error: code = Canceled desc = copying system image from manifest list: copying layer: context canceled" logger="UnhandledError" Jan 11 17:48:58 crc kubenswrapper[4837]: E0111 17:48:58.741746 4837 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-state-metrics\" with ErrImagePull: \"rpc error: code = Canceled desc = copying system image from manifest list: copying layer: context canceled\"" pod="openstack/kube-state-metrics-0" podUID="cad8e11f-3ef0-4043-a49e-308c103a973f" Jan 11 17:48:58 crc kubenswrapper[4837]: W0111 17:48:58.747169 4837 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podbc3c0fec_5357_46ca_929a_527f01e1eb3d.slice/crio-f3f2e5c7ec940a51b186572b52297a5806e51db0291bf6bc1a5f14be50b1560c WatchSource:0}: Error finding container f3f2e5c7ec940a51b186572b52297a5806e51db0291bf6bc1a5f14be50b1560c: Status 404 returned error can't find the container with id f3f2e5c7ec940a51b186572b52297a5806e51db0291bf6bc1a5f14be50b1560c Jan 11 17:48:59 crc kubenswrapper[4837]: I0111 17:48:59.557361 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-sb-0" event={"ID":"bc3c0fec-5357-46ca-929a-527f01e1eb3d","Type":"ContainerStarted","Data":"f3f2e5c7ec940a51b186572b52297a5806e51db0291bf6bc1a5f14be50b1560c"} Jan 11 17:48:59 crc kubenswrapper[4837]: I0111 17:48:59.562179 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-cell1-galera-0" event={"ID":"bafaf023-917f-44a9-807e-b6a0f6a55e77","Type":"ContainerStarted","Data":"b060d5f9e214bc4c0803826f4f0ccf6a83ab3797528bb7aeda7cebc64b0ea6f5"} Jan 11 17:48:59 crc kubenswrapper[4837]: I0111 17:48:59.563377 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-nb-0" event={"ID":"5e69a588-3047-499a-b5cb-000fdcc7762a","Type":"ContainerStarted","Data":"cace1ce24454d0f9aeeb3d81732f562ed9dd5dc47f764ffe43f2b795780f4a1e"} Jan 11 17:48:59 crc kubenswrapper[4837]: I0111 17:48:59.566547 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-galera-0" event={"ID":"09134535-27db-4787-89a5-c01f72ffa182","Type":"ContainerStarted","Data":"f30fe1d6a82e9e1d75bda5732b14416a3a113f977efdce61400d425a8d7721bd"} Jan 11 17:48:59 crc kubenswrapper[4837]: E0111 17:48:59.567697 4837 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-state-metrics\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.k8s.io/kube-state-metrics/kube-state-metrics:v2.15.0\\\"\"" pod="openstack/kube-state-metrics-0" podUID="cad8e11f-3ef0-4043-a49e-308c103a973f" Jan 11 17:49:00 crc kubenswrapper[4837]: I0111 17:49:00.372292 4837 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="bd4d43a7-b705-4282-8493-8c1b3bdd4015" path="/var/lib/kubelet/pods/bd4d43a7-b705-4282-8493-8c1b3bdd4015/volumes" Jan 11 17:49:00 crc kubenswrapper[4837]: I0111 17:49:00.373015 4837 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f7b65961-9462-460e-9dff-90141e4f764c" path="/var/lib/kubelet/pods/f7b65961-9462-460e-9dff-90141e4f764c/volumes" Jan 11 17:49:02 crc kubenswrapper[4837]: I0111 17:49:02.591126 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-sb-0" event={"ID":"bc3c0fec-5357-46ca-929a-527f01e1eb3d","Type":"ContainerStarted","Data":"dbc5ca229f3baac59b638da2d05c2508d1b3603e98c28b4b788b1ef41ac9ceb0"} Jan 11 17:49:02 crc kubenswrapper[4837]: I0111 17:49:02.593542 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-zfjdc" event={"ID":"91f28f51-1965-4fdd-bcb8-c261644249d5","Type":"ContainerStarted","Data":"a17b0bcfec31299ffb058a5cdf6aa7efe710a5434abf6cc4aa7c5cf1848934ac"} Jan 11 17:49:02 crc kubenswrapper[4837]: I0111 17:49:02.593665 4837 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ovn-controller-zfjdc" Jan 11 17:49:02 crc kubenswrapper[4837]: I0111 17:49:02.596582 4837 generic.go:334] "Generic (PLEG): container finished" podID="0e7e2f2f-8ba4-4156-a06f-abe8b8c39477" containerID="fd6545e442bf3e5935f309ac2231c9af84b8db5a8916f9da7d2bcd9153d0bccd" exitCode=0 Jan 11 17:49:02 crc kubenswrapper[4837]: I0111 17:49:02.596638 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-ovs-22bpd" event={"ID":"0e7e2f2f-8ba4-4156-a06f-abe8b8c39477","Type":"ContainerDied","Data":"fd6545e442bf3e5935f309ac2231c9af84b8db5a8916f9da7d2bcd9153d0bccd"} Jan 11 17:49:02 crc kubenswrapper[4837]: I0111 17:49:02.598521 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-nb-0" event={"ID":"5e69a588-3047-499a-b5cb-000fdcc7762a","Type":"ContainerStarted","Data":"e3ec82d1912f74253209432d18b4cec6c9fbd8101f51c7685c7027c5d313d238"} Jan 11 17:49:02 crc kubenswrapper[4837]: I0111 17:49:02.626263 4837 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovn-controller-zfjdc" podStartSLOduration=30.745332547 podStartE2EDuration="34.626241531s" podCreationTimestamp="2026-01-11 17:48:28 +0000 UTC" firstStartedPulling="2026-01-11 17:48:57.945835469 +0000 UTC m=+1112.124028175" lastFinishedPulling="2026-01-11 17:49:01.826744443 +0000 UTC m=+1116.004937159" observedRunningTime="2026-01-11 17:49:02.624756901 +0000 UTC m=+1116.802949617" watchObservedRunningTime="2026-01-11 17:49:02.626241531 +0000 UTC m=+1116.804434247" Jan 11 17:49:03 crc kubenswrapper[4837]: I0111 17:49:03.612834 4837 generic.go:334] "Generic (PLEG): container finished" podID="bafaf023-917f-44a9-807e-b6a0f6a55e77" containerID="b060d5f9e214bc4c0803826f4f0ccf6a83ab3797528bb7aeda7cebc64b0ea6f5" exitCode=0 Jan 11 17:49:03 crc kubenswrapper[4837]: I0111 17:49:03.613058 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-cell1-galera-0" event={"ID":"bafaf023-917f-44a9-807e-b6a0f6a55e77","Type":"ContainerDied","Data":"b060d5f9e214bc4c0803826f4f0ccf6a83ab3797528bb7aeda7cebc64b0ea6f5"} Jan 11 17:49:03 crc kubenswrapper[4837]: I0111 17:49:03.616888 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-ovs-22bpd" event={"ID":"0e7e2f2f-8ba4-4156-a06f-abe8b8c39477","Type":"ContainerStarted","Data":"de210df3e3c25331afdfa1cb545ee39ecb97a84c329b44c1f5966d2ae7e27afa"} Jan 11 17:49:03 crc kubenswrapper[4837]: I0111 17:49:03.616949 4837 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ovn-controller-ovs-22bpd" Jan 11 17:49:03 crc kubenswrapper[4837]: I0111 17:49:03.616970 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-ovs-22bpd" event={"ID":"0e7e2f2f-8ba4-4156-a06f-abe8b8c39477","Type":"ContainerStarted","Data":"fc41899f3830e1cff40620ccfeda78efe7815e6a7083f485b1fcdd92185fc512"} Jan 11 17:49:03 crc kubenswrapper[4837]: I0111 17:49:03.616992 4837 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ovn-controller-ovs-22bpd" Jan 11 17:49:03 crc kubenswrapper[4837]: I0111 17:49:03.618415 4837 generic.go:334] "Generic (PLEG): container finished" podID="09134535-27db-4787-89a5-c01f72ffa182" containerID="f30fe1d6a82e9e1d75bda5732b14416a3a113f977efdce61400d425a8d7721bd" exitCode=0 Jan 11 17:49:03 crc kubenswrapper[4837]: I0111 17:49:03.618953 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-galera-0" event={"ID":"09134535-27db-4787-89a5-c01f72ffa182","Type":"ContainerDied","Data":"f30fe1d6a82e9e1d75bda5732b14416a3a113f977efdce61400d425a8d7721bd"} Jan 11 17:49:03 crc kubenswrapper[4837]: I0111 17:49:03.668886 4837 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovn-controller-ovs-22bpd" podStartSLOduration=31.820013306 podStartE2EDuration="35.66886716s" podCreationTimestamp="2026-01-11 17:48:28 +0000 UTC" firstStartedPulling="2026-01-11 17:48:57.954324677 +0000 UTC m=+1112.132517383" lastFinishedPulling="2026-01-11 17:49:01.803178531 +0000 UTC m=+1115.981371237" observedRunningTime="2026-01-11 17:49:03.667178444 +0000 UTC m=+1117.845371150" watchObservedRunningTime="2026-01-11 17:49:03.66886716 +0000 UTC m=+1117.847059906" Jan 11 17:49:06 crc kubenswrapper[4837]: I0111 17:49:06.648149 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-sb-0" event={"ID":"bc3c0fec-5357-46ca-929a-527f01e1eb3d","Type":"ContainerStarted","Data":"a1fef6e6cb43c3fc8d56e4b8f6f32ebfc9c263844819aab763a3f16b0d40e12c"} Jan 11 17:49:06 crc kubenswrapper[4837]: I0111 17:49:06.653426 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-cell1-galera-0" event={"ID":"bafaf023-917f-44a9-807e-b6a0f6a55e77","Type":"ContainerStarted","Data":"001bcd5d4f8b0a69b7de942f2e7f9a18fa09acb7517380e32714d1ae2e6b67fd"} Jan 11 17:49:06 crc kubenswrapper[4837]: I0111 17:49:06.655333 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-nb-0" event={"ID":"5e69a588-3047-499a-b5cb-000fdcc7762a","Type":"ContainerStarted","Data":"15d885c307ea69725e6ccbe173e46e02401dd38450145b44f41486d580bd3516"} Jan 11 17:49:06 crc kubenswrapper[4837]: I0111 17:49:06.657630 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-galera-0" event={"ID":"09134535-27db-4787-89a5-c01f72ffa182","Type":"ContainerStarted","Data":"1a789e21337ff897ebbb4494c101e9ef88defba53a3343dc99bb3edb3639b6da"} Jan 11 17:49:06 crc kubenswrapper[4837]: I0111 17:49:06.674305 4837 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovsdbserver-sb-0" podStartSLOduration=29.867548273 podStartE2EDuration="36.674279546s" podCreationTimestamp="2026-01-11 17:48:30 +0000 UTC" firstStartedPulling="2026-01-11 17:48:58.765642391 +0000 UTC m=+1112.943835107" lastFinishedPulling="2026-01-11 17:49:05.572373674 +0000 UTC m=+1119.750566380" observedRunningTime="2026-01-11 17:49:06.669554929 +0000 UTC m=+1120.847747725" watchObservedRunningTime="2026-01-11 17:49:06.674279546 +0000 UTC m=+1120.852472292" Jan 11 17:49:06 crc kubenswrapper[4837]: I0111 17:49:06.701463 4837 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/openstack-cell1-galera-0" podStartSLOduration=11.979321376 podStartE2EDuration="45.701444975s" podCreationTimestamp="2026-01-11 17:48:21 +0000 UTC" firstStartedPulling="2026-01-11 17:48:23.59683393 +0000 UTC m=+1077.775026636" lastFinishedPulling="2026-01-11 17:48:57.318957519 +0000 UTC m=+1111.497150235" observedRunningTime="2026-01-11 17:49:06.696163583 +0000 UTC m=+1120.874356299" watchObservedRunningTime="2026-01-11 17:49:06.701444975 +0000 UTC m=+1120.879637681" Jan 11 17:49:06 crc kubenswrapper[4837]: I0111 17:49:06.736750 4837 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovsdbserver-nb-0" podStartSLOduration=30.940744359 podStartE2EDuration="37.736723703s" podCreationTimestamp="2026-01-11 17:48:29 +0000 UTC" firstStartedPulling="2026-01-11 17:48:58.739289404 +0000 UTC m=+1112.917482150" lastFinishedPulling="2026-01-11 17:49:05.535268748 +0000 UTC m=+1119.713461494" observedRunningTime="2026-01-11 17:49:06.722496701 +0000 UTC m=+1120.900689467" watchObservedRunningTime="2026-01-11 17:49:06.736723703 +0000 UTC m=+1120.914916439" Jan 11 17:49:06 crc kubenswrapper[4837]: I0111 17:49:06.762825 4837 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/openstack-galera-0" podStartSLOduration=12.410044094 podStartE2EDuration="46.762796253s" podCreationTimestamp="2026-01-11 17:48:20 +0000 UTC" firstStartedPulling="2026-01-11 17:48:23.589527834 +0000 UTC m=+1077.767720540" lastFinishedPulling="2026-01-11 17:48:57.942279983 +0000 UTC m=+1112.120472699" observedRunningTime="2026-01-11 17:49:06.751103989 +0000 UTC m=+1120.929296765" watchObservedRunningTime="2026-01-11 17:49:06.762796253 +0000 UTC m=+1120.940988999" Jan 11 17:49:06 crc kubenswrapper[4837]: I0111 17:49:06.917422 4837 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ovsdbserver-sb-0" Jan 11 17:49:07 crc kubenswrapper[4837]: I0111 17:49:07.030992 4837 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/ovsdbserver-nb-0" Jan 11 17:49:07 crc kubenswrapper[4837]: I0111 17:49:07.116923 4837 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/ovsdbserver-nb-0" Jan 11 17:49:07 crc kubenswrapper[4837]: I0111 17:49:07.664972 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/memcached-0" event={"ID":"d1aa5cf3-303a-4a5b-8802-fe264fa090d6","Type":"ContainerStarted","Data":"9344bb0a9885759fef746cca649557d87ff7f9844d1ff9e11d34ae3f172f0e04"} Jan 11 17:49:07 crc kubenswrapper[4837]: I0111 17:49:07.665370 4837 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ovsdbserver-nb-0" Jan 11 17:49:07 crc kubenswrapper[4837]: I0111 17:49:07.692808 4837 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/memcached-0" podStartSLOduration=2.552425209 podStartE2EDuration="45.692775646s" podCreationTimestamp="2026-01-11 17:48:22 +0000 UTC" firstStartedPulling="2026-01-11 17:48:23.917133683 +0000 UTC m=+1078.095326399" lastFinishedPulling="2026-01-11 17:49:07.0574841 +0000 UTC m=+1121.235676836" observedRunningTime="2026-01-11 17:49:07.686701443 +0000 UTC m=+1121.864894169" watchObservedRunningTime="2026-01-11 17:49:07.692775646 +0000 UTC m=+1121.870968392" Jan 11 17:49:07 crc kubenswrapper[4837]: I0111 17:49:07.717147 4837 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ovsdbserver-nb-0" Jan 11 17:49:07 crc kubenswrapper[4837]: I0111 17:49:07.917600 4837 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/ovsdbserver-sb-0" Jan 11 17:49:07 crc kubenswrapper[4837]: I0111 17:49:07.963063 4837 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/ovsdbserver-sb-0" Jan 11 17:49:07 crc kubenswrapper[4837]: I0111 17:49:07.974829 4837 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-57d769cc4f-mm7j2"] Jan 11 17:49:08 crc kubenswrapper[4837]: I0111 17:49:08.019793 4837 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-7fd796d7df-mtqk7"] Jan 11 17:49:08 crc kubenswrapper[4837]: I0111 17:49:08.031824 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-7fd796d7df-mtqk7" Jan 11 17:49:08 crc kubenswrapper[4837]: I0111 17:49:08.039350 4837 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovsdbserver-nb" Jan 11 17:49:08 crc kubenswrapper[4837]: I0111 17:49:08.039584 4837 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-7fd796d7df-mtqk7"] Jan 11 17:49:08 crc kubenswrapper[4837]: I0111 17:49:08.084418 4837 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovn-controller-metrics-nfclh"] Jan 11 17:49:08 crc kubenswrapper[4837]: I0111 17:49:08.085755 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-metrics-nfclh" Jan 11 17:49:08 crc kubenswrapper[4837]: I0111 17:49:08.087803 4837 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovncontroller-metrics-config" Jan 11 17:49:08 crc kubenswrapper[4837]: I0111 17:49:08.119731 4837 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-metrics-nfclh"] Jan 11 17:49:08 crc kubenswrapper[4837]: I0111 17:49:08.181915 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovs-rundir\" (UniqueName: \"kubernetes.io/host-path/62b32964-26a8-4080-a404-0b40c3122184-ovs-rundir\") pod \"ovn-controller-metrics-nfclh\" (UID: \"62b32964-26a8-4080-a404-0b40c3122184\") " pod="openstack/ovn-controller-metrics-nfclh" Jan 11 17:49:08 crc kubenswrapper[4837]: I0111 17:49:08.181987 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/07f908c8-0d67-4e92-8d61-b43e17f5f73d-ovsdbserver-nb\") pod \"dnsmasq-dns-7fd796d7df-mtqk7\" (UID: \"07f908c8-0d67-4e92-8d61-b43e17f5f73d\") " pod="openstack/dnsmasq-dns-7fd796d7df-mtqk7" Jan 11 17:49:08 crc kubenswrapper[4837]: I0111 17:49:08.182014 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/07f908c8-0d67-4e92-8d61-b43e17f5f73d-dns-svc\") pod \"dnsmasq-dns-7fd796d7df-mtqk7\" (UID: \"07f908c8-0d67-4e92-8d61-b43e17f5f73d\") " pod="openstack/dnsmasq-dns-7fd796d7df-mtqk7" Jan 11 17:49:08 crc kubenswrapper[4837]: I0111 17:49:08.182052 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/62b32964-26a8-4080-a404-0b40c3122184-combined-ca-bundle\") pod \"ovn-controller-metrics-nfclh\" (UID: \"62b32964-26a8-4080-a404-0b40c3122184\") " pod="openstack/ovn-controller-metrics-nfclh" Jan 11 17:49:08 crc kubenswrapper[4837]: I0111 17:49:08.182082 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kr8p6\" (UniqueName: \"kubernetes.io/projected/07f908c8-0d67-4e92-8d61-b43e17f5f73d-kube-api-access-kr8p6\") pod \"dnsmasq-dns-7fd796d7df-mtqk7\" (UID: \"07f908c8-0d67-4e92-8d61-b43e17f5f73d\") " pod="openstack/dnsmasq-dns-7fd796d7df-mtqk7" Jan 11 17:49:08 crc kubenswrapper[4837]: I0111 17:49:08.182106 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7jxpb\" (UniqueName: \"kubernetes.io/projected/62b32964-26a8-4080-a404-0b40c3122184-kube-api-access-7jxpb\") pod \"ovn-controller-metrics-nfclh\" (UID: \"62b32964-26a8-4080-a404-0b40c3122184\") " pod="openstack/ovn-controller-metrics-nfclh" Jan 11 17:49:08 crc kubenswrapper[4837]: I0111 17:49:08.182126 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/07f908c8-0d67-4e92-8d61-b43e17f5f73d-config\") pod \"dnsmasq-dns-7fd796d7df-mtqk7\" (UID: \"07f908c8-0d67-4e92-8d61-b43e17f5f73d\") " pod="openstack/dnsmasq-dns-7fd796d7df-mtqk7" Jan 11 17:49:08 crc kubenswrapper[4837]: I0111 17:49:08.182141 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-rundir\" (UniqueName: \"kubernetes.io/host-path/62b32964-26a8-4080-a404-0b40c3122184-ovn-rundir\") pod \"ovn-controller-metrics-nfclh\" (UID: \"62b32964-26a8-4080-a404-0b40c3122184\") " pod="openstack/ovn-controller-metrics-nfclh" Jan 11 17:49:08 crc kubenswrapper[4837]: I0111 17:49:08.182159 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/62b32964-26a8-4080-a404-0b40c3122184-config\") pod \"ovn-controller-metrics-nfclh\" (UID: \"62b32964-26a8-4080-a404-0b40c3122184\") " pod="openstack/ovn-controller-metrics-nfclh" Jan 11 17:49:08 crc kubenswrapper[4837]: I0111 17:49:08.182184 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/62b32964-26a8-4080-a404-0b40c3122184-metrics-certs-tls-certs\") pod \"ovn-controller-metrics-nfclh\" (UID: \"62b32964-26a8-4080-a404-0b40c3122184\") " pod="openstack/ovn-controller-metrics-nfclh" Jan 11 17:49:08 crc kubenswrapper[4837]: I0111 17:49:08.221964 4837 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/memcached-0" Jan 11 17:49:08 crc kubenswrapper[4837]: I0111 17:49:08.285485 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/62b32964-26a8-4080-a404-0b40c3122184-config\") pod \"ovn-controller-metrics-nfclh\" (UID: \"62b32964-26a8-4080-a404-0b40c3122184\") " pod="openstack/ovn-controller-metrics-nfclh" Jan 11 17:49:08 crc kubenswrapper[4837]: I0111 17:49:08.285553 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/62b32964-26a8-4080-a404-0b40c3122184-metrics-certs-tls-certs\") pod \"ovn-controller-metrics-nfclh\" (UID: \"62b32964-26a8-4080-a404-0b40c3122184\") " pod="openstack/ovn-controller-metrics-nfclh" Jan 11 17:49:08 crc kubenswrapper[4837]: I0111 17:49:08.285590 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovs-rundir\" (UniqueName: \"kubernetes.io/host-path/62b32964-26a8-4080-a404-0b40c3122184-ovs-rundir\") pod \"ovn-controller-metrics-nfclh\" (UID: \"62b32964-26a8-4080-a404-0b40c3122184\") " pod="openstack/ovn-controller-metrics-nfclh" Jan 11 17:49:08 crc kubenswrapper[4837]: I0111 17:49:08.285629 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/07f908c8-0d67-4e92-8d61-b43e17f5f73d-ovsdbserver-nb\") pod \"dnsmasq-dns-7fd796d7df-mtqk7\" (UID: \"07f908c8-0d67-4e92-8d61-b43e17f5f73d\") " pod="openstack/dnsmasq-dns-7fd796d7df-mtqk7" Jan 11 17:49:08 crc kubenswrapper[4837]: I0111 17:49:08.285649 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/07f908c8-0d67-4e92-8d61-b43e17f5f73d-dns-svc\") pod \"dnsmasq-dns-7fd796d7df-mtqk7\" (UID: \"07f908c8-0d67-4e92-8d61-b43e17f5f73d\") " pod="openstack/dnsmasq-dns-7fd796d7df-mtqk7" Jan 11 17:49:08 crc kubenswrapper[4837]: I0111 17:49:08.285697 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/62b32964-26a8-4080-a404-0b40c3122184-combined-ca-bundle\") pod \"ovn-controller-metrics-nfclh\" (UID: \"62b32964-26a8-4080-a404-0b40c3122184\") " pod="openstack/ovn-controller-metrics-nfclh" Jan 11 17:49:08 crc kubenswrapper[4837]: I0111 17:49:08.285728 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kr8p6\" (UniqueName: \"kubernetes.io/projected/07f908c8-0d67-4e92-8d61-b43e17f5f73d-kube-api-access-kr8p6\") pod \"dnsmasq-dns-7fd796d7df-mtqk7\" (UID: \"07f908c8-0d67-4e92-8d61-b43e17f5f73d\") " pod="openstack/dnsmasq-dns-7fd796d7df-mtqk7" Jan 11 17:49:08 crc kubenswrapper[4837]: I0111 17:49:08.285752 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7jxpb\" (UniqueName: \"kubernetes.io/projected/62b32964-26a8-4080-a404-0b40c3122184-kube-api-access-7jxpb\") pod \"ovn-controller-metrics-nfclh\" (UID: \"62b32964-26a8-4080-a404-0b40c3122184\") " pod="openstack/ovn-controller-metrics-nfclh" Jan 11 17:49:08 crc kubenswrapper[4837]: I0111 17:49:08.285774 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/07f908c8-0d67-4e92-8d61-b43e17f5f73d-config\") pod \"dnsmasq-dns-7fd796d7df-mtqk7\" (UID: \"07f908c8-0d67-4e92-8d61-b43e17f5f73d\") " pod="openstack/dnsmasq-dns-7fd796d7df-mtqk7" Jan 11 17:49:08 crc kubenswrapper[4837]: I0111 17:49:08.285787 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-rundir\" (UniqueName: \"kubernetes.io/host-path/62b32964-26a8-4080-a404-0b40c3122184-ovn-rundir\") pod \"ovn-controller-metrics-nfclh\" (UID: \"62b32964-26a8-4080-a404-0b40c3122184\") " pod="openstack/ovn-controller-metrics-nfclh" Jan 11 17:49:08 crc kubenswrapper[4837]: I0111 17:49:08.285930 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovs-rundir\" (UniqueName: \"kubernetes.io/host-path/62b32964-26a8-4080-a404-0b40c3122184-ovs-rundir\") pod \"ovn-controller-metrics-nfclh\" (UID: \"62b32964-26a8-4080-a404-0b40c3122184\") " pod="openstack/ovn-controller-metrics-nfclh" Jan 11 17:49:08 crc kubenswrapper[4837]: I0111 17:49:08.286304 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/62b32964-26a8-4080-a404-0b40c3122184-config\") pod \"ovn-controller-metrics-nfclh\" (UID: \"62b32964-26a8-4080-a404-0b40c3122184\") " pod="openstack/ovn-controller-metrics-nfclh" Jan 11 17:49:08 crc kubenswrapper[4837]: I0111 17:49:08.286744 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/07f908c8-0d67-4e92-8d61-b43e17f5f73d-dns-svc\") pod \"dnsmasq-dns-7fd796d7df-mtqk7\" (UID: \"07f908c8-0d67-4e92-8d61-b43e17f5f73d\") " pod="openstack/dnsmasq-dns-7fd796d7df-mtqk7" Jan 11 17:49:08 crc kubenswrapper[4837]: I0111 17:49:08.287049 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/07f908c8-0d67-4e92-8d61-b43e17f5f73d-ovsdbserver-nb\") pod \"dnsmasq-dns-7fd796d7df-mtqk7\" (UID: \"07f908c8-0d67-4e92-8d61-b43e17f5f73d\") " pod="openstack/dnsmasq-dns-7fd796d7df-mtqk7" Jan 11 17:49:08 crc kubenswrapper[4837]: I0111 17:49:08.287141 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-rundir\" (UniqueName: \"kubernetes.io/host-path/62b32964-26a8-4080-a404-0b40c3122184-ovn-rundir\") pod \"ovn-controller-metrics-nfclh\" (UID: \"62b32964-26a8-4080-a404-0b40c3122184\") " pod="openstack/ovn-controller-metrics-nfclh" Jan 11 17:49:08 crc kubenswrapper[4837]: I0111 17:49:08.287274 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/07f908c8-0d67-4e92-8d61-b43e17f5f73d-config\") pod \"dnsmasq-dns-7fd796d7df-mtqk7\" (UID: \"07f908c8-0d67-4e92-8d61-b43e17f5f73d\") " pod="openstack/dnsmasq-dns-7fd796d7df-mtqk7" Jan 11 17:49:08 crc kubenswrapper[4837]: I0111 17:49:08.300144 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/62b32964-26a8-4080-a404-0b40c3122184-metrics-certs-tls-certs\") pod \"ovn-controller-metrics-nfclh\" (UID: \"62b32964-26a8-4080-a404-0b40c3122184\") " pod="openstack/ovn-controller-metrics-nfclh" Jan 11 17:49:08 crc kubenswrapper[4837]: I0111 17:49:08.309257 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7jxpb\" (UniqueName: \"kubernetes.io/projected/62b32964-26a8-4080-a404-0b40c3122184-kube-api-access-7jxpb\") pod \"ovn-controller-metrics-nfclh\" (UID: \"62b32964-26a8-4080-a404-0b40c3122184\") " pod="openstack/ovn-controller-metrics-nfclh" Jan 11 17:49:08 crc kubenswrapper[4837]: I0111 17:49:08.311398 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-kr8p6\" (UniqueName: \"kubernetes.io/projected/07f908c8-0d67-4e92-8d61-b43e17f5f73d-kube-api-access-kr8p6\") pod \"dnsmasq-dns-7fd796d7df-mtqk7\" (UID: \"07f908c8-0d67-4e92-8d61-b43e17f5f73d\") " pod="openstack/dnsmasq-dns-7fd796d7df-mtqk7" Jan 11 17:49:08 crc kubenswrapper[4837]: I0111 17:49:08.319182 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/62b32964-26a8-4080-a404-0b40c3122184-combined-ca-bundle\") pod \"ovn-controller-metrics-nfclh\" (UID: \"62b32964-26a8-4080-a404-0b40c3122184\") " pod="openstack/ovn-controller-metrics-nfclh" Jan 11 17:49:08 crc kubenswrapper[4837]: I0111 17:49:08.356794 4837 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-666b6646f7-9d6t8"] Jan 11 17:49:08 crc kubenswrapper[4837]: I0111 17:49:08.379477 4837 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-57d769cc4f-mm7j2" Jan 11 17:49:08 crc kubenswrapper[4837]: I0111 17:49:08.381175 4837 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-86db49b7ff-79c66"] Jan 11 17:49:08 crc kubenswrapper[4837]: I0111 17:49:08.382501 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-86db49b7ff-79c66" Jan 11 17:49:08 crc kubenswrapper[4837]: I0111 17:49:08.388797 4837 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovsdbserver-sb" Jan 11 17:49:08 crc kubenswrapper[4837]: I0111 17:49:08.403950 4837 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-86db49b7ff-79c66"] Jan 11 17:49:08 crc kubenswrapper[4837]: I0111 17:49:08.490795 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-7fd796d7df-mtqk7" Jan 11 17:49:08 crc kubenswrapper[4837]: I0111 17:49:08.491125 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-hmgqz\" (UniqueName: \"kubernetes.io/projected/238f9a29-a4e5-4a96-a90d-17b17a1200d2-kube-api-access-hmgqz\") pod \"238f9a29-a4e5-4a96-a90d-17b17a1200d2\" (UID: \"238f9a29-a4e5-4a96-a90d-17b17a1200d2\") " Jan 11 17:49:08 crc kubenswrapper[4837]: I0111 17:49:08.491242 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/238f9a29-a4e5-4a96-a90d-17b17a1200d2-dns-svc\") pod \"238f9a29-a4e5-4a96-a90d-17b17a1200d2\" (UID: \"238f9a29-a4e5-4a96-a90d-17b17a1200d2\") " Jan 11 17:49:08 crc kubenswrapper[4837]: I0111 17:49:08.491381 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/238f9a29-a4e5-4a96-a90d-17b17a1200d2-config\") pod \"238f9a29-a4e5-4a96-a90d-17b17a1200d2\" (UID: \"238f9a29-a4e5-4a96-a90d-17b17a1200d2\") " Jan 11 17:49:08 crc kubenswrapper[4837]: I0111 17:49:08.491578 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/e7a1bd32-302a-49e9-a142-942dec63be03-ovsdbserver-sb\") pod \"dnsmasq-dns-86db49b7ff-79c66\" (UID: \"e7a1bd32-302a-49e9-a142-942dec63be03\") " pod="openstack/dnsmasq-dns-86db49b7ff-79c66" Jan 11 17:49:08 crc kubenswrapper[4837]: I0111 17:49:08.491668 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/e7a1bd32-302a-49e9-a142-942dec63be03-dns-svc\") pod \"dnsmasq-dns-86db49b7ff-79c66\" (UID: \"e7a1bd32-302a-49e9-a142-942dec63be03\") " pod="openstack/dnsmasq-dns-86db49b7ff-79c66" Jan 11 17:49:08 crc kubenswrapper[4837]: I0111 17:49:08.491700 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tl4fn\" (UniqueName: \"kubernetes.io/projected/e7a1bd32-302a-49e9-a142-942dec63be03-kube-api-access-tl4fn\") pod \"dnsmasq-dns-86db49b7ff-79c66\" (UID: \"e7a1bd32-302a-49e9-a142-942dec63be03\") " pod="openstack/dnsmasq-dns-86db49b7ff-79c66" Jan 11 17:49:08 crc kubenswrapper[4837]: I0111 17:49:08.491741 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e7a1bd32-302a-49e9-a142-942dec63be03-config\") pod \"dnsmasq-dns-86db49b7ff-79c66\" (UID: \"e7a1bd32-302a-49e9-a142-942dec63be03\") " pod="openstack/dnsmasq-dns-86db49b7ff-79c66" Jan 11 17:49:08 crc kubenswrapper[4837]: I0111 17:49:08.491810 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/e7a1bd32-302a-49e9-a142-942dec63be03-ovsdbserver-nb\") pod \"dnsmasq-dns-86db49b7ff-79c66\" (UID: \"e7a1bd32-302a-49e9-a142-942dec63be03\") " pod="openstack/dnsmasq-dns-86db49b7ff-79c66" Jan 11 17:49:08 crc kubenswrapper[4837]: I0111 17:49:08.492242 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/238f9a29-a4e5-4a96-a90d-17b17a1200d2-config" (OuterVolumeSpecName: "config") pod "238f9a29-a4e5-4a96-a90d-17b17a1200d2" (UID: "238f9a29-a4e5-4a96-a90d-17b17a1200d2"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 11 17:49:08 crc kubenswrapper[4837]: I0111 17:49:08.492533 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/238f9a29-a4e5-4a96-a90d-17b17a1200d2-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "238f9a29-a4e5-4a96-a90d-17b17a1200d2" (UID: "238f9a29-a4e5-4a96-a90d-17b17a1200d2"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 11 17:49:08 crc kubenswrapper[4837]: I0111 17:49:08.493710 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-metrics-nfclh" Jan 11 17:49:08 crc kubenswrapper[4837]: I0111 17:49:08.498329 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/238f9a29-a4e5-4a96-a90d-17b17a1200d2-kube-api-access-hmgqz" (OuterVolumeSpecName: "kube-api-access-hmgqz") pod "238f9a29-a4e5-4a96-a90d-17b17a1200d2" (UID: "238f9a29-a4e5-4a96-a90d-17b17a1200d2"). InnerVolumeSpecName "kube-api-access-hmgqz". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 11 17:49:08 crc kubenswrapper[4837]: I0111 17:49:08.592686 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/e7a1bd32-302a-49e9-a142-942dec63be03-dns-svc\") pod \"dnsmasq-dns-86db49b7ff-79c66\" (UID: \"e7a1bd32-302a-49e9-a142-942dec63be03\") " pod="openstack/dnsmasq-dns-86db49b7ff-79c66" Jan 11 17:49:08 crc kubenswrapper[4837]: I0111 17:49:08.592934 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-tl4fn\" (UniqueName: \"kubernetes.io/projected/e7a1bd32-302a-49e9-a142-942dec63be03-kube-api-access-tl4fn\") pod \"dnsmasq-dns-86db49b7ff-79c66\" (UID: \"e7a1bd32-302a-49e9-a142-942dec63be03\") " pod="openstack/dnsmasq-dns-86db49b7ff-79c66" Jan 11 17:49:08 crc kubenswrapper[4837]: I0111 17:49:08.593294 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e7a1bd32-302a-49e9-a142-942dec63be03-config\") pod \"dnsmasq-dns-86db49b7ff-79c66\" (UID: \"e7a1bd32-302a-49e9-a142-942dec63be03\") " pod="openstack/dnsmasq-dns-86db49b7ff-79c66" Jan 11 17:49:08 crc kubenswrapper[4837]: I0111 17:49:08.593632 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/e7a1bd32-302a-49e9-a142-942dec63be03-dns-svc\") pod \"dnsmasq-dns-86db49b7ff-79c66\" (UID: \"e7a1bd32-302a-49e9-a142-942dec63be03\") " pod="openstack/dnsmasq-dns-86db49b7ff-79c66" Jan 11 17:49:08 crc kubenswrapper[4837]: I0111 17:49:08.593968 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e7a1bd32-302a-49e9-a142-942dec63be03-config\") pod \"dnsmasq-dns-86db49b7ff-79c66\" (UID: \"e7a1bd32-302a-49e9-a142-942dec63be03\") " pod="openstack/dnsmasq-dns-86db49b7ff-79c66" Jan 11 17:49:08 crc kubenswrapper[4837]: I0111 17:49:08.595322 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/e7a1bd32-302a-49e9-a142-942dec63be03-ovsdbserver-nb\") pod \"dnsmasq-dns-86db49b7ff-79c66\" (UID: \"e7a1bd32-302a-49e9-a142-942dec63be03\") " pod="openstack/dnsmasq-dns-86db49b7ff-79c66" Jan 11 17:49:08 crc kubenswrapper[4837]: I0111 17:49:08.595353 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/e7a1bd32-302a-49e9-a142-942dec63be03-ovsdbserver-sb\") pod \"dnsmasq-dns-86db49b7ff-79c66\" (UID: \"e7a1bd32-302a-49e9-a142-942dec63be03\") " pod="openstack/dnsmasq-dns-86db49b7ff-79c66" Jan 11 17:49:08 crc kubenswrapper[4837]: I0111 17:49:08.595968 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/e7a1bd32-302a-49e9-a142-942dec63be03-ovsdbserver-nb\") pod \"dnsmasq-dns-86db49b7ff-79c66\" (UID: \"e7a1bd32-302a-49e9-a142-942dec63be03\") " pod="openstack/dnsmasq-dns-86db49b7ff-79c66" Jan 11 17:49:08 crc kubenswrapper[4837]: I0111 17:49:08.596064 4837 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-hmgqz\" (UniqueName: \"kubernetes.io/projected/238f9a29-a4e5-4a96-a90d-17b17a1200d2-kube-api-access-hmgqz\") on node \"crc\" DevicePath \"\"" Jan 11 17:49:08 crc kubenswrapper[4837]: I0111 17:49:08.596075 4837 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/238f9a29-a4e5-4a96-a90d-17b17a1200d2-dns-svc\") on node \"crc\" DevicePath \"\"" Jan 11 17:49:08 crc kubenswrapper[4837]: I0111 17:49:08.596084 4837 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/238f9a29-a4e5-4a96-a90d-17b17a1200d2-config\") on node \"crc\" DevicePath \"\"" Jan 11 17:49:08 crc kubenswrapper[4837]: I0111 17:49:08.596156 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/e7a1bd32-302a-49e9-a142-942dec63be03-ovsdbserver-sb\") pod \"dnsmasq-dns-86db49b7ff-79c66\" (UID: \"e7a1bd32-302a-49e9-a142-942dec63be03\") " pod="openstack/dnsmasq-dns-86db49b7ff-79c66" Jan 11 17:49:08 crc kubenswrapper[4837]: I0111 17:49:08.672931 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-57d769cc4f-mm7j2" event={"ID":"238f9a29-a4e5-4a96-a90d-17b17a1200d2","Type":"ContainerDied","Data":"e9940f25dacab204423c946fb342bdfa2eb047261c8c4a53d9c7676956a6de15"} Jan 11 17:49:08 crc kubenswrapper[4837]: I0111 17:49:08.673217 4837 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-57d769cc4f-mm7j2" Jan 11 17:49:08 crc kubenswrapper[4837]: I0111 17:49:08.727563 4837 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-57d769cc4f-mm7j2"] Jan 11 17:49:08 crc kubenswrapper[4837]: I0111 17:49:08.728418 4837 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ovsdbserver-sb-0" Jan 11 17:49:08 crc kubenswrapper[4837]: I0111 17:49:08.734042 4837 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-57d769cc4f-mm7j2"] Jan 11 17:49:08 crc kubenswrapper[4837]: I0111 17:49:08.775908 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-tl4fn\" (UniqueName: \"kubernetes.io/projected/e7a1bd32-302a-49e9-a142-942dec63be03-kube-api-access-tl4fn\") pod \"dnsmasq-dns-86db49b7ff-79c66\" (UID: \"e7a1bd32-302a-49e9-a142-942dec63be03\") " pod="openstack/dnsmasq-dns-86db49b7ff-79c66" Jan 11 17:49:08 crc kubenswrapper[4837]: I0111 17:49:08.917923 4837 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovn-northd-0"] Jan 11 17:49:08 crc kubenswrapper[4837]: I0111 17:49:08.919993 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-northd-0" Jan 11 17:49:08 crc kubenswrapper[4837]: I0111 17:49:08.926210 4837 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovnnorthd-scripts" Jan 11 17:49:08 crc kubenswrapper[4837]: I0111 17:49:08.926222 4837 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovnnorthd-config" Jan 11 17:49:08 crc kubenswrapper[4837]: I0111 17:49:08.926353 4837 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-ovnnorthd-ovndbs" Jan 11 17:49:08 crc kubenswrapper[4837]: I0111 17:49:08.928366 4837 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-metrics-nfclh"] Jan 11 17:49:08 crc kubenswrapper[4837]: I0111 17:49:08.936317 4837 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ovnnorthd-ovnnorthd-dockercfg-qtl4q" Jan 11 17:49:08 crc kubenswrapper[4837]: I0111 17:49:08.944983 4837 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-northd-0"] Jan 11 17:49:08 crc kubenswrapper[4837]: I0111 17:49:08.987500 4837 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-7fd796d7df-mtqk7"] Jan 11 17:49:09 crc kubenswrapper[4837]: I0111 17:49:09.000929 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/de66fa79-5d8b-48c3-a30a-af21fbdd19b3-scripts\") pod \"ovn-northd-0\" (UID: \"de66fa79-5d8b-48c3-a30a-af21fbdd19b3\") " pod="openstack/ovn-northd-0" Jan 11 17:49:09 crc kubenswrapper[4837]: I0111 17:49:09.001042 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-rundir\" (UniqueName: \"kubernetes.io/empty-dir/de66fa79-5d8b-48c3-a30a-af21fbdd19b3-ovn-rundir\") pod \"ovn-northd-0\" (UID: \"de66fa79-5d8b-48c3-a30a-af21fbdd19b3\") " pod="openstack/ovn-northd-0" Jan 11 17:49:09 crc kubenswrapper[4837]: I0111 17:49:09.001086 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/de66fa79-5d8b-48c3-a30a-af21fbdd19b3-config\") pod \"ovn-northd-0\" (UID: \"de66fa79-5d8b-48c3-a30a-af21fbdd19b3\") " pod="openstack/ovn-northd-0" Jan 11 17:49:09 crc kubenswrapper[4837]: I0111 17:49:09.001105 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-northd-tls-certs\" (UniqueName: \"kubernetes.io/secret/de66fa79-5d8b-48c3-a30a-af21fbdd19b3-ovn-northd-tls-certs\") pod \"ovn-northd-0\" (UID: \"de66fa79-5d8b-48c3-a30a-af21fbdd19b3\") " pod="openstack/ovn-northd-0" Jan 11 17:49:09 crc kubenswrapper[4837]: I0111 17:49:09.001141 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8z5p8\" (UniqueName: \"kubernetes.io/projected/de66fa79-5d8b-48c3-a30a-af21fbdd19b3-kube-api-access-8z5p8\") pod \"ovn-northd-0\" (UID: \"de66fa79-5d8b-48c3-a30a-af21fbdd19b3\") " pod="openstack/ovn-northd-0" Jan 11 17:49:09 crc kubenswrapper[4837]: I0111 17:49:09.001260 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/de66fa79-5d8b-48c3-a30a-af21fbdd19b3-combined-ca-bundle\") pod \"ovn-northd-0\" (UID: \"de66fa79-5d8b-48c3-a30a-af21fbdd19b3\") " pod="openstack/ovn-northd-0" Jan 11 17:49:09 crc kubenswrapper[4837]: I0111 17:49:09.001304 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/de66fa79-5d8b-48c3-a30a-af21fbdd19b3-metrics-certs-tls-certs\") pod \"ovn-northd-0\" (UID: \"de66fa79-5d8b-48c3-a30a-af21fbdd19b3\") " pod="openstack/ovn-northd-0" Jan 11 17:49:09 crc kubenswrapper[4837]: I0111 17:49:09.016735 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-86db49b7ff-79c66" Jan 11 17:49:09 crc kubenswrapper[4837]: W0111 17:49:09.078196 4837 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod62b32964_26a8_4080_a404_0b40c3122184.slice/crio-f4cab71219e56c5fd5810ba53c7d7c333c0f34248f1897ea5c44139a961d458e WatchSource:0}: Error finding container f4cab71219e56c5fd5810ba53c7d7c333c0f34248f1897ea5c44139a961d458e: Status 404 returned error can't find the container with id f4cab71219e56c5fd5810ba53c7d7c333c0f34248f1897ea5c44139a961d458e Jan 11 17:49:09 crc kubenswrapper[4837]: I0111 17:49:09.103266 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/de66fa79-5d8b-48c3-a30a-af21fbdd19b3-scripts\") pod \"ovn-northd-0\" (UID: \"de66fa79-5d8b-48c3-a30a-af21fbdd19b3\") " pod="openstack/ovn-northd-0" Jan 11 17:49:09 crc kubenswrapper[4837]: I0111 17:49:09.103353 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-rundir\" (UniqueName: \"kubernetes.io/empty-dir/de66fa79-5d8b-48c3-a30a-af21fbdd19b3-ovn-rundir\") pod \"ovn-northd-0\" (UID: \"de66fa79-5d8b-48c3-a30a-af21fbdd19b3\") " pod="openstack/ovn-northd-0" Jan 11 17:49:09 crc kubenswrapper[4837]: I0111 17:49:09.103395 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/de66fa79-5d8b-48c3-a30a-af21fbdd19b3-config\") pod \"ovn-northd-0\" (UID: \"de66fa79-5d8b-48c3-a30a-af21fbdd19b3\") " pod="openstack/ovn-northd-0" Jan 11 17:49:09 crc kubenswrapper[4837]: I0111 17:49:09.103414 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-northd-tls-certs\" (UniqueName: \"kubernetes.io/secret/de66fa79-5d8b-48c3-a30a-af21fbdd19b3-ovn-northd-tls-certs\") pod \"ovn-northd-0\" (UID: \"de66fa79-5d8b-48c3-a30a-af21fbdd19b3\") " pod="openstack/ovn-northd-0" Jan 11 17:49:09 crc kubenswrapper[4837]: I0111 17:49:09.103441 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8z5p8\" (UniqueName: \"kubernetes.io/projected/de66fa79-5d8b-48c3-a30a-af21fbdd19b3-kube-api-access-8z5p8\") pod \"ovn-northd-0\" (UID: \"de66fa79-5d8b-48c3-a30a-af21fbdd19b3\") " pod="openstack/ovn-northd-0" Jan 11 17:49:09 crc kubenswrapper[4837]: I0111 17:49:09.103468 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/de66fa79-5d8b-48c3-a30a-af21fbdd19b3-combined-ca-bundle\") pod \"ovn-northd-0\" (UID: \"de66fa79-5d8b-48c3-a30a-af21fbdd19b3\") " pod="openstack/ovn-northd-0" Jan 11 17:49:09 crc kubenswrapper[4837]: I0111 17:49:09.103491 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/de66fa79-5d8b-48c3-a30a-af21fbdd19b3-metrics-certs-tls-certs\") pod \"ovn-northd-0\" (UID: \"de66fa79-5d8b-48c3-a30a-af21fbdd19b3\") " pod="openstack/ovn-northd-0" Jan 11 17:49:09 crc kubenswrapper[4837]: I0111 17:49:09.104300 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/de66fa79-5d8b-48c3-a30a-af21fbdd19b3-scripts\") pod \"ovn-northd-0\" (UID: \"de66fa79-5d8b-48c3-a30a-af21fbdd19b3\") " pod="openstack/ovn-northd-0" Jan 11 17:49:09 crc kubenswrapper[4837]: I0111 17:49:09.104474 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/de66fa79-5d8b-48c3-a30a-af21fbdd19b3-config\") pod \"ovn-northd-0\" (UID: \"de66fa79-5d8b-48c3-a30a-af21fbdd19b3\") " pod="openstack/ovn-northd-0" Jan 11 17:49:09 crc kubenswrapper[4837]: I0111 17:49:09.104660 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-rundir\" (UniqueName: \"kubernetes.io/empty-dir/de66fa79-5d8b-48c3-a30a-af21fbdd19b3-ovn-rundir\") pod \"ovn-northd-0\" (UID: \"de66fa79-5d8b-48c3-a30a-af21fbdd19b3\") " pod="openstack/ovn-northd-0" Jan 11 17:49:09 crc kubenswrapper[4837]: I0111 17:49:09.109383 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/de66fa79-5d8b-48c3-a30a-af21fbdd19b3-combined-ca-bundle\") pod \"ovn-northd-0\" (UID: \"de66fa79-5d8b-48c3-a30a-af21fbdd19b3\") " pod="openstack/ovn-northd-0" Jan 11 17:49:09 crc kubenswrapper[4837]: I0111 17:49:09.109502 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-northd-tls-certs\" (UniqueName: \"kubernetes.io/secret/de66fa79-5d8b-48c3-a30a-af21fbdd19b3-ovn-northd-tls-certs\") pod \"ovn-northd-0\" (UID: \"de66fa79-5d8b-48c3-a30a-af21fbdd19b3\") " pod="openstack/ovn-northd-0" Jan 11 17:49:09 crc kubenswrapper[4837]: I0111 17:49:09.120400 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8z5p8\" (UniqueName: \"kubernetes.io/projected/de66fa79-5d8b-48c3-a30a-af21fbdd19b3-kube-api-access-8z5p8\") pod \"ovn-northd-0\" (UID: \"de66fa79-5d8b-48c3-a30a-af21fbdd19b3\") " pod="openstack/ovn-northd-0" Jan 11 17:49:09 crc kubenswrapper[4837]: I0111 17:49:09.174169 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/de66fa79-5d8b-48c3-a30a-af21fbdd19b3-metrics-certs-tls-certs\") pod \"ovn-northd-0\" (UID: \"de66fa79-5d8b-48c3-a30a-af21fbdd19b3\") " pod="openstack/ovn-northd-0" Jan 11 17:49:09 crc kubenswrapper[4837]: I0111 17:49:09.235323 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-northd-0" Jan 11 17:49:09 crc kubenswrapper[4837]: I0111 17:49:09.314472 4837 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-666b6646f7-9d6t8" Jan 11 17:49:09 crc kubenswrapper[4837]: I0111 17:49:09.408168 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c92d4f03-aa9b-4056-8e9b-2ef99a23d6ca-config\") pod \"c92d4f03-aa9b-4056-8e9b-2ef99a23d6ca\" (UID: \"c92d4f03-aa9b-4056-8e9b-2ef99a23d6ca\") " Jan 11 17:49:09 crc kubenswrapper[4837]: I0111 17:49:09.408659 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-plsmv\" (UniqueName: \"kubernetes.io/projected/c92d4f03-aa9b-4056-8e9b-2ef99a23d6ca-kube-api-access-plsmv\") pod \"c92d4f03-aa9b-4056-8e9b-2ef99a23d6ca\" (UID: \"c92d4f03-aa9b-4056-8e9b-2ef99a23d6ca\") " Jan 11 17:49:09 crc kubenswrapper[4837]: I0111 17:49:09.408756 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/c92d4f03-aa9b-4056-8e9b-2ef99a23d6ca-dns-svc\") pod \"c92d4f03-aa9b-4056-8e9b-2ef99a23d6ca\" (UID: \"c92d4f03-aa9b-4056-8e9b-2ef99a23d6ca\") " Jan 11 17:49:09 crc kubenswrapper[4837]: I0111 17:49:09.409536 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c92d4f03-aa9b-4056-8e9b-2ef99a23d6ca-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "c92d4f03-aa9b-4056-8e9b-2ef99a23d6ca" (UID: "c92d4f03-aa9b-4056-8e9b-2ef99a23d6ca"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 11 17:49:09 crc kubenswrapper[4837]: I0111 17:49:09.409922 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c92d4f03-aa9b-4056-8e9b-2ef99a23d6ca-config" (OuterVolumeSpecName: "config") pod "c92d4f03-aa9b-4056-8e9b-2ef99a23d6ca" (UID: "c92d4f03-aa9b-4056-8e9b-2ef99a23d6ca"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 11 17:49:09 crc kubenswrapper[4837]: I0111 17:49:09.430967 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c92d4f03-aa9b-4056-8e9b-2ef99a23d6ca-kube-api-access-plsmv" (OuterVolumeSpecName: "kube-api-access-plsmv") pod "c92d4f03-aa9b-4056-8e9b-2ef99a23d6ca" (UID: "c92d4f03-aa9b-4056-8e9b-2ef99a23d6ca"). InnerVolumeSpecName "kube-api-access-plsmv". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 11 17:49:09 crc kubenswrapper[4837]: I0111 17:49:09.510238 4837 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c92d4f03-aa9b-4056-8e9b-2ef99a23d6ca-config\") on node \"crc\" DevicePath \"\"" Jan 11 17:49:09 crc kubenswrapper[4837]: I0111 17:49:09.510265 4837 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-plsmv\" (UniqueName: \"kubernetes.io/projected/c92d4f03-aa9b-4056-8e9b-2ef99a23d6ca-kube-api-access-plsmv\") on node \"crc\" DevicePath \"\"" Jan 11 17:49:09 crc kubenswrapper[4837]: I0111 17:49:09.510276 4837 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/c92d4f03-aa9b-4056-8e9b-2ef99a23d6ca-dns-svc\") on node \"crc\" DevicePath \"\"" Jan 11 17:49:09 crc kubenswrapper[4837]: I0111 17:49:09.667299 4837 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-86db49b7ff-79c66"] Jan 11 17:49:09 crc kubenswrapper[4837]: I0111 17:49:09.691655 4837 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-666b6646f7-9d6t8" Jan 11 17:49:09 crc kubenswrapper[4837]: I0111 17:49:09.691646 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-666b6646f7-9d6t8" event={"ID":"c92d4f03-aa9b-4056-8e9b-2ef99a23d6ca","Type":"ContainerDied","Data":"b16f3d51c6d06d04f6e28cc20ba7d997f514ad577976a8ebe1b7b43e67e72849"} Jan 11 17:49:09 crc kubenswrapper[4837]: I0111 17:49:09.693951 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-metrics-nfclh" event={"ID":"62b32964-26a8-4080-a404-0b40c3122184","Type":"ContainerStarted","Data":"f4cab71219e56c5fd5810ba53c7d7c333c0f34248f1897ea5c44139a961d458e"} Jan 11 17:49:09 crc kubenswrapper[4837]: I0111 17:49:09.698115 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-7fd796d7df-mtqk7" event={"ID":"07f908c8-0d67-4e92-8d61-b43e17f5f73d","Type":"ContainerStarted","Data":"168cbb3e4b893d17858c04b932ff12068a61992b49ef5a9ca9a21e6da4846b28"} Jan 11 17:49:09 crc kubenswrapper[4837]: I0111 17:49:09.745192 4837 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-northd-0"] Jan 11 17:49:09 crc kubenswrapper[4837]: I0111 17:49:09.777502 4837 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-666b6646f7-9d6t8"] Jan 11 17:49:09 crc kubenswrapper[4837]: W0111 17:49:09.777785 4837 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podde66fa79_5d8b_48c3_a30a_af21fbdd19b3.slice/crio-1cc1f71b7b58244b937582a264ab6bb385af0a7ee497dad462f2e6beea3ba68f WatchSource:0}: Error finding container 1cc1f71b7b58244b937582a264ab6bb385af0a7ee497dad462f2e6beea3ba68f: Status 404 returned error can't find the container with id 1cc1f71b7b58244b937582a264ab6bb385af0a7ee497dad462f2e6beea3ba68f Jan 11 17:49:09 crc kubenswrapper[4837]: I0111 17:49:09.783108 4837 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-666b6646f7-9d6t8"] Jan 11 17:49:10 crc kubenswrapper[4837]: I0111 17:49:10.381295 4837 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="238f9a29-a4e5-4a96-a90d-17b17a1200d2" path="/var/lib/kubelet/pods/238f9a29-a4e5-4a96-a90d-17b17a1200d2/volumes" Jan 11 17:49:10 crc kubenswrapper[4837]: I0111 17:49:10.382355 4837 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c92d4f03-aa9b-4056-8e9b-2ef99a23d6ca" path="/var/lib/kubelet/pods/c92d4f03-aa9b-4056-8e9b-2ef99a23d6ca/volumes" Jan 11 17:49:10 crc kubenswrapper[4837]: I0111 17:49:10.712654 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-86db49b7ff-79c66" event={"ID":"e7a1bd32-302a-49e9-a142-942dec63be03","Type":"ContainerStarted","Data":"d235391e253a6595b7a2e6982e5d878ca0e2a1a8c6e30052aac1c6265a993dbc"} Jan 11 17:49:10 crc kubenswrapper[4837]: I0111 17:49:10.715640 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-northd-0" event={"ID":"de66fa79-5d8b-48c3-a30a-af21fbdd19b3","Type":"ContainerStarted","Data":"1cc1f71b7b58244b937582a264ab6bb385af0a7ee497dad462f2e6beea3ba68f"} Jan 11 17:49:11 crc kubenswrapper[4837]: I0111 17:49:11.721581 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"f21b505a-45c3-4f7e-b323-204d384185b9","Type":"ContainerStarted","Data":"c8a7e39f9bce03573496d4a5fbc8f9cbf25f994a2506c623c9b99578288b1ccc"} Jan 11 17:49:11 crc kubenswrapper[4837]: I0111 17:49:11.723753 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-metrics-nfclh" event={"ID":"62b32964-26a8-4080-a404-0b40c3122184","Type":"ContainerStarted","Data":"cae997d2134b43c43fd49bd8f1cde49f3d8d5775757b07fb1a56506e125c27c2"} Jan 11 17:49:11 crc kubenswrapper[4837]: I0111 17:49:11.725270 4837 generic.go:334] "Generic (PLEG): container finished" podID="07f908c8-0d67-4e92-8d61-b43e17f5f73d" containerID="c82518b50509709b49151f3591ef9ce17fdd44fbdf5c6c3620a89ab84c39e241" exitCode=0 Jan 11 17:49:11 crc kubenswrapper[4837]: I0111 17:49:11.725316 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-7fd796d7df-mtqk7" event={"ID":"07f908c8-0d67-4e92-8d61-b43e17f5f73d","Type":"ContainerDied","Data":"c82518b50509709b49151f3591ef9ce17fdd44fbdf5c6c3620a89ab84c39e241"} Jan 11 17:49:11 crc kubenswrapper[4837]: I0111 17:49:11.727910 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"89ca88bf-2462-4fce-8a85-8dc04655b21c","Type":"ContainerStarted","Data":"8e0c518885ba9fd613894e81f44f81cf63dac770e6d902da138829b469a23051"} Jan 11 17:49:11 crc kubenswrapper[4837]: I0111 17:49:11.801407 4837 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovn-controller-metrics-nfclh" podStartSLOduration=3.801388478 podStartE2EDuration="3.801388478s" podCreationTimestamp="2026-01-11 17:49:08 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-11 17:49:11.801121121 +0000 UTC m=+1125.979313827" watchObservedRunningTime="2026-01-11 17:49:11.801388478 +0000 UTC m=+1125.979581184" Jan 11 17:49:12 crc kubenswrapper[4837]: I0111 17:49:12.580515 4837 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/openstack-galera-0" Jan 11 17:49:12 crc kubenswrapper[4837]: I0111 17:49:12.580969 4837 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/openstack-galera-0" Jan 11 17:49:12 crc kubenswrapper[4837]: I0111 17:49:12.737964 4837 generic.go:334] "Generic (PLEG): container finished" podID="e7a1bd32-302a-49e9-a142-942dec63be03" containerID="0ffdf69a1ed83907515afc15b5586ce4041929133d7be7461fb025aa80cdeb93" exitCode=0 Jan 11 17:49:12 crc kubenswrapper[4837]: I0111 17:49:12.738175 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-86db49b7ff-79c66" event={"ID":"e7a1bd32-302a-49e9-a142-942dec63be03","Type":"ContainerDied","Data":"0ffdf69a1ed83907515afc15b5586ce4041929133d7be7461fb025aa80cdeb93"} Jan 11 17:49:12 crc kubenswrapper[4837]: I0111 17:49:12.741596 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-northd-0" event={"ID":"de66fa79-5d8b-48c3-a30a-af21fbdd19b3","Type":"ContainerStarted","Data":"470d1d2cf0ca545d13cf7275a719aed6388fbcd01ede8bd6da609c32eecd2d9c"} Jan 11 17:49:12 crc kubenswrapper[4837]: I0111 17:49:12.741661 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-northd-0" event={"ID":"de66fa79-5d8b-48c3-a30a-af21fbdd19b3","Type":"ContainerStarted","Data":"fe50a8cd05bf4fa13c6d53c96f77831f483a79bb76188b03ea310a40c14306ae"} Jan 11 17:49:12 crc kubenswrapper[4837]: I0111 17:49:12.741716 4837 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ovn-northd-0" Jan 11 17:49:12 crc kubenswrapper[4837]: I0111 17:49:12.745709 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-7fd796d7df-mtqk7" event={"ID":"07f908c8-0d67-4e92-8d61-b43e17f5f73d","Type":"ContainerStarted","Data":"71423823219f063cb47d2df9edd8066df7e2f744f633c98493f2eab1eedfd8fa"} Jan 11 17:49:12 crc kubenswrapper[4837]: I0111 17:49:12.745844 4837 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-7fd796d7df-mtqk7" Jan 11 17:49:12 crc kubenswrapper[4837]: I0111 17:49:12.827322 4837 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-7fd796d7df-mtqk7" podStartSLOduration=3.575432525 podStartE2EDuration="5.827295557s" podCreationTimestamp="2026-01-11 17:49:07 +0000 UTC" firstStartedPulling="2026-01-11 17:49:09.181855608 +0000 UTC m=+1123.360048314" lastFinishedPulling="2026-01-11 17:49:11.43371861 +0000 UTC m=+1125.611911346" observedRunningTime="2026-01-11 17:49:12.813900277 +0000 UTC m=+1126.992093073" watchObservedRunningTime="2026-01-11 17:49:12.827295557 +0000 UTC m=+1127.005488273" Jan 11 17:49:12 crc kubenswrapper[4837]: I0111 17:49:12.851776 4837 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovn-northd-0" podStartSLOduration=2.423320359 podStartE2EDuration="4.851757615s" podCreationTimestamp="2026-01-11 17:49:08 +0000 UTC" firstStartedPulling="2026-01-11 17:49:09.780553291 +0000 UTC m=+1123.958745997" lastFinishedPulling="2026-01-11 17:49:12.208990547 +0000 UTC m=+1126.387183253" observedRunningTime="2026-01-11 17:49:12.842881446 +0000 UTC m=+1127.021074232" watchObservedRunningTime="2026-01-11 17:49:12.851757615 +0000 UTC m=+1127.029950321" Jan 11 17:49:12 crc kubenswrapper[4837]: I0111 17:49:12.953807 4837 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/openstack-cell1-galera-0" Jan 11 17:49:12 crc kubenswrapper[4837]: I0111 17:49:12.953847 4837 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/openstack-cell1-galera-0" Jan 11 17:49:13 crc kubenswrapper[4837]: I0111 17:49:13.056011 4837 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/openstack-cell1-galera-0" Jan 11 17:49:13 crc kubenswrapper[4837]: I0111 17:49:13.223462 4837 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/memcached-0" Jan 11 17:49:13 crc kubenswrapper[4837]: I0111 17:49:13.756591 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-86db49b7ff-79c66" event={"ID":"e7a1bd32-302a-49e9-a142-942dec63be03","Type":"ContainerStarted","Data":"1ea4cac02a3e38e4405e89d50c0227c8e1c5d36ab412a14c0dbcb3a3c5750e3f"} Jan 11 17:49:13 crc kubenswrapper[4837]: I0111 17:49:13.757079 4837 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-86db49b7ff-79c66" Jan 11 17:49:13 crc kubenswrapper[4837]: I0111 17:49:13.791567 4837 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-86db49b7ff-79c66" podStartSLOduration=3.894815778 podStartE2EDuration="5.79154263s" podCreationTimestamp="2026-01-11 17:49:08 +0000 UTC" firstStartedPulling="2026-01-11 17:49:09.69449448 +0000 UTC m=+1123.872687186" lastFinishedPulling="2026-01-11 17:49:11.591221332 +0000 UTC m=+1125.769414038" observedRunningTime="2026-01-11 17:49:13.786364661 +0000 UTC m=+1127.964557427" watchObservedRunningTime="2026-01-11 17:49:13.79154263 +0000 UTC m=+1127.969735376" Jan 11 17:49:13 crc kubenswrapper[4837]: I0111 17:49:13.884771 4837 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/openstack-cell1-galera-0" Jan 11 17:49:14 crc kubenswrapper[4837]: I0111 17:49:14.966105 4837 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/openstack-galera-0" Jan 11 17:49:15 crc kubenswrapper[4837]: I0111 17:49:15.085819 4837 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/openstack-galera-0" Jan 11 17:49:15 crc kubenswrapper[4837]: I0111 17:49:15.254807 4837 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-7fd796d7df-mtqk7"] Jan 11 17:49:15 crc kubenswrapper[4837]: I0111 17:49:15.300747 4837 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-698758b865-rv55z"] Jan 11 17:49:15 crc kubenswrapper[4837]: I0111 17:49:15.301835 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-698758b865-rv55z" Jan 11 17:49:15 crc kubenswrapper[4837]: I0111 17:49:15.326524 4837 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-698758b865-rv55z"] Jan 11 17:49:15 crc kubenswrapper[4837]: I0111 17:49:15.349793 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/281a517d-5b8e-413b-8e6c-6555318d70c8-config\") pod \"dnsmasq-dns-698758b865-rv55z\" (UID: \"281a517d-5b8e-413b-8e6c-6555318d70c8\") " pod="openstack/dnsmasq-dns-698758b865-rv55z" Jan 11 17:49:15 crc kubenswrapper[4837]: I0111 17:49:15.349839 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/281a517d-5b8e-413b-8e6c-6555318d70c8-ovsdbserver-nb\") pod \"dnsmasq-dns-698758b865-rv55z\" (UID: \"281a517d-5b8e-413b-8e6c-6555318d70c8\") " pod="openstack/dnsmasq-dns-698758b865-rv55z" Jan 11 17:49:15 crc kubenswrapper[4837]: I0111 17:49:15.349878 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/281a517d-5b8e-413b-8e6c-6555318d70c8-dns-svc\") pod \"dnsmasq-dns-698758b865-rv55z\" (UID: \"281a517d-5b8e-413b-8e6c-6555318d70c8\") " pod="openstack/dnsmasq-dns-698758b865-rv55z" Jan 11 17:49:15 crc kubenswrapper[4837]: I0111 17:49:15.349971 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/281a517d-5b8e-413b-8e6c-6555318d70c8-ovsdbserver-sb\") pod \"dnsmasq-dns-698758b865-rv55z\" (UID: \"281a517d-5b8e-413b-8e6c-6555318d70c8\") " pod="openstack/dnsmasq-dns-698758b865-rv55z" Jan 11 17:49:15 crc kubenswrapper[4837]: I0111 17:49:15.350012 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9v866\" (UniqueName: \"kubernetes.io/projected/281a517d-5b8e-413b-8e6c-6555318d70c8-kube-api-access-9v866\") pod \"dnsmasq-dns-698758b865-rv55z\" (UID: \"281a517d-5b8e-413b-8e6c-6555318d70c8\") " pod="openstack/dnsmasq-dns-698758b865-rv55z" Jan 11 17:49:15 crc kubenswrapper[4837]: I0111 17:49:15.451213 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/281a517d-5b8e-413b-8e6c-6555318d70c8-dns-svc\") pod \"dnsmasq-dns-698758b865-rv55z\" (UID: \"281a517d-5b8e-413b-8e6c-6555318d70c8\") " pod="openstack/dnsmasq-dns-698758b865-rv55z" Jan 11 17:49:15 crc kubenswrapper[4837]: I0111 17:49:15.451305 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/281a517d-5b8e-413b-8e6c-6555318d70c8-ovsdbserver-sb\") pod \"dnsmasq-dns-698758b865-rv55z\" (UID: \"281a517d-5b8e-413b-8e6c-6555318d70c8\") " pod="openstack/dnsmasq-dns-698758b865-rv55z" Jan 11 17:49:15 crc kubenswrapper[4837]: I0111 17:49:15.451344 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9v866\" (UniqueName: \"kubernetes.io/projected/281a517d-5b8e-413b-8e6c-6555318d70c8-kube-api-access-9v866\") pod \"dnsmasq-dns-698758b865-rv55z\" (UID: \"281a517d-5b8e-413b-8e6c-6555318d70c8\") " pod="openstack/dnsmasq-dns-698758b865-rv55z" Jan 11 17:49:15 crc kubenswrapper[4837]: I0111 17:49:15.451412 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/281a517d-5b8e-413b-8e6c-6555318d70c8-config\") pod \"dnsmasq-dns-698758b865-rv55z\" (UID: \"281a517d-5b8e-413b-8e6c-6555318d70c8\") " pod="openstack/dnsmasq-dns-698758b865-rv55z" Jan 11 17:49:15 crc kubenswrapper[4837]: I0111 17:49:15.451432 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/281a517d-5b8e-413b-8e6c-6555318d70c8-ovsdbserver-nb\") pod \"dnsmasq-dns-698758b865-rv55z\" (UID: \"281a517d-5b8e-413b-8e6c-6555318d70c8\") " pod="openstack/dnsmasq-dns-698758b865-rv55z" Jan 11 17:49:15 crc kubenswrapper[4837]: I0111 17:49:15.452134 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/281a517d-5b8e-413b-8e6c-6555318d70c8-dns-svc\") pod \"dnsmasq-dns-698758b865-rv55z\" (UID: \"281a517d-5b8e-413b-8e6c-6555318d70c8\") " pod="openstack/dnsmasq-dns-698758b865-rv55z" Jan 11 17:49:15 crc kubenswrapper[4837]: I0111 17:49:15.452146 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/281a517d-5b8e-413b-8e6c-6555318d70c8-ovsdbserver-sb\") pod \"dnsmasq-dns-698758b865-rv55z\" (UID: \"281a517d-5b8e-413b-8e6c-6555318d70c8\") " pod="openstack/dnsmasq-dns-698758b865-rv55z" Jan 11 17:49:15 crc kubenswrapper[4837]: I0111 17:49:15.452172 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/281a517d-5b8e-413b-8e6c-6555318d70c8-ovsdbserver-nb\") pod \"dnsmasq-dns-698758b865-rv55z\" (UID: \"281a517d-5b8e-413b-8e6c-6555318d70c8\") " pod="openstack/dnsmasq-dns-698758b865-rv55z" Jan 11 17:49:15 crc kubenswrapper[4837]: I0111 17:49:15.452640 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/281a517d-5b8e-413b-8e6c-6555318d70c8-config\") pod \"dnsmasq-dns-698758b865-rv55z\" (UID: \"281a517d-5b8e-413b-8e6c-6555318d70c8\") " pod="openstack/dnsmasq-dns-698758b865-rv55z" Jan 11 17:49:15 crc kubenswrapper[4837]: I0111 17:49:15.473170 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9v866\" (UniqueName: \"kubernetes.io/projected/281a517d-5b8e-413b-8e6c-6555318d70c8-kube-api-access-9v866\") pod \"dnsmasq-dns-698758b865-rv55z\" (UID: \"281a517d-5b8e-413b-8e6c-6555318d70c8\") " pod="openstack/dnsmasq-dns-698758b865-rv55z" Jan 11 17:49:15 crc kubenswrapper[4837]: I0111 17:49:15.622344 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-698758b865-rv55z" Jan 11 17:49:15 crc kubenswrapper[4837]: I0111 17:49:15.773224 4837 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-7fd796d7df-mtqk7" podUID="07f908c8-0d67-4e92-8d61-b43e17f5f73d" containerName="dnsmasq-dns" containerID="cri-o://71423823219f063cb47d2df9edd8066df7e2f744f633c98493f2eab1eedfd8fa" gracePeriod=10 Jan 11 17:49:16 crc kubenswrapper[4837]: I0111 17:49:16.300038 4837 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-7fd796d7df-mtqk7" Jan 11 17:49:16 crc kubenswrapper[4837]: I0111 17:49:16.338124 4837 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/swift-storage-0"] Jan 11 17:49:16 crc kubenswrapper[4837]: E0111 17:49:16.338455 4837 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="07f908c8-0d67-4e92-8d61-b43e17f5f73d" containerName="init" Jan 11 17:49:16 crc kubenswrapper[4837]: I0111 17:49:16.338473 4837 state_mem.go:107] "Deleted CPUSet assignment" podUID="07f908c8-0d67-4e92-8d61-b43e17f5f73d" containerName="init" Jan 11 17:49:16 crc kubenswrapper[4837]: E0111 17:49:16.338490 4837 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="07f908c8-0d67-4e92-8d61-b43e17f5f73d" containerName="dnsmasq-dns" Jan 11 17:49:16 crc kubenswrapper[4837]: I0111 17:49:16.338496 4837 state_mem.go:107] "Deleted CPUSet assignment" podUID="07f908c8-0d67-4e92-8d61-b43e17f5f73d" containerName="dnsmasq-dns" Jan 11 17:49:16 crc kubenswrapper[4837]: I0111 17:49:16.338637 4837 memory_manager.go:354] "RemoveStaleState removing state" podUID="07f908c8-0d67-4e92-8d61-b43e17f5f73d" containerName="dnsmasq-dns" Jan 11 17:49:16 crc kubenswrapper[4837]: I0111 17:49:16.344880 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/swift-storage-0" Jan 11 17:49:16 crc kubenswrapper[4837]: I0111 17:49:16.346534 4837 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"swift-conf" Jan 11 17:49:16 crc kubenswrapper[4837]: I0111 17:49:16.347595 4837 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"swift-swift-dockercfg-t52m2" Jan 11 17:49:16 crc kubenswrapper[4837]: I0111 17:49:16.347758 4837 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"swift-storage-config-data" Jan 11 17:49:16 crc kubenswrapper[4837]: I0111 17:49:16.347852 4837 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"swift-ring-files" Jan 11 17:49:16 crc kubenswrapper[4837]: I0111 17:49:16.368166 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/07f908c8-0d67-4e92-8d61-b43e17f5f73d-ovsdbserver-nb\") pod \"07f908c8-0d67-4e92-8d61-b43e17f5f73d\" (UID: \"07f908c8-0d67-4e92-8d61-b43e17f5f73d\") " Jan 11 17:49:16 crc kubenswrapper[4837]: I0111 17:49:16.368204 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/07f908c8-0d67-4e92-8d61-b43e17f5f73d-dns-svc\") pod \"07f908c8-0d67-4e92-8d61-b43e17f5f73d\" (UID: \"07f908c8-0d67-4e92-8d61-b43e17f5f73d\") " Jan 11 17:49:16 crc kubenswrapper[4837]: I0111 17:49:16.368231 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/07f908c8-0d67-4e92-8d61-b43e17f5f73d-config\") pod \"07f908c8-0d67-4e92-8d61-b43e17f5f73d\" (UID: \"07f908c8-0d67-4e92-8d61-b43e17f5f73d\") " Jan 11 17:49:16 crc kubenswrapper[4837]: I0111 17:49:16.368266 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-kr8p6\" (UniqueName: \"kubernetes.io/projected/07f908c8-0d67-4e92-8d61-b43e17f5f73d-kube-api-access-kr8p6\") pod \"07f908c8-0d67-4e92-8d61-b43e17f5f73d\" (UID: \"07f908c8-0d67-4e92-8d61-b43e17f5f73d\") " Jan 11 17:49:16 crc kubenswrapper[4837]: I0111 17:49:16.368465 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lock\" (UniqueName: \"kubernetes.io/empty-dir/23a0b787-b5b4-4a4e-828b-d7f34853603f-lock\") pod \"swift-storage-0\" (UID: \"23a0b787-b5b4-4a4e-828b-d7f34853603f\") " pod="openstack/swift-storage-0" Jan 11 17:49:16 crc kubenswrapper[4837]: I0111 17:49:16.368487 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage10-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage10-crc\") pod \"swift-storage-0\" (UID: \"23a0b787-b5b4-4a4e-828b-d7f34853603f\") " pod="openstack/swift-storage-0" Jan 11 17:49:16 crc kubenswrapper[4837]: I0111 17:49:16.368518 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cache\" (UniqueName: \"kubernetes.io/empty-dir/23a0b787-b5b4-4a4e-828b-d7f34853603f-cache\") pod \"swift-storage-0\" (UID: \"23a0b787-b5b4-4a4e-828b-d7f34853603f\") " pod="openstack/swift-storage-0" Jan 11 17:49:16 crc kubenswrapper[4837]: I0111 17:49:16.368554 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/23a0b787-b5b4-4a4e-828b-d7f34853603f-etc-swift\") pod \"swift-storage-0\" (UID: \"23a0b787-b5b4-4a4e-828b-d7f34853603f\") " pod="openstack/swift-storage-0" Jan 11 17:49:16 crc kubenswrapper[4837]: I0111 17:49:16.368587 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fkg7q\" (UniqueName: \"kubernetes.io/projected/23a0b787-b5b4-4a4e-828b-d7f34853603f-kube-api-access-fkg7q\") pod \"swift-storage-0\" (UID: \"23a0b787-b5b4-4a4e-828b-d7f34853603f\") " pod="openstack/swift-storage-0" Jan 11 17:49:16 crc kubenswrapper[4837]: I0111 17:49:16.375075 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/07f908c8-0d67-4e92-8d61-b43e17f5f73d-kube-api-access-kr8p6" (OuterVolumeSpecName: "kube-api-access-kr8p6") pod "07f908c8-0d67-4e92-8d61-b43e17f5f73d" (UID: "07f908c8-0d67-4e92-8d61-b43e17f5f73d"). InnerVolumeSpecName "kube-api-access-kr8p6". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 11 17:49:16 crc kubenswrapper[4837]: I0111 17:49:16.378583 4837 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/swift-storage-0"] Jan 11 17:49:16 crc kubenswrapper[4837]: W0111 17:49:16.392054 4837 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod281a517d_5b8e_413b_8e6c_6555318d70c8.slice/crio-9145b53e54b9a0ffc304535ec138f16427619aaec454e617ed64f3cb1d7dafe1 WatchSource:0}: Error finding container 9145b53e54b9a0ffc304535ec138f16427619aaec454e617ed64f3cb1d7dafe1: Status 404 returned error can't find the container with id 9145b53e54b9a0ffc304535ec138f16427619aaec454e617ed64f3cb1d7dafe1 Jan 11 17:49:16 crc kubenswrapper[4837]: I0111 17:49:16.411083 4837 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-698758b865-rv55z"] Jan 11 17:49:16 crc kubenswrapper[4837]: I0111 17:49:16.411220 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/07f908c8-0d67-4e92-8d61-b43e17f5f73d-config" (OuterVolumeSpecName: "config") pod "07f908c8-0d67-4e92-8d61-b43e17f5f73d" (UID: "07f908c8-0d67-4e92-8d61-b43e17f5f73d"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 11 17:49:16 crc kubenswrapper[4837]: I0111 17:49:16.413640 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/07f908c8-0d67-4e92-8d61-b43e17f5f73d-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "07f908c8-0d67-4e92-8d61-b43e17f5f73d" (UID: "07f908c8-0d67-4e92-8d61-b43e17f5f73d"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 11 17:49:16 crc kubenswrapper[4837]: I0111 17:49:16.420437 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/07f908c8-0d67-4e92-8d61-b43e17f5f73d-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "07f908c8-0d67-4e92-8d61-b43e17f5f73d" (UID: "07f908c8-0d67-4e92-8d61-b43e17f5f73d"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 11 17:49:16 crc kubenswrapper[4837]: I0111 17:49:16.469271 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage10-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage10-crc\") pod \"swift-storage-0\" (UID: \"23a0b787-b5b4-4a4e-828b-d7f34853603f\") " pod="openstack/swift-storage-0" Jan 11 17:49:16 crc kubenswrapper[4837]: I0111 17:49:16.469350 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cache\" (UniqueName: \"kubernetes.io/empty-dir/23a0b787-b5b4-4a4e-828b-d7f34853603f-cache\") pod \"swift-storage-0\" (UID: \"23a0b787-b5b4-4a4e-828b-d7f34853603f\") " pod="openstack/swift-storage-0" Jan 11 17:49:16 crc kubenswrapper[4837]: I0111 17:49:16.469395 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/23a0b787-b5b4-4a4e-828b-d7f34853603f-etc-swift\") pod \"swift-storage-0\" (UID: \"23a0b787-b5b4-4a4e-828b-d7f34853603f\") " pod="openstack/swift-storage-0" Jan 11 17:49:16 crc kubenswrapper[4837]: I0111 17:49:16.469462 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fkg7q\" (UniqueName: \"kubernetes.io/projected/23a0b787-b5b4-4a4e-828b-d7f34853603f-kube-api-access-fkg7q\") pod \"swift-storage-0\" (UID: \"23a0b787-b5b4-4a4e-828b-d7f34853603f\") " pod="openstack/swift-storage-0" Jan 11 17:49:16 crc kubenswrapper[4837]: I0111 17:49:16.469591 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"lock\" (UniqueName: \"kubernetes.io/empty-dir/23a0b787-b5b4-4a4e-828b-d7f34853603f-lock\") pod \"swift-storage-0\" (UID: \"23a0b787-b5b4-4a4e-828b-d7f34853603f\") " pod="openstack/swift-storage-0" Jan 11 17:49:16 crc kubenswrapper[4837]: I0111 17:49:16.469694 4837 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-kr8p6\" (UniqueName: \"kubernetes.io/projected/07f908c8-0d67-4e92-8d61-b43e17f5f73d-kube-api-access-kr8p6\") on node \"crc\" DevicePath \"\"" Jan 11 17:49:16 crc kubenswrapper[4837]: I0111 17:49:16.469702 4837 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage10-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage10-crc\") pod \"swift-storage-0\" (UID: \"23a0b787-b5b4-4a4e-828b-d7f34853603f\") device mount path \"/mnt/openstack/pv10\"" pod="openstack/swift-storage-0" Jan 11 17:49:16 crc kubenswrapper[4837]: I0111 17:49:16.469726 4837 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/07f908c8-0d67-4e92-8d61-b43e17f5f73d-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Jan 11 17:49:16 crc kubenswrapper[4837]: I0111 17:49:16.469744 4837 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/07f908c8-0d67-4e92-8d61-b43e17f5f73d-dns-svc\") on node \"crc\" DevicePath \"\"" Jan 11 17:49:16 crc kubenswrapper[4837]: I0111 17:49:16.469826 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cache\" (UniqueName: \"kubernetes.io/empty-dir/23a0b787-b5b4-4a4e-828b-d7f34853603f-cache\") pod \"swift-storage-0\" (UID: \"23a0b787-b5b4-4a4e-828b-d7f34853603f\") " pod="openstack/swift-storage-0" Jan 11 17:49:16 crc kubenswrapper[4837]: I0111 17:49:16.469842 4837 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/07f908c8-0d67-4e92-8d61-b43e17f5f73d-config\") on node \"crc\" DevicePath \"\"" Jan 11 17:49:16 crc kubenswrapper[4837]: E0111 17:49:16.470359 4837 projected.go:288] Couldn't get configMap openstack/swift-ring-files: configmap "swift-ring-files" not found Jan 11 17:49:16 crc kubenswrapper[4837]: I0111 17:49:16.470378 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"lock\" (UniqueName: \"kubernetes.io/empty-dir/23a0b787-b5b4-4a4e-828b-d7f34853603f-lock\") pod \"swift-storage-0\" (UID: \"23a0b787-b5b4-4a4e-828b-d7f34853603f\") " pod="openstack/swift-storage-0" Jan 11 17:49:16 crc kubenswrapper[4837]: E0111 17:49:16.470491 4837 projected.go:194] Error preparing data for projected volume etc-swift for pod openstack/swift-storage-0: configmap "swift-ring-files" not found Jan 11 17:49:16 crc kubenswrapper[4837]: E0111 17:49:16.470824 4837 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/23a0b787-b5b4-4a4e-828b-d7f34853603f-etc-swift podName:23a0b787-b5b4-4a4e-828b-d7f34853603f nodeName:}" failed. No retries permitted until 2026-01-11 17:49:16.970799245 +0000 UTC m=+1131.148992051 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "etc-swift" (UniqueName: "kubernetes.io/projected/23a0b787-b5b4-4a4e-828b-d7f34853603f-etc-swift") pod "swift-storage-0" (UID: "23a0b787-b5b4-4a4e-828b-d7f34853603f") : configmap "swift-ring-files" not found Jan 11 17:49:16 crc kubenswrapper[4837]: I0111 17:49:16.488603 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fkg7q\" (UniqueName: \"kubernetes.io/projected/23a0b787-b5b4-4a4e-828b-d7f34853603f-kube-api-access-fkg7q\") pod \"swift-storage-0\" (UID: \"23a0b787-b5b4-4a4e-828b-d7f34853603f\") " pod="openstack/swift-storage-0" Jan 11 17:49:16 crc kubenswrapper[4837]: I0111 17:49:16.497155 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage10-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage10-crc\") pod \"swift-storage-0\" (UID: \"23a0b787-b5b4-4a4e-828b-d7f34853603f\") " pod="openstack/swift-storage-0" Jan 11 17:49:16 crc kubenswrapper[4837]: I0111 17:49:16.782158 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/kube-state-metrics-0" event={"ID":"cad8e11f-3ef0-4043-a49e-308c103a973f","Type":"ContainerStarted","Data":"7359edf71a1234b308bbb5aa3c883a06057a34f99aa6ec662b1ce987950a4957"} Jan 11 17:49:16 crc kubenswrapper[4837]: I0111 17:49:16.782375 4837 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/kube-state-metrics-0" Jan 11 17:49:16 crc kubenswrapper[4837]: I0111 17:49:16.784249 4837 generic.go:334] "Generic (PLEG): container finished" podID="281a517d-5b8e-413b-8e6c-6555318d70c8" containerID="01765bd1879b2d50a7280384405ec922ebd0d4d2f673ecbb8a4a39fffb4e8ab5" exitCode=0 Jan 11 17:49:16 crc kubenswrapper[4837]: I0111 17:49:16.784302 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-698758b865-rv55z" event={"ID":"281a517d-5b8e-413b-8e6c-6555318d70c8","Type":"ContainerDied","Data":"01765bd1879b2d50a7280384405ec922ebd0d4d2f673ecbb8a4a39fffb4e8ab5"} Jan 11 17:49:16 crc kubenswrapper[4837]: I0111 17:49:16.784338 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-698758b865-rv55z" event={"ID":"281a517d-5b8e-413b-8e6c-6555318d70c8","Type":"ContainerStarted","Data":"9145b53e54b9a0ffc304535ec138f16427619aaec454e617ed64f3cb1d7dafe1"} Jan 11 17:49:16 crc kubenswrapper[4837]: I0111 17:49:16.787784 4837 generic.go:334] "Generic (PLEG): container finished" podID="07f908c8-0d67-4e92-8d61-b43e17f5f73d" containerID="71423823219f063cb47d2df9edd8066df7e2f744f633c98493f2eab1eedfd8fa" exitCode=0 Jan 11 17:49:16 crc kubenswrapper[4837]: I0111 17:49:16.787879 4837 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-7fd796d7df-mtqk7" Jan 11 17:49:16 crc kubenswrapper[4837]: I0111 17:49:16.788850 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-7fd796d7df-mtqk7" event={"ID":"07f908c8-0d67-4e92-8d61-b43e17f5f73d","Type":"ContainerDied","Data":"71423823219f063cb47d2df9edd8066df7e2f744f633c98493f2eab1eedfd8fa"} Jan 11 17:49:16 crc kubenswrapper[4837]: I0111 17:49:16.788912 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-7fd796d7df-mtqk7" event={"ID":"07f908c8-0d67-4e92-8d61-b43e17f5f73d","Type":"ContainerDied","Data":"168cbb3e4b893d17858c04b932ff12068a61992b49ef5a9ca9a21e6da4846b28"} Jan 11 17:49:16 crc kubenswrapper[4837]: I0111 17:49:16.788937 4837 scope.go:117] "RemoveContainer" containerID="71423823219f063cb47d2df9edd8066df7e2f744f633c98493f2eab1eedfd8fa" Jan 11 17:49:16 crc kubenswrapper[4837]: I0111 17:49:16.809012 4837 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/kube-state-metrics-0" podStartSLOduration=2.676276523 podStartE2EDuration="52.808996679s" podCreationTimestamp="2026-01-11 17:48:24 +0000 UTC" firstStartedPulling="2026-01-11 17:48:25.8121232 +0000 UTC m=+1079.990315906" lastFinishedPulling="2026-01-11 17:49:15.944843356 +0000 UTC m=+1130.123036062" observedRunningTime="2026-01-11 17:49:16.803118502 +0000 UTC m=+1130.981311198" watchObservedRunningTime="2026-01-11 17:49:16.808996679 +0000 UTC m=+1130.987189385" Jan 11 17:49:16 crc kubenswrapper[4837]: I0111 17:49:16.857528 4837 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/swift-ring-rebalance-qn5fn"] Jan 11 17:49:16 crc kubenswrapper[4837]: I0111 17:49:16.858515 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/swift-ring-rebalance-qn5fn" Jan 11 17:49:16 crc kubenswrapper[4837]: I0111 17:49:16.862377 4837 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"swift-proxy-config-data" Jan 11 17:49:16 crc kubenswrapper[4837]: I0111 17:49:16.862622 4837 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"swift-ring-config-data" Jan 11 17:49:16 crc kubenswrapper[4837]: I0111 17:49:16.862762 4837 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"swift-ring-scripts" Jan 11 17:49:16 crc kubenswrapper[4837]: I0111 17:49:16.864608 4837 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/swift-ring-rebalance-qn5fn"] Jan 11 17:49:16 crc kubenswrapper[4837]: I0111 17:49:16.911705 4837 scope.go:117] "RemoveContainer" containerID="c82518b50509709b49151f3591ef9ce17fdd44fbdf5c6c3620a89ab84c39e241" Jan 11 17:49:16 crc kubenswrapper[4837]: I0111 17:49:16.915257 4837 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-7fd796d7df-mtqk7"] Jan 11 17:49:16 crc kubenswrapper[4837]: I0111 17:49:16.921270 4837 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-7fd796d7df-mtqk7"] Jan 11 17:49:16 crc kubenswrapper[4837]: I0111 17:49:16.958758 4837 scope.go:117] "RemoveContainer" containerID="71423823219f063cb47d2df9edd8066df7e2f744f633c98493f2eab1eedfd8fa" Jan 11 17:49:16 crc kubenswrapper[4837]: E0111 17:49:16.959164 4837 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"71423823219f063cb47d2df9edd8066df7e2f744f633c98493f2eab1eedfd8fa\": container with ID starting with 71423823219f063cb47d2df9edd8066df7e2f744f633c98493f2eab1eedfd8fa not found: ID does not exist" containerID="71423823219f063cb47d2df9edd8066df7e2f744f633c98493f2eab1eedfd8fa" Jan 11 17:49:16 crc kubenswrapper[4837]: I0111 17:49:16.959216 4837 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"71423823219f063cb47d2df9edd8066df7e2f744f633c98493f2eab1eedfd8fa"} err="failed to get container status \"71423823219f063cb47d2df9edd8066df7e2f744f633c98493f2eab1eedfd8fa\": rpc error: code = NotFound desc = could not find container \"71423823219f063cb47d2df9edd8066df7e2f744f633c98493f2eab1eedfd8fa\": container with ID starting with 71423823219f063cb47d2df9edd8066df7e2f744f633c98493f2eab1eedfd8fa not found: ID does not exist" Jan 11 17:49:16 crc kubenswrapper[4837]: I0111 17:49:16.959249 4837 scope.go:117] "RemoveContainer" containerID="c82518b50509709b49151f3591ef9ce17fdd44fbdf5c6c3620a89ab84c39e241" Jan 11 17:49:16 crc kubenswrapper[4837]: E0111 17:49:16.959522 4837 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"c82518b50509709b49151f3591ef9ce17fdd44fbdf5c6c3620a89ab84c39e241\": container with ID starting with c82518b50509709b49151f3591ef9ce17fdd44fbdf5c6c3620a89ab84c39e241 not found: ID does not exist" containerID="c82518b50509709b49151f3591ef9ce17fdd44fbdf5c6c3620a89ab84c39e241" Jan 11 17:49:16 crc kubenswrapper[4837]: I0111 17:49:16.959546 4837 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c82518b50509709b49151f3591ef9ce17fdd44fbdf5c6c3620a89ab84c39e241"} err="failed to get container status \"c82518b50509709b49151f3591ef9ce17fdd44fbdf5c6c3620a89ab84c39e241\": rpc error: code = NotFound desc = could not find container \"c82518b50509709b49151f3591ef9ce17fdd44fbdf5c6c3620a89ab84c39e241\": container with ID starting with c82518b50509709b49151f3591ef9ce17fdd44fbdf5c6c3620a89ab84c39e241 not found: ID does not exist" Jan 11 17:49:16 crc kubenswrapper[4837]: I0111 17:49:16.977942 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/43068ba1-1d19-4822-88fa-e52f8fb21738-combined-ca-bundle\") pod \"swift-ring-rebalance-qn5fn\" (UID: \"43068ba1-1d19-4822-88fa-e52f8fb21738\") " pod="openstack/swift-ring-rebalance-qn5fn" Jan 11 17:49:16 crc kubenswrapper[4837]: I0111 17:49:16.978023 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/23a0b787-b5b4-4a4e-828b-d7f34853603f-etc-swift\") pod \"swift-storage-0\" (UID: \"23a0b787-b5b4-4a4e-828b-d7f34853603f\") " pod="openstack/swift-storage-0" Jan 11 17:49:16 crc kubenswrapper[4837]: I0111 17:49:16.978054 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/43068ba1-1d19-4822-88fa-e52f8fb21738-swiftconf\") pod \"swift-ring-rebalance-qn5fn\" (UID: \"43068ba1-1d19-4822-88fa-e52f8fb21738\") " pod="openstack/swift-ring-rebalance-qn5fn" Jan 11 17:49:16 crc kubenswrapper[4837]: I0111 17:49:16.978109 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/43068ba1-1d19-4822-88fa-e52f8fb21738-ring-data-devices\") pod \"swift-ring-rebalance-qn5fn\" (UID: \"43068ba1-1d19-4822-88fa-e52f8fb21738\") " pod="openstack/swift-ring-rebalance-qn5fn" Jan 11 17:49:16 crc kubenswrapper[4837]: I0111 17:49:16.978211 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/43068ba1-1d19-4822-88fa-e52f8fb21738-scripts\") pod \"swift-ring-rebalance-qn5fn\" (UID: \"43068ba1-1d19-4822-88fa-e52f8fb21738\") " pod="openstack/swift-ring-rebalance-qn5fn" Jan 11 17:49:16 crc kubenswrapper[4837]: E0111 17:49:16.978356 4837 projected.go:288] Couldn't get configMap openstack/swift-ring-files: configmap "swift-ring-files" not found Jan 11 17:49:16 crc kubenswrapper[4837]: E0111 17:49:16.978379 4837 projected.go:194] Error preparing data for projected volume etc-swift for pod openstack/swift-storage-0: configmap "swift-ring-files" not found Jan 11 17:49:16 crc kubenswrapper[4837]: E0111 17:49:16.978425 4837 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/23a0b787-b5b4-4a4e-828b-d7f34853603f-etc-swift podName:23a0b787-b5b4-4a4e-828b-d7f34853603f nodeName:}" failed. No retries permitted until 2026-01-11 17:49:17.978407251 +0000 UTC m=+1132.156599967 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "etc-swift" (UniqueName: "kubernetes.io/projected/23a0b787-b5b4-4a4e-828b-d7f34853603f-etc-swift") pod "swift-storage-0" (UID: "23a0b787-b5b4-4a4e-828b-d7f34853603f") : configmap "swift-ring-files" not found Jan 11 17:49:16 crc kubenswrapper[4837]: I0111 17:49:16.978867 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/43068ba1-1d19-4822-88fa-e52f8fb21738-etc-swift\") pod \"swift-ring-rebalance-qn5fn\" (UID: \"43068ba1-1d19-4822-88fa-e52f8fb21738\") " pod="openstack/swift-ring-rebalance-qn5fn" Jan 11 17:49:16 crc kubenswrapper[4837]: I0111 17:49:16.979021 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rnsvp\" (UniqueName: \"kubernetes.io/projected/43068ba1-1d19-4822-88fa-e52f8fb21738-kube-api-access-rnsvp\") pod \"swift-ring-rebalance-qn5fn\" (UID: \"43068ba1-1d19-4822-88fa-e52f8fb21738\") " pod="openstack/swift-ring-rebalance-qn5fn" Jan 11 17:49:16 crc kubenswrapper[4837]: I0111 17:49:16.979135 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/43068ba1-1d19-4822-88fa-e52f8fb21738-dispersionconf\") pod \"swift-ring-rebalance-qn5fn\" (UID: \"43068ba1-1d19-4822-88fa-e52f8fb21738\") " pod="openstack/swift-ring-rebalance-qn5fn" Jan 11 17:49:17 crc kubenswrapper[4837]: I0111 17:49:17.080272 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/43068ba1-1d19-4822-88fa-e52f8fb21738-swiftconf\") pod \"swift-ring-rebalance-qn5fn\" (UID: \"43068ba1-1d19-4822-88fa-e52f8fb21738\") " pod="openstack/swift-ring-rebalance-qn5fn" Jan 11 17:49:17 crc kubenswrapper[4837]: I0111 17:49:17.080324 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/43068ba1-1d19-4822-88fa-e52f8fb21738-ring-data-devices\") pod \"swift-ring-rebalance-qn5fn\" (UID: \"43068ba1-1d19-4822-88fa-e52f8fb21738\") " pod="openstack/swift-ring-rebalance-qn5fn" Jan 11 17:49:17 crc kubenswrapper[4837]: I0111 17:49:17.080386 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/43068ba1-1d19-4822-88fa-e52f8fb21738-scripts\") pod \"swift-ring-rebalance-qn5fn\" (UID: \"43068ba1-1d19-4822-88fa-e52f8fb21738\") " pod="openstack/swift-ring-rebalance-qn5fn" Jan 11 17:49:17 crc kubenswrapper[4837]: I0111 17:49:17.080438 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/43068ba1-1d19-4822-88fa-e52f8fb21738-etc-swift\") pod \"swift-ring-rebalance-qn5fn\" (UID: \"43068ba1-1d19-4822-88fa-e52f8fb21738\") " pod="openstack/swift-ring-rebalance-qn5fn" Jan 11 17:49:17 crc kubenswrapper[4837]: I0111 17:49:17.080470 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rnsvp\" (UniqueName: \"kubernetes.io/projected/43068ba1-1d19-4822-88fa-e52f8fb21738-kube-api-access-rnsvp\") pod \"swift-ring-rebalance-qn5fn\" (UID: \"43068ba1-1d19-4822-88fa-e52f8fb21738\") " pod="openstack/swift-ring-rebalance-qn5fn" Jan 11 17:49:17 crc kubenswrapper[4837]: I0111 17:49:17.080494 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/43068ba1-1d19-4822-88fa-e52f8fb21738-dispersionconf\") pod \"swift-ring-rebalance-qn5fn\" (UID: \"43068ba1-1d19-4822-88fa-e52f8fb21738\") " pod="openstack/swift-ring-rebalance-qn5fn" Jan 11 17:49:17 crc kubenswrapper[4837]: I0111 17:49:17.080551 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/43068ba1-1d19-4822-88fa-e52f8fb21738-combined-ca-bundle\") pod \"swift-ring-rebalance-qn5fn\" (UID: \"43068ba1-1d19-4822-88fa-e52f8fb21738\") " pod="openstack/swift-ring-rebalance-qn5fn" Jan 11 17:49:17 crc kubenswrapper[4837]: I0111 17:49:17.081220 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/43068ba1-1d19-4822-88fa-e52f8fb21738-etc-swift\") pod \"swift-ring-rebalance-qn5fn\" (UID: \"43068ba1-1d19-4822-88fa-e52f8fb21738\") " pod="openstack/swift-ring-rebalance-qn5fn" Jan 11 17:49:17 crc kubenswrapper[4837]: I0111 17:49:17.081468 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/43068ba1-1d19-4822-88fa-e52f8fb21738-scripts\") pod \"swift-ring-rebalance-qn5fn\" (UID: \"43068ba1-1d19-4822-88fa-e52f8fb21738\") " pod="openstack/swift-ring-rebalance-qn5fn" Jan 11 17:49:17 crc kubenswrapper[4837]: I0111 17:49:17.081989 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/43068ba1-1d19-4822-88fa-e52f8fb21738-ring-data-devices\") pod \"swift-ring-rebalance-qn5fn\" (UID: \"43068ba1-1d19-4822-88fa-e52f8fb21738\") " pod="openstack/swift-ring-rebalance-qn5fn" Jan 11 17:49:17 crc kubenswrapper[4837]: I0111 17:49:17.085426 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/43068ba1-1d19-4822-88fa-e52f8fb21738-dispersionconf\") pod \"swift-ring-rebalance-qn5fn\" (UID: \"43068ba1-1d19-4822-88fa-e52f8fb21738\") " pod="openstack/swift-ring-rebalance-qn5fn" Jan 11 17:49:17 crc kubenswrapper[4837]: I0111 17:49:17.086942 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/43068ba1-1d19-4822-88fa-e52f8fb21738-combined-ca-bundle\") pod \"swift-ring-rebalance-qn5fn\" (UID: \"43068ba1-1d19-4822-88fa-e52f8fb21738\") " pod="openstack/swift-ring-rebalance-qn5fn" Jan 11 17:49:17 crc kubenswrapper[4837]: I0111 17:49:17.087013 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/43068ba1-1d19-4822-88fa-e52f8fb21738-swiftconf\") pod \"swift-ring-rebalance-qn5fn\" (UID: \"43068ba1-1d19-4822-88fa-e52f8fb21738\") " pod="openstack/swift-ring-rebalance-qn5fn" Jan 11 17:49:17 crc kubenswrapper[4837]: I0111 17:49:17.094451 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rnsvp\" (UniqueName: \"kubernetes.io/projected/43068ba1-1d19-4822-88fa-e52f8fb21738-kube-api-access-rnsvp\") pod \"swift-ring-rebalance-qn5fn\" (UID: \"43068ba1-1d19-4822-88fa-e52f8fb21738\") " pod="openstack/swift-ring-rebalance-qn5fn" Jan 11 17:49:17 crc kubenswrapper[4837]: I0111 17:49:17.210303 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/swift-ring-rebalance-qn5fn" Jan 11 17:49:17 crc kubenswrapper[4837]: I0111 17:49:17.639362 4837 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/swift-ring-rebalance-qn5fn"] Jan 11 17:49:17 crc kubenswrapper[4837]: W0111 17:49:17.649271 4837 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod43068ba1_1d19_4822_88fa_e52f8fb21738.slice/crio-55a37974e006e2d77d11c40fb1ec9a7fcd1c38e2df0c9636a9f4e6fe6f9b173b WatchSource:0}: Error finding container 55a37974e006e2d77d11c40fb1ec9a7fcd1c38e2df0c9636a9f4e6fe6f9b173b: Status 404 returned error can't find the container with id 55a37974e006e2d77d11c40fb1ec9a7fcd1c38e2df0c9636a9f4e6fe6f9b173b Jan 11 17:49:17 crc kubenswrapper[4837]: I0111 17:49:17.796342 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-ring-rebalance-qn5fn" event={"ID":"43068ba1-1d19-4822-88fa-e52f8fb21738","Type":"ContainerStarted","Data":"55a37974e006e2d77d11c40fb1ec9a7fcd1c38e2df0c9636a9f4e6fe6f9b173b"} Jan 11 17:49:17 crc kubenswrapper[4837]: I0111 17:49:17.798908 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-698758b865-rv55z" event={"ID":"281a517d-5b8e-413b-8e6c-6555318d70c8","Type":"ContainerStarted","Data":"c1c6287f733a41f27571dc8fc09e2b56cec55dba13b85e711379023f35711626"} Jan 11 17:49:17 crc kubenswrapper[4837]: I0111 17:49:17.799329 4837 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-698758b865-rv55z" Jan 11 17:49:17 crc kubenswrapper[4837]: I0111 17:49:17.826173 4837 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-698758b865-rv55z" podStartSLOduration=2.826139923 podStartE2EDuration="2.826139923s" podCreationTimestamp="2026-01-11 17:49:15 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-11 17:49:17.82006451 +0000 UTC m=+1131.998257226" watchObservedRunningTime="2026-01-11 17:49:17.826139923 +0000 UTC m=+1132.004332669" Jan 11 17:49:17 crc kubenswrapper[4837]: I0111 17:49:17.998308 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/23a0b787-b5b4-4a4e-828b-d7f34853603f-etc-swift\") pod \"swift-storage-0\" (UID: \"23a0b787-b5b4-4a4e-828b-d7f34853603f\") " pod="openstack/swift-storage-0" Jan 11 17:49:17 crc kubenswrapper[4837]: E0111 17:49:17.998865 4837 projected.go:288] Couldn't get configMap openstack/swift-ring-files: configmap "swift-ring-files" not found Jan 11 17:49:17 crc kubenswrapper[4837]: E0111 17:49:17.998893 4837 projected.go:194] Error preparing data for projected volume etc-swift for pod openstack/swift-storage-0: configmap "swift-ring-files" not found Jan 11 17:49:17 crc kubenswrapper[4837]: E0111 17:49:17.998962 4837 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/23a0b787-b5b4-4a4e-828b-d7f34853603f-etc-swift podName:23a0b787-b5b4-4a4e-828b-d7f34853603f nodeName:}" failed. No retries permitted until 2026-01-11 17:49:19.998938446 +0000 UTC m=+1134.177131192 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "etc-swift" (UniqueName: "kubernetes.io/projected/23a0b787-b5b4-4a4e-828b-d7f34853603f-etc-swift") pod "swift-storage-0" (UID: "23a0b787-b5b4-4a4e-828b-d7f34853603f") : configmap "swift-ring-files" not found Jan 11 17:49:18 crc kubenswrapper[4837]: I0111 17:49:18.372613 4837 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="07f908c8-0d67-4e92-8d61-b43e17f5f73d" path="/var/lib/kubelet/pods/07f908c8-0d67-4e92-8d61-b43e17f5f73d/volumes" Jan 11 17:49:19 crc kubenswrapper[4837]: I0111 17:49:19.019549 4837 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-86db49b7ff-79c66" Jan 11 17:49:20 crc kubenswrapper[4837]: I0111 17:49:20.031985 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/23a0b787-b5b4-4a4e-828b-d7f34853603f-etc-swift\") pod \"swift-storage-0\" (UID: \"23a0b787-b5b4-4a4e-828b-d7f34853603f\") " pod="openstack/swift-storage-0" Jan 11 17:49:20 crc kubenswrapper[4837]: E0111 17:49:20.032241 4837 projected.go:288] Couldn't get configMap openstack/swift-ring-files: configmap "swift-ring-files" not found Jan 11 17:49:20 crc kubenswrapper[4837]: E0111 17:49:20.032296 4837 projected.go:194] Error preparing data for projected volume etc-swift for pod openstack/swift-storage-0: configmap "swift-ring-files" not found Jan 11 17:49:20 crc kubenswrapper[4837]: E0111 17:49:20.032390 4837 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/23a0b787-b5b4-4a4e-828b-d7f34853603f-etc-swift podName:23a0b787-b5b4-4a4e-828b-d7f34853603f nodeName:}" failed. No retries permitted until 2026-01-11 17:49:24.032360711 +0000 UTC m=+1138.210553447 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "etc-swift" (UniqueName: "kubernetes.io/projected/23a0b787-b5b4-4a4e-828b-d7f34853603f-etc-swift") pod "swift-storage-0" (UID: "23a0b787-b5b4-4a4e-828b-d7f34853603f") : configmap "swift-ring-files" not found Jan 11 17:49:20 crc kubenswrapper[4837]: I0111 17:49:20.137910 4837 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/root-account-create-update-l5kvn"] Jan 11 17:49:20 crc kubenswrapper[4837]: I0111 17:49:20.138956 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/root-account-create-update-l5kvn" Jan 11 17:49:20 crc kubenswrapper[4837]: I0111 17:49:20.141403 4837 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-mariadb-root-db-secret" Jan 11 17:49:20 crc kubenswrapper[4837]: I0111 17:49:20.158705 4837 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/root-account-create-update-l5kvn"] Jan 11 17:49:20 crc kubenswrapper[4837]: I0111 17:49:20.248336 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cgs4r\" (UniqueName: \"kubernetes.io/projected/3286e3b8-db71-4590-8874-b255bb5600cd-kube-api-access-cgs4r\") pod \"root-account-create-update-l5kvn\" (UID: \"3286e3b8-db71-4590-8874-b255bb5600cd\") " pod="openstack/root-account-create-update-l5kvn" Jan 11 17:49:20 crc kubenswrapper[4837]: I0111 17:49:20.248563 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/3286e3b8-db71-4590-8874-b255bb5600cd-operator-scripts\") pod \"root-account-create-update-l5kvn\" (UID: \"3286e3b8-db71-4590-8874-b255bb5600cd\") " pod="openstack/root-account-create-update-l5kvn" Jan 11 17:49:20 crc kubenswrapper[4837]: I0111 17:49:20.349879 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cgs4r\" (UniqueName: \"kubernetes.io/projected/3286e3b8-db71-4590-8874-b255bb5600cd-kube-api-access-cgs4r\") pod \"root-account-create-update-l5kvn\" (UID: \"3286e3b8-db71-4590-8874-b255bb5600cd\") " pod="openstack/root-account-create-update-l5kvn" Jan 11 17:49:20 crc kubenswrapper[4837]: I0111 17:49:20.349999 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/3286e3b8-db71-4590-8874-b255bb5600cd-operator-scripts\") pod \"root-account-create-update-l5kvn\" (UID: \"3286e3b8-db71-4590-8874-b255bb5600cd\") " pod="openstack/root-account-create-update-l5kvn" Jan 11 17:49:20 crc kubenswrapper[4837]: I0111 17:49:20.352464 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/3286e3b8-db71-4590-8874-b255bb5600cd-operator-scripts\") pod \"root-account-create-update-l5kvn\" (UID: \"3286e3b8-db71-4590-8874-b255bb5600cd\") " pod="openstack/root-account-create-update-l5kvn" Jan 11 17:49:20 crc kubenswrapper[4837]: I0111 17:49:20.370365 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-cgs4r\" (UniqueName: \"kubernetes.io/projected/3286e3b8-db71-4590-8874-b255bb5600cd-kube-api-access-cgs4r\") pod \"root-account-create-update-l5kvn\" (UID: \"3286e3b8-db71-4590-8874-b255bb5600cd\") " pod="openstack/root-account-create-update-l5kvn" Jan 11 17:49:20 crc kubenswrapper[4837]: I0111 17:49:20.489798 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/root-account-create-update-l5kvn" Jan 11 17:49:21 crc kubenswrapper[4837]: I0111 17:49:21.478548 4837 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/root-account-create-update-l5kvn"] Jan 11 17:49:21 crc kubenswrapper[4837]: I0111 17:49:21.836643 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/root-account-create-update-l5kvn" event={"ID":"3286e3b8-db71-4590-8874-b255bb5600cd","Type":"ContainerStarted","Data":"55f365005fc8749ef4656392442751b8e3b126a78d83c6835d6fa4b8ff96a9c0"} Jan 11 17:49:22 crc kubenswrapper[4837]: I0111 17:49:22.782771 4837 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/keystone-db-create-m8sjp"] Jan 11 17:49:22 crc kubenswrapper[4837]: I0111 17:49:22.784111 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-create-m8sjp" Jan 11 17:49:22 crc kubenswrapper[4837]: I0111 17:49:22.791767 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gz7kc\" (UniqueName: \"kubernetes.io/projected/65ae93f3-f5ad-4225-86a9-c7bf0748d84f-kube-api-access-gz7kc\") pod \"keystone-db-create-m8sjp\" (UID: \"65ae93f3-f5ad-4225-86a9-c7bf0748d84f\") " pod="openstack/keystone-db-create-m8sjp" Jan 11 17:49:22 crc kubenswrapper[4837]: I0111 17:49:22.791942 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/65ae93f3-f5ad-4225-86a9-c7bf0748d84f-operator-scripts\") pod \"keystone-db-create-m8sjp\" (UID: \"65ae93f3-f5ad-4225-86a9-c7bf0748d84f\") " pod="openstack/keystone-db-create-m8sjp" Jan 11 17:49:22 crc kubenswrapper[4837]: I0111 17:49:22.812281 4837 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-db-create-m8sjp"] Jan 11 17:49:22 crc kubenswrapper[4837]: I0111 17:49:22.850046 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-ring-rebalance-qn5fn" event={"ID":"43068ba1-1d19-4822-88fa-e52f8fb21738","Type":"ContainerStarted","Data":"d3dd869d3fd64b1f5ab874a0f53ad826f8a488a91a162c3b44beaa01e0102440"} Jan 11 17:49:22 crc kubenswrapper[4837]: I0111 17:49:22.865244 4837 generic.go:334] "Generic (PLEG): container finished" podID="3286e3b8-db71-4590-8874-b255bb5600cd" containerID="892c1071f039c7f21325f41a2bcca20b48d3925ea457a5d2c914d76d7af02ee7" exitCode=0 Jan 11 17:49:22 crc kubenswrapper[4837]: I0111 17:49:22.865334 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/root-account-create-update-l5kvn" event={"ID":"3286e3b8-db71-4590-8874-b255bb5600cd","Type":"ContainerDied","Data":"892c1071f039c7f21325f41a2bcca20b48d3925ea457a5d2c914d76d7af02ee7"} Jan 11 17:49:22 crc kubenswrapper[4837]: I0111 17:49:22.892114 4837 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/swift-ring-rebalance-qn5fn" podStartSLOduration=3.174830784 podStartE2EDuration="6.892095653s" podCreationTimestamp="2026-01-11 17:49:16 +0000 UTC" firstStartedPulling="2026-01-11 17:49:17.651646956 +0000 UTC m=+1131.829839662" lastFinishedPulling="2026-01-11 17:49:21.368911815 +0000 UTC m=+1135.547104531" observedRunningTime="2026-01-11 17:49:22.886264567 +0000 UTC m=+1137.064457273" watchObservedRunningTime="2026-01-11 17:49:22.892095653 +0000 UTC m=+1137.070288369" Jan 11 17:49:22 crc kubenswrapper[4837]: I0111 17:49:22.893618 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gz7kc\" (UniqueName: \"kubernetes.io/projected/65ae93f3-f5ad-4225-86a9-c7bf0748d84f-kube-api-access-gz7kc\") pod \"keystone-db-create-m8sjp\" (UID: \"65ae93f3-f5ad-4225-86a9-c7bf0748d84f\") " pod="openstack/keystone-db-create-m8sjp" Jan 11 17:49:22 crc kubenswrapper[4837]: I0111 17:49:22.893708 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/65ae93f3-f5ad-4225-86a9-c7bf0748d84f-operator-scripts\") pod \"keystone-db-create-m8sjp\" (UID: \"65ae93f3-f5ad-4225-86a9-c7bf0748d84f\") " pod="openstack/keystone-db-create-m8sjp" Jan 11 17:49:22 crc kubenswrapper[4837]: I0111 17:49:22.894471 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/65ae93f3-f5ad-4225-86a9-c7bf0748d84f-operator-scripts\") pod \"keystone-db-create-m8sjp\" (UID: \"65ae93f3-f5ad-4225-86a9-c7bf0748d84f\") " pod="openstack/keystone-db-create-m8sjp" Jan 11 17:49:22 crc kubenswrapper[4837]: I0111 17:49:22.904833 4837 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/keystone-ce95-account-create-update-mv8vt"] Jan 11 17:49:22 crc kubenswrapper[4837]: I0111 17:49:22.905873 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-ce95-account-create-update-mv8vt" Jan 11 17:49:22 crc kubenswrapper[4837]: I0111 17:49:22.911427 4837 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-db-secret" Jan 11 17:49:22 crc kubenswrapper[4837]: I0111 17:49:22.919933 4837 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-ce95-account-create-update-mv8vt"] Jan 11 17:49:22 crc kubenswrapper[4837]: I0111 17:49:22.920351 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gz7kc\" (UniqueName: \"kubernetes.io/projected/65ae93f3-f5ad-4225-86a9-c7bf0748d84f-kube-api-access-gz7kc\") pod \"keystone-db-create-m8sjp\" (UID: \"65ae93f3-f5ad-4225-86a9-c7bf0748d84f\") " pod="openstack/keystone-db-create-m8sjp" Jan 11 17:49:23 crc kubenswrapper[4837]: I0111 17:49:23.097294 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/7c933d5f-4b3e-43c8-a87d-67b274355687-operator-scripts\") pod \"keystone-ce95-account-create-update-mv8vt\" (UID: \"7c933d5f-4b3e-43c8-a87d-67b274355687\") " pod="openstack/keystone-ce95-account-create-update-mv8vt" Jan 11 17:49:23 crc kubenswrapper[4837]: I0111 17:49:23.097403 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5ng5r\" (UniqueName: \"kubernetes.io/projected/7c933d5f-4b3e-43c8-a87d-67b274355687-kube-api-access-5ng5r\") pod \"keystone-ce95-account-create-update-mv8vt\" (UID: \"7c933d5f-4b3e-43c8-a87d-67b274355687\") " pod="openstack/keystone-ce95-account-create-update-mv8vt" Jan 11 17:49:23 crc kubenswrapper[4837]: I0111 17:49:23.111828 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-create-m8sjp" Jan 11 17:49:23 crc kubenswrapper[4837]: I0111 17:49:23.134101 4837 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/placement-db-create-dv96d"] Jan 11 17:49:23 crc kubenswrapper[4837]: I0111 17:49:23.135575 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-create-dv96d" Jan 11 17:49:23 crc kubenswrapper[4837]: I0111 17:49:23.142729 4837 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-db-create-dv96d"] Jan 11 17:49:23 crc kubenswrapper[4837]: I0111 17:49:23.201831 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5ng5r\" (UniqueName: \"kubernetes.io/projected/7c933d5f-4b3e-43c8-a87d-67b274355687-kube-api-access-5ng5r\") pod \"keystone-ce95-account-create-update-mv8vt\" (UID: \"7c933d5f-4b3e-43c8-a87d-67b274355687\") " pod="openstack/keystone-ce95-account-create-update-mv8vt" Jan 11 17:49:23 crc kubenswrapper[4837]: I0111 17:49:23.202278 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9fpb5\" (UniqueName: \"kubernetes.io/projected/76495fd7-f795-4340-85fa-9f1469bbd1aa-kube-api-access-9fpb5\") pod \"placement-db-create-dv96d\" (UID: \"76495fd7-f795-4340-85fa-9f1469bbd1aa\") " pod="openstack/placement-db-create-dv96d" Jan 11 17:49:23 crc kubenswrapper[4837]: I0111 17:49:23.202329 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/76495fd7-f795-4340-85fa-9f1469bbd1aa-operator-scripts\") pod \"placement-db-create-dv96d\" (UID: \"76495fd7-f795-4340-85fa-9f1469bbd1aa\") " pod="openstack/placement-db-create-dv96d" Jan 11 17:49:23 crc kubenswrapper[4837]: I0111 17:49:23.202409 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/7c933d5f-4b3e-43c8-a87d-67b274355687-operator-scripts\") pod \"keystone-ce95-account-create-update-mv8vt\" (UID: \"7c933d5f-4b3e-43c8-a87d-67b274355687\") " pod="openstack/keystone-ce95-account-create-update-mv8vt" Jan 11 17:49:23 crc kubenswrapper[4837]: I0111 17:49:23.203383 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/7c933d5f-4b3e-43c8-a87d-67b274355687-operator-scripts\") pod \"keystone-ce95-account-create-update-mv8vt\" (UID: \"7c933d5f-4b3e-43c8-a87d-67b274355687\") " pod="openstack/keystone-ce95-account-create-update-mv8vt" Jan 11 17:49:23 crc kubenswrapper[4837]: I0111 17:49:23.219773 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5ng5r\" (UniqueName: \"kubernetes.io/projected/7c933d5f-4b3e-43c8-a87d-67b274355687-kube-api-access-5ng5r\") pod \"keystone-ce95-account-create-update-mv8vt\" (UID: \"7c933d5f-4b3e-43c8-a87d-67b274355687\") " pod="openstack/keystone-ce95-account-create-update-mv8vt" Jan 11 17:49:23 crc kubenswrapper[4837]: I0111 17:49:23.247957 4837 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/placement-047d-account-create-update-9296c"] Jan 11 17:49:23 crc kubenswrapper[4837]: I0111 17:49:23.249243 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-047d-account-create-update-9296c" Jan 11 17:49:23 crc kubenswrapper[4837]: I0111 17:49:23.251989 4837 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"placement-db-secret" Jan 11 17:49:23 crc kubenswrapper[4837]: I0111 17:49:23.254096 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-ce95-account-create-update-mv8vt" Jan 11 17:49:23 crc kubenswrapper[4837]: I0111 17:49:23.255467 4837 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-047d-account-create-update-9296c"] Jan 11 17:49:23 crc kubenswrapper[4837]: I0111 17:49:23.308338 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9fpb5\" (UniqueName: \"kubernetes.io/projected/76495fd7-f795-4340-85fa-9f1469bbd1aa-kube-api-access-9fpb5\") pod \"placement-db-create-dv96d\" (UID: \"76495fd7-f795-4340-85fa-9f1469bbd1aa\") " pod="openstack/placement-db-create-dv96d" Jan 11 17:49:23 crc kubenswrapper[4837]: I0111 17:49:23.308408 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/76495fd7-f795-4340-85fa-9f1469bbd1aa-operator-scripts\") pod \"placement-db-create-dv96d\" (UID: \"76495fd7-f795-4340-85fa-9f1469bbd1aa\") " pod="openstack/placement-db-create-dv96d" Jan 11 17:49:23 crc kubenswrapper[4837]: I0111 17:49:23.308460 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/a5853482-7ef2-4be7-83ac-5212d4db1696-operator-scripts\") pod \"placement-047d-account-create-update-9296c\" (UID: \"a5853482-7ef2-4be7-83ac-5212d4db1696\") " pod="openstack/placement-047d-account-create-update-9296c" Jan 11 17:49:23 crc kubenswrapper[4837]: I0111 17:49:23.308507 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vqztt\" (UniqueName: \"kubernetes.io/projected/a5853482-7ef2-4be7-83ac-5212d4db1696-kube-api-access-vqztt\") pod \"placement-047d-account-create-update-9296c\" (UID: \"a5853482-7ef2-4be7-83ac-5212d4db1696\") " pod="openstack/placement-047d-account-create-update-9296c" Jan 11 17:49:23 crc kubenswrapper[4837]: I0111 17:49:23.309545 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/76495fd7-f795-4340-85fa-9f1469bbd1aa-operator-scripts\") pod \"placement-db-create-dv96d\" (UID: \"76495fd7-f795-4340-85fa-9f1469bbd1aa\") " pod="openstack/placement-db-create-dv96d" Jan 11 17:49:23 crc kubenswrapper[4837]: I0111 17:49:23.339239 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9fpb5\" (UniqueName: \"kubernetes.io/projected/76495fd7-f795-4340-85fa-9f1469bbd1aa-kube-api-access-9fpb5\") pod \"placement-db-create-dv96d\" (UID: \"76495fd7-f795-4340-85fa-9f1469bbd1aa\") " pod="openstack/placement-db-create-dv96d" Jan 11 17:49:23 crc kubenswrapper[4837]: I0111 17:49:23.405815 4837 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-db-create-b9fmt"] Jan 11 17:49:23 crc kubenswrapper[4837]: I0111 17:49:23.406768 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-create-b9fmt" Jan 11 17:49:23 crc kubenswrapper[4837]: I0111 17:49:23.409456 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/a5853482-7ef2-4be7-83ac-5212d4db1696-operator-scripts\") pod \"placement-047d-account-create-update-9296c\" (UID: \"a5853482-7ef2-4be7-83ac-5212d4db1696\") " pod="openstack/placement-047d-account-create-update-9296c" Jan 11 17:49:23 crc kubenswrapper[4837]: I0111 17:49:23.409501 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vqztt\" (UniqueName: \"kubernetes.io/projected/a5853482-7ef2-4be7-83ac-5212d4db1696-kube-api-access-vqztt\") pod \"placement-047d-account-create-update-9296c\" (UID: \"a5853482-7ef2-4be7-83ac-5212d4db1696\") " pod="openstack/placement-047d-account-create-update-9296c" Jan 11 17:49:23 crc kubenswrapper[4837]: I0111 17:49:23.411073 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/a5853482-7ef2-4be7-83ac-5212d4db1696-operator-scripts\") pod \"placement-047d-account-create-update-9296c\" (UID: \"a5853482-7ef2-4be7-83ac-5212d4db1696\") " pod="openstack/placement-047d-account-create-update-9296c" Jan 11 17:49:23 crc kubenswrapper[4837]: I0111 17:49:23.415187 4837 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-db-create-b9fmt"] Jan 11 17:49:23 crc kubenswrapper[4837]: I0111 17:49:23.435111 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vqztt\" (UniqueName: \"kubernetes.io/projected/a5853482-7ef2-4be7-83ac-5212d4db1696-kube-api-access-vqztt\") pod \"placement-047d-account-create-update-9296c\" (UID: \"a5853482-7ef2-4be7-83ac-5212d4db1696\") " pod="openstack/placement-047d-account-create-update-9296c" Jan 11 17:49:23 crc kubenswrapper[4837]: I0111 17:49:23.513156 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/3a0934ac-db06-4719-9ee0-edbefddcd983-operator-scripts\") pod \"glance-db-create-b9fmt\" (UID: \"3a0934ac-db06-4719-9ee0-edbefddcd983\") " pod="openstack/glance-db-create-b9fmt" Jan 11 17:49:23 crc kubenswrapper[4837]: I0111 17:49:23.514085 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-k8sr5\" (UniqueName: \"kubernetes.io/projected/3a0934ac-db06-4719-9ee0-edbefddcd983-kube-api-access-k8sr5\") pod \"glance-db-create-b9fmt\" (UID: \"3a0934ac-db06-4719-9ee0-edbefddcd983\") " pod="openstack/glance-db-create-b9fmt" Jan 11 17:49:23 crc kubenswrapper[4837]: I0111 17:49:23.543509 4837 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-8247-account-create-update-chtt6"] Jan 11 17:49:23 crc kubenswrapper[4837]: I0111 17:49:23.544729 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-8247-account-create-update-chtt6" Jan 11 17:49:23 crc kubenswrapper[4837]: I0111 17:49:23.547371 4837 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-db-secret" Jan 11 17:49:23 crc kubenswrapper[4837]: I0111 17:49:23.550817 4837 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-8247-account-create-update-chtt6"] Jan 11 17:49:23 crc kubenswrapper[4837]: I0111 17:49:23.617563 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/129b7610-1c04-47c2-ba4d-4c20195c2071-operator-scripts\") pod \"glance-8247-account-create-update-chtt6\" (UID: \"129b7610-1c04-47c2-ba4d-4c20195c2071\") " pod="openstack/glance-8247-account-create-update-chtt6" Jan 11 17:49:23 crc kubenswrapper[4837]: I0111 17:49:23.617964 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-k8sr5\" (UniqueName: \"kubernetes.io/projected/3a0934ac-db06-4719-9ee0-edbefddcd983-kube-api-access-k8sr5\") pod \"glance-db-create-b9fmt\" (UID: \"3a0934ac-db06-4719-9ee0-edbefddcd983\") " pod="openstack/glance-db-create-b9fmt" Jan 11 17:49:23 crc kubenswrapper[4837]: I0111 17:49:23.618109 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5xl9q\" (UniqueName: \"kubernetes.io/projected/129b7610-1c04-47c2-ba4d-4c20195c2071-kube-api-access-5xl9q\") pod \"glance-8247-account-create-update-chtt6\" (UID: \"129b7610-1c04-47c2-ba4d-4c20195c2071\") " pod="openstack/glance-8247-account-create-update-chtt6" Jan 11 17:49:23 crc kubenswrapper[4837]: I0111 17:49:23.618303 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/3a0934ac-db06-4719-9ee0-edbefddcd983-operator-scripts\") pod \"glance-db-create-b9fmt\" (UID: \"3a0934ac-db06-4719-9ee0-edbefddcd983\") " pod="openstack/glance-db-create-b9fmt" Jan 11 17:49:23 crc kubenswrapper[4837]: I0111 17:49:23.619129 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/3a0934ac-db06-4719-9ee0-edbefddcd983-operator-scripts\") pod \"glance-db-create-b9fmt\" (UID: \"3a0934ac-db06-4719-9ee0-edbefddcd983\") " pod="openstack/glance-db-create-b9fmt" Jan 11 17:49:23 crc kubenswrapper[4837]: I0111 17:49:23.638265 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-create-dv96d" Jan 11 17:49:23 crc kubenswrapper[4837]: I0111 17:49:23.638724 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-k8sr5\" (UniqueName: \"kubernetes.io/projected/3a0934ac-db06-4719-9ee0-edbefddcd983-kube-api-access-k8sr5\") pod \"glance-db-create-b9fmt\" (UID: \"3a0934ac-db06-4719-9ee0-edbefddcd983\") " pod="openstack/glance-db-create-b9fmt" Jan 11 17:49:23 crc kubenswrapper[4837]: I0111 17:49:23.647693 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-047d-account-create-update-9296c" Jan 11 17:49:23 crc kubenswrapper[4837]: W0111 17:49:23.665829 4837 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod65ae93f3_f5ad_4225_86a9_c7bf0748d84f.slice/crio-68a625d5bf9975c052b4456c121cc3f7a6713e85fa06ab3fdef1cae30b3fb6cb WatchSource:0}: Error finding container 68a625d5bf9975c052b4456c121cc3f7a6713e85fa06ab3fdef1cae30b3fb6cb: Status 404 returned error can't find the container with id 68a625d5bf9975c052b4456c121cc3f7a6713e85fa06ab3fdef1cae30b3fb6cb Jan 11 17:49:23 crc kubenswrapper[4837]: I0111 17:49:23.667278 4837 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-db-create-m8sjp"] Jan 11 17:49:23 crc kubenswrapper[4837]: I0111 17:49:23.719947 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/129b7610-1c04-47c2-ba4d-4c20195c2071-operator-scripts\") pod \"glance-8247-account-create-update-chtt6\" (UID: \"129b7610-1c04-47c2-ba4d-4c20195c2071\") " pod="openstack/glance-8247-account-create-update-chtt6" Jan 11 17:49:23 crc kubenswrapper[4837]: I0111 17:49:23.720122 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5xl9q\" (UniqueName: \"kubernetes.io/projected/129b7610-1c04-47c2-ba4d-4c20195c2071-kube-api-access-5xl9q\") pod \"glance-8247-account-create-update-chtt6\" (UID: \"129b7610-1c04-47c2-ba4d-4c20195c2071\") " pod="openstack/glance-8247-account-create-update-chtt6" Jan 11 17:49:23 crc kubenswrapper[4837]: I0111 17:49:23.720802 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/129b7610-1c04-47c2-ba4d-4c20195c2071-operator-scripts\") pod \"glance-8247-account-create-update-chtt6\" (UID: \"129b7610-1c04-47c2-ba4d-4c20195c2071\") " pod="openstack/glance-8247-account-create-update-chtt6" Jan 11 17:49:23 crc kubenswrapper[4837]: I0111 17:49:23.726344 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-create-b9fmt" Jan 11 17:49:23 crc kubenswrapper[4837]: I0111 17:49:23.732983 4837 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-ce95-account-create-update-mv8vt"] Jan 11 17:49:23 crc kubenswrapper[4837]: I0111 17:49:23.741845 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5xl9q\" (UniqueName: \"kubernetes.io/projected/129b7610-1c04-47c2-ba4d-4c20195c2071-kube-api-access-5xl9q\") pod \"glance-8247-account-create-update-chtt6\" (UID: \"129b7610-1c04-47c2-ba4d-4c20195c2071\") " pod="openstack/glance-8247-account-create-update-chtt6" Jan 11 17:49:23 crc kubenswrapper[4837]: I0111 17:49:23.864819 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-8247-account-create-update-chtt6" Jan 11 17:49:23 crc kubenswrapper[4837]: I0111 17:49:23.890989 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-ce95-account-create-update-mv8vt" event={"ID":"7c933d5f-4b3e-43c8-a87d-67b274355687","Type":"ContainerStarted","Data":"1dfe1685c7c33c409617b3ea0b6089569997b8bcc6d967c5b713fa8e1fb43d21"} Jan 11 17:49:23 crc kubenswrapper[4837]: I0111 17:49:23.892067 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-create-m8sjp" event={"ID":"65ae93f3-f5ad-4225-86a9-c7bf0748d84f","Type":"ContainerStarted","Data":"68a625d5bf9975c052b4456c121cc3f7a6713e85fa06ab3fdef1cae30b3fb6cb"} Jan 11 17:49:24 crc kubenswrapper[4837]: I0111 17:49:24.126820 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/23a0b787-b5b4-4a4e-828b-d7f34853603f-etc-swift\") pod \"swift-storage-0\" (UID: \"23a0b787-b5b4-4a4e-828b-d7f34853603f\") " pod="openstack/swift-storage-0" Jan 11 17:49:24 crc kubenswrapper[4837]: E0111 17:49:24.126997 4837 projected.go:288] Couldn't get configMap openstack/swift-ring-files: configmap "swift-ring-files" not found Jan 11 17:49:24 crc kubenswrapper[4837]: E0111 17:49:24.127025 4837 projected.go:194] Error preparing data for projected volume etc-swift for pod openstack/swift-storage-0: configmap "swift-ring-files" not found Jan 11 17:49:24 crc kubenswrapper[4837]: E0111 17:49:24.127077 4837 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/23a0b787-b5b4-4a4e-828b-d7f34853603f-etc-swift podName:23a0b787-b5b4-4a4e-828b-d7f34853603f nodeName:}" failed. No retries permitted until 2026-01-11 17:49:32.127061229 +0000 UTC m=+1146.305253935 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "etc-swift" (UniqueName: "kubernetes.io/projected/23a0b787-b5b4-4a4e-828b-d7f34853603f-etc-swift") pod "swift-storage-0" (UID: "23a0b787-b5b4-4a4e-828b-d7f34853603f") : configmap "swift-ring-files" not found Jan 11 17:49:24 crc kubenswrapper[4837]: I0111 17:49:24.135303 4837 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-db-create-dv96d"] Jan 11 17:49:24 crc kubenswrapper[4837]: W0111 17:49:24.136811 4837 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod76495fd7_f795_4340_85fa_9f1469bbd1aa.slice/crio-aa9c174d0ff9f8e619b4b9382be8d48ddda3aaca4a6ca65cdb61cc74963a486c WatchSource:0}: Error finding container aa9c174d0ff9f8e619b4b9382be8d48ddda3aaca4a6ca65cdb61cc74963a486c: Status 404 returned error can't find the container with id aa9c174d0ff9f8e619b4b9382be8d48ddda3aaca4a6ca65cdb61cc74963a486c Jan 11 17:49:24 crc kubenswrapper[4837]: I0111 17:49:24.242504 4837 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-047d-account-create-update-9296c"] Jan 11 17:49:24 crc kubenswrapper[4837]: I0111 17:49:24.319167 4837 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ovn-northd-0" Jan 11 17:49:24 crc kubenswrapper[4837]: I0111 17:49:24.374303 4837 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/root-account-create-update-l5kvn" Jan 11 17:49:24 crc kubenswrapper[4837]: I0111 17:49:24.398974 4837 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-db-create-b9fmt"] Jan 11 17:49:24 crc kubenswrapper[4837]: I0111 17:49:24.432429 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/3286e3b8-db71-4590-8874-b255bb5600cd-operator-scripts\") pod \"3286e3b8-db71-4590-8874-b255bb5600cd\" (UID: \"3286e3b8-db71-4590-8874-b255bb5600cd\") " Jan 11 17:49:24 crc kubenswrapper[4837]: I0111 17:49:24.432662 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-cgs4r\" (UniqueName: \"kubernetes.io/projected/3286e3b8-db71-4590-8874-b255bb5600cd-kube-api-access-cgs4r\") pod \"3286e3b8-db71-4590-8874-b255bb5600cd\" (UID: \"3286e3b8-db71-4590-8874-b255bb5600cd\") " Jan 11 17:49:24 crc kubenswrapper[4837]: I0111 17:49:24.432964 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/3286e3b8-db71-4590-8874-b255bb5600cd-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "3286e3b8-db71-4590-8874-b255bb5600cd" (UID: "3286e3b8-db71-4590-8874-b255bb5600cd"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 11 17:49:24 crc kubenswrapper[4837]: I0111 17:49:24.433194 4837 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/3286e3b8-db71-4590-8874-b255bb5600cd-operator-scripts\") on node \"crc\" DevicePath \"\"" Jan 11 17:49:24 crc kubenswrapper[4837]: I0111 17:49:24.440012 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3286e3b8-db71-4590-8874-b255bb5600cd-kube-api-access-cgs4r" (OuterVolumeSpecName: "kube-api-access-cgs4r") pod "3286e3b8-db71-4590-8874-b255bb5600cd" (UID: "3286e3b8-db71-4590-8874-b255bb5600cd"). InnerVolumeSpecName "kube-api-access-cgs4r". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 11 17:49:24 crc kubenswrapper[4837]: I0111 17:49:24.448663 4837 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-8247-account-create-update-chtt6"] Jan 11 17:49:24 crc kubenswrapper[4837]: I0111 17:49:24.535141 4837 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-cgs4r\" (UniqueName: \"kubernetes.io/projected/3286e3b8-db71-4590-8874-b255bb5600cd-kube-api-access-cgs4r\") on node \"crc\" DevicePath \"\"" Jan 11 17:49:24 crc kubenswrapper[4837]: I0111 17:49:24.903091 4837 generic.go:334] "Generic (PLEG): container finished" podID="65ae93f3-f5ad-4225-86a9-c7bf0748d84f" containerID="4946608abe9f38b0ce7bd682101e55184bf0348f285b29d2181c1d450d82e7be" exitCode=0 Jan 11 17:49:24 crc kubenswrapper[4837]: I0111 17:49:24.903185 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-create-m8sjp" event={"ID":"65ae93f3-f5ad-4225-86a9-c7bf0748d84f","Type":"ContainerDied","Data":"4946608abe9f38b0ce7bd682101e55184bf0348f285b29d2181c1d450d82e7be"} Jan 11 17:49:24 crc kubenswrapper[4837]: I0111 17:49:24.906888 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-8247-account-create-update-chtt6" event={"ID":"129b7610-1c04-47c2-ba4d-4c20195c2071","Type":"ContainerStarted","Data":"f1a691cbefb207e0912f108bb1859fef3a8ca9a89fb8eee662e8d939b31df464"} Jan 11 17:49:24 crc kubenswrapper[4837]: I0111 17:49:24.906962 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-8247-account-create-update-chtt6" event={"ID":"129b7610-1c04-47c2-ba4d-4c20195c2071","Type":"ContainerStarted","Data":"31416beb4552974bc3a4b44bc247070e12a5357532fa5c46e4e909975203450b"} Jan 11 17:49:24 crc kubenswrapper[4837]: I0111 17:49:24.909272 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-047d-account-create-update-9296c" event={"ID":"a5853482-7ef2-4be7-83ac-5212d4db1696","Type":"ContainerStarted","Data":"274cf8df922185f1b531431260cbe8bc072410bcd7e3df68013113f1b78a39c4"} Jan 11 17:49:24 crc kubenswrapper[4837]: I0111 17:49:24.909328 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-047d-account-create-update-9296c" event={"ID":"a5853482-7ef2-4be7-83ac-5212d4db1696","Type":"ContainerStarted","Data":"ac6e2ce8399b737ea76ecab1afd967bb501f59a65e4990c8b5e6eb4969631963"} Jan 11 17:49:24 crc kubenswrapper[4837]: I0111 17:49:24.911296 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-db-create-dv96d" event={"ID":"76495fd7-f795-4340-85fa-9f1469bbd1aa","Type":"ContainerStarted","Data":"0a8f3ee29272f42b879ea7bd68b49ee864cdc1330c6740bc020afd8b722065b3"} Jan 11 17:49:24 crc kubenswrapper[4837]: I0111 17:49:24.911506 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-db-create-dv96d" event={"ID":"76495fd7-f795-4340-85fa-9f1469bbd1aa","Type":"ContainerStarted","Data":"aa9c174d0ff9f8e619b4b9382be8d48ddda3aaca4a6ca65cdb61cc74963a486c"} Jan 11 17:49:24 crc kubenswrapper[4837]: I0111 17:49:24.921521 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db-create-b9fmt" event={"ID":"3a0934ac-db06-4719-9ee0-edbefddcd983","Type":"ContainerStarted","Data":"ab94431e6e856e8762b5638e35264dba0d0cf5be791233e8088e845bcdf0924c"} Jan 11 17:49:24 crc kubenswrapper[4837]: I0111 17:49:24.921572 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db-create-b9fmt" event={"ID":"3a0934ac-db06-4719-9ee0-edbefddcd983","Type":"ContainerStarted","Data":"0ec46b96398c27600fd68c848527f2bb870a290db0b539a045e44b36899b2e6f"} Jan 11 17:49:24 crc kubenswrapper[4837]: I0111 17:49:24.924827 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/root-account-create-update-l5kvn" event={"ID":"3286e3b8-db71-4590-8874-b255bb5600cd","Type":"ContainerDied","Data":"55f365005fc8749ef4656392442751b8e3b126a78d83c6835d6fa4b8ff96a9c0"} Jan 11 17:49:24 crc kubenswrapper[4837]: I0111 17:49:24.925032 4837 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="55f365005fc8749ef4656392442751b8e3b126a78d83c6835d6fa4b8ff96a9c0" Jan 11 17:49:24 crc kubenswrapper[4837]: I0111 17:49:24.925109 4837 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/root-account-create-update-l5kvn" Jan 11 17:49:24 crc kubenswrapper[4837]: I0111 17:49:24.926815 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-ce95-account-create-update-mv8vt" event={"ID":"7c933d5f-4b3e-43c8-a87d-67b274355687","Type":"ContainerStarted","Data":"bc3597a6896d5097d8928cdc96d9475de068e7fcb43bbbc7e760fe9c2a0d2a35"} Jan 11 17:49:24 crc kubenswrapper[4837]: I0111 17:49:24.935430 4837 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/placement-047d-account-create-update-9296c" podStartSLOduration=1.935407994 podStartE2EDuration="1.935407994s" podCreationTimestamp="2026-01-11 17:49:23 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-11 17:49:24.932408493 +0000 UTC m=+1139.110601209" watchObservedRunningTime="2026-01-11 17:49:24.935407994 +0000 UTC m=+1139.113600710" Jan 11 17:49:24 crc kubenswrapper[4837]: I0111 17:49:24.963212 4837 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/placement-db-create-dv96d" podStartSLOduration=1.9631914400000001 podStartE2EDuration="1.96319144s" podCreationTimestamp="2026-01-11 17:49:23 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-11 17:49:24.95536717 +0000 UTC m=+1139.133559876" watchObservedRunningTime="2026-01-11 17:49:24.96319144 +0000 UTC m=+1139.141384156" Jan 11 17:49:24 crc kubenswrapper[4837]: I0111 17:49:24.978519 4837 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/keystone-ce95-account-create-update-mv8vt" podStartSLOduration=2.978501812 podStartE2EDuration="2.978501812s" podCreationTimestamp="2026-01-11 17:49:22 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-11 17:49:24.96990873 +0000 UTC m=+1139.148101446" watchObservedRunningTime="2026-01-11 17:49:24.978501812 +0000 UTC m=+1139.156694528" Jan 11 17:49:24 crc kubenswrapper[4837]: I0111 17:49:24.984632 4837 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/glance-db-create-b9fmt" podStartSLOduration=1.984611946 podStartE2EDuration="1.984611946s" podCreationTimestamp="2026-01-11 17:49:23 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-11 17:49:24.981835701 +0000 UTC m=+1139.160028417" watchObservedRunningTime="2026-01-11 17:49:24.984611946 +0000 UTC m=+1139.162804652" Jan 11 17:49:25 crc kubenswrapper[4837]: I0111 17:49:25.205117 4837 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/kube-state-metrics-0" Jan 11 17:49:25 crc kubenswrapper[4837]: I0111 17:49:25.624959 4837 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-698758b865-rv55z" Jan 11 17:49:25 crc kubenswrapper[4837]: I0111 17:49:25.701813 4837 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-86db49b7ff-79c66"] Jan 11 17:49:25 crc kubenswrapper[4837]: I0111 17:49:25.702056 4837 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-86db49b7ff-79c66" podUID="e7a1bd32-302a-49e9-a142-942dec63be03" containerName="dnsmasq-dns" containerID="cri-o://1ea4cac02a3e38e4405e89d50c0227c8e1c5d36ab412a14c0dbcb3a3c5750e3f" gracePeriod=10 Jan 11 17:49:25 crc kubenswrapper[4837]: I0111 17:49:25.935768 4837 generic.go:334] "Generic (PLEG): container finished" podID="e7a1bd32-302a-49e9-a142-942dec63be03" containerID="1ea4cac02a3e38e4405e89d50c0227c8e1c5d36ab412a14c0dbcb3a3c5750e3f" exitCode=0 Jan 11 17:49:25 crc kubenswrapper[4837]: I0111 17:49:25.935849 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-86db49b7ff-79c66" event={"ID":"e7a1bd32-302a-49e9-a142-942dec63be03","Type":"ContainerDied","Data":"1ea4cac02a3e38e4405e89d50c0227c8e1c5d36ab412a14c0dbcb3a3c5750e3f"} Jan 11 17:49:25 crc kubenswrapper[4837]: I0111 17:49:25.956585 4837 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/glance-8247-account-create-update-chtt6" podStartSLOduration=2.956564656 podStartE2EDuration="2.956564656s" podCreationTimestamp="2026-01-11 17:49:23 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-11 17:49:25.950806331 +0000 UTC m=+1140.128999037" watchObservedRunningTime="2026-01-11 17:49:25.956564656 +0000 UTC m=+1140.134757362" Jan 11 17:49:26 crc kubenswrapper[4837]: I0111 17:49:26.308876 4837 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-create-m8sjp" Jan 11 17:49:26 crc kubenswrapper[4837]: I0111 17:49:26.381831 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/65ae93f3-f5ad-4225-86a9-c7bf0748d84f-operator-scripts\") pod \"65ae93f3-f5ad-4225-86a9-c7bf0748d84f\" (UID: \"65ae93f3-f5ad-4225-86a9-c7bf0748d84f\") " Jan 11 17:49:26 crc kubenswrapper[4837]: I0111 17:49:26.381974 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-gz7kc\" (UniqueName: \"kubernetes.io/projected/65ae93f3-f5ad-4225-86a9-c7bf0748d84f-kube-api-access-gz7kc\") pod \"65ae93f3-f5ad-4225-86a9-c7bf0748d84f\" (UID: \"65ae93f3-f5ad-4225-86a9-c7bf0748d84f\") " Jan 11 17:49:26 crc kubenswrapper[4837]: I0111 17:49:26.383427 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/65ae93f3-f5ad-4225-86a9-c7bf0748d84f-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "65ae93f3-f5ad-4225-86a9-c7bf0748d84f" (UID: "65ae93f3-f5ad-4225-86a9-c7bf0748d84f"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 11 17:49:26 crc kubenswrapper[4837]: I0111 17:49:26.388856 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/65ae93f3-f5ad-4225-86a9-c7bf0748d84f-kube-api-access-gz7kc" (OuterVolumeSpecName: "kube-api-access-gz7kc") pod "65ae93f3-f5ad-4225-86a9-c7bf0748d84f" (UID: "65ae93f3-f5ad-4225-86a9-c7bf0748d84f"). InnerVolumeSpecName "kube-api-access-gz7kc". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 11 17:49:26 crc kubenswrapper[4837]: I0111 17:49:26.484204 4837 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-gz7kc\" (UniqueName: \"kubernetes.io/projected/65ae93f3-f5ad-4225-86a9-c7bf0748d84f-kube-api-access-gz7kc\") on node \"crc\" DevicePath \"\"" Jan 11 17:49:26 crc kubenswrapper[4837]: I0111 17:49:26.484450 4837 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/65ae93f3-f5ad-4225-86a9-c7bf0748d84f-operator-scripts\") on node \"crc\" DevicePath \"\"" Jan 11 17:49:26 crc kubenswrapper[4837]: I0111 17:49:26.589137 4837 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/root-account-create-update-l5kvn"] Jan 11 17:49:26 crc kubenswrapper[4837]: I0111 17:49:26.596345 4837 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/root-account-create-update-l5kvn"] Jan 11 17:49:26 crc kubenswrapper[4837]: I0111 17:49:26.950528 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-86db49b7ff-79c66" event={"ID":"e7a1bd32-302a-49e9-a142-942dec63be03","Type":"ContainerDied","Data":"d235391e253a6595b7a2e6982e5d878ca0e2a1a8c6e30052aac1c6265a993dbc"} Jan 11 17:49:26 crc kubenswrapper[4837]: I0111 17:49:26.950623 4837 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="d235391e253a6595b7a2e6982e5d878ca0e2a1a8c6e30052aac1c6265a993dbc" Jan 11 17:49:26 crc kubenswrapper[4837]: I0111 17:49:26.952312 4837 generic.go:334] "Generic (PLEG): container finished" podID="76495fd7-f795-4340-85fa-9f1469bbd1aa" containerID="0a8f3ee29272f42b879ea7bd68b49ee864cdc1330c6740bc020afd8b722065b3" exitCode=0 Jan 11 17:49:26 crc kubenswrapper[4837]: I0111 17:49:26.952395 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-db-create-dv96d" event={"ID":"76495fd7-f795-4340-85fa-9f1469bbd1aa","Type":"ContainerDied","Data":"0a8f3ee29272f42b879ea7bd68b49ee864cdc1330c6740bc020afd8b722065b3"} Jan 11 17:49:26 crc kubenswrapper[4837]: I0111 17:49:26.957040 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-create-m8sjp" event={"ID":"65ae93f3-f5ad-4225-86a9-c7bf0748d84f","Type":"ContainerDied","Data":"68a625d5bf9975c052b4456c121cc3f7a6713e85fa06ab3fdef1cae30b3fb6cb"} Jan 11 17:49:26 crc kubenswrapper[4837]: I0111 17:49:26.957065 4837 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="68a625d5bf9975c052b4456c121cc3f7a6713e85fa06ab3fdef1cae30b3fb6cb" Jan 11 17:49:26 crc kubenswrapper[4837]: I0111 17:49:26.957755 4837 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-create-m8sjp" Jan 11 17:49:26 crc kubenswrapper[4837]: I0111 17:49:26.962137 4837 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-86db49b7ff-79c66" Jan 11 17:49:26 crc kubenswrapper[4837]: I0111 17:49:26.991823 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/e7a1bd32-302a-49e9-a142-942dec63be03-ovsdbserver-nb\") pod \"e7a1bd32-302a-49e9-a142-942dec63be03\" (UID: \"e7a1bd32-302a-49e9-a142-942dec63be03\") " Jan 11 17:49:26 crc kubenswrapper[4837]: I0111 17:49:26.991921 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/e7a1bd32-302a-49e9-a142-942dec63be03-dns-svc\") pod \"e7a1bd32-302a-49e9-a142-942dec63be03\" (UID: \"e7a1bd32-302a-49e9-a142-942dec63be03\") " Jan 11 17:49:26 crc kubenswrapper[4837]: I0111 17:49:26.992012 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e7a1bd32-302a-49e9-a142-942dec63be03-config\") pod \"e7a1bd32-302a-49e9-a142-942dec63be03\" (UID: \"e7a1bd32-302a-49e9-a142-942dec63be03\") " Jan 11 17:49:26 crc kubenswrapper[4837]: I0111 17:49:26.992104 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/e7a1bd32-302a-49e9-a142-942dec63be03-ovsdbserver-sb\") pod \"e7a1bd32-302a-49e9-a142-942dec63be03\" (UID: \"e7a1bd32-302a-49e9-a142-942dec63be03\") " Jan 11 17:49:26 crc kubenswrapper[4837]: I0111 17:49:26.992151 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-tl4fn\" (UniqueName: \"kubernetes.io/projected/e7a1bd32-302a-49e9-a142-942dec63be03-kube-api-access-tl4fn\") pod \"e7a1bd32-302a-49e9-a142-942dec63be03\" (UID: \"e7a1bd32-302a-49e9-a142-942dec63be03\") " Jan 11 17:49:27 crc kubenswrapper[4837]: I0111 17:49:27.000445 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e7a1bd32-302a-49e9-a142-942dec63be03-kube-api-access-tl4fn" (OuterVolumeSpecName: "kube-api-access-tl4fn") pod "e7a1bd32-302a-49e9-a142-942dec63be03" (UID: "e7a1bd32-302a-49e9-a142-942dec63be03"). InnerVolumeSpecName "kube-api-access-tl4fn". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 11 17:49:27 crc kubenswrapper[4837]: I0111 17:49:27.049090 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e7a1bd32-302a-49e9-a142-942dec63be03-config" (OuterVolumeSpecName: "config") pod "e7a1bd32-302a-49e9-a142-942dec63be03" (UID: "e7a1bd32-302a-49e9-a142-942dec63be03"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 11 17:49:27 crc kubenswrapper[4837]: I0111 17:49:27.050464 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e7a1bd32-302a-49e9-a142-942dec63be03-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "e7a1bd32-302a-49e9-a142-942dec63be03" (UID: "e7a1bd32-302a-49e9-a142-942dec63be03"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 11 17:49:27 crc kubenswrapper[4837]: I0111 17:49:27.075664 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e7a1bd32-302a-49e9-a142-942dec63be03-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "e7a1bd32-302a-49e9-a142-942dec63be03" (UID: "e7a1bd32-302a-49e9-a142-942dec63be03"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 11 17:49:27 crc kubenswrapper[4837]: I0111 17:49:27.085639 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e7a1bd32-302a-49e9-a142-942dec63be03-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "e7a1bd32-302a-49e9-a142-942dec63be03" (UID: "e7a1bd32-302a-49e9-a142-942dec63be03"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 11 17:49:27 crc kubenswrapper[4837]: I0111 17:49:27.092985 4837 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/e7a1bd32-302a-49e9-a142-942dec63be03-dns-svc\") on node \"crc\" DevicePath \"\"" Jan 11 17:49:27 crc kubenswrapper[4837]: I0111 17:49:27.093005 4837 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e7a1bd32-302a-49e9-a142-942dec63be03-config\") on node \"crc\" DevicePath \"\"" Jan 11 17:49:27 crc kubenswrapper[4837]: I0111 17:49:27.093015 4837 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/e7a1bd32-302a-49e9-a142-942dec63be03-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Jan 11 17:49:27 crc kubenswrapper[4837]: I0111 17:49:27.093025 4837 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-tl4fn\" (UniqueName: \"kubernetes.io/projected/e7a1bd32-302a-49e9-a142-942dec63be03-kube-api-access-tl4fn\") on node \"crc\" DevicePath \"\"" Jan 11 17:49:27 crc kubenswrapper[4837]: I0111 17:49:27.093035 4837 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/e7a1bd32-302a-49e9-a142-942dec63be03-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Jan 11 17:49:27 crc kubenswrapper[4837]: I0111 17:49:27.970187 4837 generic.go:334] "Generic (PLEG): container finished" podID="7c933d5f-4b3e-43c8-a87d-67b274355687" containerID="bc3597a6896d5097d8928cdc96d9475de068e7fcb43bbbc7e760fe9c2a0d2a35" exitCode=0 Jan 11 17:49:27 crc kubenswrapper[4837]: I0111 17:49:27.970594 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-ce95-account-create-update-mv8vt" event={"ID":"7c933d5f-4b3e-43c8-a87d-67b274355687","Type":"ContainerDied","Data":"bc3597a6896d5097d8928cdc96d9475de068e7fcb43bbbc7e760fe9c2a0d2a35"} Jan 11 17:49:27 crc kubenswrapper[4837]: I0111 17:49:27.981152 4837 generic.go:334] "Generic (PLEG): container finished" podID="129b7610-1c04-47c2-ba4d-4c20195c2071" containerID="f1a691cbefb207e0912f108bb1859fef3a8ca9a89fb8eee662e8d939b31df464" exitCode=0 Jan 11 17:49:27 crc kubenswrapper[4837]: I0111 17:49:27.981252 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-8247-account-create-update-chtt6" event={"ID":"129b7610-1c04-47c2-ba4d-4c20195c2071","Type":"ContainerDied","Data":"f1a691cbefb207e0912f108bb1859fef3a8ca9a89fb8eee662e8d939b31df464"} Jan 11 17:49:27 crc kubenswrapper[4837]: I0111 17:49:27.984788 4837 generic.go:334] "Generic (PLEG): container finished" podID="3a0934ac-db06-4719-9ee0-edbefddcd983" containerID="ab94431e6e856e8762b5638e35264dba0d0cf5be791233e8088e845bcdf0924c" exitCode=0 Jan 11 17:49:27 crc kubenswrapper[4837]: I0111 17:49:27.984858 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db-create-b9fmt" event={"ID":"3a0934ac-db06-4719-9ee0-edbefddcd983","Type":"ContainerDied","Data":"ab94431e6e856e8762b5638e35264dba0d0cf5be791233e8088e845bcdf0924c"} Jan 11 17:49:27 crc kubenswrapper[4837]: I0111 17:49:27.993520 4837 generic.go:334] "Generic (PLEG): container finished" podID="a5853482-7ef2-4be7-83ac-5212d4db1696" containerID="274cf8df922185f1b531431260cbe8bc072410bcd7e3df68013113f1b78a39c4" exitCode=0 Jan 11 17:49:27 crc kubenswrapper[4837]: I0111 17:49:27.993699 4837 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-86db49b7ff-79c66" Jan 11 17:49:27 crc kubenswrapper[4837]: I0111 17:49:27.994112 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-047d-account-create-update-9296c" event={"ID":"a5853482-7ef2-4be7-83ac-5212d4db1696","Type":"ContainerDied","Data":"274cf8df922185f1b531431260cbe8bc072410bcd7e3df68013113f1b78a39c4"} Jan 11 17:49:28 crc kubenswrapper[4837]: I0111 17:49:28.096530 4837 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-86db49b7ff-79c66"] Jan 11 17:49:28 crc kubenswrapper[4837]: I0111 17:49:28.105736 4837 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-86db49b7ff-79c66"] Jan 11 17:49:28 crc kubenswrapper[4837]: I0111 17:49:28.343917 4837 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-create-dv96d" Jan 11 17:49:28 crc kubenswrapper[4837]: I0111 17:49:28.392325 4837 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3286e3b8-db71-4590-8874-b255bb5600cd" path="/var/lib/kubelet/pods/3286e3b8-db71-4590-8874-b255bb5600cd/volumes" Jan 11 17:49:28 crc kubenswrapper[4837]: I0111 17:49:28.393241 4837 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e7a1bd32-302a-49e9-a142-942dec63be03" path="/var/lib/kubelet/pods/e7a1bd32-302a-49e9-a142-942dec63be03/volumes" Jan 11 17:49:28 crc kubenswrapper[4837]: I0111 17:49:28.524136 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/76495fd7-f795-4340-85fa-9f1469bbd1aa-operator-scripts\") pod \"76495fd7-f795-4340-85fa-9f1469bbd1aa\" (UID: \"76495fd7-f795-4340-85fa-9f1469bbd1aa\") " Jan 11 17:49:28 crc kubenswrapper[4837]: I0111 17:49:28.524230 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-9fpb5\" (UniqueName: \"kubernetes.io/projected/76495fd7-f795-4340-85fa-9f1469bbd1aa-kube-api-access-9fpb5\") pod \"76495fd7-f795-4340-85fa-9f1469bbd1aa\" (UID: \"76495fd7-f795-4340-85fa-9f1469bbd1aa\") " Jan 11 17:49:28 crc kubenswrapper[4837]: I0111 17:49:28.525135 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/76495fd7-f795-4340-85fa-9f1469bbd1aa-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "76495fd7-f795-4340-85fa-9f1469bbd1aa" (UID: "76495fd7-f795-4340-85fa-9f1469bbd1aa"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 11 17:49:28 crc kubenswrapper[4837]: I0111 17:49:28.535270 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/76495fd7-f795-4340-85fa-9f1469bbd1aa-kube-api-access-9fpb5" (OuterVolumeSpecName: "kube-api-access-9fpb5") pod "76495fd7-f795-4340-85fa-9f1469bbd1aa" (UID: "76495fd7-f795-4340-85fa-9f1469bbd1aa"). InnerVolumeSpecName "kube-api-access-9fpb5". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 11 17:49:28 crc kubenswrapper[4837]: I0111 17:49:28.627030 4837 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/76495fd7-f795-4340-85fa-9f1469bbd1aa-operator-scripts\") on node \"crc\" DevicePath \"\"" Jan 11 17:49:28 crc kubenswrapper[4837]: I0111 17:49:28.627071 4837 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-9fpb5\" (UniqueName: \"kubernetes.io/projected/76495fd7-f795-4340-85fa-9f1469bbd1aa-kube-api-access-9fpb5\") on node \"crc\" DevicePath \"\"" Jan 11 17:49:29 crc kubenswrapper[4837]: I0111 17:49:29.012730 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-db-create-dv96d" event={"ID":"76495fd7-f795-4340-85fa-9f1469bbd1aa","Type":"ContainerDied","Data":"aa9c174d0ff9f8e619b4b9382be8d48ddda3aaca4a6ca65cdb61cc74963a486c"} Jan 11 17:49:29 crc kubenswrapper[4837]: I0111 17:49:29.012780 4837 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="aa9c174d0ff9f8e619b4b9382be8d48ddda3aaca4a6ca65cdb61cc74963a486c" Jan 11 17:49:29 crc kubenswrapper[4837]: I0111 17:49:29.012828 4837 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-create-dv96d" Jan 11 17:49:29 crc kubenswrapper[4837]: I0111 17:49:29.353513 4837 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-create-b9fmt" Jan 11 17:49:29 crc kubenswrapper[4837]: I0111 17:49:29.544840 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-k8sr5\" (UniqueName: \"kubernetes.io/projected/3a0934ac-db06-4719-9ee0-edbefddcd983-kube-api-access-k8sr5\") pod \"3a0934ac-db06-4719-9ee0-edbefddcd983\" (UID: \"3a0934ac-db06-4719-9ee0-edbefddcd983\") " Jan 11 17:49:29 crc kubenswrapper[4837]: I0111 17:49:29.545026 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/3a0934ac-db06-4719-9ee0-edbefddcd983-operator-scripts\") pod \"3a0934ac-db06-4719-9ee0-edbefddcd983\" (UID: \"3a0934ac-db06-4719-9ee0-edbefddcd983\") " Jan 11 17:49:29 crc kubenswrapper[4837]: I0111 17:49:29.547566 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/3a0934ac-db06-4719-9ee0-edbefddcd983-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "3a0934ac-db06-4719-9ee0-edbefddcd983" (UID: "3a0934ac-db06-4719-9ee0-edbefddcd983"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 11 17:49:29 crc kubenswrapper[4837]: I0111 17:49:29.551270 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3a0934ac-db06-4719-9ee0-edbefddcd983-kube-api-access-k8sr5" (OuterVolumeSpecName: "kube-api-access-k8sr5") pod "3a0934ac-db06-4719-9ee0-edbefddcd983" (UID: "3a0934ac-db06-4719-9ee0-edbefddcd983"). InnerVolumeSpecName "kube-api-access-k8sr5". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 11 17:49:29 crc kubenswrapper[4837]: I0111 17:49:29.602800 4837 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-ce95-account-create-update-mv8vt" Jan 11 17:49:29 crc kubenswrapper[4837]: I0111 17:49:29.614140 4837 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/placement-047d-account-create-update-9296c" Jan 11 17:49:29 crc kubenswrapper[4837]: I0111 17:49:29.639298 4837 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-8247-account-create-update-chtt6" Jan 11 17:49:29 crc kubenswrapper[4837]: I0111 17:49:29.646941 4837 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-k8sr5\" (UniqueName: \"kubernetes.io/projected/3a0934ac-db06-4719-9ee0-edbefddcd983-kube-api-access-k8sr5\") on node \"crc\" DevicePath \"\"" Jan 11 17:49:29 crc kubenswrapper[4837]: I0111 17:49:29.646974 4837 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/3a0934ac-db06-4719-9ee0-edbefddcd983-operator-scripts\") on node \"crc\" DevicePath \"\"" Jan 11 17:49:29 crc kubenswrapper[4837]: I0111 17:49:29.748342 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/7c933d5f-4b3e-43c8-a87d-67b274355687-operator-scripts\") pod \"7c933d5f-4b3e-43c8-a87d-67b274355687\" (UID: \"7c933d5f-4b3e-43c8-a87d-67b274355687\") " Jan 11 17:49:29 crc kubenswrapper[4837]: I0111 17:49:29.748464 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-vqztt\" (UniqueName: \"kubernetes.io/projected/a5853482-7ef2-4be7-83ac-5212d4db1696-kube-api-access-vqztt\") pod \"a5853482-7ef2-4be7-83ac-5212d4db1696\" (UID: \"a5853482-7ef2-4be7-83ac-5212d4db1696\") " Jan 11 17:49:29 crc kubenswrapper[4837]: I0111 17:49:29.748525 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/129b7610-1c04-47c2-ba4d-4c20195c2071-operator-scripts\") pod \"129b7610-1c04-47c2-ba4d-4c20195c2071\" (UID: \"129b7610-1c04-47c2-ba4d-4c20195c2071\") " Jan 11 17:49:29 crc kubenswrapper[4837]: I0111 17:49:29.748608 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-5ng5r\" (UniqueName: \"kubernetes.io/projected/7c933d5f-4b3e-43c8-a87d-67b274355687-kube-api-access-5ng5r\") pod \"7c933d5f-4b3e-43c8-a87d-67b274355687\" (UID: \"7c933d5f-4b3e-43c8-a87d-67b274355687\") " Jan 11 17:49:29 crc kubenswrapper[4837]: I0111 17:49:29.748754 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/a5853482-7ef2-4be7-83ac-5212d4db1696-operator-scripts\") pod \"a5853482-7ef2-4be7-83ac-5212d4db1696\" (UID: \"a5853482-7ef2-4be7-83ac-5212d4db1696\") " Jan 11 17:49:29 crc kubenswrapper[4837]: I0111 17:49:29.748954 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-5xl9q\" (UniqueName: \"kubernetes.io/projected/129b7610-1c04-47c2-ba4d-4c20195c2071-kube-api-access-5xl9q\") pod \"129b7610-1c04-47c2-ba4d-4c20195c2071\" (UID: \"129b7610-1c04-47c2-ba4d-4c20195c2071\") " Jan 11 17:49:29 crc kubenswrapper[4837]: I0111 17:49:29.749339 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/129b7610-1c04-47c2-ba4d-4c20195c2071-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "129b7610-1c04-47c2-ba4d-4c20195c2071" (UID: "129b7610-1c04-47c2-ba4d-4c20195c2071"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 11 17:49:29 crc kubenswrapper[4837]: I0111 17:49:29.749413 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7c933d5f-4b3e-43c8-a87d-67b274355687-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "7c933d5f-4b3e-43c8-a87d-67b274355687" (UID: "7c933d5f-4b3e-43c8-a87d-67b274355687"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 11 17:49:29 crc kubenswrapper[4837]: I0111 17:49:29.749706 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a5853482-7ef2-4be7-83ac-5212d4db1696-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "a5853482-7ef2-4be7-83ac-5212d4db1696" (UID: "a5853482-7ef2-4be7-83ac-5212d4db1696"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 11 17:49:29 crc kubenswrapper[4837]: I0111 17:49:29.750160 4837 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/a5853482-7ef2-4be7-83ac-5212d4db1696-operator-scripts\") on node \"crc\" DevicePath \"\"" Jan 11 17:49:29 crc kubenswrapper[4837]: I0111 17:49:29.750190 4837 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/7c933d5f-4b3e-43c8-a87d-67b274355687-operator-scripts\") on node \"crc\" DevicePath \"\"" Jan 11 17:49:29 crc kubenswrapper[4837]: I0111 17:49:29.750202 4837 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/129b7610-1c04-47c2-ba4d-4c20195c2071-operator-scripts\") on node \"crc\" DevicePath \"\"" Jan 11 17:49:29 crc kubenswrapper[4837]: I0111 17:49:29.751738 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7c933d5f-4b3e-43c8-a87d-67b274355687-kube-api-access-5ng5r" (OuterVolumeSpecName: "kube-api-access-5ng5r") pod "7c933d5f-4b3e-43c8-a87d-67b274355687" (UID: "7c933d5f-4b3e-43c8-a87d-67b274355687"). InnerVolumeSpecName "kube-api-access-5ng5r". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 11 17:49:29 crc kubenswrapper[4837]: I0111 17:49:29.752411 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a5853482-7ef2-4be7-83ac-5212d4db1696-kube-api-access-vqztt" (OuterVolumeSpecName: "kube-api-access-vqztt") pod "a5853482-7ef2-4be7-83ac-5212d4db1696" (UID: "a5853482-7ef2-4be7-83ac-5212d4db1696"). InnerVolumeSpecName "kube-api-access-vqztt". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 11 17:49:29 crc kubenswrapper[4837]: I0111 17:49:29.755767 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/129b7610-1c04-47c2-ba4d-4c20195c2071-kube-api-access-5xl9q" (OuterVolumeSpecName: "kube-api-access-5xl9q") pod "129b7610-1c04-47c2-ba4d-4c20195c2071" (UID: "129b7610-1c04-47c2-ba4d-4c20195c2071"). InnerVolumeSpecName "kube-api-access-5xl9q". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 11 17:49:29 crc kubenswrapper[4837]: I0111 17:49:29.852408 4837 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-vqztt\" (UniqueName: \"kubernetes.io/projected/a5853482-7ef2-4be7-83ac-5212d4db1696-kube-api-access-vqztt\") on node \"crc\" DevicePath \"\"" Jan 11 17:49:29 crc kubenswrapper[4837]: I0111 17:49:29.852444 4837 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-5ng5r\" (UniqueName: \"kubernetes.io/projected/7c933d5f-4b3e-43c8-a87d-67b274355687-kube-api-access-5ng5r\") on node \"crc\" DevicePath \"\"" Jan 11 17:49:29 crc kubenswrapper[4837]: I0111 17:49:29.852458 4837 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-5xl9q\" (UniqueName: \"kubernetes.io/projected/129b7610-1c04-47c2-ba4d-4c20195c2071-kube-api-access-5xl9q\") on node \"crc\" DevicePath \"\"" Jan 11 17:49:30 crc kubenswrapper[4837]: I0111 17:49:30.023331 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db-create-b9fmt" event={"ID":"3a0934ac-db06-4719-9ee0-edbefddcd983","Type":"ContainerDied","Data":"0ec46b96398c27600fd68c848527f2bb870a290db0b539a045e44b36899b2e6f"} Jan 11 17:49:30 crc kubenswrapper[4837]: I0111 17:49:30.023394 4837 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="0ec46b96398c27600fd68c848527f2bb870a290db0b539a045e44b36899b2e6f" Jan 11 17:49:30 crc kubenswrapper[4837]: I0111 17:49:30.023408 4837 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-create-b9fmt" Jan 11 17:49:30 crc kubenswrapper[4837]: I0111 17:49:30.026381 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-ce95-account-create-update-mv8vt" event={"ID":"7c933d5f-4b3e-43c8-a87d-67b274355687","Type":"ContainerDied","Data":"1dfe1685c7c33c409617b3ea0b6089569997b8bcc6d967c5b713fa8e1fb43d21"} Jan 11 17:49:30 crc kubenswrapper[4837]: I0111 17:49:30.026439 4837 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="1dfe1685c7c33c409617b3ea0b6089569997b8bcc6d967c5b713fa8e1fb43d21" Jan 11 17:49:30 crc kubenswrapper[4837]: I0111 17:49:30.026515 4837 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-ce95-account-create-update-mv8vt" Jan 11 17:49:30 crc kubenswrapper[4837]: I0111 17:49:30.031470 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-8247-account-create-update-chtt6" event={"ID":"129b7610-1c04-47c2-ba4d-4c20195c2071","Type":"ContainerDied","Data":"31416beb4552974bc3a4b44bc247070e12a5357532fa5c46e4e909975203450b"} Jan 11 17:49:30 crc kubenswrapper[4837]: I0111 17:49:30.031514 4837 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="31416beb4552974bc3a4b44bc247070e12a5357532fa5c46e4e909975203450b" Jan 11 17:49:30 crc kubenswrapper[4837]: I0111 17:49:30.031569 4837 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-8247-account-create-update-chtt6" Jan 11 17:49:30 crc kubenswrapper[4837]: I0111 17:49:30.034250 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-047d-account-create-update-9296c" event={"ID":"a5853482-7ef2-4be7-83ac-5212d4db1696","Type":"ContainerDied","Data":"ac6e2ce8399b737ea76ecab1afd967bb501f59a65e4990c8b5e6eb4969631963"} Jan 11 17:49:30 crc kubenswrapper[4837]: I0111 17:49:30.034300 4837 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="ac6e2ce8399b737ea76ecab1afd967bb501f59a65e4990c8b5e6eb4969631963" Jan 11 17:49:30 crc kubenswrapper[4837]: I0111 17:49:30.034324 4837 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/placement-047d-account-create-update-9296c" Jan 11 17:49:30 crc kubenswrapper[4837]: I0111 17:49:30.180643 4837 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/root-account-create-update-865d2"] Jan 11 17:49:30 crc kubenswrapper[4837]: E0111 17:49:30.181135 4837 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a5853482-7ef2-4be7-83ac-5212d4db1696" containerName="mariadb-account-create-update" Jan 11 17:49:30 crc kubenswrapper[4837]: I0111 17:49:30.181165 4837 state_mem.go:107] "Deleted CPUSet assignment" podUID="a5853482-7ef2-4be7-83ac-5212d4db1696" containerName="mariadb-account-create-update" Jan 11 17:49:30 crc kubenswrapper[4837]: E0111 17:49:30.181187 4837 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e7a1bd32-302a-49e9-a142-942dec63be03" containerName="init" Jan 11 17:49:30 crc kubenswrapper[4837]: I0111 17:49:30.181197 4837 state_mem.go:107] "Deleted CPUSet assignment" podUID="e7a1bd32-302a-49e9-a142-942dec63be03" containerName="init" Jan 11 17:49:30 crc kubenswrapper[4837]: E0111 17:49:30.181221 4837 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7c933d5f-4b3e-43c8-a87d-67b274355687" containerName="mariadb-account-create-update" Jan 11 17:49:30 crc kubenswrapper[4837]: I0111 17:49:30.181232 4837 state_mem.go:107] "Deleted CPUSet assignment" podUID="7c933d5f-4b3e-43c8-a87d-67b274355687" containerName="mariadb-account-create-update" Jan 11 17:49:30 crc kubenswrapper[4837]: E0111 17:49:30.181256 4837 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3286e3b8-db71-4590-8874-b255bb5600cd" containerName="mariadb-account-create-update" Jan 11 17:49:30 crc kubenswrapper[4837]: I0111 17:49:30.181269 4837 state_mem.go:107] "Deleted CPUSet assignment" podUID="3286e3b8-db71-4590-8874-b255bb5600cd" containerName="mariadb-account-create-update" Jan 11 17:49:30 crc kubenswrapper[4837]: E0111 17:49:30.181328 4837 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3a0934ac-db06-4719-9ee0-edbefddcd983" containerName="mariadb-database-create" Jan 11 17:49:30 crc kubenswrapper[4837]: I0111 17:49:30.181339 4837 state_mem.go:107] "Deleted CPUSet assignment" podUID="3a0934ac-db06-4719-9ee0-edbefddcd983" containerName="mariadb-database-create" Jan 11 17:49:30 crc kubenswrapper[4837]: E0111 17:49:30.181354 4837 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e7a1bd32-302a-49e9-a142-942dec63be03" containerName="dnsmasq-dns" Jan 11 17:49:30 crc kubenswrapper[4837]: I0111 17:49:30.181363 4837 state_mem.go:107] "Deleted CPUSet assignment" podUID="e7a1bd32-302a-49e9-a142-942dec63be03" containerName="dnsmasq-dns" Jan 11 17:49:30 crc kubenswrapper[4837]: E0111 17:49:30.181379 4837 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="76495fd7-f795-4340-85fa-9f1469bbd1aa" containerName="mariadb-database-create" Jan 11 17:49:30 crc kubenswrapper[4837]: I0111 17:49:30.181389 4837 state_mem.go:107] "Deleted CPUSet assignment" podUID="76495fd7-f795-4340-85fa-9f1469bbd1aa" containerName="mariadb-database-create" Jan 11 17:49:30 crc kubenswrapper[4837]: E0111 17:49:30.181404 4837 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="65ae93f3-f5ad-4225-86a9-c7bf0748d84f" containerName="mariadb-database-create" Jan 11 17:49:30 crc kubenswrapper[4837]: I0111 17:49:30.181414 4837 state_mem.go:107] "Deleted CPUSet assignment" podUID="65ae93f3-f5ad-4225-86a9-c7bf0748d84f" containerName="mariadb-database-create" Jan 11 17:49:30 crc kubenswrapper[4837]: E0111 17:49:30.181427 4837 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="129b7610-1c04-47c2-ba4d-4c20195c2071" containerName="mariadb-account-create-update" Jan 11 17:49:30 crc kubenswrapper[4837]: I0111 17:49:30.181438 4837 state_mem.go:107] "Deleted CPUSet assignment" podUID="129b7610-1c04-47c2-ba4d-4c20195c2071" containerName="mariadb-account-create-update" Jan 11 17:49:30 crc kubenswrapper[4837]: I0111 17:49:30.181665 4837 memory_manager.go:354] "RemoveStaleState removing state" podUID="65ae93f3-f5ad-4225-86a9-c7bf0748d84f" containerName="mariadb-database-create" Jan 11 17:49:30 crc kubenswrapper[4837]: I0111 17:49:30.181706 4837 memory_manager.go:354] "RemoveStaleState removing state" podUID="7c933d5f-4b3e-43c8-a87d-67b274355687" containerName="mariadb-account-create-update" Jan 11 17:49:30 crc kubenswrapper[4837]: I0111 17:49:30.181725 4837 memory_manager.go:354] "RemoveStaleState removing state" podUID="3286e3b8-db71-4590-8874-b255bb5600cd" containerName="mariadb-account-create-update" Jan 11 17:49:30 crc kubenswrapper[4837]: I0111 17:49:30.181744 4837 memory_manager.go:354] "RemoveStaleState removing state" podUID="a5853482-7ef2-4be7-83ac-5212d4db1696" containerName="mariadb-account-create-update" Jan 11 17:49:30 crc kubenswrapper[4837]: I0111 17:49:30.181764 4837 memory_manager.go:354] "RemoveStaleState removing state" podUID="3a0934ac-db06-4719-9ee0-edbefddcd983" containerName="mariadb-database-create" Jan 11 17:49:30 crc kubenswrapper[4837]: I0111 17:49:30.181782 4837 memory_manager.go:354] "RemoveStaleState removing state" podUID="76495fd7-f795-4340-85fa-9f1469bbd1aa" containerName="mariadb-database-create" Jan 11 17:49:30 crc kubenswrapper[4837]: I0111 17:49:30.181797 4837 memory_manager.go:354] "RemoveStaleState removing state" podUID="129b7610-1c04-47c2-ba4d-4c20195c2071" containerName="mariadb-account-create-update" Jan 11 17:49:30 crc kubenswrapper[4837]: I0111 17:49:30.181812 4837 memory_manager.go:354] "RemoveStaleState removing state" podUID="e7a1bd32-302a-49e9-a142-942dec63be03" containerName="dnsmasq-dns" Jan 11 17:49:30 crc kubenswrapper[4837]: I0111 17:49:30.182526 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/root-account-create-update-865d2" Jan 11 17:49:30 crc kubenswrapper[4837]: I0111 17:49:30.185112 4837 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-mariadb-root-db-secret" Jan 11 17:49:30 crc kubenswrapper[4837]: I0111 17:49:30.195165 4837 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/root-account-create-update-865d2"] Jan 11 17:49:30 crc kubenswrapper[4837]: I0111 17:49:30.265090 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/d8ed2a92-a88f-486e-ac1f-ec1b59216ad6-operator-scripts\") pod \"root-account-create-update-865d2\" (UID: \"d8ed2a92-a88f-486e-ac1f-ec1b59216ad6\") " pod="openstack/root-account-create-update-865d2" Jan 11 17:49:30 crc kubenswrapper[4837]: I0111 17:49:30.265132 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vqw6h\" (UniqueName: \"kubernetes.io/projected/d8ed2a92-a88f-486e-ac1f-ec1b59216ad6-kube-api-access-vqw6h\") pod \"root-account-create-update-865d2\" (UID: \"d8ed2a92-a88f-486e-ac1f-ec1b59216ad6\") " pod="openstack/root-account-create-update-865d2" Jan 11 17:49:30 crc kubenswrapper[4837]: I0111 17:49:30.375979 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/d8ed2a92-a88f-486e-ac1f-ec1b59216ad6-operator-scripts\") pod \"root-account-create-update-865d2\" (UID: \"d8ed2a92-a88f-486e-ac1f-ec1b59216ad6\") " pod="openstack/root-account-create-update-865d2" Jan 11 17:49:30 crc kubenswrapper[4837]: I0111 17:49:30.376098 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vqw6h\" (UniqueName: \"kubernetes.io/projected/d8ed2a92-a88f-486e-ac1f-ec1b59216ad6-kube-api-access-vqw6h\") pod \"root-account-create-update-865d2\" (UID: \"d8ed2a92-a88f-486e-ac1f-ec1b59216ad6\") " pod="openstack/root-account-create-update-865d2" Jan 11 17:49:30 crc kubenswrapper[4837]: I0111 17:49:30.382082 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/d8ed2a92-a88f-486e-ac1f-ec1b59216ad6-operator-scripts\") pod \"root-account-create-update-865d2\" (UID: \"d8ed2a92-a88f-486e-ac1f-ec1b59216ad6\") " pod="openstack/root-account-create-update-865d2" Jan 11 17:49:30 crc kubenswrapper[4837]: I0111 17:49:30.414669 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vqw6h\" (UniqueName: \"kubernetes.io/projected/d8ed2a92-a88f-486e-ac1f-ec1b59216ad6-kube-api-access-vqw6h\") pod \"root-account-create-update-865d2\" (UID: \"d8ed2a92-a88f-486e-ac1f-ec1b59216ad6\") " pod="openstack/root-account-create-update-865d2" Jan 11 17:49:30 crc kubenswrapper[4837]: I0111 17:49:30.557290 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/root-account-create-update-865d2" Jan 11 17:49:30 crc kubenswrapper[4837]: W0111 17:49:30.983123 4837 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podd8ed2a92_a88f_486e_ac1f_ec1b59216ad6.slice/crio-efc24fd692f0b2beacd08ecd3954eec8f2f8ac4ec362c4440d0aff1094753315 WatchSource:0}: Error finding container efc24fd692f0b2beacd08ecd3954eec8f2f8ac4ec362c4440d0aff1094753315: Status 404 returned error can't find the container with id efc24fd692f0b2beacd08ecd3954eec8f2f8ac4ec362c4440d0aff1094753315 Jan 11 17:49:30 crc kubenswrapper[4837]: I0111 17:49:30.983381 4837 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/root-account-create-update-865d2"] Jan 11 17:49:31 crc kubenswrapper[4837]: I0111 17:49:31.044383 4837 generic.go:334] "Generic (PLEG): container finished" podID="43068ba1-1d19-4822-88fa-e52f8fb21738" containerID="d3dd869d3fd64b1f5ab874a0f53ad826f8a488a91a162c3b44beaa01e0102440" exitCode=0 Jan 11 17:49:31 crc kubenswrapper[4837]: I0111 17:49:31.044480 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-ring-rebalance-qn5fn" event={"ID":"43068ba1-1d19-4822-88fa-e52f8fb21738","Type":"ContainerDied","Data":"d3dd869d3fd64b1f5ab874a0f53ad826f8a488a91a162c3b44beaa01e0102440"} Jan 11 17:49:31 crc kubenswrapper[4837]: I0111 17:49:31.046794 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/root-account-create-update-865d2" event={"ID":"d8ed2a92-a88f-486e-ac1f-ec1b59216ad6","Type":"ContainerStarted","Data":"efc24fd692f0b2beacd08ecd3954eec8f2f8ac4ec362c4440d0aff1094753315"} Jan 11 17:49:32 crc kubenswrapper[4837]: I0111 17:49:32.059120 4837 generic.go:334] "Generic (PLEG): container finished" podID="d8ed2a92-a88f-486e-ac1f-ec1b59216ad6" containerID="090c16b6579339fa8a190dd68dae626cf24eced302842fd629cf501463501539" exitCode=0 Jan 11 17:49:32 crc kubenswrapper[4837]: I0111 17:49:32.059188 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/root-account-create-update-865d2" event={"ID":"d8ed2a92-a88f-486e-ac1f-ec1b59216ad6","Type":"ContainerDied","Data":"090c16b6579339fa8a190dd68dae626cf24eced302842fd629cf501463501539"} Jan 11 17:49:32 crc kubenswrapper[4837]: I0111 17:49:32.216554 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/23a0b787-b5b4-4a4e-828b-d7f34853603f-etc-swift\") pod \"swift-storage-0\" (UID: \"23a0b787-b5b4-4a4e-828b-d7f34853603f\") " pod="openstack/swift-storage-0" Jan 11 17:49:32 crc kubenswrapper[4837]: I0111 17:49:32.229293 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/23a0b787-b5b4-4a4e-828b-d7f34853603f-etc-swift\") pod \"swift-storage-0\" (UID: \"23a0b787-b5b4-4a4e-828b-d7f34853603f\") " pod="openstack/swift-storage-0" Jan 11 17:49:32 crc kubenswrapper[4837]: I0111 17:49:32.270086 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/swift-storage-0" Jan 11 17:49:32 crc kubenswrapper[4837]: I0111 17:49:32.470167 4837 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/swift-ring-rebalance-qn5fn" Jan 11 17:49:32 crc kubenswrapper[4837]: I0111 17:49:32.521506 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/43068ba1-1d19-4822-88fa-e52f8fb21738-swiftconf\") pod \"43068ba1-1d19-4822-88fa-e52f8fb21738\" (UID: \"43068ba1-1d19-4822-88fa-e52f8fb21738\") " Jan 11 17:49:32 crc kubenswrapper[4837]: I0111 17:49:32.521757 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/43068ba1-1d19-4822-88fa-e52f8fb21738-scripts\") pod \"43068ba1-1d19-4822-88fa-e52f8fb21738\" (UID: \"43068ba1-1d19-4822-88fa-e52f8fb21738\") " Jan 11 17:49:32 crc kubenswrapper[4837]: I0111 17:49:32.521840 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/43068ba1-1d19-4822-88fa-e52f8fb21738-etc-swift\") pod \"43068ba1-1d19-4822-88fa-e52f8fb21738\" (UID: \"43068ba1-1d19-4822-88fa-e52f8fb21738\") " Jan 11 17:49:32 crc kubenswrapper[4837]: I0111 17:49:32.521920 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/43068ba1-1d19-4822-88fa-e52f8fb21738-combined-ca-bundle\") pod \"43068ba1-1d19-4822-88fa-e52f8fb21738\" (UID: \"43068ba1-1d19-4822-88fa-e52f8fb21738\") " Jan 11 17:49:32 crc kubenswrapper[4837]: I0111 17:49:32.521956 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/43068ba1-1d19-4822-88fa-e52f8fb21738-dispersionconf\") pod \"43068ba1-1d19-4822-88fa-e52f8fb21738\" (UID: \"43068ba1-1d19-4822-88fa-e52f8fb21738\") " Jan 11 17:49:32 crc kubenswrapper[4837]: I0111 17:49:32.521977 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-rnsvp\" (UniqueName: \"kubernetes.io/projected/43068ba1-1d19-4822-88fa-e52f8fb21738-kube-api-access-rnsvp\") pod \"43068ba1-1d19-4822-88fa-e52f8fb21738\" (UID: \"43068ba1-1d19-4822-88fa-e52f8fb21738\") " Jan 11 17:49:32 crc kubenswrapper[4837]: I0111 17:49:32.522112 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/43068ba1-1d19-4822-88fa-e52f8fb21738-ring-data-devices\") pod \"43068ba1-1d19-4822-88fa-e52f8fb21738\" (UID: \"43068ba1-1d19-4822-88fa-e52f8fb21738\") " Jan 11 17:49:32 crc kubenswrapper[4837]: I0111 17:49:32.523176 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/43068ba1-1d19-4822-88fa-e52f8fb21738-ring-data-devices" (OuterVolumeSpecName: "ring-data-devices") pod "43068ba1-1d19-4822-88fa-e52f8fb21738" (UID: "43068ba1-1d19-4822-88fa-e52f8fb21738"). InnerVolumeSpecName "ring-data-devices". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 11 17:49:32 crc kubenswrapper[4837]: I0111 17:49:32.524211 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/43068ba1-1d19-4822-88fa-e52f8fb21738-etc-swift" (OuterVolumeSpecName: "etc-swift") pod "43068ba1-1d19-4822-88fa-e52f8fb21738" (UID: "43068ba1-1d19-4822-88fa-e52f8fb21738"). InnerVolumeSpecName "etc-swift". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 11 17:49:32 crc kubenswrapper[4837]: I0111 17:49:32.530650 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/43068ba1-1d19-4822-88fa-e52f8fb21738-kube-api-access-rnsvp" (OuterVolumeSpecName: "kube-api-access-rnsvp") pod "43068ba1-1d19-4822-88fa-e52f8fb21738" (UID: "43068ba1-1d19-4822-88fa-e52f8fb21738"). InnerVolumeSpecName "kube-api-access-rnsvp". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 11 17:49:32 crc kubenswrapper[4837]: I0111 17:49:32.531266 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/43068ba1-1d19-4822-88fa-e52f8fb21738-dispersionconf" (OuterVolumeSpecName: "dispersionconf") pod "43068ba1-1d19-4822-88fa-e52f8fb21738" (UID: "43068ba1-1d19-4822-88fa-e52f8fb21738"). InnerVolumeSpecName "dispersionconf". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 11 17:49:32 crc kubenswrapper[4837]: I0111 17:49:32.546095 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/43068ba1-1d19-4822-88fa-e52f8fb21738-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "43068ba1-1d19-4822-88fa-e52f8fb21738" (UID: "43068ba1-1d19-4822-88fa-e52f8fb21738"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 11 17:49:32 crc kubenswrapper[4837]: I0111 17:49:32.546910 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/43068ba1-1d19-4822-88fa-e52f8fb21738-swiftconf" (OuterVolumeSpecName: "swiftconf") pod "43068ba1-1d19-4822-88fa-e52f8fb21738" (UID: "43068ba1-1d19-4822-88fa-e52f8fb21738"). InnerVolumeSpecName "swiftconf". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 11 17:49:32 crc kubenswrapper[4837]: I0111 17:49:32.548480 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/43068ba1-1d19-4822-88fa-e52f8fb21738-scripts" (OuterVolumeSpecName: "scripts") pod "43068ba1-1d19-4822-88fa-e52f8fb21738" (UID: "43068ba1-1d19-4822-88fa-e52f8fb21738"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 11 17:49:32 crc kubenswrapper[4837]: I0111 17:49:32.623401 4837 reconciler_common.go:293] "Volume detached for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/43068ba1-1d19-4822-88fa-e52f8fb21738-ring-data-devices\") on node \"crc\" DevicePath \"\"" Jan 11 17:49:32 crc kubenswrapper[4837]: I0111 17:49:32.623435 4837 reconciler_common.go:293] "Volume detached for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/43068ba1-1d19-4822-88fa-e52f8fb21738-swiftconf\") on node \"crc\" DevicePath \"\"" Jan 11 17:49:32 crc kubenswrapper[4837]: I0111 17:49:32.623444 4837 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/43068ba1-1d19-4822-88fa-e52f8fb21738-scripts\") on node \"crc\" DevicePath \"\"" Jan 11 17:49:32 crc kubenswrapper[4837]: I0111 17:49:32.623452 4837 reconciler_common.go:293] "Volume detached for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/43068ba1-1d19-4822-88fa-e52f8fb21738-etc-swift\") on node \"crc\" DevicePath \"\"" Jan 11 17:49:32 crc kubenswrapper[4837]: I0111 17:49:32.623461 4837 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/43068ba1-1d19-4822-88fa-e52f8fb21738-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 11 17:49:32 crc kubenswrapper[4837]: I0111 17:49:32.623469 4837 reconciler_common.go:293] "Volume detached for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/43068ba1-1d19-4822-88fa-e52f8fb21738-dispersionconf\") on node \"crc\" DevicePath \"\"" Jan 11 17:49:32 crc kubenswrapper[4837]: I0111 17:49:32.623478 4837 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-rnsvp\" (UniqueName: \"kubernetes.io/projected/43068ba1-1d19-4822-88fa-e52f8fb21738-kube-api-access-rnsvp\") on node \"crc\" DevicePath \"\"" Jan 11 17:49:32 crc kubenswrapper[4837]: I0111 17:49:32.782035 4837 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/swift-storage-0"] Jan 11 17:49:33 crc kubenswrapper[4837]: I0111 17:49:33.070149 4837 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/swift-ring-rebalance-qn5fn" Jan 11 17:49:33 crc kubenswrapper[4837]: I0111 17:49:33.070196 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-ring-rebalance-qn5fn" event={"ID":"43068ba1-1d19-4822-88fa-e52f8fb21738","Type":"ContainerDied","Data":"55a37974e006e2d77d11c40fb1ec9a7fcd1c38e2df0c9636a9f4e6fe6f9b173b"} Jan 11 17:49:33 crc kubenswrapper[4837]: I0111 17:49:33.070878 4837 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="55a37974e006e2d77d11c40fb1ec9a7fcd1c38e2df0c9636a9f4e6fe6f9b173b" Jan 11 17:49:33 crc kubenswrapper[4837]: I0111 17:49:33.071823 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"23a0b787-b5b4-4a4e-828b-d7f34853603f","Type":"ContainerStarted","Data":"2d86cba7bc3d7426840099db4653866464a172cab509dcf5db155ec0376c4b25"} Jan 11 17:49:33 crc kubenswrapper[4837]: I0111 17:49:33.403212 4837 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/root-account-create-update-865d2" Jan 11 17:49:33 crc kubenswrapper[4837]: I0111 17:49:33.448229 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/d8ed2a92-a88f-486e-ac1f-ec1b59216ad6-operator-scripts\") pod \"d8ed2a92-a88f-486e-ac1f-ec1b59216ad6\" (UID: \"d8ed2a92-a88f-486e-ac1f-ec1b59216ad6\") " Jan 11 17:49:33 crc kubenswrapper[4837]: I0111 17:49:33.448373 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-vqw6h\" (UniqueName: \"kubernetes.io/projected/d8ed2a92-a88f-486e-ac1f-ec1b59216ad6-kube-api-access-vqw6h\") pod \"d8ed2a92-a88f-486e-ac1f-ec1b59216ad6\" (UID: \"d8ed2a92-a88f-486e-ac1f-ec1b59216ad6\") " Jan 11 17:49:33 crc kubenswrapper[4837]: I0111 17:49:33.449662 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d8ed2a92-a88f-486e-ac1f-ec1b59216ad6-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "d8ed2a92-a88f-486e-ac1f-ec1b59216ad6" (UID: "d8ed2a92-a88f-486e-ac1f-ec1b59216ad6"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 11 17:49:33 crc kubenswrapper[4837]: I0111 17:49:33.458874 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d8ed2a92-a88f-486e-ac1f-ec1b59216ad6-kube-api-access-vqw6h" (OuterVolumeSpecName: "kube-api-access-vqw6h") pod "d8ed2a92-a88f-486e-ac1f-ec1b59216ad6" (UID: "d8ed2a92-a88f-486e-ac1f-ec1b59216ad6"). InnerVolumeSpecName "kube-api-access-vqw6h". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 11 17:49:33 crc kubenswrapper[4837]: I0111 17:49:33.550980 4837 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/d8ed2a92-a88f-486e-ac1f-ec1b59216ad6-operator-scripts\") on node \"crc\" DevicePath \"\"" Jan 11 17:49:33 crc kubenswrapper[4837]: I0111 17:49:33.551025 4837 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-vqw6h\" (UniqueName: \"kubernetes.io/projected/d8ed2a92-a88f-486e-ac1f-ec1b59216ad6-kube-api-access-vqw6h\") on node \"crc\" DevicePath \"\"" Jan 11 17:49:33 crc kubenswrapper[4837]: I0111 17:49:33.652349 4837 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-db-sync-r8vzn"] Jan 11 17:49:33 crc kubenswrapper[4837]: E0111 17:49:33.652635 4837 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="43068ba1-1d19-4822-88fa-e52f8fb21738" containerName="swift-ring-rebalance" Jan 11 17:49:33 crc kubenswrapper[4837]: I0111 17:49:33.652652 4837 state_mem.go:107] "Deleted CPUSet assignment" podUID="43068ba1-1d19-4822-88fa-e52f8fb21738" containerName="swift-ring-rebalance" Jan 11 17:49:33 crc kubenswrapper[4837]: E0111 17:49:33.652687 4837 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d8ed2a92-a88f-486e-ac1f-ec1b59216ad6" containerName="mariadb-account-create-update" Jan 11 17:49:33 crc kubenswrapper[4837]: I0111 17:49:33.652694 4837 state_mem.go:107] "Deleted CPUSet assignment" podUID="d8ed2a92-a88f-486e-ac1f-ec1b59216ad6" containerName="mariadb-account-create-update" Jan 11 17:49:33 crc kubenswrapper[4837]: I0111 17:49:33.652842 4837 memory_manager.go:354] "RemoveStaleState removing state" podUID="d8ed2a92-a88f-486e-ac1f-ec1b59216ad6" containerName="mariadb-account-create-update" Jan 11 17:49:33 crc kubenswrapper[4837]: I0111 17:49:33.652857 4837 memory_manager.go:354] "RemoveStaleState removing state" podUID="43068ba1-1d19-4822-88fa-e52f8fb21738" containerName="swift-ring-rebalance" Jan 11 17:49:33 crc kubenswrapper[4837]: I0111 17:49:33.653356 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-sync-r8vzn" Jan 11 17:49:33 crc kubenswrapper[4837]: I0111 17:49:33.654912 4837 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-glance-dockercfg-mt2ws" Jan 11 17:49:33 crc kubenswrapper[4837]: I0111 17:49:33.655626 4837 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-config-data" Jan 11 17:49:33 crc kubenswrapper[4837]: I0111 17:49:33.673865 4837 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-db-sync-r8vzn"] Jan 11 17:49:33 crc kubenswrapper[4837]: I0111 17:49:33.754658 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ed1cfeaf-6da5-40ed-b605-077e5c95900c-combined-ca-bundle\") pod \"glance-db-sync-r8vzn\" (UID: \"ed1cfeaf-6da5-40ed-b605-077e5c95900c\") " pod="openstack/glance-db-sync-r8vzn" Jan 11 17:49:33 crc kubenswrapper[4837]: I0111 17:49:33.754737 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/ed1cfeaf-6da5-40ed-b605-077e5c95900c-db-sync-config-data\") pod \"glance-db-sync-r8vzn\" (UID: \"ed1cfeaf-6da5-40ed-b605-077e5c95900c\") " pod="openstack/glance-db-sync-r8vzn" Jan 11 17:49:33 crc kubenswrapper[4837]: I0111 17:49:33.754862 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-g6sfd\" (UniqueName: \"kubernetes.io/projected/ed1cfeaf-6da5-40ed-b605-077e5c95900c-kube-api-access-g6sfd\") pod \"glance-db-sync-r8vzn\" (UID: \"ed1cfeaf-6da5-40ed-b605-077e5c95900c\") " pod="openstack/glance-db-sync-r8vzn" Jan 11 17:49:33 crc kubenswrapper[4837]: I0111 17:49:33.754927 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ed1cfeaf-6da5-40ed-b605-077e5c95900c-config-data\") pod \"glance-db-sync-r8vzn\" (UID: \"ed1cfeaf-6da5-40ed-b605-077e5c95900c\") " pod="openstack/glance-db-sync-r8vzn" Jan 11 17:49:33 crc kubenswrapper[4837]: I0111 17:49:33.856344 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-g6sfd\" (UniqueName: \"kubernetes.io/projected/ed1cfeaf-6da5-40ed-b605-077e5c95900c-kube-api-access-g6sfd\") pod \"glance-db-sync-r8vzn\" (UID: \"ed1cfeaf-6da5-40ed-b605-077e5c95900c\") " pod="openstack/glance-db-sync-r8vzn" Jan 11 17:49:33 crc kubenswrapper[4837]: I0111 17:49:33.856505 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ed1cfeaf-6da5-40ed-b605-077e5c95900c-config-data\") pod \"glance-db-sync-r8vzn\" (UID: \"ed1cfeaf-6da5-40ed-b605-077e5c95900c\") " pod="openstack/glance-db-sync-r8vzn" Jan 11 17:49:33 crc kubenswrapper[4837]: I0111 17:49:33.856591 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ed1cfeaf-6da5-40ed-b605-077e5c95900c-combined-ca-bundle\") pod \"glance-db-sync-r8vzn\" (UID: \"ed1cfeaf-6da5-40ed-b605-077e5c95900c\") " pod="openstack/glance-db-sync-r8vzn" Jan 11 17:49:33 crc kubenswrapper[4837]: I0111 17:49:33.856633 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/ed1cfeaf-6da5-40ed-b605-077e5c95900c-db-sync-config-data\") pod \"glance-db-sync-r8vzn\" (UID: \"ed1cfeaf-6da5-40ed-b605-077e5c95900c\") " pod="openstack/glance-db-sync-r8vzn" Jan 11 17:49:33 crc kubenswrapper[4837]: I0111 17:49:33.864274 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ed1cfeaf-6da5-40ed-b605-077e5c95900c-config-data\") pod \"glance-db-sync-r8vzn\" (UID: \"ed1cfeaf-6da5-40ed-b605-077e5c95900c\") " pod="openstack/glance-db-sync-r8vzn" Jan 11 17:49:33 crc kubenswrapper[4837]: I0111 17:49:33.864283 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/ed1cfeaf-6da5-40ed-b605-077e5c95900c-db-sync-config-data\") pod \"glance-db-sync-r8vzn\" (UID: \"ed1cfeaf-6da5-40ed-b605-077e5c95900c\") " pod="openstack/glance-db-sync-r8vzn" Jan 11 17:49:33 crc kubenswrapper[4837]: I0111 17:49:33.864599 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ed1cfeaf-6da5-40ed-b605-077e5c95900c-combined-ca-bundle\") pod \"glance-db-sync-r8vzn\" (UID: \"ed1cfeaf-6da5-40ed-b605-077e5c95900c\") " pod="openstack/glance-db-sync-r8vzn" Jan 11 17:49:33 crc kubenswrapper[4837]: I0111 17:49:33.879544 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-g6sfd\" (UniqueName: \"kubernetes.io/projected/ed1cfeaf-6da5-40ed-b605-077e5c95900c-kube-api-access-g6sfd\") pod \"glance-db-sync-r8vzn\" (UID: \"ed1cfeaf-6da5-40ed-b605-077e5c95900c\") " pod="openstack/glance-db-sync-r8vzn" Jan 11 17:49:33 crc kubenswrapper[4837]: I0111 17:49:33.968767 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-sync-r8vzn" Jan 11 17:49:34 crc kubenswrapper[4837]: I0111 17:49:34.108214 4837 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/root-account-create-update-865d2" Jan 11 17:49:34 crc kubenswrapper[4837]: I0111 17:49:34.109209 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/root-account-create-update-865d2" event={"ID":"d8ed2a92-a88f-486e-ac1f-ec1b59216ad6","Type":"ContainerDied","Data":"efc24fd692f0b2beacd08ecd3954eec8f2f8ac4ec362c4440d0aff1094753315"} Jan 11 17:49:34 crc kubenswrapper[4837]: I0111 17:49:34.109261 4837 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="efc24fd692f0b2beacd08ecd3954eec8f2f8ac4ec362c4440d0aff1094753315" Jan 11 17:49:34 crc kubenswrapper[4837]: I0111 17:49:34.269366 4837 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/ovn-controller-zfjdc" podUID="91f28f51-1965-4fdd-bcb8-c261644249d5" containerName="ovn-controller" probeResult="failure" output=< Jan 11 17:49:34 crc kubenswrapper[4837]: ERROR - ovn-controller connection status is 'not connected', expecting 'connected' status Jan 11 17:49:34 crc kubenswrapper[4837]: > Jan 11 17:49:34 crc kubenswrapper[4837]: I0111 17:49:34.315485 4837 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ovn-controller-ovs-22bpd" Jan 11 17:49:34 crc kubenswrapper[4837]: I0111 17:49:34.318663 4837 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ovn-controller-ovs-22bpd" Jan 11 17:49:34 crc kubenswrapper[4837]: I0111 17:49:34.592401 4837 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovn-controller-zfjdc-config-cwprt"] Jan 11 17:49:34 crc kubenswrapper[4837]: I0111 17:49:34.593304 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-zfjdc-config-cwprt" Jan 11 17:49:34 crc kubenswrapper[4837]: I0111 17:49:34.594958 4837 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovncontroller-extra-scripts" Jan 11 17:49:34 crc kubenswrapper[4837]: I0111 17:49:34.600383 4837 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-zfjdc-config-cwprt"] Jan 11 17:49:34 crc kubenswrapper[4837]: I0111 17:49:34.636050 4837 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-db-sync-r8vzn"] Jan 11 17:49:34 crc kubenswrapper[4837]: I0111 17:49:34.669398 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/b55ccd7a-b1ad-431a-a35c-4a9132cec68b-var-log-ovn\") pod \"ovn-controller-zfjdc-config-cwprt\" (UID: \"b55ccd7a-b1ad-431a-a35c-4a9132cec68b\") " pod="openstack/ovn-controller-zfjdc-config-cwprt" Jan 11 17:49:34 crc kubenswrapper[4837]: I0111 17:49:34.669467 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/b55ccd7a-b1ad-431a-a35c-4a9132cec68b-var-run-ovn\") pod \"ovn-controller-zfjdc-config-cwprt\" (UID: \"b55ccd7a-b1ad-431a-a35c-4a9132cec68b\") " pod="openstack/ovn-controller-zfjdc-config-cwprt" Jan 11 17:49:34 crc kubenswrapper[4837]: I0111 17:49:34.669492 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qdbrz\" (UniqueName: \"kubernetes.io/projected/b55ccd7a-b1ad-431a-a35c-4a9132cec68b-kube-api-access-qdbrz\") pod \"ovn-controller-zfjdc-config-cwprt\" (UID: \"b55ccd7a-b1ad-431a-a35c-4a9132cec68b\") " pod="openstack/ovn-controller-zfjdc-config-cwprt" Jan 11 17:49:34 crc kubenswrapper[4837]: I0111 17:49:34.669520 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/b55ccd7a-b1ad-431a-a35c-4a9132cec68b-scripts\") pod \"ovn-controller-zfjdc-config-cwprt\" (UID: \"b55ccd7a-b1ad-431a-a35c-4a9132cec68b\") " pod="openstack/ovn-controller-zfjdc-config-cwprt" Jan 11 17:49:34 crc kubenswrapper[4837]: I0111 17:49:34.669745 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"additional-scripts\" (UniqueName: \"kubernetes.io/configmap/b55ccd7a-b1ad-431a-a35c-4a9132cec68b-additional-scripts\") pod \"ovn-controller-zfjdc-config-cwprt\" (UID: \"b55ccd7a-b1ad-431a-a35c-4a9132cec68b\") " pod="openstack/ovn-controller-zfjdc-config-cwprt" Jan 11 17:49:34 crc kubenswrapper[4837]: I0111 17:49:34.669872 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/b55ccd7a-b1ad-431a-a35c-4a9132cec68b-var-run\") pod \"ovn-controller-zfjdc-config-cwprt\" (UID: \"b55ccd7a-b1ad-431a-a35c-4a9132cec68b\") " pod="openstack/ovn-controller-zfjdc-config-cwprt" Jan 11 17:49:34 crc kubenswrapper[4837]: I0111 17:49:34.771374 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/b55ccd7a-b1ad-431a-a35c-4a9132cec68b-var-run-ovn\") pod \"ovn-controller-zfjdc-config-cwprt\" (UID: \"b55ccd7a-b1ad-431a-a35c-4a9132cec68b\") " pod="openstack/ovn-controller-zfjdc-config-cwprt" Jan 11 17:49:34 crc kubenswrapper[4837]: I0111 17:49:34.771419 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qdbrz\" (UniqueName: \"kubernetes.io/projected/b55ccd7a-b1ad-431a-a35c-4a9132cec68b-kube-api-access-qdbrz\") pod \"ovn-controller-zfjdc-config-cwprt\" (UID: \"b55ccd7a-b1ad-431a-a35c-4a9132cec68b\") " pod="openstack/ovn-controller-zfjdc-config-cwprt" Jan 11 17:49:34 crc kubenswrapper[4837]: I0111 17:49:34.771460 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/b55ccd7a-b1ad-431a-a35c-4a9132cec68b-scripts\") pod \"ovn-controller-zfjdc-config-cwprt\" (UID: \"b55ccd7a-b1ad-431a-a35c-4a9132cec68b\") " pod="openstack/ovn-controller-zfjdc-config-cwprt" Jan 11 17:49:34 crc kubenswrapper[4837]: I0111 17:49:34.771553 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"additional-scripts\" (UniqueName: \"kubernetes.io/configmap/b55ccd7a-b1ad-431a-a35c-4a9132cec68b-additional-scripts\") pod \"ovn-controller-zfjdc-config-cwprt\" (UID: \"b55ccd7a-b1ad-431a-a35c-4a9132cec68b\") " pod="openstack/ovn-controller-zfjdc-config-cwprt" Jan 11 17:49:34 crc kubenswrapper[4837]: I0111 17:49:34.771584 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/b55ccd7a-b1ad-431a-a35c-4a9132cec68b-var-run\") pod \"ovn-controller-zfjdc-config-cwprt\" (UID: \"b55ccd7a-b1ad-431a-a35c-4a9132cec68b\") " pod="openstack/ovn-controller-zfjdc-config-cwprt" Jan 11 17:49:34 crc kubenswrapper[4837]: I0111 17:49:34.771616 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/b55ccd7a-b1ad-431a-a35c-4a9132cec68b-var-log-ovn\") pod \"ovn-controller-zfjdc-config-cwprt\" (UID: \"b55ccd7a-b1ad-431a-a35c-4a9132cec68b\") " pod="openstack/ovn-controller-zfjdc-config-cwprt" Jan 11 17:49:34 crc kubenswrapper[4837]: I0111 17:49:34.771815 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/b55ccd7a-b1ad-431a-a35c-4a9132cec68b-var-run\") pod \"ovn-controller-zfjdc-config-cwprt\" (UID: \"b55ccd7a-b1ad-431a-a35c-4a9132cec68b\") " pod="openstack/ovn-controller-zfjdc-config-cwprt" Jan 11 17:49:34 crc kubenswrapper[4837]: I0111 17:49:34.771864 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/b55ccd7a-b1ad-431a-a35c-4a9132cec68b-var-run-ovn\") pod \"ovn-controller-zfjdc-config-cwprt\" (UID: \"b55ccd7a-b1ad-431a-a35c-4a9132cec68b\") " pod="openstack/ovn-controller-zfjdc-config-cwprt" Jan 11 17:49:34 crc kubenswrapper[4837]: I0111 17:49:34.771915 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/b55ccd7a-b1ad-431a-a35c-4a9132cec68b-var-log-ovn\") pod \"ovn-controller-zfjdc-config-cwprt\" (UID: \"b55ccd7a-b1ad-431a-a35c-4a9132cec68b\") " pod="openstack/ovn-controller-zfjdc-config-cwprt" Jan 11 17:49:34 crc kubenswrapper[4837]: I0111 17:49:34.772445 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"additional-scripts\" (UniqueName: \"kubernetes.io/configmap/b55ccd7a-b1ad-431a-a35c-4a9132cec68b-additional-scripts\") pod \"ovn-controller-zfjdc-config-cwprt\" (UID: \"b55ccd7a-b1ad-431a-a35c-4a9132cec68b\") " pod="openstack/ovn-controller-zfjdc-config-cwprt" Jan 11 17:49:34 crc kubenswrapper[4837]: I0111 17:49:34.774366 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/b55ccd7a-b1ad-431a-a35c-4a9132cec68b-scripts\") pod \"ovn-controller-zfjdc-config-cwprt\" (UID: \"b55ccd7a-b1ad-431a-a35c-4a9132cec68b\") " pod="openstack/ovn-controller-zfjdc-config-cwprt" Jan 11 17:49:34 crc kubenswrapper[4837]: I0111 17:49:34.796282 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qdbrz\" (UniqueName: \"kubernetes.io/projected/b55ccd7a-b1ad-431a-a35c-4a9132cec68b-kube-api-access-qdbrz\") pod \"ovn-controller-zfjdc-config-cwprt\" (UID: \"b55ccd7a-b1ad-431a-a35c-4a9132cec68b\") " pod="openstack/ovn-controller-zfjdc-config-cwprt" Jan 11 17:49:34 crc kubenswrapper[4837]: I0111 17:49:34.909825 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-zfjdc-config-cwprt" Jan 11 17:49:35 crc kubenswrapper[4837]: I0111 17:49:35.115569 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db-sync-r8vzn" event={"ID":"ed1cfeaf-6da5-40ed-b605-077e5c95900c","Type":"ContainerStarted","Data":"a0b6ab519ec350b87b042e380a38c6f87de122c9827e623f60a8415eb6f1f8f4"} Jan 11 17:49:35 crc kubenswrapper[4837]: I0111 17:49:35.422045 4837 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-zfjdc-config-cwprt"] Jan 11 17:49:36 crc kubenswrapper[4837]: I0111 17:49:36.585473 4837 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/root-account-create-update-865d2"] Jan 11 17:49:36 crc kubenswrapper[4837]: I0111 17:49:36.590978 4837 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/root-account-create-update-865d2"] Jan 11 17:49:36 crc kubenswrapper[4837]: W0111 17:49:36.787084 4837 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podb55ccd7a_b1ad_431a_a35c_4a9132cec68b.slice/crio-914ead4afcccda62e963ed4ad8a8b7cdbcf35f7e3186e7c91bb4a8d1af562c1b WatchSource:0}: Error finding container 914ead4afcccda62e963ed4ad8a8b7cdbcf35f7e3186e7c91bb4a8d1af562c1b: Status 404 returned error can't find the container with id 914ead4afcccda62e963ed4ad8a8b7cdbcf35f7e3186e7c91bb4a8d1af562c1b Jan 11 17:49:37 crc kubenswrapper[4837]: I0111 17:49:37.140548 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"23a0b787-b5b4-4a4e-828b-d7f34853603f","Type":"ContainerStarted","Data":"dda207df1543c20657481d0ed4d8c0179225121c0eaf5266e212085517740e6f"} Jan 11 17:49:37 crc kubenswrapper[4837]: I0111 17:49:37.144171 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-zfjdc-config-cwprt" event={"ID":"b55ccd7a-b1ad-431a-a35c-4a9132cec68b","Type":"ContainerStarted","Data":"53fb0ce9a8cb569af17e8a40682754a192e1d5c1f46b4b28c0b252a77c3822f0"} Jan 11 17:49:37 crc kubenswrapper[4837]: I0111 17:49:37.144263 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-zfjdc-config-cwprt" event={"ID":"b55ccd7a-b1ad-431a-a35c-4a9132cec68b","Type":"ContainerStarted","Data":"914ead4afcccda62e963ed4ad8a8b7cdbcf35f7e3186e7c91bb4a8d1af562c1b"} Jan 11 17:49:37 crc kubenswrapper[4837]: I0111 17:49:37.182767 4837 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovn-controller-zfjdc-config-cwprt" podStartSLOduration=3.182746871 podStartE2EDuration="3.182746871s" podCreationTimestamp="2026-01-11 17:49:34 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-11 17:49:37.182071103 +0000 UTC m=+1151.360263829" watchObservedRunningTime="2026-01-11 17:49:37.182746871 +0000 UTC m=+1151.360939587" Jan 11 17:49:38 crc kubenswrapper[4837]: I0111 17:49:38.154453 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"23a0b787-b5b4-4a4e-828b-d7f34853603f","Type":"ContainerStarted","Data":"94de47283f609b0c54161f9f264ecd0972317e2a9763b93e92c564526992d35a"} Jan 11 17:49:38 crc kubenswrapper[4837]: I0111 17:49:38.154838 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"23a0b787-b5b4-4a4e-828b-d7f34853603f","Type":"ContainerStarted","Data":"46e6abc0a0d310e057b0c87cf5697a39cc93cde02ae0255dcd0befb6a0006970"} Jan 11 17:49:38 crc kubenswrapper[4837]: I0111 17:49:38.154853 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"23a0b787-b5b4-4a4e-828b-d7f34853603f","Type":"ContainerStarted","Data":"928aa3a00558461d0b445a59fdfa611db9eb41e6bb921557e1b1a33f7e6ab196"} Jan 11 17:49:38 crc kubenswrapper[4837]: I0111 17:49:38.156167 4837 generic.go:334] "Generic (PLEG): container finished" podID="b55ccd7a-b1ad-431a-a35c-4a9132cec68b" containerID="53fb0ce9a8cb569af17e8a40682754a192e1d5c1f46b4b28c0b252a77c3822f0" exitCode=0 Jan 11 17:49:38 crc kubenswrapper[4837]: I0111 17:49:38.156212 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-zfjdc-config-cwprt" event={"ID":"b55ccd7a-b1ad-431a-a35c-4a9132cec68b","Type":"ContainerDied","Data":"53fb0ce9a8cb569af17e8a40682754a192e1d5c1f46b4b28c0b252a77c3822f0"} Jan 11 17:49:38 crc kubenswrapper[4837]: I0111 17:49:38.379428 4837 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d8ed2a92-a88f-486e-ac1f-ec1b59216ad6" path="/var/lib/kubelet/pods/d8ed2a92-a88f-486e-ac1f-ec1b59216ad6/volumes" Jan 11 17:49:39 crc kubenswrapper[4837]: I0111 17:49:39.308800 4837 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ovn-controller-zfjdc" Jan 11 17:49:39 crc kubenswrapper[4837]: I0111 17:49:39.562199 4837 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-zfjdc-config-cwprt" Jan 11 17:49:39 crc kubenswrapper[4837]: I0111 17:49:39.659997 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qdbrz\" (UniqueName: \"kubernetes.io/projected/b55ccd7a-b1ad-431a-a35c-4a9132cec68b-kube-api-access-qdbrz\") pod \"b55ccd7a-b1ad-431a-a35c-4a9132cec68b\" (UID: \"b55ccd7a-b1ad-431a-a35c-4a9132cec68b\") " Jan 11 17:49:39 crc kubenswrapper[4837]: I0111 17:49:39.660351 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/b55ccd7a-b1ad-431a-a35c-4a9132cec68b-var-run-ovn\") pod \"b55ccd7a-b1ad-431a-a35c-4a9132cec68b\" (UID: \"b55ccd7a-b1ad-431a-a35c-4a9132cec68b\") " Jan 11 17:49:39 crc kubenswrapper[4837]: I0111 17:49:39.660407 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"additional-scripts\" (UniqueName: \"kubernetes.io/configmap/b55ccd7a-b1ad-431a-a35c-4a9132cec68b-additional-scripts\") pod \"b55ccd7a-b1ad-431a-a35c-4a9132cec68b\" (UID: \"b55ccd7a-b1ad-431a-a35c-4a9132cec68b\") " Jan 11 17:49:39 crc kubenswrapper[4837]: I0111 17:49:39.660447 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/b55ccd7a-b1ad-431a-a35c-4a9132cec68b-var-run\") pod \"b55ccd7a-b1ad-431a-a35c-4a9132cec68b\" (UID: \"b55ccd7a-b1ad-431a-a35c-4a9132cec68b\") " Jan 11 17:49:39 crc kubenswrapper[4837]: I0111 17:49:39.660714 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/b55ccd7a-b1ad-431a-a35c-4a9132cec68b-scripts\") pod \"b55ccd7a-b1ad-431a-a35c-4a9132cec68b\" (UID: \"b55ccd7a-b1ad-431a-a35c-4a9132cec68b\") " Jan 11 17:49:39 crc kubenswrapper[4837]: I0111 17:49:39.660768 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/b55ccd7a-b1ad-431a-a35c-4a9132cec68b-var-log-ovn\") pod \"b55ccd7a-b1ad-431a-a35c-4a9132cec68b\" (UID: \"b55ccd7a-b1ad-431a-a35c-4a9132cec68b\") " Jan 11 17:49:39 crc kubenswrapper[4837]: I0111 17:49:39.661453 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/b55ccd7a-b1ad-431a-a35c-4a9132cec68b-var-log-ovn" (OuterVolumeSpecName: "var-log-ovn") pod "b55ccd7a-b1ad-431a-a35c-4a9132cec68b" (UID: "b55ccd7a-b1ad-431a-a35c-4a9132cec68b"). InnerVolumeSpecName "var-log-ovn". PluginName "kubernetes.io/host-path", VolumeGidValue "" Jan 11 17:49:39 crc kubenswrapper[4837]: I0111 17:49:39.665405 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b55ccd7a-b1ad-431a-a35c-4a9132cec68b-additional-scripts" (OuterVolumeSpecName: "additional-scripts") pod "b55ccd7a-b1ad-431a-a35c-4a9132cec68b" (UID: "b55ccd7a-b1ad-431a-a35c-4a9132cec68b"). InnerVolumeSpecName "additional-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 11 17:49:39 crc kubenswrapper[4837]: I0111 17:49:39.665443 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/b55ccd7a-b1ad-431a-a35c-4a9132cec68b-var-run-ovn" (OuterVolumeSpecName: "var-run-ovn") pod "b55ccd7a-b1ad-431a-a35c-4a9132cec68b" (UID: "b55ccd7a-b1ad-431a-a35c-4a9132cec68b"). InnerVolumeSpecName "var-run-ovn". PluginName "kubernetes.io/host-path", VolumeGidValue "" Jan 11 17:49:39 crc kubenswrapper[4837]: I0111 17:49:39.665606 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/b55ccd7a-b1ad-431a-a35c-4a9132cec68b-var-run" (OuterVolumeSpecName: "var-run") pod "b55ccd7a-b1ad-431a-a35c-4a9132cec68b" (UID: "b55ccd7a-b1ad-431a-a35c-4a9132cec68b"). InnerVolumeSpecName "var-run". PluginName "kubernetes.io/host-path", VolumeGidValue "" Jan 11 17:49:39 crc kubenswrapper[4837]: I0111 17:49:39.666303 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b55ccd7a-b1ad-431a-a35c-4a9132cec68b-scripts" (OuterVolumeSpecName: "scripts") pod "b55ccd7a-b1ad-431a-a35c-4a9132cec68b" (UID: "b55ccd7a-b1ad-431a-a35c-4a9132cec68b"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 11 17:49:39 crc kubenswrapper[4837]: I0111 17:49:39.672976 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b55ccd7a-b1ad-431a-a35c-4a9132cec68b-kube-api-access-qdbrz" (OuterVolumeSpecName: "kube-api-access-qdbrz") pod "b55ccd7a-b1ad-431a-a35c-4a9132cec68b" (UID: "b55ccd7a-b1ad-431a-a35c-4a9132cec68b"). InnerVolumeSpecName "kube-api-access-qdbrz". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 11 17:49:39 crc kubenswrapper[4837]: I0111 17:49:39.763147 4837 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qdbrz\" (UniqueName: \"kubernetes.io/projected/b55ccd7a-b1ad-431a-a35c-4a9132cec68b-kube-api-access-qdbrz\") on node \"crc\" DevicePath \"\"" Jan 11 17:49:39 crc kubenswrapper[4837]: I0111 17:49:39.763182 4837 reconciler_common.go:293] "Volume detached for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/b55ccd7a-b1ad-431a-a35c-4a9132cec68b-var-run-ovn\") on node \"crc\" DevicePath \"\"" Jan 11 17:49:39 crc kubenswrapper[4837]: I0111 17:49:39.763194 4837 reconciler_common.go:293] "Volume detached for volume \"additional-scripts\" (UniqueName: \"kubernetes.io/configmap/b55ccd7a-b1ad-431a-a35c-4a9132cec68b-additional-scripts\") on node \"crc\" DevicePath \"\"" Jan 11 17:49:39 crc kubenswrapper[4837]: I0111 17:49:39.763204 4837 reconciler_common.go:293] "Volume detached for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/b55ccd7a-b1ad-431a-a35c-4a9132cec68b-var-run\") on node \"crc\" DevicePath \"\"" Jan 11 17:49:39 crc kubenswrapper[4837]: I0111 17:49:39.763245 4837 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/b55ccd7a-b1ad-431a-a35c-4a9132cec68b-scripts\") on node \"crc\" DevicePath \"\"" Jan 11 17:49:39 crc kubenswrapper[4837]: I0111 17:49:39.763255 4837 reconciler_common.go:293] "Volume detached for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/b55ccd7a-b1ad-431a-a35c-4a9132cec68b-var-log-ovn\") on node \"crc\" DevicePath \"\"" Jan 11 17:49:40 crc kubenswrapper[4837]: I0111 17:49:40.179103 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"23a0b787-b5b4-4a4e-828b-d7f34853603f","Type":"ContainerStarted","Data":"d32335d568d9e84364adbdf3cd531390179440eab8347bcaee6941595ed364e5"} Jan 11 17:49:40 crc kubenswrapper[4837]: I0111 17:49:40.180900 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-zfjdc-config-cwprt" event={"ID":"b55ccd7a-b1ad-431a-a35c-4a9132cec68b","Type":"ContainerDied","Data":"914ead4afcccda62e963ed4ad8a8b7cdbcf35f7e3186e7c91bb4a8d1af562c1b"} Jan 11 17:49:40 crc kubenswrapper[4837]: I0111 17:49:40.180929 4837 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="914ead4afcccda62e963ed4ad8a8b7cdbcf35f7e3186e7c91bb4a8d1af562c1b" Jan 11 17:49:40 crc kubenswrapper[4837]: I0111 17:49:40.180994 4837 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-zfjdc-config-cwprt" Jan 11 17:49:40 crc kubenswrapper[4837]: I0111 17:49:40.680014 4837 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ovn-controller-zfjdc-config-cwprt"] Jan 11 17:49:40 crc kubenswrapper[4837]: I0111 17:49:40.686816 4837 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ovn-controller-zfjdc-config-cwprt"] Jan 11 17:49:40 crc kubenswrapper[4837]: I0111 17:49:40.746747 4837 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovn-controller-zfjdc-config-ctrsb"] Jan 11 17:49:40 crc kubenswrapper[4837]: E0111 17:49:40.747041 4837 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b55ccd7a-b1ad-431a-a35c-4a9132cec68b" containerName="ovn-config" Jan 11 17:49:40 crc kubenswrapper[4837]: I0111 17:49:40.747055 4837 state_mem.go:107] "Deleted CPUSet assignment" podUID="b55ccd7a-b1ad-431a-a35c-4a9132cec68b" containerName="ovn-config" Jan 11 17:49:40 crc kubenswrapper[4837]: I0111 17:49:40.747208 4837 memory_manager.go:354] "RemoveStaleState removing state" podUID="b55ccd7a-b1ad-431a-a35c-4a9132cec68b" containerName="ovn-config" Jan 11 17:49:40 crc kubenswrapper[4837]: I0111 17:49:40.747695 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-zfjdc-config-ctrsb" Jan 11 17:49:40 crc kubenswrapper[4837]: I0111 17:49:40.752262 4837 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovncontroller-extra-scripts" Jan 11 17:49:40 crc kubenswrapper[4837]: I0111 17:49:40.758695 4837 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-zfjdc-config-ctrsb"] Jan 11 17:49:40 crc kubenswrapper[4837]: I0111 17:49:40.807052 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6tjvk\" (UniqueName: \"kubernetes.io/projected/a72aad60-a71e-46a9-a37d-e9f9796d4042-kube-api-access-6tjvk\") pod \"ovn-controller-zfjdc-config-ctrsb\" (UID: \"a72aad60-a71e-46a9-a37d-e9f9796d4042\") " pod="openstack/ovn-controller-zfjdc-config-ctrsb" Jan 11 17:49:40 crc kubenswrapper[4837]: I0111 17:49:40.807119 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/a72aad60-a71e-46a9-a37d-e9f9796d4042-var-run-ovn\") pod \"ovn-controller-zfjdc-config-ctrsb\" (UID: \"a72aad60-a71e-46a9-a37d-e9f9796d4042\") " pod="openstack/ovn-controller-zfjdc-config-ctrsb" Jan 11 17:49:40 crc kubenswrapper[4837]: I0111 17:49:40.807138 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/a72aad60-a71e-46a9-a37d-e9f9796d4042-var-log-ovn\") pod \"ovn-controller-zfjdc-config-ctrsb\" (UID: \"a72aad60-a71e-46a9-a37d-e9f9796d4042\") " pod="openstack/ovn-controller-zfjdc-config-ctrsb" Jan 11 17:49:40 crc kubenswrapper[4837]: I0111 17:49:40.807208 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"additional-scripts\" (UniqueName: \"kubernetes.io/configmap/a72aad60-a71e-46a9-a37d-e9f9796d4042-additional-scripts\") pod \"ovn-controller-zfjdc-config-ctrsb\" (UID: \"a72aad60-a71e-46a9-a37d-e9f9796d4042\") " pod="openstack/ovn-controller-zfjdc-config-ctrsb" Jan 11 17:49:40 crc kubenswrapper[4837]: I0111 17:49:40.807277 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/a72aad60-a71e-46a9-a37d-e9f9796d4042-scripts\") pod \"ovn-controller-zfjdc-config-ctrsb\" (UID: \"a72aad60-a71e-46a9-a37d-e9f9796d4042\") " pod="openstack/ovn-controller-zfjdc-config-ctrsb" Jan 11 17:49:40 crc kubenswrapper[4837]: I0111 17:49:40.807355 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/a72aad60-a71e-46a9-a37d-e9f9796d4042-var-run\") pod \"ovn-controller-zfjdc-config-ctrsb\" (UID: \"a72aad60-a71e-46a9-a37d-e9f9796d4042\") " pod="openstack/ovn-controller-zfjdc-config-ctrsb" Jan 11 17:49:40 crc kubenswrapper[4837]: I0111 17:49:40.909150 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/a72aad60-a71e-46a9-a37d-e9f9796d4042-var-run\") pod \"ovn-controller-zfjdc-config-ctrsb\" (UID: \"a72aad60-a71e-46a9-a37d-e9f9796d4042\") " pod="openstack/ovn-controller-zfjdc-config-ctrsb" Jan 11 17:49:40 crc kubenswrapper[4837]: I0111 17:49:40.909440 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/a72aad60-a71e-46a9-a37d-e9f9796d4042-var-run\") pod \"ovn-controller-zfjdc-config-ctrsb\" (UID: \"a72aad60-a71e-46a9-a37d-e9f9796d4042\") " pod="openstack/ovn-controller-zfjdc-config-ctrsb" Jan 11 17:49:40 crc kubenswrapper[4837]: I0111 17:49:40.909563 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6tjvk\" (UniqueName: \"kubernetes.io/projected/a72aad60-a71e-46a9-a37d-e9f9796d4042-kube-api-access-6tjvk\") pod \"ovn-controller-zfjdc-config-ctrsb\" (UID: \"a72aad60-a71e-46a9-a37d-e9f9796d4042\") " pod="openstack/ovn-controller-zfjdc-config-ctrsb" Jan 11 17:49:40 crc kubenswrapper[4837]: I0111 17:49:40.909593 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/a72aad60-a71e-46a9-a37d-e9f9796d4042-var-run-ovn\") pod \"ovn-controller-zfjdc-config-ctrsb\" (UID: \"a72aad60-a71e-46a9-a37d-e9f9796d4042\") " pod="openstack/ovn-controller-zfjdc-config-ctrsb" Jan 11 17:49:40 crc kubenswrapper[4837]: I0111 17:49:40.909614 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/a72aad60-a71e-46a9-a37d-e9f9796d4042-var-log-ovn\") pod \"ovn-controller-zfjdc-config-ctrsb\" (UID: \"a72aad60-a71e-46a9-a37d-e9f9796d4042\") " pod="openstack/ovn-controller-zfjdc-config-ctrsb" Jan 11 17:49:40 crc kubenswrapper[4837]: I0111 17:49:40.909684 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/a72aad60-a71e-46a9-a37d-e9f9796d4042-var-run-ovn\") pod \"ovn-controller-zfjdc-config-ctrsb\" (UID: \"a72aad60-a71e-46a9-a37d-e9f9796d4042\") " pod="openstack/ovn-controller-zfjdc-config-ctrsb" Jan 11 17:49:40 crc kubenswrapper[4837]: I0111 17:49:40.909784 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/a72aad60-a71e-46a9-a37d-e9f9796d4042-var-log-ovn\") pod \"ovn-controller-zfjdc-config-ctrsb\" (UID: \"a72aad60-a71e-46a9-a37d-e9f9796d4042\") " pod="openstack/ovn-controller-zfjdc-config-ctrsb" Jan 11 17:49:40 crc kubenswrapper[4837]: I0111 17:49:40.909816 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"additional-scripts\" (UniqueName: \"kubernetes.io/configmap/a72aad60-a71e-46a9-a37d-e9f9796d4042-additional-scripts\") pod \"ovn-controller-zfjdc-config-ctrsb\" (UID: \"a72aad60-a71e-46a9-a37d-e9f9796d4042\") " pod="openstack/ovn-controller-zfjdc-config-ctrsb" Jan 11 17:49:40 crc kubenswrapper[4837]: I0111 17:49:40.909864 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/a72aad60-a71e-46a9-a37d-e9f9796d4042-scripts\") pod \"ovn-controller-zfjdc-config-ctrsb\" (UID: \"a72aad60-a71e-46a9-a37d-e9f9796d4042\") " pod="openstack/ovn-controller-zfjdc-config-ctrsb" Jan 11 17:49:40 crc kubenswrapper[4837]: I0111 17:49:40.910620 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"additional-scripts\" (UniqueName: \"kubernetes.io/configmap/a72aad60-a71e-46a9-a37d-e9f9796d4042-additional-scripts\") pod \"ovn-controller-zfjdc-config-ctrsb\" (UID: \"a72aad60-a71e-46a9-a37d-e9f9796d4042\") " pod="openstack/ovn-controller-zfjdc-config-ctrsb" Jan 11 17:49:40 crc kubenswrapper[4837]: I0111 17:49:40.913092 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/a72aad60-a71e-46a9-a37d-e9f9796d4042-scripts\") pod \"ovn-controller-zfjdc-config-ctrsb\" (UID: \"a72aad60-a71e-46a9-a37d-e9f9796d4042\") " pod="openstack/ovn-controller-zfjdc-config-ctrsb" Jan 11 17:49:40 crc kubenswrapper[4837]: I0111 17:49:40.925549 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6tjvk\" (UniqueName: \"kubernetes.io/projected/a72aad60-a71e-46a9-a37d-e9f9796d4042-kube-api-access-6tjvk\") pod \"ovn-controller-zfjdc-config-ctrsb\" (UID: \"a72aad60-a71e-46a9-a37d-e9f9796d4042\") " pod="openstack/ovn-controller-zfjdc-config-ctrsb" Jan 11 17:49:41 crc kubenswrapper[4837]: I0111 17:49:41.118272 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-zfjdc-config-ctrsb" Jan 11 17:49:41 crc kubenswrapper[4837]: I0111 17:49:41.202643 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"23a0b787-b5b4-4a4e-828b-d7f34853603f","Type":"ContainerStarted","Data":"e8c5311db3da52cbd3aeb52121436f98a17fd8157972c25828eb4192e3cc4a14"} Jan 11 17:49:41 crc kubenswrapper[4837]: I0111 17:49:41.202694 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"23a0b787-b5b4-4a4e-828b-d7f34853603f","Type":"ContainerStarted","Data":"37e52f624365879bedd9e730685b30eb636f5bd83de56a3f0f7aa66e4096f802"} Jan 11 17:49:41 crc kubenswrapper[4837]: I0111 17:49:41.202707 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"23a0b787-b5b4-4a4e-828b-d7f34853603f","Type":"ContainerStarted","Data":"b01eb620927c8b664d8156be3835aee150a06d22df3c119f5a28110d7b655540"} Jan 11 17:49:41 crc kubenswrapper[4837]: I0111 17:49:41.600785 4837 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/root-account-create-update-khqcj"] Jan 11 17:49:41 crc kubenswrapper[4837]: I0111 17:49:41.602022 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/root-account-create-update-khqcj" Jan 11 17:49:41 crc kubenswrapper[4837]: I0111 17:49:41.604021 4837 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-cell1-mariadb-root-db-secret" Jan 11 17:49:41 crc kubenswrapper[4837]: I0111 17:49:41.619233 4837 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-zfjdc-config-ctrsb"] Jan 11 17:49:41 crc kubenswrapper[4837]: I0111 17:49:41.628408 4837 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/root-account-create-update-khqcj"] Jan 11 17:49:41 crc kubenswrapper[4837]: I0111 17:49:41.720793 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jzhch\" (UniqueName: \"kubernetes.io/projected/d861b390-4351-4ffb-8bb5-19201f06e3db-kube-api-access-jzhch\") pod \"root-account-create-update-khqcj\" (UID: \"d861b390-4351-4ffb-8bb5-19201f06e3db\") " pod="openstack/root-account-create-update-khqcj" Jan 11 17:49:41 crc kubenswrapper[4837]: I0111 17:49:41.720946 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/d861b390-4351-4ffb-8bb5-19201f06e3db-operator-scripts\") pod \"root-account-create-update-khqcj\" (UID: \"d861b390-4351-4ffb-8bb5-19201f06e3db\") " pod="openstack/root-account-create-update-khqcj" Jan 11 17:49:41 crc kubenswrapper[4837]: I0111 17:49:41.822334 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/d861b390-4351-4ffb-8bb5-19201f06e3db-operator-scripts\") pod \"root-account-create-update-khqcj\" (UID: \"d861b390-4351-4ffb-8bb5-19201f06e3db\") " pod="openstack/root-account-create-update-khqcj" Jan 11 17:49:41 crc kubenswrapper[4837]: I0111 17:49:41.822425 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jzhch\" (UniqueName: \"kubernetes.io/projected/d861b390-4351-4ffb-8bb5-19201f06e3db-kube-api-access-jzhch\") pod \"root-account-create-update-khqcj\" (UID: \"d861b390-4351-4ffb-8bb5-19201f06e3db\") " pod="openstack/root-account-create-update-khqcj" Jan 11 17:49:41 crc kubenswrapper[4837]: I0111 17:49:41.823415 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/d861b390-4351-4ffb-8bb5-19201f06e3db-operator-scripts\") pod \"root-account-create-update-khqcj\" (UID: \"d861b390-4351-4ffb-8bb5-19201f06e3db\") " pod="openstack/root-account-create-update-khqcj" Jan 11 17:49:41 crc kubenswrapper[4837]: I0111 17:49:41.842655 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jzhch\" (UniqueName: \"kubernetes.io/projected/d861b390-4351-4ffb-8bb5-19201f06e3db-kube-api-access-jzhch\") pod \"root-account-create-update-khqcj\" (UID: \"d861b390-4351-4ffb-8bb5-19201f06e3db\") " pod="openstack/root-account-create-update-khqcj" Jan 11 17:49:41 crc kubenswrapper[4837]: I0111 17:49:41.924187 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/root-account-create-update-khqcj" Jan 11 17:49:42 crc kubenswrapper[4837]: I0111 17:49:42.213550 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-zfjdc-config-ctrsb" event={"ID":"a72aad60-a71e-46a9-a37d-e9f9796d4042","Type":"ContainerStarted","Data":"feeca970611022b253501bdfadb41b2d85806a12f4c47717e10ddc070c8ba25e"} Jan 11 17:49:42 crc kubenswrapper[4837]: I0111 17:49:42.375630 4837 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b55ccd7a-b1ad-431a-a35c-4a9132cec68b" path="/var/lib/kubelet/pods/b55ccd7a-b1ad-431a-a35c-4a9132cec68b/volumes" Jan 11 17:49:44 crc kubenswrapper[4837]: I0111 17:49:44.236755 4837 generic.go:334] "Generic (PLEG): container finished" podID="f21b505a-45c3-4f7e-b323-204d384185b9" containerID="c8a7e39f9bce03573496d4a5fbc8f9cbf25f994a2506c623c9b99578288b1ccc" exitCode=0 Jan 11 17:49:44 crc kubenswrapper[4837]: I0111 17:49:44.236815 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"f21b505a-45c3-4f7e-b323-204d384185b9","Type":"ContainerDied","Data":"c8a7e39f9bce03573496d4a5fbc8f9cbf25f994a2506c623c9b99578288b1ccc"} Jan 11 17:49:44 crc kubenswrapper[4837]: I0111 17:49:44.241638 4837 generic.go:334] "Generic (PLEG): container finished" podID="89ca88bf-2462-4fce-8a85-8dc04655b21c" containerID="8e0c518885ba9fd613894e81f44f81cf63dac770e6d902da138829b469a23051" exitCode=0 Jan 11 17:49:44 crc kubenswrapper[4837]: I0111 17:49:44.241718 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"89ca88bf-2462-4fce-8a85-8dc04655b21c","Type":"ContainerDied","Data":"8e0c518885ba9fd613894e81f44f81cf63dac770e6d902da138829b469a23051"} Jan 11 17:49:53 crc kubenswrapper[4837]: I0111 17:49:53.446593 4837 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/root-account-create-update-khqcj"] Jan 11 17:49:53 crc kubenswrapper[4837]: W0111 17:49:53.621167 4837 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podd861b390_4351_4ffb_8bb5_19201f06e3db.slice/crio-316b39e5aa11cc6d5d7e73193c960db54e6a8f9b03873b0709acd8b3a0e0642a WatchSource:0}: Error finding container 316b39e5aa11cc6d5d7e73193c960db54e6a8f9b03873b0709acd8b3a0e0642a: Status 404 returned error can't find the container with id 316b39e5aa11cc6d5d7e73193c960db54e6a8f9b03873b0709acd8b3a0e0642a Jan 11 17:49:54 crc kubenswrapper[4837]: I0111 17:49:54.332071 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/root-account-create-update-khqcj" event={"ID":"d861b390-4351-4ffb-8bb5-19201f06e3db","Type":"ContainerStarted","Data":"316b39e5aa11cc6d5d7e73193c960db54e6a8f9b03873b0709acd8b3a0e0642a"} Jan 11 17:49:55 crc kubenswrapper[4837]: I0111 17:49:55.342778 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-zfjdc-config-ctrsb" event={"ID":"a72aad60-a71e-46a9-a37d-e9f9796d4042","Type":"ContainerStarted","Data":"f5e5a6d7b3e8107fad32fcdd4674d216875cbcfa938175bd760e22e94ee75e3e"} Jan 11 17:49:55 crc kubenswrapper[4837]: I0111 17:49:55.349160 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"23a0b787-b5b4-4a4e-828b-d7f34853603f","Type":"ContainerStarted","Data":"e8634135be6ee68fb97b9d092235a41defde966c7e4e22481adb2288bf44ea0a"} Jan 11 17:49:56 crc kubenswrapper[4837]: I0111 17:49:56.357868 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"89ca88bf-2462-4fce-8a85-8dc04655b21c","Type":"ContainerStarted","Data":"5e4f18128979344961b956f7b3cb1cf46c0c5cabba9f3d76bd3725aae7b33879"} Jan 11 17:49:56 crc kubenswrapper[4837]: E0111 17:49:56.410938 4837 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/podified-antelope-centos9/openstack-glance-api:current-podified" Jan 11 17:49:56 crc kubenswrapper[4837]: E0111 17:49:56.411111 4837 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:glance-db-sync,Image:quay.io/podified-antelope-centos9/openstack-glance-api:current-podified,Command:[/bin/bash],Args:[-c /usr/local/bin/kolla_start],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:KOLLA_BOOTSTRAP,Value:true,ValueFrom:nil,},EnvVar{Name:KOLLA_CONFIG_STRATEGY,Value:COPY_ALWAYS,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:db-sync-config-data,ReadOnly:true,MountPath:/etc/glance/glance.conf.d,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:config-data,ReadOnly:true,MountPath:/etc/my.cnf,SubPath:my.cnf,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:config-data,ReadOnly:true,MountPath:/var/lib/kolla/config_files/config.json,SubPath:db-sync-config.json,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:combined-ca-bundle,ReadOnly:true,MountPath:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem,SubPath:tls-ca-bundle.pem,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-g6sfd,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*42415,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:nil,RunAsGroup:*42415,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod glance-db-sync-r8vzn_openstack(ed1cfeaf-6da5-40ed-b605-077e5c95900c): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Jan 11 17:49:56 crc kubenswrapper[4837]: E0111 17:49:56.412563 4837 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"glance-db-sync\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack/glance-db-sync-r8vzn" podUID="ed1cfeaf-6da5-40ed-b605-077e5c95900c" Jan 11 17:49:57 crc kubenswrapper[4837]: I0111 17:49:57.370341 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/root-account-create-update-khqcj" event={"ID":"d861b390-4351-4ffb-8bb5-19201f06e3db","Type":"ContainerStarted","Data":"56b953bf2da6d3a8a84c18f4249c0a5f2c9bf0cdb0724a20a9cff19e12b1f277"} Jan 11 17:49:57 crc kubenswrapper[4837]: I0111 17:49:57.374036 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"f21b505a-45c3-4f7e-b323-204d384185b9","Type":"ContainerStarted","Data":"96fd6d715572e68e28afddc6708154913dd9d5e9e5f2f26efd5a37f3d50ac164"} Jan 11 17:49:57 crc kubenswrapper[4837]: E0111 17:49:57.376150 4837 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"glance-db-sync\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/podified-antelope-centos9/openstack-glance-api:current-podified\\\"\"" pod="openstack/glance-db-sync-r8vzn" podUID="ed1cfeaf-6da5-40ed-b605-077e5c95900c" Jan 11 17:49:58 crc kubenswrapper[4837]: I0111 17:49:58.391552 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"23a0b787-b5b4-4a4e-828b-d7f34853603f","Type":"ContainerStarted","Data":"06f434113f210278f34d36c40688248677a17e99005c003374a9515e79b5e5cd"} Jan 11 17:49:58 crc kubenswrapper[4837]: I0111 17:49:58.396292 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/root-account-create-update-khqcj" event={"ID":"d861b390-4351-4ffb-8bb5-19201f06e3db","Type":"ContainerDied","Data":"56b953bf2da6d3a8a84c18f4249c0a5f2c9bf0cdb0724a20a9cff19e12b1f277"} Jan 11 17:49:58 crc kubenswrapper[4837]: I0111 17:49:58.396232 4837 generic.go:334] "Generic (PLEG): container finished" podID="d861b390-4351-4ffb-8bb5-19201f06e3db" containerID="56b953bf2da6d3a8a84c18f4249c0a5f2c9bf0cdb0724a20a9cff19e12b1f277" exitCode=0 Jan 11 17:49:58 crc kubenswrapper[4837]: I0111 17:49:58.398403 4837 generic.go:334] "Generic (PLEG): container finished" podID="a72aad60-a71e-46a9-a37d-e9f9796d4042" containerID="f5e5a6d7b3e8107fad32fcdd4674d216875cbcfa938175bd760e22e94ee75e3e" exitCode=0 Jan 11 17:49:58 crc kubenswrapper[4837]: I0111 17:49:58.398545 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-zfjdc-config-ctrsb" event={"ID":"a72aad60-a71e-46a9-a37d-e9f9796d4042","Type":"ContainerDied","Data":"f5e5a6d7b3e8107fad32fcdd4674d216875cbcfa938175bd760e22e94ee75e3e"} Jan 11 17:49:58 crc kubenswrapper[4837]: I0111 17:49:58.399105 4837 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/rabbitmq-cell1-server-0" Jan 11 17:49:58 crc kubenswrapper[4837]: I0111 17:49:58.399474 4837 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/rabbitmq-server-0" Jan 11 17:49:58 crc kubenswrapper[4837]: I0111 17:49:58.442834 4837 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/rabbitmq-server-0" podStartSLOduration=56.034110253 podStartE2EDuration="1m40.442818671s" podCreationTimestamp="2026-01-11 17:48:18 +0000 UTC" firstStartedPulling="2026-01-11 17:48:23.598597637 +0000 UTC m=+1077.776790343" lastFinishedPulling="2026-01-11 17:49:08.007306055 +0000 UTC m=+1122.185498761" observedRunningTime="2026-01-11 17:49:58.438004032 +0000 UTC m=+1172.616196728" watchObservedRunningTime="2026-01-11 17:49:58.442818671 +0000 UTC m=+1172.621011377" Jan 11 17:49:58 crc kubenswrapper[4837]: I0111 17:49:58.473581 4837 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/rabbitmq-cell1-server-0" podStartSLOduration=55.99061483 podStartE2EDuration="1m40.473488305s" podCreationTimestamp="2026-01-11 17:48:18 +0000 UTC" firstStartedPulling="2026-01-11 17:48:23.46151581 +0000 UTC m=+1077.639708516" lastFinishedPulling="2026-01-11 17:49:07.944389295 +0000 UTC m=+1122.122581991" observedRunningTime="2026-01-11 17:49:58.466147698 +0000 UTC m=+1172.644340404" watchObservedRunningTime="2026-01-11 17:49:58.473488305 +0000 UTC m=+1172.651681011" Jan 11 17:49:59 crc kubenswrapper[4837]: I0111 17:49:59.412079 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"23a0b787-b5b4-4a4e-828b-d7f34853603f","Type":"ContainerStarted","Data":"80efce2e9d2e51f81719e586e66bd80700e5e24ce6eb4af5ebcaf50cc1a7efa9"} Jan 11 17:49:59 crc kubenswrapper[4837]: I0111 17:49:59.412123 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"23a0b787-b5b4-4a4e-828b-d7f34853603f","Type":"ContainerStarted","Data":"45ed2214ec26521f5711e93be9703c4ea7a7d749c39284ce95fd43a7958f1099"} Jan 11 17:49:59 crc kubenswrapper[4837]: I0111 17:49:59.757582 4837 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/root-account-create-update-khqcj" Jan 11 17:49:59 crc kubenswrapper[4837]: I0111 17:49:59.807156 4837 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-zfjdc-config-ctrsb" Jan 11 17:49:59 crc kubenswrapper[4837]: I0111 17:49:59.859949 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jzhch\" (UniqueName: \"kubernetes.io/projected/d861b390-4351-4ffb-8bb5-19201f06e3db-kube-api-access-jzhch\") pod \"d861b390-4351-4ffb-8bb5-19201f06e3db\" (UID: \"d861b390-4351-4ffb-8bb5-19201f06e3db\") " Jan 11 17:49:59 crc kubenswrapper[4837]: I0111 17:49:59.860045 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/d861b390-4351-4ffb-8bb5-19201f06e3db-operator-scripts\") pod \"d861b390-4351-4ffb-8bb5-19201f06e3db\" (UID: \"d861b390-4351-4ffb-8bb5-19201f06e3db\") " Jan 11 17:49:59 crc kubenswrapper[4837]: I0111 17:49:59.860866 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d861b390-4351-4ffb-8bb5-19201f06e3db-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "d861b390-4351-4ffb-8bb5-19201f06e3db" (UID: "d861b390-4351-4ffb-8bb5-19201f06e3db"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 11 17:49:59 crc kubenswrapper[4837]: I0111 17:49:59.865320 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d861b390-4351-4ffb-8bb5-19201f06e3db-kube-api-access-jzhch" (OuterVolumeSpecName: "kube-api-access-jzhch") pod "d861b390-4351-4ffb-8bb5-19201f06e3db" (UID: "d861b390-4351-4ffb-8bb5-19201f06e3db"). InnerVolumeSpecName "kube-api-access-jzhch". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 11 17:49:59 crc kubenswrapper[4837]: I0111 17:49:59.966205 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/a72aad60-a71e-46a9-a37d-e9f9796d4042-var-run\") pod \"a72aad60-a71e-46a9-a37d-e9f9796d4042\" (UID: \"a72aad60-a71e-46a9-a37d-e9f9796d4042\") " Jan 11 17:49:59 crc kubenswrapper[4837]: I0111 17:49:59.966266 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/a72aad60-a71e-46a9-a37d-e9f9796d4042-var-run-ovn\") pod \"a72aad60-a71e-46a9-a37d-e9f9796d4042\" (UID: \"a72aad60-a71e-46a9-a37d-e9f9796d4042\") " Jan 11 17:49:59 crc kubenswrapper[4837]: I0111 17:49:59.966334 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/a72aad60-a71e-46a9-a37d-e9f9796d4042-var-run" (OuterVolumeSpecName: "var-run") pod "a72aad60-a71e-46a9-a37d-e9f9796d4042" (UID: "a72aad60-a71e-46a9-a37d-e9f9796d4042"). InnerVolumeSpecName "var-run". PluginName "kubernetes.io/host-path", VolumeGidValue "" Jan 11 17:49:59 crc kubenswrapper[4837]: I0111 17:49:59.966428 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"additional-scripts\" (UniqueName: \"kubernetes.io/configmap/a72aad60-a71e-46a9-a37d-e9f9796d4042-additional-scripts\") pod \"a72aad60-a71e-46a9-a37d-e9f9796d4042\" (UID: \"a72aad60-a71e-46a9-a37d-e9f9796d4042\") " Jan 11 17:49:59 crc kubenswrapper[4837]: I0111 17:49:59.966453 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/a72aad60-a71e-46a9-a37d-e9f9796d4042-var-log-ovn\") pod \"a72aad60-a71e-46a9-a37d-e9f9796d4042\" (UID: \"a72aad60-a71e-46a9-a37d-e9f9796d4042\") " Jan 11 17:49:59 crc kubenswrapper[4837]: I0111 17:49:59.966439 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/a72aad60-a71e-46a9-a37d-e9f9796d4042-var-run-ovn" (OuterVolumeSpecName: "var-run-ovn") pod "a72aad60-a71e-46a9-a37d-e9f9796d4042" (UID: "a72aad60-a71e-46a9-a37d-e9f9796d4042"). InnerVolumeSpecName "var-run-ovn". PluginName "kubernetes.io/host-path", VolumeGidValue "" Jan 11 17:49:59 crc kubenswrapper[4837]: I0111 17:49:59.966472 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/a72aad60-a71e-46a9-a37d-e9f9796d4042-scripts\") pod \"a72aad60-a71e-46a9-a37d-e9f9796d4042\" (UID: \"a72aad60-a71e-46a9-a37d-e9f9796d4042\") " Jan 11 17:49:59 crc kubenswrapper[4837]: I0111 17:49:59.966514 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6tjvk\" (UniqueName: \"kubernetes.io/projected/a72aad60-a71e-46a9-a37d-e9f9796d4042-kube-api-access-6tjvk\") pod \"a72aad60-a71e-46a9-a37d-e9f9796d4042\" (UID: \"a72aad60-a71e-46a9-a37d-e9f9796d4042\") " Jan 11 17:49:59 crc kubenswrapper[4837]: I0111 17:49:59.966536 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/a72aad60-a71e-46a9-a37d-e9f9796d4042-var-log-ovn" (OuterVolumeSpecName: "var-log-ovn") pod "a72aad60-a71e-46a9-a37d-e9f9796d4042" (UID: "a72aad60-a71e-46a9-a37d-e9f9796d4042"). InnerVolumeSpecName "var-log-ovn". PluginName "kubernetes.io/host-path", VolumeGidValue "" Jan 11 17:49:59 crc kubenswrapper[4837]: I0111 17:49:59.966828 4837 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jzhch\" (UniqueName: \"kubernetes.io/projected/d861b390-4351-4ffb-8bb5-19201f06e3db-kube-api-access-jzhch\") on node \"crc\" DevicePath \"\"" Jan 11 17:49:59 crc kubenswrapper[4837]: I0111 17:49:59.966842 4837 reconciler_common.go:293] "Volume detached for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/a72aad60-a71e-46a9-a37d-e9f9796d4042-var-log-ovn\") on node \"crc\" DevicePath \"\"" Jan 11 17:49:59 crc kubenswrapper[4837]: I0111 17:49:59.966852 4837 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/d861b390-4351-4ffb-8bb5-19201f06e3db-operator-scripts\") on node \"crc\" DevicePath \"\"" Jan 11 17:49:59 crc kubenswrapper[4837]: I0111 17:49:59.966863 4837 reconciler_common.go:293] "Volume detached for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/a72aad60-a71e-46a9-a37d-e9f9796d4042-var-run\") on node \"crc\" DevicePath \"\"" Jan 11 17:49:59 crc kubenswrapper[4837]: I0111 17:49:59.966871 4837 reconciler_common.go:293] "Volume detached for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/a72aad60-a71e-46a9-a37d-e9f9796d4042-var-run-ovn\") on node \"crc\" DevicePath \"\"" Jan 11 17:49:59 crc kubenswrapper[4837]: I0111 17:49:59.967070 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a72aad60-a71e-46a9-a37d-e9f9796d4042-additional-scripts" (OuterVolumeSpecName: "additional-scripts") pod "a72aad60-a71e-46a9-a37d-e9f9796d4042" (UID: "a72aad60-a71e-46a9-a37d-e9f9796d4042"). InnerVolumeSpecName "additional-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 11 17:49:59 crc kubenswrapper[4837]: I0111 17:49:59.967493 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a72aad60-a71e-46a9-a37d-e9f9796d4042-scripts" (OuterVolumeSpecName: "scripts") pod "a72aad60-a71e-46a9-a37d-e9f9796d4042" (UID: "a72aad60-a71e-46a9-a37d-e9f9796d4042"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 11 17:49:59 crc kubenswrapper[4837]: I0111 17:49:59.987394 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a72aad60-a71e-46a9-a37d-e9f9796d4042-kube-api-access-6tjvk" (OuterVolumeSpecName: "kube-api-access-6tjvk") pod "a72aad60-a71e-46a9-a37d-e9f9796d4042" (UID: "a72aad60-a71e-46a9-a37d-e9f9796d4042"). InnerVolumeSpecName "kube-api-access-6tjvk". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 11 17:50:00 crc kubenswrapper[4837]: I0111 17:50:00.068748 4837 reconciler_common.go:293] "Volume detached for volume \"additional-scripts\" (UniqueName: \"kubernetes.io/configmap/a72aad60-a71e-46a9-a37d-e9f9796d4042-additional-scripts\") on node \"crc\" DevicePath \"\"" Jan 11 17:50:00 crc kubenswrapper[4837]: I0111 17:50:00.069095 4837 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/a72aad60-a71e-46a9-a37d-e9f9796d4042-scripts\") on node \"crc\" DevicePath \"\"" Jan 11 17:50:00 crc kubenswrapper[4837]: I0111 17:50:00.069112 4837 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6tjvk\" (UniqueName: \"kubernetes.io/projected/a72aad60-a71e-46a9-a37d-e9f9796d4042-kube-api-access-6tjvk\") on node \"crc\" DevicePath \"\"" Jan 11 17:50:00 crc kubenswrapper[4837]: I0111 17:50:00.424281 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/root-account-create-update-khqcj" event={"ID":"d861b390-4351-4ffb-8bb5-19201f06e3db","Type":"ContainerDied","Data":"316b39e5aa11cc6d5d7e73193c960db54e6a8f9b03873b0709acd8b3a0e0642a"} Jan 11 17:50:00 crc kubenswrapper[4837]: I0111 17:50:00.424325 4837 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="316b39e5aa11cc6d5d7e73193c960db54e6a8f9b03873b0709acd8b3a0e0642a" Jan 11 17:50:00 crc kubenswrapper[4837]: I0111 17:50:00.424361 4837 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/root-account-create-update-khqcj" Jan 11 17:50:00 crc kubenswrapper[4837]: I0111 17:50:00.427500 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-zfjdc-config-ctrsb" event={"ID":"a72aad60-a71e-46a9-a37d-e9f9796d4042","Type":"ContainerDied","Data":"feeca970611022b253501bdfadb41b2d85806a12f4c47717e10ddc070c8ba25e"} Jan 11 17:50:00 crc kubenswrapper[4837]: I0111 17:50:00.427541 4837 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="feeca970611022b253501bdfadb41b2d85806a12f4c47717e10ddc070c8ba25e" Jan 11 17:50:00 crc kubenswrapper[4837]: I0111 17:50:00.427599 4837 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-zfjdc-config-ctrsb" Jan 11 17:50:00 crc kubenswrapper[4837]: I0111 17:50:00.436297 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"23a0b787-b5b4-4a4e-828b-d7f34853603f","Type":"ContainerStarted","Data":"3ba3e992da40e819b24cc1622f1c8411aef297ac86f111c2e9086e5d468deeab"} Jan 11 17:50:00 crc kubenswrapper[4837]: I0111 17:50:00.436340 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"23a0b787-b5b4-4a4e-828b-d7f34853603f","Type":"ContainerStarted","Data":"34e17b427413f60d87afab1287806a3bbe2e87694bea57e2c1bc09970a4c0cae"} Jan 11 17:50:00 crc kubenswrapper[4837]: I0111 17:50:00.900191 4837 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ovn-controller-zfjdc-config-ctrsb"] Jan 11 17:50:00 crc kubenswrapper[4837]: I0111 17:50:00.905827 4837 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ovn-controller-zfjdc-config-ctrsb"] Jan 11 17:50:01 crc kubenswrapper[4837]: I0111 17:50:01.037729 4837 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovn-controller-zfjdc-config-pk98z"] Jan 11 17:50:01 crc kubenswrapper[4837]: E0111 17:50:01.038064 4837 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d861b390-4351-4ffb-8bb5-19201f06e3db" containerName="mariadb-account-create-update" Jan 11 17:50:01 crc kubenswrapper[4837]: I0111 17:50:01.038076 4837 state_mem.go:107] "Deleted CPUSet assignment" podUID="d861b390-4351-4ffb-8bb5-19201f06e3db" containerName="mariadb-account-create-update" Jan 11 17:50:01 crc kubenswrapper[4837]: E0111 17:50:01.038088 4837 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a72aad60-a71e-46a9-a37d-e9f9796d4042" containerName="ovn-config" Jan 11 17:50:01 crc kubenswrapper[4837]: I0111 17:50:01.038094 4837 state_mem.go:107] "Deleted CPUSet assignment" podUID="a72aad60-a71e-46a9-a37d-e9f9796d4042" containerName="ovn-config" Jan 11 17:50:01 crc kubenswrapper[4837]: I0111 17:50:01.038244 4837 memory_manager.go:354] "RemoveStaleState removing state" podUID="d861b390-4351-4ffb-8bb5-19201f06e3db" containerName="mariadb-account-create-update" Jan 11 17:50:01 crc kubenswrapper[4837]: I0111 17:50:01.038260 4837 memory_manager.go:354] "RemoveStaleState removing state" podUID="a72aad60-a71e-46a9-a37d-e9f9796d4042" containerName="ovn-config" Jan 11 17:50:01 crc kubenswrapper[4837]: I0111 17:50:01.038853 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-zfjdc-config-pk98z" Jan 11 17:50:01 crc kubenswrapper[4837]: I0111 17:50:01.041765 4837 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovncontroller-extra-scripts" Jan 11 17:50:01 crc kubenswrapper[4837]: I0111 17:50:01.058450 4837 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-zfjdc-config-pk98z"] Jan 11 17:50:01 crc kubenswrapper[4837]: I0111 17:50:01.185857 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-f84lw\" (UniqueName: \"kubernetes.io/projected/803e6524-2b98-4987-852a-5e70cc5f99c9-kube-api-access-f84lw\") pod \"ovn-controller-zfjdc-config-pk98z\" (UID: \"803e6524-2b98-4987-852a-5e70cc5f99c9\") " pod="openstack/ovn-controller-zfjdc-config-pk98z" Jan 11 17:50:01 crc kubenswrapper[4837]: I0111 17:50:01.185970 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/803e6524-2b98-4987-852a-5e70cc5f99c9-var-run\") pod \"ovn-controller-zfjdc-config-pk98z\" (UID: \"803e6524-2b98-4987-852a-5e70cc5f99c9\") " pod="openstack/ovn-controller-zfjdc-config-pk98z" Jan 11 17:50:01 crc kubenswrapper[4837]: I0111 17:50:01.186046 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"additional-scripts\" (UniqueName: \"kubernetes.io/configmap/803e6524-2b98-4987-852a-5e70cc5f99c9-additional-scripts\") pod \"ovn-controller-zfjdc-config-pk98z\" (UID: \"803e6524-2b98-4987-852a-5e70cc5f99c9\") " pod="openstack/ovn-controller-zfjdc-config-pk98z" Jan 11 17:50:01 crc kubenswrapper[4837]: I0111 17:50:01.186095 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/803e6524-2b98-4987-852a-5e70cc5f99c9-var-run-ovn\") pod \"ovn-controller-zfjdc-config-pk98z\" (UID: \"803e6524-2b98-4987-852a-5e70cc5f99c9\") " pod="openstack/ovn-controller-zfjdc-config-pk98z" Jan 11 17:50:01 crc kubenswrapper[4837]: I0111 17:50:01.186160 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/803e6524-2b98-4987-852a-5e70cc5f99c9-var-log-ovn\") pod \"ovn-controller-zfjdc-config-pk98z\" (UID: \"803e6524-2b98-4987-852a-5e70cc5f99c9\") " pod="openstack/ovn-controller-zfjdc-config-pk98z" Jan 11 17:50:01 crc kubenswrapper[4837]: I0111 17:50:01.186340 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/803e6524-2b98-4987-852a-5e70cc5f99c9-scripts\") pod \"ovn-controller-zfjdc-config-pk98z\" (UID: \"803e6524-2b98-4987-852a-5e70cc5f99c9\") " pod="openstack/ovn-controller-zfjdc-config-pk98z" Jan 11 17:50:01 crc kubenswrapper[4837]: I0111 17:50:01.287575 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"additional-scripts\" (UniqueName: \"kubernetes.io/configmap/803e6524-2b98-4987-852a-5e70cc5f99c9-additional-scripts\") pod \"ovn-controller-zfjdc-config-pk98z\" (UID: \"803e6524-2b98-4987-852a-5e70cc5f99c9\") " pod="openstack/ovn-controller-zfjdc-config-pk98z" Jan 11 17:50:01 crc kubenswrapper[4837]: I0111 17:50:01.287625 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/803e6524-2b98-4987-852a-5e70cc5f99c9-var-run-ovn\") pod \"ovn-controller-zfjdc-config-pk98z\" (UID: \"803e6524-2b98-4987-852a-5e70cc5f99c9\") " pod="openstack/ovn-controller-zfjdc-config-pk98z" Jan 11 17:50:01 crc kubenswrapper[4837]: I0111 17:50:01.287997 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/803e6524-2b98-4987-852a-5e70cc5f99c9-var-run-ovn\") pod \"ovn-controller-zfjdc-config-pk98z\" (UID: \"803e6524-2b98-4987-852a-5e70cc5f99c9\") " pod="openstack/ovn-controller-zfjdc-config-pk98z" Jan 11 17:50:01 crc kubenswrapper[4837]: I0111 17:50:01.288058 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/803e6524-2b98-4987-852a-5e70cc5f99c9-var-log-ovn\") pod \"ovn-controller-zfjdc-config-pk98z\" (UID: \"803e6524-2b98-4987-852a-5e70cc5f99c9\") " pod="openstack/ovn-controller-zfjdc-config-pk98z" Jan 11 17:50:01 crc kubenswrapper[4837]: I0111 17:50:01.288142 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/803e6524-2b98-4987-852a-5e70cc5f99c9-var-log-ovn\") pod \"ovn-controller-zfjdc-config-pk98z\" (UID: \"803e6524-2b98-4987-852a-5e70cc5f99c9\") " pod="openstack/ovn-controller-zfjdc-config-pk98z" Jan 11 17:50:01 crc kubenswrapper[4837]: I0111 17:50:01.288235 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/803e6524-2b98-4987-852a-5e70cc5f99c9-scripts\") pod \"ovn-controller-zfjdc-config-pk98z\" (UID: \"803e6524-2b98-4987-852a-5e70cc5f99c9\") " pod="openstack/ovn-controller-zfjdc-config-pk98z" Jan 11 17:50:01 crc kubenswrapper[4837]: I0111 17:50:01.288341 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"additional-scripts\" (UniqueName: \"kubernetes.io/configmap/803e6524-2b98-4987-852a-5e70cc5f99c9-additional-scripts\") pod \"ovn-controller-zfjdc-config-pk98z\" (UID: \"803e6524-2b98-4987-852a-5e70cc5f99c9\") " pod="openstack/ovn-controller-zfjdc-config-pk98z" Jan 11 17:50:01 crc kubenswrapper[4837]: I0111 17:50:01.290121 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/803e6524-2b98-4987-852a-5e70cc5f99c9-scripts\") pod \"ovn-controller-zfjdc-config-pk98z\" (UID: \"803e6524-2b98-4987-852a-5e70cc5f99c9\") " pod="openstack/ovn-controller-zfjdc-config-pk98z" Jan 11 17:50:01 crc kubenswrapper[4837]: I0111 17:50:01.290182 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-f84lw\" (UniqueName: \"kubernetes.io/projected/803e6524-2b98-4987-852a-5e70cc5f99c9-kube-api-access-f84lw\") pod \"ovn-controller-zfjdc-config-pk98z\" (UID: \"803e6524-2b98-4987-852a-5e70cc5f99c9\") " pod="openstack/ovn-controller-zfjdc-config-pk98z" Jan 11 17:50:01 crc kubenswrapper[4837]: I0111 17:50:01.290374 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/803e6524-2b98-4987-852a-5e70cc5f99c9-var-run\") pod \"ovn-controller-zfjdc-config-pk98z\" (UID: \"803e6524-2b98-4987-852a-5e70cc5f99c9\") " pod="openstack/ovn-controller-zfjdc-config-pk98z" Jan 11 17:50:01 crc kubenswrapper[4837]: I0111 17:50:01.290506 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/803e6524-2b98-4987-852a-5e70cc5f99c9-var-run\") pod \"ovn-controller-zfjdc-config-pk98z\" (UID: \"803e6524-2b98-4987-852a-5e70cc5f99c9\") " pod="openstack/ovn-controller-zfjdc-config-pk98z" Jan 11 17:50:01 crc kubenswrapper[4837]: I0111 17:50:01.312208 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-f84lw\" (UniqueName: \"kubernetes.io/projected/803e6524-2b98-4987-852a-5e70cc5f99c9-kube-api-access-f84lw\") pod \"ovn-controller-zfjdc-config-pk98z\" (UID: \"803e6524-2b98-4987-852a-5e70cc5f99c9\") " pod="openstack/ovn-controller-zfjdc-config-pk98z" Jan 11 17:50:01 crc kubenswrapper[4837]: I0111 17:50:01.357119 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-zfjdc-config-pk98z" Jan 11 17:50:01 crc kubenswrapper[4837]: I0111 17:50:01.540810 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"23a0b787-b5b4-4a4e-828b-d7f34853603f","Type":"ContainerStarted","Data":"f94ac4194137630893b6187b768a3802ad6c5dd15db382db63398344d53c9692"} Jan 11 17:50:01 crc kubenswrapper[4837]: I0111 17:50:01.858169 4837 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/swift-storage-0" podStartSLOduration=26.702035844 podStartE2EDuration="46.858153779s" podCreationTimestamp="2026-01-11 17:49:15 +0000 UTC" firstStartedPulling="2026-01-11 17:49:32.784239601 +0000 UTC m=+1146.962432307" lastFinishedPulling="2026-01-11 17:49:52.940357526 +0000 UTC m=+1167.118550242" observedRunningTime="2026-01-11 17:50:01.590406957 +0000 UTC m=+1175.768599663" watchObservedRunningTime="2026-01-11 17:50:01.858153779 +0000 UTC m=+1176.036346475" Jan 11 17:50:01 crc kubenswrapper[4837]: I0111 17:50:01.861925 4837 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-77585f5f8c-xqvrn"] Jan 11 17:50:01 crc kubenswrapper[4837]: I0111 17:50:01.863211 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-77585f5f8c-xqvrn" Jan 11 17:50:01 crc kubenswrapper[4837]: I0111 17:50:01.865725 4837 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"dns-swift-storage-0" Jan 11 17:50:01 crc kubenswrapper[4837]: I0111 17:50:01.878603 4837 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-77585f5f8c-xqvrn"] Jan 11 17:50:01 crc kubenswrapper[4837]: I0111 17:50:01.911185 4837 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-zfjdc-config-pk98z"] Jan 11 17:50:02 crc kubenswrapper[4837]: I0111 17:50:02.049996 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7vlnn\" (UniqueName: \"kubernetes.io/projected/e5f0d593-ab67-4967-9397-517c45742d39-kube-api-access-7vlnn\") pod \"dnsmasq-dns-77585f5f8c-xqvrn\" (UID: \"e5f0d593-ab67-4967-9397-517c45742d39\") " pod="openstack/dnsmasq-dns-77585f5f8c-xqvrn" Jan 11 17:50:02 crc kubenswrapper[4837]: I0111 17:50:02.050260 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/e5f0d593-ab67-4967-9397-517c45742d39-dns-swift-storage-0\") pod \"dnsmasq-dns-77585f5f8c-xqvrn\" (UID: \"e5f0d593-ab67-4967-9397-517c45742d39\") " pod="openstack/dnsmasq-dns-77585f5f8c-xqvrn" Jan 11 17:50:02 crc kubenswrapper[4837]: I0111 17:50:02.050285 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e5f0d593-ab67-4967-9397-517c45742d39-config\") pod \"dnsmasq-dns-77585f5f8c-xqvrn\" (UID: \"e5f0d593-ab67-4967-9397-517c45742d39\") " pod="openstack/dnsmasq-dns-77585f5f8c-xqvrn" Jan 11 17:50:02 crc kubenswrapper[4837]: I0111 17:50:02.050346 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/e5f0d593-ab67-4967-9397-517c45742d39-dns-svc\") pod \"dnsmasq-dns-77585f5f8c-xqvrn\" (UID: \"e5f0d593-ab67-4967-9397-517c45742d39\") " pod="openstack/dnsmasq-dns-77585f5f8c-xqvrn" Jan 11 17:50:02 crc kubenswrapper[4837]: I0111 17:50:02.050407 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/e5f0d593-ab67-4967-9397-517c45742d39-ovsdbserver-sb\") pod \"dnsmasq-dns-77585f5f8c-xqvrn\" (UID: \"e5f0d593-ab67-4967-9397-517c45742d39\") " pod="openstack/dnsmasq-dns-77585f5f8c-xqvrn" Jan 11 17:50:02 crc kubenswrapper[4837]: I0111 17:50:02.050437 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/e5f0d593-ab67-4967-9397-517c45742d39-ovsdbserver-nb\") pod \"dnsmasq-dns-77585f5f8c-xqvrn\" (UID: \"e5f0d593-ab67-4967-9397-517c45742d39\") " pod="openstack/dnsmasq-dns-77585f5f8c-xqvrn" Jan 11 17:50:02 crc kubenswrapper[4837]: I0111 17:50:02.151354 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/e5f0d593-ab67-4967-9397-517c45742d39-ovsdbserver-sb\") pod \"dnsmasq-dns-77585f5f8c-xqvrn\" (UID: \"e5f0d593-ab67-4967-9397-517c45742d39\") " pod="openstack/dnsmasq-dns-77585f5f8c-xqvrn" Jan 11 17:50:02 crc kubenswrapper[4837]: I0111 17:50:02.151404 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/e5f0d593-ab67-4967-9397-517c45742d39-ovsdbserver-nb\") pod \"dnsmasq-dns-77585f5f8c-xqvrn\" (UID: \"e5f0d593-ab67-4967-9397-517c45742d39\") " pod="openstack/dnsmasq-dns-77585f5f8c-xqvrn" Jan 11 17:50:02 crc kubenswrapper[4837]: I0111 17:50:02.151463 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7vlnn\" (UniqueName: \"kubernetes.io/projected/e5f0d593-ab67-4967-9397-517c45742d39-kube-api-access-7vlnn\") pod \"dnsmasq-dns-77585f5f8c-xqvrn\" (UID: \"e5f0d593-ab67-4967-9397-517c45742d39\") " pod="openstack/dnsmasq-dns-77585f5f8c-xqvrn" Jan 11 17:50:02 crc kubenswrapper[4837]: I0111 17:50:02.151501 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/e5f0d593-ab67-4967-9397-517c45742d39-dns-swift-storage-0\") pod \"dnsmasq-dns-77585f5f8c-xqvrn\" (UID: \"e5f0d593-ab67-4967-9397-517c45742d39\") " pod="openstack/dnsmasq-dns-77585f5f8c-xqvrn" Jan 11 17:50:02 crc kubenswrapper[4837]: I0111 17:50:02.151522 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e5f0d593-ab67-4967-9397-517c45742d39-config\") pod \"dnsmasq-dns-77585f5f8c-xqvrn\" (UID: \"e5f0d593-ab67-4967-9397-517c45742d39\") " pod="openstack/dnsmasq-dns-77585f5f8c-xqvrn" Jan 11 17:50:02 crc kubenswrapper[4837]: I0111 17:50:02.151560 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/e5f0d593-ab67-4967-9397-517c45742d39-dns-svc\") pod \"dnsmasq-dns-77585f5f8c-xqvrn\" (UID: \"e5f0d593-ab67-4967-9397-517c45742d39\") " pod="openstack/dnsmasq-dns-77585f5f8c-xqvrn" Jan 11 17:50:02 crc kubenswrapper[4837]: I0111 17:50:02.152282 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/e5f0d593-ab67-4967-9397-517c45742d39-dns-svc\") pod \"dnsmasq-dns-77585f5f8c-xqvrn\" (UID: \"e5f0d593-ab67-4967-9397-517c45742d39\") " pod="openstack/dnsmasq-dns-77585f5f8c-xqvrn" Jan 11 17:50:02 crc kubenswrapper[4837]: I0111 17:50:02.152881 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/e5f0d593-ab67-4967-9397-517c45742d39-ovsdbserver-sb\") pod \"dnsmasq-dns-77585f5f8c-xqvrn\" (UID: \"e5f0d593-ab67-4967-9397-517c45742d39\") " pod="openstack/dnsmasq-dns-77585f5f8c-xqvrn" Jan 11 17:50:02 crc kubenswrapper[4837]: I0111 17:50:02.152973 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e5f0d593-ab67-4967-9397-517c45742d39-config\") pod \"dnsmasq-dns-77585f5f8c-xqvrn\" (UID: \"e5f0d593-ab67-4967-9397-517c45742d39\") " pod="openstack/dnsmasq-dns-77585f5f8c-xqvrn" Jan 11 17:50:02 crc kubenswrapper[4837]: I0111 17:50:02.153071 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/e5f0d593-ab67-4967-9397-517c45742d39-ovsdbserver-nb\") pod \"dnsmasq-dns-77585f5f8c-xqvrn\" (UID: \"e5f0d593-ab67-4967-9397-517c45742d39\") " pod="openstack/dnsmasq-dns-77585f5f8c-xqvrn" Jan 11 17:50:02 crc kubenswrapper[4837]: I0111 17:50:02.153099 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/e5f0d593-ab67-4967-9397-517c45742d39-dns-swift-storage-0\") pod \"dnsmasq-dns-77585f5f8c-xqvrn\" (UID: \"e5f0d593-ab67-4967-9397-517c45742d39\") " pod="openstack/dnsmasq-dns-77585f5f8c-xqvrn" Jan 11 17:50:02 crc kubenswrapper[4837]: I0111 17:50:02.174649 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7vlnn\" (UniqueName: \"kubernetes.io/projected/e5f0d593-ab67-4967-9397-517c45742d39-kube-api-access-7vlnn\") pod \"dnsmasq-dns-77585f5f8c-xqvrn\" (UID: \"e5f0d593-ab67-4967-9397-517c45742d39\") " pod="openstack/dnsmasq-dns-77585f5f8c-xqvrn" Jan 11 17:50:02 crc kubenswrapper[4837]: I0111 17:50:02.185647 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-77585f5f8c-xqvrn" Jan 11 17:50:02 crc kubenswrapper[4837]: I0111 17:50:02.379850 4837 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a72aad60-a71e-46a9-a37d-e9f9796d4042" path="/var/lib/kubelet/pods/a72aad60-a71e-46a9-a37d-e9f9796d4042/volumes" Jan 11 17:50:02 crc kubenswrapper[4837]: I0111 17:50:02.548479 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-zfjdc-config-pk98z" event={"ID":"803e6524-2b98-4987-852a-5e70cc5f99c9","Type":"ContainerStarted","Data":"a8301c4c919537beac441d4504f14706e333b943d7a0190414f107378ba46847"} Jan 11 17:50:02 crc kubenswrapper[4837]: W0111 17:50:02.688619 4837 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pode5f0d593_ab67_4967_9397_517c45742d39.slice/crio-903a073c30fe7183293db7550b9f7189457f4885c8da44caa5fae22c605188d5 WatchSource:0}: Error finding container 903a073c30fe7183293db7550b9f7189457f4885c8da44caa5fae22c605188d5: Status 404 returned error can't find the container with id 903a073c30fe7183293db7550b9f7189457f4885c8da44caa5fae22c605188d5 Jan 11 17:50:02 crc kubenswrapper[4837]: I0111 17:50:02.689739 4837 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-77585f5f8c-xqvrn"] Jan 11 17:50:03 crc kubenswrapper[4837]: I0111 17:50:03.558095 4837 generic.go:334] "Generic (PLEG): container finished" podID="e5f0d593-ab67-4967-9397-517c45742d39" containerID="d241bf124a3b2ee1f8906722e2dac544932bb327fcf087bf4e834c39c85bc804" exitCode=0 Jan 11 17:50:03 crc kubenswrapper[4837]: I0111 17:50:03.558476 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-77585f5f8c-xqvrn" event={"ID":"e5f0d593-ab67-4967-9397-517c45742d39","Type":"ContainerDied","Data":"d241bf124a3b2ee1f8906722e2dac544932bb327fcf087bf4e834c39c85bc804"} Jan 11 17:50:03 crc kubenswrapper[4837]: I0111 17:50:03.558504 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-77585f5f8c-xqvrn" event={"ID":"e5f0d593-ab67-4967-9397-517c45742d39","Type":"ContainerStarted","Data":"903a073c30fe7183293db7550b9f7189457f4885c8da44caa5fae22c605188d5"} Jan 11 17:50:03 crc kubenswrapper[4837]: I0111 17:50:03.560601 4837 generic.go:334] "Generic (PLEG): container finished" podID="803e6524-2b98-4987-852a-5e70cc5f99c9" containerID="221ed1d81791c870665aa671ba57922eeda80d386e82af5182a48ba03b593cf1" exitCode=0 Jan 11 17:50:03 crc kubenswrapper[4837]: I0111 17:50:03.560628 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-zfjdc-config-pk98z" event={"ID":"803e6524-2b98-4987-852a-5e70cc5f99c9","Type":"ContainerDied","Data":"221ed1d81791c870665aa671ba57922eeda80d386e82af5182a48ba03b593cf1"} Jan 11 17:50:04 crc kubenswrapper[4837]: I0111 17:50:04.574913 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-77585f5f8c-xqvrn" event={"ID":"e5f0d593-ab67-4967-9397-517c45742d39","Type":"ContainerStarted","Data":"77eee50bec11a2ded47dc627f6957ec511c0d29c38f2d45ecf1bb4c34fd9dfab"} Jan 11 17:50:04 crc kubenswrapper[4837]: I0111 17:50:04.605104 4837 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-77585f5f8c-xqvrn" podStartSLOduration=3.605079022 podStartE2EDuration="3.605079022s" podCreationTimestamp="2026-01-11 17:50:01 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-11 17:50:04.599500462 +0000 UTC m=+1178.777693198" watchObservedRunningTime="2026-01-11 17:50:04.605079022 +0000 UTC m=+1178.783271768" Jan 11 17:50:04 crc kubenswrapper[4837]: I0111 17:50:04.988856 4837 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-zfjdc-config-pk98z" Jan 11 17:50:05 crc kubenswrapper[4837]: I0111 17:50:05.106192 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/803e6524-2b98-4987-852a-5e70cc5f99c9-var-log-ovn\") pod \"803e6524-2b98-4987-852a-5e70cc5f99c9\" (UID: \"803e6524-2b98-4987-852a-5e70cc5f99c9\") " Jan 11 17:50:05 crc kubenswrapper[4837]: I0111 17:50:05.106342 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/803e6524-2b98-4987-852a-5e70cc5f99c9-var-log-ovn" (OuterVolumeSpecName: "var-log-ovn") pod "803e6524-2b98-4987-852a-5e70cc5f99c9" (UID: "803e6524-2b98-4987-852a-5e70cc5f99c9"). InnerVolumeSpecName "var-log-ovn". PluginName "kubernetes.io/host-path", VolumeGidValue "" Jan 11 17:50:05 crc kubenswrapper[4837]: I0111 17:50:05.106511 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"additional-scripts\" (UniqueName: \"kubernetes.io/configmap/803e6524-2b98-4987-852a-5e70cc5f99c9-additional-scripts\") pod \"803e6524-2b98-4987-852a-5e70cc5f99c9\" (UID: \"803e6524-2b98-4987-852a-5e70cc5f99c9\") " Jan 11 17:50:05 crc kubenswrapper[4837]: I0111 17:50:05.106579 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/803e6524-2b98-4987-852a-5e70cc5f99c9-scripts\") pod \"803e6524-2b98-4987-852a-5e70cc5f99c9\" (UID: \"803e6524-2b98-4987-852a-5e70cc5f99c9\") " Jan 11 17:50:05 crc kubenswrapper[4837]: I0111 17:50:05.106619 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/803e6524-2b98-4987-852a-5e70cc5f99c9-var-run-ovn\") pod \"803e6524-2b98-4987-852a-5e70cc5f99c9\" (UID: \"803e6524-2b98-4987-852a-5e70cc5f99c9\") " Jan 11 17:50:05 crc kubenswrapper[4837]: I0111 17:50:05.106719 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/803e6524-2b98-4987-852a-5e70cc5f99c9-var-run\") pod \"803e6524-2b98-4987-852a-5e70cc5f99c9\" (UID: \"803e6524-2b98-4987-852a-5e70cc5f99c9\") " Jan 11 17:50:05 crc kubenswrapper[4837]: I0111 17:50:05.106762 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-f84lw\" (UniqueName: \"kubernetes.io/projected/803e6524-2b98-4987-852a-5e70cc5f99c9-kube-api-access-f84lw\") pod \"803e6524-2b98-4987-852a-5e70cc5f99c9\" (UID: \"803e6524-2b98-4987-852a-5e70cc5f99c9\") " Jan 11 17:50:05 crc kubenswrapper[4837]: I0111 17:50:05.106837 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/803e6524-2b98-4987-852a-5e70cc5f99c9-var-run-ovn" (OuterVolumeSpecName: "var-run-ovn") pod "803e6524-2b98-4987-852a-5e70cc5f99c9" (UID: "803e6524-2b98-4987-852a-5e70cc5f99c9"). InnerVolumeSpecName "var-run-ovn". PluginName "kubernetes.io/host-path", VolumeGidValue "" Jan 11 17:50:05 crc kubenswrapper[4837]: I0111 17:50:05.106932 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/803e6524-2b98-4987-852a-5e70cc5f99c9-var-run" (OuterVolumeSpecName: "var-run") pod "803e6524-2b98-4987-852a-5e70cc5f99c9" (UID: "803e6524-2b98-4987-852a-5e70cc5f99c9"). InnerVolumeSpecName "var-run". PluginName "kubernetes.io/host-path", VolumeGidValue "" Jan 11 17:50:05 crc kubenswrapper[4837]: I0111 17:50:05.107180 4837 reconciler_common.go:293] "Volume detached for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/803e6524-2b98-4987-852a-5e70cc5f99c9-var-run-ovn\") on node \"crc\" DevicePath \"\"" Jan 11 17:50:05 crc kubenswrapper[4837]: I0111 17:50:05.107251 4837 reconciler_common.go:293] "Volume detached for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/803e6524-2b98-4987-852a-5e70cc5f99c9-var-run\") on node \"crc\" DevicePath \"\"" Jan 11 17:50:05 crc kubenswrapper[4837]: I0111 17:50:05.107314 4837 reconciler_common.go:293] "Volume detached for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/803e6524-2b98-4987-852a-5e70cc5f99c9-var-log-ovn\") on node \"crc\" DevicePath \"\"" Jan 11 17:50:05 crc kubenswrapper[4837]: I0111 17:50:05.107734 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/803e6524-2b98-4987-852a-5e70cc5f99c9-additional-scripts" (OuterVolumeSpecName: "additional-scripts") pod "803e6524-2b98-4987-852a-5e70cc5f99c9" (UID: "803e6524-2b98-4987-852a-5e70cc5f99c9"). InnerVolumeSpecName "additional-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 11 17:50:05 crc kubenswrapper[4837]: I0111 17:50:05.107886 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/803e6524-2b98-4987-852a-5e70cc5f99c9-scripts" (OuterVolumeSpecName: "scripts") pod "803e6524-2b98-4987-852a-5e70cc5f99c9" (UID: "803e6524-2b98-4987-852a-5e70cc5f99c9"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 11 17:50:05 crc kubenswrapper[4837]: I0111 17:50:05.115463 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/803e6524-2b98-4987-852a-5e70cc5f99c9-kube-api-access-f84lw" (OuterVolumeSpecName: "kube-api-access-f84lw") pod "803e6524-2b98-4987-852a-5e70cc5f99c9" (UID: "803e6524-2b98-4987-852a-5e70cc5f99c9"). InnerVolumeSpecName "kube-api-access-f84lw". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 11 17:50:05 crc kubenswrapper[4837]: I0111 17:50:05.208536 4837 reconciler_common.go:293] "Volume detached for volume \"additional-scripts\" (UniqueName: \"kubernetes.io/configmap/803e6524-2b98-4987-852a-5e70cc5f99c9-additional-scripts\") on node \"crc\" DevicePath \"\"" Jan 11 17:50:05 crc kubenswrapper[4837]: I0111 17:50:05.208581 4837 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/803e6524-2b98-4987-852a-5e70cc5f99c9-scripts\") on node \"crc\" DevicePath \"\"" Jan 11 17:50:05 crc kubenswrapper[4837]: I0111 17:50:05.208597 4837 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-f84lw\" (UniqueName: \"kubernetes.io/projected/803e6524-2b98-4987-852a-5e70cc5f99c9-kube-api-access-f84lw\") on node \"crc\" DevicePath \"\"" Jan 11 17:50:05 crc kubenswrapper[4837]: I0111 17:50:05.588138 4837 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-zfjdc-config-pk98z" Jan 11 17:50:05 crc kubenswrapper[4837]: I0111 17:50:05.590871 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-zfjdc-config-pk98z" event={"ID":"803e6524-2b98-4987-852a-5e70cc5f99c9","Type":"ContainerDied","Data":"a8301c4c919537beac441d4504f14706e333b943d7a0190414f107378ba46847"} Jan 11 17:50:05 crc kubenswrapper[4837]: I0111 17:50:05.590943 4837 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="a8301c4c919537beac441d4504f14706e333b943d7a0190414f107378ba46847" Jan 11 17:50:05 crc kubenswrapper[4837]: I0111 17:50:05.590981 4837 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-77585f5f8c-xqvrn" Jan 11 17:50:06 crc kubenswrapper[4837]: I0111 17:50:06.084957 4837 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ovn-controller-zfjdc-config-pk98z"] Jan 11 17:50:06 crc kubenswrapper[4837]: I0111 17:50:06.091142 4837 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ovn-controller-zfjdc-config-pk98z"] Jan 11 17:50:06 crc kubenswrapper[4837]: I0111 17:50:06.374088 4837 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="803e6524-2b98-4987-852a-5e70cc5f99c9" path="/var/lib/kubelet/pods/803e6524-2b98-4987-852a-5e70cc5f99c9/volumes" Jan 11 17:50:09 crc kubenswrapper[4837]: I0111 17:50:09.444592 4837 patch_prober.go:28] interesting pod/machine-config-daemon-pqnst container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Jan 11 17:50:09 crc kubenswrapper[4837]: I0111 17:50:09.444946 4837 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-pqnst" podUID="1cf6fa66-290a-4e29-bafc-e60185a22fe2" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Jan 11 17:50:12 crc kubenswrapper[4837]: I0111 17:50:12.188997 4837 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-77585f5f8c-xqvrn" Jan 11 17:50:12 crc kubenswrapper[4837]: I0111 17:50:12.262761 4837 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-698758b865-rv55z"] Jan 11 17:50:12 crc kubenswrapper[4837]: I0111 17:50:12.263035 4837 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-698758b865-rv55z" podUID="281a517d-5b8e-413b-8e6c-6555318d70c8" containerName="dnsmasq-dns" containerID="cri-o://c1c6287f733a41f27571dc8fc09e2b56cec55dba13b85e711379023f35711626" gracePeriod=10 Jan 11 17:50:12 crc kubenswrapper[4837]: I0111 17:50:12.546976 4837 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/rabbitmq-server-0" Jan 11 17:50:12 crc kubenswrapper[4837]: I0111 17:50:12.565938 4837 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/rabbitmq-cell1-server-0" Jan 11 17:50:12 crc kubenswrapper[4837]: I0111 17:50:12.923279 4837 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-db-create-rvfdr"] Jan 11 17:50:12 crc kubenswrapper[4837]: E0111 17:50:12.923587 4837 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="803e6524-2b98-4987-852a-5e70cc5f99c9" containerName="ovn-config" Jan 11 17:50:12 crc kubenswrapper[4837]: I0111 17:50:12.923600 4837 state_mem.go:107] "Deleted CPUSet assignment" podUID="803e6524-2b98-4987-852a-5e70cc5f99c9" containerName="ovn-config" Jan 11 17:50:12 crc kubenswrapper[4837]: I0111 17:50:12.923786 4837 memory_manager.go:354] "RemoveStaleState removing state" podUID="803e6524-2b98-4987-852a-5e70cc5f99c9" containerName="ovn-config" Jan 11 17:50:12 crc kubenswrapper[4837]: I0111 17:50:12.924236 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-create-rvfdr" Jan 11 17:50:12 crc kubenswrapper[4837]: I0111 17:50:12.933835 4837 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-db-create-rvfdr"] Jan 11 17:50:12 crc kubenswrapper[4837]: I0111 17:50:12.955749 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-k6jmz\" (UniqueName: \"kubernetes.io/projected/29bd365d-3562-452a-a585-041d3f538ebe-kube-api-access-k6jmz\") pod \"cinder-db-create-rvfdr\" (UID: \"29bd365d-3562-452a-a585-041d3f538ebe\") " pod="openstack/cinder-db-create-rvfdr" Jan 11 17:50:12 crc kubenswrapper[4837]: I0111 17:50:12.955922 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/29bd365d-3562-452a-a585-041d3f538ebe-operator-scripts\") pod \"cinder-db-create-rvfdr\" (UID: \"29bd365d-3562-452a-a585-041d3f538ebe\") " pod="openstack/cinder-db-create-rvfdr" Jan 11 17:50:13 crc kubenswrapper[4837]: I0111 17:50:13.012924 4837 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/barbican-db-create-8nczp"] Jan 11 17:50:13 crc kubenswrapper[4837]: I0111 17:50:13.014115 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-create-8nczp" Jan 11 17:50:13 crc kubenswrapper[4837]: I0111 17:50:13.037421 4837 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-db-create-8nczp"] Jan 11 17:50:13 crc kubenswrapper[4837]: I0111 17:50:13.049659 4837 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/barbican-136e-account-create-update-chnvk"] Jan 11 17:50:13 crc kubenswrapper[4837]: I0111 17:50:13.050957 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-136e-account-create-update-chnvk" Jan 11 17:50:13 crc kubenswrapper[4837]: I0111 17:50:13.055607 4837 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-db-secret" Jan 11 17:50:13 crc kubenswrapper[4837]: I0111 17:50:13.057327 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/29bd365d-3562-452a-a585-041d3f538ebe-operator-scripts\") pod \"cinder-db-create-rvfdr\" (UID: \"29bd365d-3562-452a-a585-041d3f538ebe\") " pod="openstack/cinder-db-create-rvfdr" Jan 11 17:50:13 crc kubenswrapper[4837]: I0111 17:50:13.057403 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fv6bv\" (UniqueName: \"kubernetes.io/projected/ff652e64-59f9-4670-8952-a33ee996c7e5-kube-api-access-fv6bv\") pod \"barbican-db-create-8nczp\" (UID: \"ff652e64-59f9-4670-8952-a33ee996c7e5\") " pod="openstack/barbican-db-create-8nczp" Jan 11 17:50:13 crc kubenswrapper[4837]: I0111 17:50:13.057426 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-k6jmz\" (UniqueName: \"kubernetes.io/projected/29bd365d-3562-452a-a585-041d3f538ebe-kube-api-access-k6jmz\") pod \"cinder-db-create-rvfdr\" (UID: \"29bd365d-3562-452a-a585-041d3f538ebe\") " pod="openstack/cinder-db-create-rvfdr" Jan 11 17:50:13 crc kubenswrapper[4837]: I0111 17:50:13.057457 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/ff652e64-59f9-4670-8952-a33ee996c7e5-operator-scripts\") pod \"barbican-db-create-8nczp\" (UID: \"ff652e64-59f9-4670-8952-a33ee996c7e5\") " pod="openstack/barbican-db-create-8nczp" Jan 11 17:50:13 crc kubenswrapper[4837]: I0111 17:50:13.058243 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/29bd365d-3562-452a-a585-041d3f538ebe-operator-scripts\") pod \"cinder-db-create-rvfdr\" (UID: \"29bd365d-3562-452a-a585-041d3f538ebe\") " pod="openstack/cinder-db-create-rvfdr" Jan 11 17:50:13 crc kubenswrapper[4837]: I0111 17:50:13.076488 4837 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-136e-account-create-update-chnvk"] Jan 11 17:50:13 crc kubenswrapper[4837]: I0111 17:50:13.078211 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-k6jmz\" (UniqueName: \"kubernetes.io/projected/29bd365d-3562-452a-a585-041d3f538ebe-kube-api-access-k6jmz\") pod \"cinder-db-create-rvfdr\" (UID: \"29bd365d-3562-452a-a585-041d3f538ebe\") " pod="openstack/cinder-db-create-rvfdr" Jan 11 17:50:13 crc kubenswrapper[4837]: I0111 17:50:13.117464 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-create-rvfdr" Jan 11 17:50:13 crc kubenswrapper[4837]: I0111 17:50:13.133363 4837 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-9090-account-create-update-674ql"] Jan 11 17:50:13 crc kubenswrapper[4837]: I0111 17:50:13.143173 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-9090-account-create-update-674ql" Jan 11 17:50:13 crc kubenswrapper[4837]: I0111 17:50:13.145910 4837 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-db-secret" Jan 11 17:50:13 crc kubenswrapper[4837]: I0111 17:50:13.153053 4837 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-9090-account-create-update-674ql"] Jan 11 17:50:13 crc kubenswrapper[4837]: I0111 17:50:13.161259 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fv6bv\" (UniqueName: \"kubernetes.io/projected/ff652e64-59f9-4670-8952-a33ee996c7e5-kube-api-access-fv6bv\") pod \"barbican-db-create-8nczp\" (UID: \"ff652e64-59f9-4670-8952-a33ee996c7e5\") " pod="openstack/barbican-db-create-8nczp" Jan 11 17:50:13 crc kubenswrapper[4837]: I0111 17:50:13.161330 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/ff652e64-59f9-4670-8952-a33ee996c7e5-operator-scripts\") pod \"barbican-db-create-8nczp\" (UID: \"ff652e64-59f9-4670-8952-a33ee996c7e5\") " pod="openstack/barbican-db-create-8nczp" Jan 11 17:50:13 crc kubenswrapper[4837]: I0111 17:50:13.161363 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nwxn5\" (UniqueName: \"kubernetes.io/projected/f1d3da28-9c32-4294-b0b9-35b383dafb36-kube-api-access-nwxn5\") pod \"barbican-136e-account-create-update-chnvk\" (UID: \"f1d3da28-9c32-4294-b0b9-35b383dafb36\") " pod="openstack/barbican-136e-account-create-update-chnvk" Jan 11 17:50:13 crc kubenswrapper[4837]: I0111 17:50:13.161380 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/3865f01a-21c1-4214-b979-14e72a764eb8-operator-scripts\") pod \"cinder-9090-account-create-update-674ql\" (UID: \"3865f01a-21c1-4214-b979-14e72a764eb8\") " pod="openstack/cinder-9090-account-create-update-674ql" Jan 11 17:50:13 crc kubenswrapper[4837]: I0111 17:50:13.161412 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/f1d3da28-9c32-4294-b0b9-35b383dafb36-operator-scripts\") pod \"barbican-136e-account-create-update-chnvk\" (UID: \"f1d3da28-9c32-4294-b0b9-35b383dafb36\") " pod="openstack/barbican-136e-account-create-update-chnvk" Jan 11 17:50:13 crc kubenswrapper[4837]: I0111 17:50:13.161464 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-d89fn\" (UniqueName: \"kubernetes.io/projected/3865f01a-21c1-4214-b979-14e72a764eb8-kube-api-access-d89fn\") pod \"cinder-9090-account-create-update-674ql\" (UID: \"3865f01a-21c1-4214-b979-14e72a764eb8\") " pod="openstack/cinder-9090-account-create-update-674ql" Jan 11 17:50:13 crc kubenswrapper[4837]: I0111 17:50:13.162859 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/ff652e64-59f9-4670-8952-a33ee996c7e5-operator-scripts\") pod \"barbican-db-create-8nczp\" (UID: \"ff652e64-59f9-4670-8952-a33ee996c7e5\") " pod="openstack/barbican-db-create-8nczp" Jan 11 17:50:13 crc kubenswrapper[4837]: I0111 17:50:13.188313 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fv6bv\" (UniqueName: \"kubernetes.io/projected/ff652e64-59f9-4670-8952-a33ee996c7e5-kube-api-access-fv6bv\") pod \"barbican-db-create-8nczp\" (UID: \"ff652e64-59f9-4670-8952-a33ee996c7e5\") " pod="openstack/barbican-db-create-8nczp" Jan 11 17:50:13 crc kubenswrapper[4837]: I0111 17:50:13.214588 4837 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/keystone-db-sync-d8v5z"] Jan 11 17:50:13 crc kubenswrapper[4837]: I0111 17:50:13.218642 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-sync-d8v5z" Jan 11 17:50:13 crc kubenswrapper[4837]: I0111 17:50:13.220712 4837 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-keystone-dockercfg-6ct2s" Jan 11 17:50:13 crc kubenswrapper[4837]: I0111 17:50:13.221641 4837 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone" Jan 11 17:50:13 crc kubenswrapper[4837]: I0111 17:50:13.221707 4837 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-scripts" Jan 11 17:50:13 crc kubenswrapper[4837]: I0111 17:50:13.221820 4837 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-config-data" Jan 11 17:50:13 crc kubenswrapper[4837]: I0111 17:50:13.225841 4837 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-db-sync-d8v5z"] Jan 11 17:50:13 crc kubenswrapper[4837]: I0111 17:50:13.263404 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mf8rv\" (UniqueName: \"kubernetes.io/projected/3ca257e2-1e1f-401c-9a8e-776746d6bfe2-kube-api-access-mf8rv\") pod \"keystone-db-sync-d8v5z\" (UID: \"3ca257e2-1e1f-401c-9a8e-776746d6bfe2\") " pod="openstack/keystone-db-sync-d8v5z" Jan 11 17:50:13 crc kubenswrapper[4837]: I0111 17:50:13.263461 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3ca257e2-1e1f-401c-9a8e-776746d6bfe2-combined-ca-bundle\") pod \"keystone-db-sync-d8v5z\" (UID: \"3ca257e2-1e1f-401c-9a8e-776746d6bfe2\") " pod="openstack/keystone-db-sync-d8v5z" Jan 11 17:50:13 crc kubenswrapper[4837]: I0111 17:50:13.263496 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-nwxn5\" (UniqueName: \"kubernetes.io/projected/f1d3da28-9c32-4294-b0b9-35b383dafb36-kube-api-access-nwxn5\") pod \"barbican-136e-account-create-update-chnvk\" (UID: \"f1d3da28-9c32-4294-b0b9-35b383dafb36\") " pod="openstack/barbican-136e-account-create-update-chnvk" Jan 11 17:50:13 crc kubenswrapper[4837]: I0111 17:50:13.263517 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/3865f01a-21c1-4214-b979-14e72a764eb8-operator-scripts\") pod \"cinder-9090-account-create-update-674ql\" (UID: \"3865f01a-21c1-4214-b979-14e72a764eb8\") " pod="openstack/cinder-9090-account-create-update-674ql" Jan 11 17:50:13 crc kubenswrapper[4837]: I0111 17:50:13.263549 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/f1d3da28-9c32-4294-b0b9-35b383dafb36-operator-scripts\") pod \"barbican-136e-account-create-update-chnvk\" (UID: \"f1d3da28-9c32-4294-b0b9-35b383dafb36\") " pod="openstack/barbican-136e-account-create-update-chnvk" Jan 11 17:50:13 crc kubenswrapper[4837]: I0111 17:50:13.263580 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3ca257e2-1e1f-401c-9a8e-776746d6bfe2-config-data\") pod \"keystone-db-sync-d8v5z\" (UID: \"3ca257e2-1e1f-401c-9a8e-776746d6bfe2\") " pod="openstack/keystone-db-sync-d8v5z" Jan 11 17:50:13 crc kubenswrapper[4837]: I0111 17:50:13.263607 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-d89fn\" (UniqueName: \"kubernetes.io/projected/3865f01a-21c1-4214-b979-14e72a764eb8-kube-api-access-d89fn\") pod \"cinder-9090-account-create-update-674ql\" (UID: \"3865f01a-21c1-4214-b979-14e72a764eb8\") " pod="openstack/cinder-9090-account-create-update-674ql" Jan 11 17:50:13 crc kubenswrapper[4837]: I0111 17:50:13.264692 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/f1d3da28-9c32-4294-b0b9-35b383dafb36-operator-scripts\") pod \"barbican-136e-account-create-update-chnvk\" (UID: \"f1d3da28-9c32-4294-b0b9-35b383dafb36\") " pod="openstack/barbican-136e-account-create-update-chnvk" Jan 11 17:50:13 crc kubenswrapper[4837]: I0111 17:50:13.265115 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/3865f01a-21c1-4214-b979-14e72a764eb8-operator-scripts\") pod \"cinder-9090-account-create-update-674ql\" (UID: \"3865f01a-21c1-4214-b979-14e72a764eb8\") " pod="openstack/cinder-9090-account-create-update-674ql" Jan 11 17:50:13 crc kubenswrapper[4837]: I0111 17:50:13.281862 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-nwxn5\" (UniqueName: \"kubernetes.io/projected/f1d3da28-9c32-4294-b0b9-35b383dafb36-kube-api-access-nwxn5\") pod \"barbican-136e-account-create-update-chnvk\" (UID: \"f1d3da28-9c32-4294-b0b9-35b383dafb36\") " pod="openstack/barbican-136e-account-create-update-chnvk" Jan 11 17:50:13 crc kubenswrapper[4837]: I0111 17:50:13.297891 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-d89fn\" (UniqueName: \"kubernetes.io/projected/3865f01a-21c1-4214-b979-14e72a764eb8-kube-api-access-d89fn\") pod \"cinder-9090-account-create-update-674ql\" (UID: \"3865f01a-21c1-4214-b979-14e72a764eb8\") " pod="openstack/cinder-9090-account-create-update-674ql" Jan 11 17:50:13 crc kubenswrapper[4837]: I0111 17:50:13.343060 4837 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/neutron-db-create-5lj8p"] Jan 11 17:50:13 crc kubenswrapper[4837]: I0111 17:50:13.344174 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-create-5lj8p" Jan 11 17:50:13 crc kubenswrapper[4837]: I0111 17:50:13.346984 4837 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-db-create-5lj8p"] Jan 11 17:50:13 crc kubenswrapper[4837]: I0111 17:50:13.365084 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3ca257e2-1e1f-401c-9a8e-776746d6bfe2-combined-ca-bundle\") pod \"keystone-db-sync-d8v5z\" (UID: \"3ca257e2-1e1f-401c-9a8e-776746d6bfe2\") " pod="openstack/keystone-db-sync-d8v5z" Jan 11 17:50:13 crc kubenswrapper[4837]: I0111 17:50:13.365150 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3ca257e2-1e1f-401c-9a8e-776746d6bfe2-config-data\") pod \"keystone-db-sync-d8v5z\" (UID: \"3ca257e2-1e1f-401c-9a8e-776746d6bfe2\") " pod="openstack/keystone-db-sync-d8v5z" Jan 11 17:50:13 crc kubenswrapper[4837]: I0111 17:50:13.365197 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/7c8fa4d5-171c-4961-b1e2-203c2d2a128f-operator-scripts\") pod \"neutron-db-create-5lj8p\" (UID: \"7c8fa4d5-171c-4961-b1e2-203c2d2a128f\") " pod="openstack/neutron-db-create-5lj8p" Jan 11 17:50:13 crc kubenswrapper[4837]: I0111 17:50:13.365236 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-sl7bm\" (UniqueName: \"kubernetes.io/projected/7c8fa4d5-171c-4961-b1e2-203c2d2a128f-kube-api-access-sl7bm\") pod \"neutron-db-create-5lj8p\" (UID: \"7c8fa4d5-171c-4961-b1e2-203c2d2a128f\") " pod="openstack/neutron-db-create-5lj8p" Jan 11 17:50:13 crc kubenswrapper[4837]: I0111 17:50:13.365271 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mf8rv\" (UniqueName: \"kubernetes.io/projected/3ca257e2-1e1f-401c-9a8e-776746d6bfe2-kube-api-access-mf8rv\") pod \"keystone-db-sync-d8v5z\" (UID: \"3ca257e2-1e1f-401c-9a8e-776746d6bfe2\") " pod="openstack/keystone-db-sync-d8v5z" Jan 11 17:50:13 crc kubenswrapper[4837]: I0111 17:50:13.370345 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3ca257e2-1e1f-401c-9a8e-776746d6bfe2-combined-ca-bundle\") pod \"keystone-db-sync-d8v5z\" (UID: \"3ca257e2-1e1f-401c-9a8e-776746d6bfe2\") " pod="openstack/keystone-db-sync-d8v5z" Jan 11 17:50:13 crc kubenswrapper[4837]: I0111 17:50:13.386158 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mf8rv\" (UniqueName: \"kubernetes.io/projected/3ca257e2-1e1f-401c-9a8e-776746d6bfe2-kube-api-access-mf8rv\") pod \"keystone-db-sync-d8v5z\" (UID: \"3ca257e2-1e1f-401c-9a8e-776746d6bfe2\") " pod="openstack/keystone-db-sync-d8v5z" Jan 11 17:50:13 crc kubenswrapper[4837]: I0111 17:50:13.387521 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3ca257e2-1e1f-401c-9a8e-776746d6bfe2-config-data\") pod \"keystone-db-sync-d8v5z\" (UID: \"3ca257e2-1e1f-401c-9a8e-776746d6bfe2\") " pod="openstack/keystone-db-sync-d8v5z" Jan 11 17:50:13 crc kubenswrapper[4837]: I0111 17:50:13.425434 4837 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/neutron-f46b-account-create-update-ctrv8"] Jan 11 17:50:13 crc kubenswrapper[4837]: I0111 17:50:13.431427 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-f46b-account-create-update-ctrv8" Jan 11 17:50:13 crc kubenswrapper[4837]: I0111 17:50:13.433418 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-create-8nczp" Jan 11 17:50:13 crc kubenswrapper[4837]: I0111 17:50:13.435213 4837 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-db-secret" Jan 11 17:50:13 crc kubenswrapper[4837]: I0111 17:50:13.443839 4837 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-f46b-account-create-update-ctrv8"] Jan 11 17:50:13 crc kubenswrapper[4837]: I0111 17:50:13.467489 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/f02a3348-4ccd-4793-8f46-e735fa0fc49d-operator-scripts\") pod \"neutron-f46b-account-create-update-ctrv8\" (UID: \"f02a3348-4ccd-4793-8f46-e735fa0fc49d\") " pod="openstack/neutron-f46b-account-create-update-ctrv8" Jan 11 17:50:13 crc kubenswrapper[4837]: I0111 17:50:13.467553 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-sl7bm\" (UniqueName: \"kubernetes.io/projected/7c8fa4d5-171c-4961-b1e2-203c2d2a128f-kube-api-access-sl7bm\") pod \"neutron-db-create-5lj8p\" (UID: \"7c8fa4d5-171c-4961-b1e2-203c2d2a128f\") " pod="openstack/neutron-db-create-5lj8p" Jan 11 17:50:13 crc kubenswrapper[4837]: I0111 17:50:13.467661 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vqzfl\" (UniqueName: \"kubernetes.io/projected/f02a3348-4ccd-4793-8f46-e735fa0fc49d-kube-api-access-vqzfl\") pod \"neutron-f46b-account-create-update-ctrv8\" (UID: \"f02a3348-4ccd-4793-8f46-e735fa0fc49d\") " pod="openstack/neutron-f46b-account-create-update-ctrv8" Jan 11 17:50:13 crc kubenswrapper[4837]: I0111 17:50:13.467784 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/7c8fa4d5-171c-4961-b1e2-203c2d2a128f-operator-scripts\") pod \"neutron-db-create-5lj8p\" (UID: \"7c8fa4d5-171c-4961-b1e2-203c2d2a128f\") " pod="openstack/neutron-db-create-5lj8p" Jan 11 17:50:13 crc kubenswrapper[4837]: I0111 17:50:13.468391 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/7c8fa4d5-171c-4961-b1e2-203c2d2a128f-operator-scripts\") pod \"neutron-db-create-5lj8p\" (UID: \"7c8fa4d5-171c-4961-b1e2-203c2d2a128f\") " pod="openstack/neutron-db-create-5lj8p" Jan 11 17:50:13 crc kubenswrapper[4837]: I0111 17:50:13.482528 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-136e-account-create-update-chnvk" Jan 11 17:50:13 crc kubenswrapper[4837]: I0111 17:50:13.497620 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-sl7bm\" (UniqueName: \"kubernetes.io/projected/7c8fa4d5-171c-4961-b1e2-203c2d2a128f-kube-api-access-sl7bm\") pod \"neutron-db-create-5lj8p\" (UID: \"7c8fa4d5-171c-4961-b1e2-203c2d2a128f\") " pod="openstack/neutron-db-create-5lj8p" Jan 11 17:50:13 crc kubenswrapper[4837]: I0111 17:50:13.507928 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-9090-account-create-update-674ql" Jan 11 17:50:13 crc kubenswrapper[4837]: I0111 17:50:13.533498 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-sync-d8v5z" Jan 11 17:50:13 crc kubenswrapper[4837]: I0111 17:50:13.569104 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/f02a3348-4ccd-4793-8f46-e735fa0fc49d-operator-scripts\") pod \"neutron-f46b-account-create-update-ctrv8\" (UID: \"f02a3348-4ccd-4793-8f46-e735fa0fc49d\") " pod="openstack/neutron-f46b-account-create-update-ctrv8" Jan 11 17:50:13 crc kubenswrapper[4837]: I0111 17:50:13.569218 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vqzfl\" (UniqueName: \"kubernetes.io/projected/f02a3348-4ccd-4793-8f46-e735fa0fc49d-kube-api-access-vqzfl\") pod \"neutron-f46b-account-create-update-ctrv8\" (UID: \"f02a3348-4ccd-4793-8f46-e735fa0fc49d\") " pod="openstack/neutron-f46b-account-create-update-ctrv8" Jan 11 17:50:13 crc kubenswrapper[4837]: I0111 17:50:13.570323 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/f02a3348-4ccd-4793-8f46-e735fa0fc49d-operator-scripts\") pod \"neutron-f46b-account-create-update-ctrv8\" (UID: \"f02a3348-4ccd-4793-8f46-e735fa0fc49d\") " pod="openstack/neutron-f46b-account-create-update-ctrv8" Jan 11 17:50:13 crc kubenswrapper[4837]: I0111 17:50:13.590977 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vqzfl\" (UniqueName: \"kubernetes.io/projected/f02a3348-4ccd-4793-8f46-e735fa0fc49d-kube-api-access-vqzfl\") pod \"neutron-f46b-account-create-update-ctrv8\" (UID: \"f02a3348-4ccd-4793-8f46-e735fa0fc49d\") " pod="openstack/neutron-f46b-account-create-update-ctrv8" Jan 11 17:50:13 crc kubenswrapper[4837]: I0111 17:50:13.660568 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-create-5lj8p" Jan 11 17:50:13 crc kubenswrapper[4837]: I0111 17:50:13.699637 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-698758b865-rv55z" event={"ID":"281a517d-5b8e-413b-8e6c-6555318d70c8","Type":"ContainerDied","Data":"c1c6287f733a41f27571dc8fc09e2b56cec55dba13b85e711379023f35711626"} Jan 11 17:50:13 crc kubenswrapper[4837]: I0111 17:50:13.699630 4837 generic.go:334] "Generic (PLEG): container finished" podID="281a517d-5b8e-413b-8e6c-6555318d70c8" containerID="c1c6287f733a41f27571dc8fc09e2b56cec55dba13b85e711379023f35711626" exitCode=0 Jan 11 17:50:13 crc kubenswrapper[4837]: I0111 17:50:13.719617 4837 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-db-create-rvfdr"] Jan 11 17:50:13 crc kubenswrapper[4837]: I0111 17:50:13.764003 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-f46b-account-create-update-ctrv8" Jan 11 17:50:13 crc kubenswrapper[4837]: I0111 17:50:13.975093 4837 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-db-create-8nczp"] Jan 11 17:50:14 crc kubenswrapper[4837]: I0111 17:50:14.025515 4837 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-698758b865-rv55z" Jan 11 17:50:14 crc kubenswrapper[4837]: I0111 17:50:14.080660 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-9v866\" (UniqueName: \"kubernetes.io/projected/281a517d-5b8e-413b-8e6c-6555318d70c8-kube-api-access-9v866\") pod \"281a517d-5b8e-413b-8e6c-6555318d70c8\" (UID: \"281a517d-5b8e-413b-8e6c-6555318d70c8\") " Jan 11 17:50:14 crc kubenswrapper[4837]: I0111 17:50:14.080782 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/281a517d-5b8e-413b-8e6c-6555318d70c8-dns-svc\") pod \"281a517d-5b8e-413b-8e6c-6555318d70c8\" (UID: \"281a517d-5b8e-413b-8e6c-6555318d70c8\") " Jan 11 17:50:14 crc kubenswrapper[4837]: I0111 17:50:14.080819 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/281a517d-5b8e-413b-8e6c-6555318d70c8-ovsdbserver-nb\") pod \"281a517d-5b8e-413b-8e6c-6555318d70c8\" (UID: \"281a517d-5b8e-413b-8e6c-6555318d70c8\") " Jan 11 17:50:14 crc kubenswrapper[4837]: I0111 17:50:14.080921 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/281a517d-5b8e-413b-8e6c-6555318d70c8-ovsdbserver-sb\") pod \"281a517d-5b8e-413b-8e6c-6555318d70c8\" (UID: \"281a517d-5b8e-413b-8e6c-6555318d70c8\") " Jan 11 17:50:14 crc kubenswrapper[4837]: I0111 17:50:14.080954 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/281a517d-5b8e-413b-8e6c-6555318d70c8-config\") pod \"281a517d-5b8e-413b-8e6c-6555318d70c8\" (UID: \"281a517d-5b8e-413b-8e6c-6555318d70c8\") " Jan 11 17:50:14 crc kubenswrapper[4837]: I0111 17:50:14.091307 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/281a517d-5b8e-413b-8e6c-6555318d70c8-kube-api-access-9v866" (OuterVolumeSpecName: "kube-api-access-9v866") pod "281a517d-5b8e-413b-8e6c-6555318d70c8" (UID: "281a517d-5b8e-413b-8e6c-6555318d70c8"). InnerVolumeSpecName "kube-api-access-9v866". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 11 17:50:14 crc kubenswrapper[4837]: I0111 17:50:14.115425 4837 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-db-sync-d8v5z"] Jan 11 17:50:14 crc kubenswrapper[4837]: I0111 17:50:14.152976 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/281a517d-5b8e-413b-8e6c-6555318d70c8-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "281a517d-5b8e-413b-8e6c-6555318d70c8" (UID: "281a517d-5b8e-413b-8e6c-6555318d70c8"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 11 17:50:14 crc kubenswrapper[4837]: I0111 17:50:14.157116 4837 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-136e-account-create-update-chnvk"] Jan 11 17:50:14 crc kubenswrapper[4837]: I0111 17:50:14.160267 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/281a517d-5b8e-413b-8e6c-6555318d70c8-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "281a517d-5b8e-413b-8e6c-6555318d70c8" (UID: "281a517d-5b8e-413b-8e6c-6555318d70c8"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 11 17:50:14 crc kubenswrapper[4837]: I0111 17:50:14.180264 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/281a517d-5b8e-413b-8e6c-6555318d70c8-config" (OuterVolumeSpecName: "config") pod "281a517d-5b8e-413b-8e6c-6555318d70c8" (UID: "281a517d-5b8e-413b-8e6c-6555318d70c8"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 11 17:50:14 crc kubenswrapper[4837]: I0111 17:50:14.180462 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/281a517d-5b8e-413b-8e6c-6555318d70c8-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "281a517d-5b8e-413b-8e6c-6555318d70c8" (UID: "281a517d-5b8e-413b-8e6c-6555318d70c8"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 11 17:50:14 crc kubenswrapper[4837]: I0111 17:50:14.182323 4837 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-9v866\" (UniqueName: \"kubernetes.io/projected/281a517d-5b8e-413b-8e6c-6555318d70c8-kube-api-access-9v866\") on node \"crc\" DevicePath \"\"" Jan 11 17:50:14 crc kubenswrapper[4837]: I0111 17:50:14.182344 4837 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/281a517d-5b8e-413b-8e6c-6555318d70c8-dns-svc\") on node \"crc\" DevicePath \"\"" Jan 11 17:50:14 crc kubenswrapper[4837]: I0111 17:50:14.182354 4837 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/281a517d-5b8e-413b-8e6c-6555318d70c8-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Jan 11 17:50:14 crc kubenswrapper[4837]: I0111 17:50:14.182363 4837 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/281a517d-5b8e-413b-8e6c-6555318d70c8-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Jan 11 17:50:14 crc kubenswrapper[4837]: I0111 17:50:14.182372 4837 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/281a517d-5b8e-413b-8e6c-6555318d70c8-config\") on node \"crc\" DevicePath \"\"" Jan 11 17:50:14 crc kubenswrapper[4837]: I0111 17:50:14.188116 4837 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-9090-account-create-update-674ql"] Jan 11 17:50:14 crc kubenswrapper[4837]: I0111 17:50:14.418718 4837 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-f46b-account-create-update-ctrv8"] Jan 11 17:50:14 crc kubenswrapper[4837]: I0111 17:50:14.423553 4837 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-db-create-5lj8p"] Jan 11 17:50:14 crc kubenswrapper[4837]: W0111 17:50:14.442569 4837 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod7c8fa4d5_171c_4961_b1e2_203c2d2a128f.slice/crio-0a41605017e2885fdcc570954f9213a959ac90359dda3f8bb07193725beb6185 WatchSource:0}: Error finding container 0a41605017e2885fdcc570954f9213a959ac90359dda3f8bb07193725beb6185: Status 404 returned error can't find the container with id 0a41605017e2885fdcc570954f9213a959ac90359dda3f8bb07193725beb6185 Jan 11 17:50:14 crc kubenswrapper[4837]: I0111 17:50:14.711327 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-698758b865-rv55z" event={"ID":"281a517d-5b8e-413b-8e6c-6555318d70c8","Type":"ContainerDied","Data":"9145b53e54b9a0ffc304535ec138f16427619aaec454e617ed64f3cb1d7dafe1"} Jan 11 17:50:14 crc kubenswrapper[4837]: I0111 17:50:14.711347 4837 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-698758b865-rv55z" Jan 11 17:50:14 crc kubenswrapper[4837]: I0111 17:50:14.711377 4837 scope.go:117] "RemoveContainer" containerID="c1c6287f733a41f27571dc8fc09e2b56cec55dba13b85e711379023f35711626" Jan 11 17:50:14 crc kubenswrapper[4837]: I0111 17:50:14.713588 4837 generic.go:334] "Generic (PLEG): container finished" podID="29bd365d-3562-452a-a585-041d3f538ebe" containerID="b7fe0e99090474c599792ca7a532bd6276dd675cebeb22e673233cb41e557686" exitCode=0 Jan 11 17:50:14 crc kubenswrapper[4837]: I0111 17:50:14.713631 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-db-create-rvfdr" event={"ID":"29bd365d-3562-452a-a585-041d3f538ebe","Type":"ContainerDied","Data":"b7fe0e99090474c599792ca7a532bd6276dd675cebeb22e673233cb41e557686"} Jan 11 17:50:14 crc kubenswrapper[4837]: I0111 17:50:14.713649 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-db-create-rvfdr" event={"ID":"29bd365d-3562-452a-a585-041d3f538ebe","Type":"ContainerStarted","Data":"83eaf9156e94fd8ef9c293503e1fb578d88791dbad1d712c02a0b92cdbd837c6"} Jan 11 17:50:14 crc kubenswrapper[4837]: I0111 17:50:14.714588 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-f46b-account-create-update-ctrv8" event={"ID":"f02a3348-4ccd-4793-8f46-e735fa0fc49d","Type":"ContainerStarted","Data":"f3540d1940f867f006421cc1624181d8e020d65834eac2b63c72891addb94cc3"} Jan 11 17:50:14 crc kubenswrapper[4837]: I0111 17:50:14.715793 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-sync-d8v5z" event={"ID":"3ca257e2-1e1f-401c-9a8e-776746d6bfe2","Type":"ContainerStarted","Data":"fd5e0721b6433645d1d246f06f4f62ecdf2ff85194e95ccb2b12fafbf0fe803b"} Jan 11 17:50:14 crc kubenswrapper[4837]: I0111 17:50:14.716931 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-db-create-8nczp" event={"ID":"ff652e64-59f9-4670-8952-a33ee996c7e5","Type":"ContainerStarted","Data":"e4b31cfde4e25b9262febddecd8a97fddf82fcde426e94897e7395faf084ffc9"} Jan 11 17:50:14 crc kubenswrapper[4837]: I0111 17:50:14.716954 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-db-create-8nczp" event={"ID":"ff652e64-59f9-4670-8952-a33ee996c7e5","Type":"ContainerStarted","Data":"41885ef8a9d43b60b59f5aaf6575cd7da4e6a54002051f21167e2bf2aee890d0"} Jan 11 17:50:14 crc kubenswrapper[4837]: I0111 17:50:14.718479 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-136e-account-create-update-chnvk" event={"ID":"f1d3da28-9c32-4294-b0b9-35b383dafb36","Type":"ContainerStarted","Data":"ee5ecc1a9cfec995b82e04d12deb2fb465641ad952c38d225e5ec7da48803bb5"} Jan 11 17:50:14 crc kubenswrapper[4837]: I0111 17:50:14.718519 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-136e-account-create-update-chnvk" event={"ID":"f1d3da28-9c32-4294-b0b9-35b383dafb36","Type":"ContainerStarted","Data":"c9da1846090f3c850a173a4d0b36710de3972bf1de93c3556832d50430e2f56f"} Jan 11 17:50:14 crc kubenswrapper[4837]: I0111 17:50:14.722182 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-db-create-5lj8p" event={"ID":"7c8fa4d5-171c-4961-b1e2-203c2d2a128f","Type":"ContainerStarted","Data":"0a41605017e2885fdcc570954f9213a959ac90359dda3f8bb07193725beb6185"} Jan 11 17:50:14 crc kubenswrapper[4837]: I0111 17:50:14.723500 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db-sync-r8vzn" event={"ID":"ed1cfeaf-6da5-40ed-b605-077e5c95900c","Type":"ContainerStarted","Data":"a36a59c0876ae98d222aab78869c8bcdf4810ea828fc526059b03a3d70f1d37f"} Jan 11 17:50:14 crc kubenswrapper[4837]: I0111 17:50:14.724544 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-9090-account-create-update-674ql" event={"ID":"3865f01a-21c1-4214-b979-14e72a764eb8","Type":"ContainerStarted","Data":"1d2a6a0a70eca994b42dc0d4f40e9703d7324ffca4d61e9113e69c400cbb1446"} Jan 11 17:50:14 crc kubenswrapper[4837]: I0111 17:50:14.724567 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-9090-account-create-update-674ql" event={"ID":"3865f01a-21c1-4214-b979-14e72a764eb8","Type":"ContainerStarted","Data":"510b558f618f5e8dbbdf2de55ab1eeac5b07c40b4493b3b456b53fc554934d40"} Jan 11 17:50:14 crc kubenswrapper[4837]: I0111 17:50:14.728307 4837 scope.go:117] "RemoveContainer" containerID="01765bd1879b2d50a7280384405ec922ebd0d4d2f673ecbb8a4a39fffb4e8ab5" Jan 11 17:50:14 crc kubenswrapper[4837]: I0111 17:50:14.763527 4837 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-698758b865-rv55z"] Jan 11 17:50:14 crc kubenswrapper[4837]: I0111 17:50:14.777156 4837 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-698758b865-rv55z"] Jan 11 17:50:14 crc kubenswrapper[4837]: I0111 17:50:14.778001 4837 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/barbican-db-create-8nczp" podStartSLOduration=2.777982582 podStartE2EDuration="2.777982582s" podCreationTimestamp="2026-01-11 17:50:12 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-11 17:50:14.773026589 +0000 UTC m=+1188.951219305" watchObservedRunningTime="2026-01-11 17:50:14.777982582 +0000 UTC m=+1188.956175288" Jan 11 17:50:14 crc kubenswrapper[4837]: I0111 17:50:14.793111 4837 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/cinder-9090-account-create-update-674ql" podStartSLOduration=1.7930938589999998 podStartE2EDuration="1.793093859s" podCreationTimestamp="2026-01-11 17:50:13 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-11 17:50:14.791921367 +0000 UTC m=+1188.970114073" watchObservedRunningTime="2026-01-11 17:50:14.793093859 +0000 UTC m=+1188.971286565" Jan 11 17:50:14 crc kubenswrapper[4837]: I0111 17:50:14.815781 4837 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/glance-db-sync-r8vzn" podStartSLOduration=3.471635139 podStartE2EDuration="41.815760738s" podCreationTimestamp="2026-01-11 17:49:33 +0000 UTC" firstStartedPulling="2026-01-11 17:49:34.646352975 +0000 UTC m=+1148.824545681" lastFinishedPulling="2026-01-11 17:50:12.990478574 +0000 UTC m=+1187.168671280" observedRunningTime="2026-01-11 17:50:14.808085901 +0000 UTC m=+1188.986278647" watchObservedRunningTime="2026-01-11 17:50:14.815760738 +0000 UTC m=+1188.993953444" Jan 11 17:50:14 crc kubenswrapper[4837]: I0111 17:50:14.824435 4837 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/barbican-136e-account-create-update-chnvk" podStartSLOduration=1.8244175500000002 podStartE2EDuration="1.82441755s" podCreationTimestamp="2026-01-11 17:50:13 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-11 17:50:14.821490822 +0000 UTC m=+1188.999683528" watchObservedRunningTime="2026-01-11 17:50:14.82441755 +0000 UTC m=+1189.002610256" Jan 11 17:50:15 crc kubenswrapper[4837]: I0111 17:50:15.758465 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-9090-account-create-update-674ql" event={"ID":"3865f01a-21c1-4214-b979-14e72a764eb8","Type":"ContainerDied","Data":"1d2a6a0a70eca994b42dc0d4f40e9703d7324ffca4d61e9113e69c400cbb1446"} Jan 11 17:50:15 crc kubenswrapper[4837]: I0111 17:50:15.758477 4837 generic.go:334] "Generic (PLEG): container finished" podID="3865f01a-21c1-4214-b979-14e72a764eb8" containerID="1d2a6a0a70eca994b42dc0d4f40e9703d7324ffca4d61e9113e69c400cbb1446" exitCode=0 Jan 11 17:50:15 crc kubenswrapper[4837]: I0111 17:50:15.763878 4837 generic.go:334] "Generic (PLEG): container finished" podID="ff652e64-59f9-4670-8952-a33ee996c7e5" containerID="e4b31cfde4e25b9262febddecd8a97fddf82fcde426e94897e7395faf084ffc9" exitCode=0 Jan 11 17:50:15 crc kubenswrapper[4837]: I0111 17:50:15.763927 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-db-create-8nczp" event={"ID":"ff652e64-59f9-4670-8952-a33ee996c7e5","Type":"ContainerDied","Data":"e4b31cfde4e25b9262febddecd8a97fddf82fcde426e94897e7395faf084ffc9"} Jan 11 17:50:15 crc kubenswrapper[4837]: I0111 17:50:15.765365 4837 generic.go:334] "Generic (PLEG): container finished" podID="f1d3da28-9c32-4294-b0b9-35b383dafb36" containerID="ee5ecc1a9cfec995b82e04d12deb2fb465641ad952c38d225e5ec7da48803bb5" exitCode=0 Jan 11 17:50:15 crc kubenswrapper[4837]: I0111 17:50:15.765464 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-136e-account-create-update-chnvk" event={"ID":"f1d3da28-9c32-4294-b0b9-35b383dafb36","Type":"ContainerDied","Data":"ee5ecc1a9cfec995b82e04d12deb2fb465641ad952c38d225e5ec7da48803bb5"} Jan 11 17:50:15 crc kubenswrapper[4837]: I0111 17:50:15.768978 4837 generic.go:334] "Generic (PLEG): container finished" podID="7c8fa4d5-171c-4961-b1e2-203c2d2a128f" containerID="820a117f8c24d68f1bd3a716a9093ff50bfc7224e3c22ac46e6eac2b9cf22ef9" exitCode=0 Jan 11 17:50:15 crc kubenswrapper[4837]: I0111 17:50:15.769027 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-db-create-5lj8p" event={"ID":"7c8fa4d5-171c-4961-b1e2-203c2d2a128f","Type":"ContainerDied","Data":"820a117f8c24d68f1bd3a716a9093ff50bfc7224e3c22ac46e6eac2b9cf22ef9"} Jan 11 17:50:15 crc kubenswrapper[4837]: I0111 17:50:15.771921 4837 generic.go:334] "Generic (PLEG): container finished" podID="f02a3348-4ccd-4793-8f46-e735fa0fc49d" containerID="bc7fb78df1712c24841197035ddd115fb2ce6e77521f6ba61641113c90a27917" exitCode=0 Jan 11 17:50:15 crc kubenswrapper[4837]: I0111 17:50:15.772065 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-f46b-account-create-update-ctrv8" event={"ID":"f02a3348-4ccd-4793-8f46-e735fa0fc49d","Type":"ContainerDied","Data":"bc7fb78df1712c24841197035ddd115fb2ce6e77521f6ba61641113c90a27917"} Jan 11 17:50:20 crc kubenswrapper[4837]: I0111 17:50:16.075871 4837 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-create-rvfdr" Jan 11 17:50:20 crc kubenswrapper[4837]: I0111 17:50:16.119513 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/29bd365d-3562-452a-a585-041d3f538ebe-operator-scripts\") pod \"29bd365d-3562-452a-a585-041d3f538ebe\" (UID: \"29bd365d-3562-452a-a585-041d3f538ebe\") " Jan 11 17:50:20 crc kubenswrapper[4837]: I0111 17:50:16.119567 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-k6jmz\" (UniqueName: \"kubernetes.io/projected/29bd365d-3562-452a-a585-041d3f538ebe-kube-api-access-k6jmz\") pod \"29bd365d-3562-452a-a585-041d3f538ebe\" (UID: \"29bd365d-3562-452a-a585-041d3f538ebe\") " Jan 11 17:50:20 crc kubenswrapper[4837]: I0111 17:50:16.120008 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/29bd365d-3562-452a-a585-041d3f538ebe-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "29bd365d-3562-452a-a585-041d3f538ebe" (UID: "29bd365d-3562-452a-a585-041d3f538ebe"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 11 17:50:20 crc kubenswrapper[4837]: I0111 17:50:16.135244 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/29bd365d-3562-452a-a585-041d3f538ebe-kube-api-access-k6jmz" (OuterVolumeSpecName: "kube-api-access-k6jmz") pod "29bd365d-3562-452a-a585-041d3f538ebe" (UID: "29bd365d-3562-452a-a585-041d3f538ebe"). InnerVolumeSpecName "kube-api-access-k6jmz". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 11 17:50:20 crc kubenswrapper[4837]: I0111 17:50:16.221638 4837 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/29bd365d-3562-452a-a585-041d3f538ebe-operator-scripts\") on node \"crc\" DevicePath \"\"" Jan 11 17:50:20 crc kubenswrapper[4837]: I0111 17:50:16.221701 4837 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-k6jmz\" (UniqueName: \"kubernetes.io/projected/29bd365d-3562-452a-a585-041d3f538ebe-kube-api-access-k6jmz\") on node \"crc\" DevicePath \"\"" Jan 11 17:50:20 crc kubenswrapper[4837]: I0111 17:50:16.386070 4837 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="281a517d-5b8e-413b-8e6c-6555318d70c8" path="/var/lib/kubelet/pods/281a517d-5b8e-413b-8e6c-6555318d70c8/volumes" Jan 11 17:50:20 crc kubenswrapper[4837]: I0111 17:50:16.786837 4837 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-create-rvfdr" Jan 11 17:50:20 crc kubenswrapper[4837]: I0111 17:50:16.786871 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-db-create-rvfdr" event={"ID":"29bd365d-3562-452a-a585-041d3f538ebe","Type":"ContainerDied","Data":"83eaf9156e94fd8ef9c293503e1fb578d88791dbad1d712c02a0b92cdbd837c6"} Jan 11 17:50:20 crc kubenswrapper[4837]: I0111 17:50:16.787473 4837 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="83eaf9156e94fd8ef9c293503e1fb578d88791dbad1d712c02a0b92cdbd837c6" Jan 11 17:50:20 crc kubenswrapper[4837]: I0111 17:50:20.835977 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-f46b-account-create-update-ctrv8" event={"ID":"f02a3348-4ccd-4793-8f46-e735fa0fc49d","Type":"ContainerDied","Data":"f3540d1940f867f006421cc1624181d8e020d65834eac2b63c72891addb94cc3"} Jan 11 17:50:20 crc kubenswrapper[4837]: I0111 17:50:20.837232 4837 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="f3540d1940f867f006421cc1624181d8e020d65834eac2b63c72891addb94cc3" Jan 11 17:50:20 crc kubenswrapper[4837]: I0111 17:50:20.838939 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-9090-account-create-update-674ql" event={"ID":"3865f01a-21c1-4214-b979-14e72a764eb8","Type":"ContainerDied","Data":"510b558f618f5e8dbbdf2de55ab1eeac5b07c40b4493b3b456b53fc554934d40"} Jan 11 17:50:20 crc kubenswrapper[4837]: I0111 17:50:20.839023 4837 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="510b558f618f5e8dbbdf2de55ab1eeac5b07c40b4493b3b456b53fc554934d40" Jan 11 17:50:20 crc kubenswrapper[4837]: I0111 17:50:20.842304 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-sync-d8v5z" event={"ID":"3ca257e2-1e1f-401c-9a8e-776746d6bfe2","Type":"ContainerStarted","Data":"93c7e0d022ac13d05d186210059cd3587449e76f24ba046842ec3e34a7e70f73"} Jan 11 17:50:20 crc kubenswrapper[4837]: I0111 17:50:20.847352 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-db-create-8nczp" event={"ID":"ff652e64-59f9-4670-8952-a33ee996c7e5","Type":"ContainerDied","Data":"41885ef8a9d43b60b59f5aaf6575cd7da4e6a54002051f21167e2bf2aee890d0"} Jan 11 17:50:20 crc kubenswrapper[4837]: I0111 17:50:20.847461 4837 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="41885ef8a9d43b60b59f5aaf6575cd7da4e6a54002051f21167e2bf2aee890d0" Jan 11 17:50:20 crc kubenswrapper[4837]: I0111 17:50:20.851085 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-136e-account-create-update-chnvk" event={"ID":"f1d3da28-9c32-4294-b0b9-35b383dafb36","Type":"ContainerDied","Data":"c9da1846090f3c850a173a4d0b36710de3972bf1de93c3556832d50430e2f56f"} Jan 11 17:50:20 crc kubenswrapper[4837]: I0111 17:50:20.851119 4837 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="c9da1846090f3c850a173a4d0b36710de3972bf1de93c3556832d50430e2f56f" Jan 11 17:50:20 crc kubenswrapper[4837]: I0111 17:50:20.881460 4837 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/keystone-db-sync-d8v5z" podStartSLOduration=1.535899079 podStartE2EDuration="7.881437673s" podCreationTimestamp="2026-01-11 17:50:13 +0000 UTC" firstStartedPulling="2026-01-11 17:50:14.150864126 +0000 UTC m=+1188.329056832" lastFinishedPulling="2026-01-11 17:50:20.49640269 +0000 UTC m=+1194.674595426" observedRunningTime="2026-01-11 17:50:20.862211346 +0000 UTC m=+1195.040404052" watchObservedRunningTime="2026-01-11 17:50:20.881437673 +0000 UTC m=+1195.059630379" Jan 11 17:50:20 crc kubenswrapper[4837]: I0111 17:50:20.884144 4837 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-136e-account-create-update-chnvk" Jan 11 17:50:20 crc kubenswrapper[4837]: I0111 17:50:20.891149 4837 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-f46b-account-create-update-ctrv8" Jan 11 17:50:20 crc kubenswrapper[4837]: I0111 17:50:20.913936 4837 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-9090-account-create-update-674ql" Jan 11 17:50:20 crc kubenswrapper[4837]: I0111 17:50:20.919939 4837 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-create-8nczp" Jan 11 17:50:20 crc kubenswrapper[4837]: I0111 17:50:20.926419 4837 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-create-5lj8p" Jan 11 17:50:21 crc kubenswrapper[4837]: I0111 17:50:21.010149 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/3865f01a-21c1-4214-b979-14e72a764eb8-operator-scripts\") pod \"3865f01a-21c1-4214-b979-14e72a764eb8\" (UID: \"3865f01a-21c1-4214-b979-14e72a764eb8\") " Jan 11 17:50:21 crc kubenswrapper[4837]: I0111 17:50:21.010478 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/ff652e64-59f9-4670-8952-a33ee996c7e5-operator-scripts\") pod \"ff652e64-59f9-4670-8952-a33ee996c7e5\" (UID: \"ff652e64-59f9-4670-8952-a33ee996c7e5\") " Jan 11 17:50:21 crc kubenswrapper[4837]: I0111 17:50:21.011128 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-d89fn\" (UniqueName: \"kubernetes.io/projected/3865f01a-21c1-4214-b979-14e72a764eb8-kube-api-access-d89fn\") pod \"3865f01a-21c1-4214-b979-14e72a764eb8\" (UID: \"3865f01a-21c1-4214-b979-14e72a764eb8\") " Jan 11 17:50:21 crc kubenswrapper[4837]: I0111 17:50:21.011886 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-vqzfl\" (UniqueName: \"kubernetes.io/projected/f02a3348-4ccd-4793-8f46-e735fa0fc49d-kube-api-access-vqzfl\") pod \"f02a3348-4ccd-4793-8f46-e735fa0fc49d\" (UID: \"f02a3348-4ccd-4793-8f46-e735fa0fc49d\") " Jan 11 17:50:21 crc kubenswrapper[4837]: I0111 17:50:21.010945 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/3865f01a-21c1-4214-b979-14e72a764eb8-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "3865f01a-21c1-4214-b979-14e72a764eb8" (UID: "3865f01a-21c1-4214-b979-14e72a764eb8"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 11 17:50:21 crc kubenswrapper[4837]: I0111 17:50:21.011061 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/ff652e64-59f9-4670-8952-a33ee996c7e5-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "ff652e64-59f9-4670-8952-a33ee996c7e5" (UID: "ff652e64-59f9-4670-8952-a33ee996c7e5"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 11 17:50:21 crc kubenswrapper[4837]: I0111 17:50:21.012075 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-nwxn5\" (UniqueName: \"kubernetes.io/projected/f1d3da28-9c32-4294-b0b9-35b383dafb36-kube-api-access-nwxn5\") pod \"f1d3da28-9c32-4294-b0b9-35b383dafb36\" (UID: \"f1d3da28-9c32-4294-b0b9-35b383dafb36\") " Jan 11 17:50:21 crc kubenswrapper[4837]: I0111 17:50:21.012249 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/f1d3da28-9c32-4294-b0b9-35b383dafb36-operator-scripts\") pod \"f1d3da28-9c32-4294-b0b9-35b383dafb36\" (UID: \"f1d3da28-9c32-4294-b0b9-35b383dafb36\") " Jan 11 17:50:21 crc kubenswrapper[4837]: I0111 17:50:21.012338 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fv6bv\" (UniqueName: \"kubernetes.io/projected/ff652e64-59f9-4670-8952-a33ee996c7e5-kube-api-access-fv6bv\") pod \"ff652e64-59f9-4670-8952-a33ee996c7e5\" (UID: \"ff652e64-59f9-4670-8952-a33ee996c7e5\") " Jan 11 17:50:21 crc kubenswrapper[4837]: I0111 17:50:21.012384 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/f02a3348-4ccd-4793-8f46-e735fa0fc49d-operator-scripts\") pod \"f02a3348-4ccd-4793-8f46-e735fa0fc49d\" (UID: \"f02a3348-4ccd-4793-8f46-e735fa0fc49d\") " Jan 11 17:50:21 crc kubenswrapper[4837]: I0111 17:50:21.013069 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/f02a3348-4ccd-4793-8f46-e735fa0fc49d-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "f02a3348-4ccd-4793-8f46-e735fa0fc49d" (UID: "f02a3348-4ccd-4793-8f46-e735fa0fc49d"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 11 17:50:21 crc kubenswrapper[4837]: I0111 17:50:21.013151 4837 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/3865f01a-21c1-4214-b979-14e72a764eb8-operator-scripts\") on node \"crc\" DevicePath \"\"" Jan 11 17:50:21 crc kubenswrapper[4837]: I0111 17:50:21.013173 4837 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/ff652e64-59f9-4670-8952-a33ee996c7e5-operator-scripts\") on node \"crc\" DevicePath \"\"" Jan 11 17:50:21 crc kubenswrapper[4837]: I0111 17:50:21.013720 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/f1d3da28-9c32-4294-b0b9-35b383dafb36-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "f1d3da28-9c32-4294-b0b9-35b383dafb36" (UID: "f1d3da28-9c32-4294-b0b9-35b383dafb36"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 11 17:50:21 crc kubenswrapper[4837]: I0111 17:50:21.014926 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3865f01a-21c1-4214-b979-14e72a764eb8-kube-api-access-d89fn" (OuterVolumeSpecName: "kube-api-access-d89fn") pod "3865f01a-21c1-4214-b979-14e72a764eb8" (UID: "3865f01a-21c1-4214-b979-14e72a764eb8"). InnerVolumeSpecName "kube-api-access-d89fn". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 11 17:50:21 crc kubenswrapper[4837]: I0111 17:50:21.015593 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f02a3348-4ccd-4793-8f46-e735fa0fc49d-kube-api-access-vqzfl" (OuterVolumeSpecName: "kube-api-access-vqzfl") pod "f02a3348-4ccd-4793-8f46-e735fa0fc49d" (UID: "f02a3348-4ccd-4793-8f46-e735fa0fc49d"). InnerVolumeSpecName "kube-api-access-vqzfl". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 11 17:50:21 crc kubenswrapper[4837]: I0111 17:50:21.015798 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f1d3da28-9c32-4294-b0b9-35b383dafb36-kube-api-access-nwxn5" (OuterVolumeSpecName: "kube-api-access-nwxn5") pod "f1d3da28-9c32-4294-b0b9-35b383dafb36" (UID: "f1d3da28-9c32-4294-b0b9-35b383dafb36"). InnerVolumeSpecName "kube-api-access-nwxn5". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 11 17:50:21 crc kubenswrapper[4837]: I0111 17:50:21.016893 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ff652e64-59f9-4670-8952-a33ee996c7e5-kube-api-access-fv6bv" (OuterVolumeSpecName: "kube-api-access-fv6bv") pod "ff652e64-59f9-4670-8952-a33ee996c7e5" (UID: "ff652e64-59f9-4670-8952-a33ee996c7e5"). InnerVolumeSpecName "kube-api-access-fv6bv". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 11 17:50:21 crc kubenswrapper[4837]: I0111 17:50:21.113937 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/7c8fa4d5-171c-4961-b1e2-203c2d2a128f-operator-scripts\") pod \"7c8fa4d5-171c-4961-b1e2-203c2d2a128f\" (UID: \"7c8fa4d5-171c-4961-b1e2-203c2d2a128f\") " Jan 11 17:50:21 crc kubenswrapper[4837]: I0111 17:50:21.114132 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-sl7bm\" (UniqueName: \"kubernetes.io/projected/7c8fa4d5-171c-4961-b1e2-203c2d2a128f-kube-api-access-sl7bm\") pod \"7c8fa4d5-171c-4961-b1e2-203c2d2a128f\" (UID: \"7c8fa4d5-171c-4961-b1e2-203c2d2a128f\") " Jan 11 17:50:21 crc kubenswrapper[4837]: I0111 17:50:21.114474 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7c8fa4d5-171c-4961-b1e2-203c2d2a128f-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "7c8fa4d5-171c-4961-b1e2-203c2d2a128f" (UID: "7c8fa4d5-171c-4961-b1e2-203c2d2a128f"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 11 17:50:21 crc kubenswrapper[4837]: I0111 17:50:21.114632 4837 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-d89fn\" (UniqueName: \"kubernetes.io/projected/3865f01a-21c1-4214-b979-14e72a764eb8-kube-api-access-d89fn\") on node \"crc\" DevicePath \"\"" Jan 11 17:50:21 crc kubenswrapper[4837]: I0111 17:50:21.114706 4837 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-vqzfl\" (UniqueName: \"kubernetes.io/projected/f02a3348-4ccd-4793-8f46-e735fa0fc49d-kube-api-access-vqzfl\") on node \"crc\" DevicePath \"\"" Jan 11 17:50:21 crc kubenswrapper[4837]: I0111 17:50:21.114766 4837 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/7c8fa4d5-171c-4961-b1e2-203c2d2a128f-operator-scripts\") on node \"crc\" DevicePath \"\"" Jan 11 17:50:21 crc kubenswrapper[4837]: I0111 17:50:21.114814 4837 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-nwxn5\" (UniqueName: \"kubernetes.io/projected/f1d3da28-9c32-4294-b0b9-35b383dafb36-kube-api-access-nwxn5\") on node \"crc\" DevicePath \"\"" Jan 11 17:50:21 crc kubenswrapper[4837]: I0111 17:50:21.114860 4837 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/f1d3da28-9c32-4294-b0b9-35b383dafb36-operator-scripts\") on node \"crc\" DevicePath \"\"" Jan 11 17:50:21 crc kubenswrapper[4837]: I0111 17:50:21.114910 4837 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-fv6bv\" (UniqueName: \"kubernetes.io/projected/ff652e64-59f9-4670-8952-a33ee996c7e5-kube-api-access-fv6bv\") on node \"crc\" DevicePath \"\"" Jan 11 17:50:21 crc kubenswrapper[4837]: I0111 17:50:21.114957 4837 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/f02a3348-4ccd-4793-8f46-e735fa0fc49d-operator-scripts\") on node \"crc\" DevicePath \"\"" Jan 11 17:50:21 crc kubenswrapper[4837]: I0111 17:50:21.117626 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7c8fa4d5-171c-4961-b1e2-203c2d2a128f-kube-api-access-sl7bm" (OuterVolumeSpecName: "kube-api-access-sl7bm") pod "7c8fa4d5-171c-4961-b1e2-203c2d2a128f" (UID: "7c8fa4d5-171c-4961-b1e2-203c2d2a128f"). InnerVolumeSpecName "kube-api-access-sl7bm". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 11 17:50:21 crc kubenswrapper[4837]: I0111 17:50:21.216640 4837 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-sl7bm\" (UniqueName: \"kubernetes.io/projected/7c8fa4d5-171c-4961-b1e2-203c2d2a128f-kube-api-access-sl7bm\") on node \"crc\" DevicePath \"\"" Jan 11 17:50:21 crc kubenswrapper[4837]: I0111 17:50:21.874506 4837 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-136e-account-create-update-chnvk" Jan 11 17:50:21 crc kubenswrapper[4837]: I0111 17:50:21.874537 4837 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-create-5lj8p" Jan 11 17:50:21 crc kubenswrapper[4837]: I0111 17:50:21.874537 4837 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-9090-account-create-update-674ql" Jan 11 17:50:21 crc kubenswrapper[4837]: I0111 17:50:21.874569 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-db-create-5lj8p" event={"ID":"7c8fa4d5-171c-4961-b1e2-203c2d2a128f","Type":"ContainerDied","Data":"0a41605017e2885fdcc570954f9213a959ac90359dda3f8bb07193725beb6185"} Jan 11 17:50:21 crc kubenswrapper[4837]: I0111 17:50:21.874625 4837 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="0a41605017e2885fdcc570954f9213a959ac90359dda3f8bb07193725beb6185" Jan 11 17:50:21 crc kubenswrapper[4837]: I0111 17:50:21.874558 4837 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-f46b-account-create-update-ctrv8" Jan 11 17:50:21 crc kubenswrapper[4837]: I0111 17:50:21.874713 4837 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-create-8nczp" Jan 11 17:50:23 crc kubenswrapper[4837]: I0111 17:50:23.896755 4837 generic.go:334] "Generic (PLEG): container finished" podID="3ca257e2-1e1f-401c-9a8e-776746d6bfe2" containerID="93c7e0d022ac13d05d186210059cd3587449e76f24ba046842ec3e34a7e70f73" exitCode=0 Jan 11 17:50:23 crc kubenswrapper[4837]: I0111 17:50:23.896832 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-sync-d8v5z" event={"ID":"3ca257e2-1e1f-401c-9a8e-776746d6bfe2","Type":"ContainerDied","Data":"93c7e0d022ac13d05d186210059cd3587449e76f24ba046842ec3e34a7e70f73"} Jan 11 17:50:25 crc kubenswrapper[4837]: I0111 17:50:25.236765 4837 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-sync-d8v5z" Jan 11 17:50:25 crc kubenswrapper[4837]: I0111 17:50:25.302962 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3ca257e2-1e1f-401c-9a8e-776746d6bfe2-config-data\") pod \"3ca257e2-1e1f-401c-9a8e-776746d6bfe2\" (UID: \"3ca257e2-1e1f-401c-9a8e-776746d6bfe2\") " Jan 11 17:50:25 crc kubenswrapper[4837]: I0111 17:50:25.303134 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3ca257e2-1e1f-401c-9a8e-776746d6bfe2-combined-ca-bundle\") pod \"3ca257e2-1e1f-401c-9a8e-776746d6bfe2\" (UID: \"3ca257e2-1e1f-401c-9a8e-776746d6bfe2\") " Jan 11 17:50:25 crc kubenswrapper[4837]: I0111 17:50:25.303212 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mf8rv\" (UniqueName: \"kubernetes.io/projected/3ca257e2-1e1f-401c-9a8e-776746d6bfe2-kube-api-access-mf8rv\") pod \"3ca257e2-1e1f-401c-9a8e-776746d6bfe2\" (UID: \"3ca257e2-1e1f-401c-9a8e-776746d6bfe2\") " Jan 11 17:50:25 crc kubenswrapper[4837]: I0111 17:50:25.308444 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3ca257e2-1e1f-401c-9a8e-776746d6bfe2-kube-api-access-mf8rv" (OuterVolumeSpecName: "kube-api-access-mf8rv") pod "3ca257e2-1e1f-401c-9a8e-776746d6bfe2" (UID: "3ca257e2-1e1f-401c-9a8e-776746d6bfe2"). InnerVolumeSpecName "kube-api-access-mf8rv". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 11 17:50:25 crc kubenswrapper[4837]: I0111 17:50:25.332375 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3ca257e2-1e1f-401c-9a8e-776746d6bfe2-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "3ca257e2-1e1f-401c-9a8e-776746d6bfe2" (UID: "3ca257e2-1e1f-401c-9a8e-776746d6bfe2"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 11 17:50:25 crc kubenswrapper[4837]: I0111 17:50:25.339136 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3ca257e2-1e1f-401c-9a8e-776746d6bfe2-config-data" (OuterVolumeSpecName: "config-data") pod "3ca257e2-1e1f-401c-9a8e-776746d6bfe2" (UID: "3ca257e2-1e1f-401c-9a8e-776746d6bfe2"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 11 17:50:25 crc kubenswrapper[4837]: I0111 17:50:25.405895 4837 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3ca257e2-1e1f-401c-9a8e-776746d6bfe2-config-data\") on node \"crc\" DevicePath \"\"" Jan 11 17:50:25 crc kubenswrapper[4837]: I0111 17:50:25.405924 4837 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3ca257e2-1e1f-401c-9a8e-776746d6bfe2-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 11 17:50:25 crc kubenswrapper[4837]: I0111 17:50:25.405935 4837 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mf8rv\" (UniqueName: \"kubernetes.io/projected/3ca257e2-1e1f-401c-9a8e-776746d6bfe2-kube-api-access-mf8rv\") on node \"crc\" DevicePath \"\"" Jan 11 17:50:25 crc kubenswrapper[4837]: I0111 17:50:25.930708 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-sync-d8v5z" event={"ID":"3ca257e2-1e1f-401c-9a8e-776746d6bfe2","Type":"ContainerDied","Data":"fd5e0721b6433645d1d246f06f4f62ecdf2ff85194e95ccb2b12fafbf0fe803b"} Jan 11 17:50:25 crc kubenswrapper[4837]: I0111 17:50:25.930795 4837 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="fd5e0721b6433645d1d246f06f4f62ecdf2ff85194e95ccb2b12fafbf0fe803b" Jan 11 17:50:25 crc kubenswrapper[4837]: I0111 17:50:25.930991 4837 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-sync-d8v5z" Jan 11 17:50:25 crc kubenswrapper[4837]: I0111 17:50:25.940429 4837 generic.go:334] "Generic (PLEG): container finished" podID="ed1cfeaf-6da5-40ed-b605-077e5c95900c" containerID="a36a59c0876ae98d222aab78869c8bcdf4810ea828fc526059b03a3d70f1d37f" exitCode=0 Jan 11 17:50:25 crc kubenswrapper[4837]: I0111 17:50:25.940513 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db-sync-r8vzn" event={"ID":"ed1cfeaf-6da5-40ed-b605-077e5c95900c","Type":"ContainerDied","Data":"a36a59c0876ae98d222aab78869c8bcdf4810ea828fc526059b03a3d70f1d37f"} Jan 11 17:50:26 crc kubenswrapper[4837]: I0111 17:50:26.220751 4837 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-55fff446b9-zmx2m"] Jan 11 17:50:26 crc kubenswrapper[4837]: E0111 17:50:26.221489 4837 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ff652e64-59f9-4670-8952-a33ee996c7e5" containerName="mariadb-database-create" Jan 11 17:50:26 crc kubenswrapper[4837]: I0111 17:50:26.221507 4837 state_mem.go:107] "Deleted CPUSet assignment" podUID="ff652e64-59f9-4670-8952-a33ee996c7e5" containerName="mariadb-database-create" Jan 11 17:50:26 crc kubenswrapper[4837]: E0111 17:50:26.221529 4837 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="281a517d-5b8e-413b-8e6c-6555318d70c8" containerName="init" Jan 11 17:50:26 crc kubenswrapper[4837]: I0111 17:50:26.221539 4837 state_mem.go:107] "Deleted CPUSet assignment" podUID="281a517d-5b8e-413b-8e6c-6555318d70c8" containerName="init" Jan 11 17:50:26 crc kubenswrapper[4837]: E0111 17:50:26.221552 4837 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f1d3da28-9c32-4294-b0b9-35b383dafb36" containerName="mariadb-account-create-update" Jan 11 17:50:26 crc kubenswrapper[4837]: I0111 17:50:26.221561 4837 state_mem.go:107] "Deleted CPUSet assignment" podUID="f1d3da28-9c32-4294-b0b9-35b383dafb36" containerName="mariadb-account-create-update" Jan 11 17:50:26 crc kubenswrapper[4837]: E0111 17:50:26.221574 4837 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7c8fa4d5-171c-4961-b1e2-203c2d2a128f" containerName="mariadb-database-create" Jan 11 17:50:26 crc kubenswrapper[4837]: I0111 17:50:26.221581 4837 state_mem.go:107] "Deleted CPUSet assignment" podUID="7c8fa4d5-171c-4961-b1e2-203c2d2a128f" containerName="mariadb-database-create" Jan 11 17:50:26 crc kubenswrapper[4837]: E0111 17:50:26.221596 4837 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3865f01a-21c1-4214-b979-14e72a764eb8" containerName="mariadb-account-create-update" Jan 11 17:50:26 crc kubenswrapper[4837]: I0111 17:50:26.221604 4837 state_mem.go:107] "Deleted CPUSet assignment" podUID="3865f01a-21c1-4214-b979-14e72a764eb8" containerName="mariadb-account-create-update" Jan 11 17:50:26 crc kubenswrapper[4837]: E0111 17:50:26.221616 4837 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f02a3348-4ccd-4793-8f46-e735fa0fc49d" containerName="mariadb-account-create-update" Jan 11 17:50:26 crc kubenswrapper[4837]: I0111 17:50:26.221624 4837 state_mem.go:107] "Deleted CPUSet assignment" podUID="f02a3348-4ccd-4793-8f46-e735fa0fc49d" containerName="mariadb-account-create-update" Jan 11 17:50:26 crc kubenswrapper[4837]: E0111 17:50:26.221636 4837 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3ca257e2-1e1f-401c-9a8e-776746d6bfe2" containerName="keystone-db-sync" Jan 11 17:50:26 crc kubenswrapper[4837]: I0111 17:50:26.221644 4837 state_mem.go:107] "Deleted CPUSet assignment" podUID="3ca257e2-1e1f-401c-9a8e-776746d6bfe2" containerName="keystone-db-sync" Jan 11 17:50:26 crc kubenswrapper[4837]: E0111 17:50:26.221660 4837 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="29bd365d-3562-452a-a585-041d3f538ebe" containerName="mariadb-database-create" Jan 11 17:50:26 crc kubenswrapper[4837]: I0111 17:50:26.221668 4837 state_mem.go:107] "Deleted CPUSet assignment" podUID="29bd365d-3562-452a-a585-041d3f538ebe" containerName="mariadb-database-create" Jan 11 17:50:26 crc kubenswrapper[4837]: E0111 17:50:26.221700 4837 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="281a517d-5b8e-413b-8e6c-6555318d70c8" containerName="dnsmasq-dns" Jan 11 17:50:26 crc kubenswrapper[4837]: I0111 17:50:26.221708 4837 state_mem.go:107] "Deleted CPUSet assignment" podUID="281a517d-5b8e-413b-8e6c-6555318d70c8" containerName="dnsmasq-dns" Jan 11 17:50:26 crc kubenswrapper[4837]: I0111 17:50:26.221875 4837 memory_manager.go:354] "RemoveStaleState removing state" podUID="ff652e64-59f9-4670-8952-a33ee996c7e5" containerName="mariadb-database-create" Jan 11 17:50:26 crc kubenswrapper[4837]: I0111 17:50:26.221890 4837 memory_manager.go:354] "RemoveStaleState removing state" podUID="f1d3da28-9c32-4294-b0b9-35b383dafb36" containerName="mariadb-account-create-update" Jan 11 17:50:26 crc kubenswrapper[4837]: I0111 17:50:26.221901 4837 memory_manager.go:354] "RemoveStaleState removing state" podUID="f02a3348-4ccd-4793-8f46-e735fa0fc49d" containerName="mariadb-account-create-update" Jan 11 17:50:26 crc kubenswrapper[4837]: I0111 17:50:26.221913 4837 memory_manager.go:354] "RemoveStaleState removing state" podUID="3865f01a-21c1-4214-b979-14e72a764eb8" containerName="mariadb-account-create-update" Jan 11 17:50:26 crc kubenswrapper[4837]: I0111 17:50:26.221927 4837 memory_manager.go:354] "RemoveStaleState removing state" podUID="29bd365d-3562-452a-a585-041d3f538ebe" containerName="mariadb-database-create" Jan 11 17:50:26 crc kubenswrapper[4837]: I0111 17:50:26.221939 4837 memory_manager.go:354] "RemoveStaleState removing state" podUID="281a517d-5b8e-413b-8e6c-6555318d70c8" containerName="dnsmasq-dns" Jan 11 17:50:26 crc kubenswrapper[4837]: I0111 17:50:26.221951 4837 memory_manager.go:354] "RemoveStaleState removing state" podUID="3ca257e2-1e1f-401c-9a8e-776746d6bfe2" containerName="keystone-db-sync" Jan 11 17:50:26 crc kubenswrapper[4837]: I0111 17:50:26.221963 4837 memory_manager.go:354] "RemoveStaleState removing state" podUID="7c8fa4d5-171c-4961-b1e2-203c2d2a128f" containerName="mariadb-database-create" Jan 11 17:50:26 crc kubenswrapper[4837]: I0111 17:50:26.222822 4837 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/keystone-bootstrap-dtdbs"] Jan 11 17:50:26 crc kubenswrapper[4837]: I0111 17:50:26.223505 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-dtdbs" Jan 11 17:50:26 crc kubenswrapper[4837]: I0111 17:50:26.223521 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-55fff446b9-zmx2m" Jan 11 17:50:26 crc kubenswrapper[4837]: I0111 17:50:26.228661 4837 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-scripts" Jan 11 17:50:26 crc kubenswrapper[4837]: I0111 17:50:26.228883 4837 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-keystone-dockercfg-6ct2s" Jan 11 17:50:26 crc kubenswrapper[4837]: I0111 17:50:26.229024 4837 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"osp-secret" Jan 11 17:50:26 crc kubenswrapper[4837]: I0111 17:50:26.229146 4837 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone" Jan 11 17:50:26 crc kubenswrapper[4837]: I0111 17:50:26.229340 4837 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-config-data" Jan 11 17:50:26 crc kubenswrapper[4837]: I0111 17:50:26.253911 4837 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-bootstrap-dtdbs"] Jan 11 17:50:26 crc kubenswrapper[4837]: I0111 17:50:26.262790 4837 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-55fff446b9-zmx2m"] Jan 11 17:50:26 crc kubenswrapper[4837]: I0111 17:50:26.323643 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/0c2f4993-7ed4-4e39-b9ef-1902b2fd3510-ovsdbserver-sb\") pod \"dnsmasq-dns-55fff446b9-zmx2m\" (UID: \"0c2f4993-7ed4-4e39-b9ef-1902b2fd3510\") " pod="openstack/dnsmasq-dns-55fff446b9-zmx2m" Jan 11 17:50:26 crc kubenswrapper[4837]: I0111 17:50:26.323708 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/0c2f4993-7ed4-4e39-b9ef-1902b2fd3510-config\") pod \"dnsmasq-dns-55fff446b9-zmx2m\" (UID: \"0c2f4993-7ed4-4e39-b9ef-1902b2fd3510\") " pod="openstack/dnsmasq-dns-55fff446b9-zmx2m" Jan 11 17:50:26 crc kubenswrapper[4837]: I0111 17:50:26.323767 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/0c2f4993-7ed4-4e39-b9ef-1902b2fd3510-ovsdbserver-nb\") pod \"dnsmasq-dns-55fff446b9-zmx2m\" (UID: \"0c2f4993-7ed4-4e39-b9ef-1902b2fd3510\") " pod="openstack/dnsmasq-dns-55fff446b9-zmx2m" Jan 11 17:50:26 crc kubenswrapper[4837]: I0111 17:50:26.323850 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/0c2f4993-7ed4-4e39-b9ef-1902b2fd3510-dns-svc\") pod \"dnsmasq-dns-55fff446b9-zmx2m\" (UID: \"0c2f4993-7ed4-4e39-b9ef-1902b2fd3510\") " pod="openstack/dnsmasq-dns-55fff446b9-zmx2m" Jan 11 17:50:26 crc kubenswrapper[4837]: I0111 17:50:26.323890 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/0c2f4993-7ed4-4e39-b9ef-1902b2fd3510-dns-swift-storage-0\") pod \"dnsmasq-dns-55fff446b9-zmx2m\" (UID: \"0c2f4993-7ed4-4e39-b9ef-1902b2fd3510\") " pod="openstack/dnsmasq-dns-55fff446b9-zmx2m" Jan 11 17:50:26 crc kubenswrapper[4837]: I0111 17:50:26.323922 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hdbwx\" (UniqueName: \"kubernetes.io/projected/0c2f4993-7ed4-4e39-b9ef-1902b2fd3510-kube-api-access-hdbwx\") pod \"dnsmasq-dns-55fff446b9-zmx2m\" (UID: \"0c2f4993-7ed4-4e39-b9ef-1902b2fd3510\") " pod="openstack/dnsmasq-dns-55fff446b9-zmx2m" Jan 11 17:50:26 crc kubenswrapper[4837]: I0111 17:50:26.380527 4837 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/horizon-774fb4c689-s8tf9"] Jan 11 17:50:26 crc kubenswrapper[4837]: I0111 17:50:26.381780 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-774fb4c689-s8tf9" Jan 11 17:50:26 crc kubenswrapper[4837]: I0111 17:50:26.384890 4837 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"horizon-horizon-dockercfg-2p4v5" Jan 11 17:50:26 crc kubenswrapper[4837]: I0111 17:50:26.385105 4837 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"horizon-config-data" Jan 11 17:50:26 crc kubenswrapper[4837]: I0111 17:50:26.385230 4837 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"horizon-scripts" Jan 11 17:50:26 crc kubenswrapper[4837]: I0111 17:50:26.388932 4837 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"horizon" Jan 11 17:50:26 crc kubenswrapper[4837]: I0111 17:50:26.392309 4837 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/horizon-774fb4c689-s8tf9"] Jan 11 17:50:26 crc kubenswrapper[4837]: I0111 17:50:26.401811 4837 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-db-sync-l58f7"] Jan 11 17:50:26 crc kubenswrapper[4837]: I0111 17:50:26.403111 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-sync-l58f7" Jan 11 17:50:26 crc kubenswrapper[4837]: I0111 17:50:26.408301 4837 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-cinder-dockercfg-6cjjv" Jan 11 17:50:26 crc kubenswrapper[4837]: I0111 17:50:26.408328 4837 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-scripts" Jan 11 17:50:26 crc kubenswrapper[4837]: I0111 17:50:26.408917 4837 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-config-data" Jan 11 17:50:26 crc kubenswrapper[4837]: I0111 17:50:26.413934 4837 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-db-sync-l58f7"] Jan 11 17:50:26 crc kubenswrapper[4837]: I0111 17:50:26.425308 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/0c2f4993-7ed4-4e39-b9ef-1902b2fd3510-dns-svc\") pod \"dnsmasq-dns-55fff446b9-zmx2m\" (UID: \"0c2f4993-7ed4-4e39-b9ef-1902b2fd3510\") " pod="openstack/dnsmasq-dns-55fff446b9-zmx2m" Jan 11 17:50:26 crc kubenswrapper[4837]: I0111 17:50:26.425355 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/42cb0cea-5da0-4694-b28f-a24f296b6399-combined-ca-bundle\") pod \"keystone-bootstrap-dtdbs\" (UID: \"42cb0cea-5da0-4694-b28f-a24f296b6399\") " pod="openstack/keystone-bootstrap-dtdbs" Jan 11 17:50:26 crc kubenswrapper[4837]: I0111 17:50:26.425375 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/42cb0cea-5da0-4694-b28f-a24f296b6399-credential-keys\") pod \"keystone-bootstrap-dtdbs\" (UID: \"42cb0cea-5da0-4694-b28f-a24f296b6399\") " pod="openstack/keystone-bootstrap-dtdbs" Jan 11 17:50:26 crc kubenswrapper[4837]: I0111 17:50:26.425390 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9s8qr\" (UniqueName: \"kubernetes.io/projected/42cb0cea-5da0-4694-b28f-a24f296b6399-kube-api-access-9s8qr\") pod \"keystone-bootstrap-dtdbs\" (UID: \"42cb0cea-5da0-4694-b28f-a24f296b6399\") " pod="openstack/keystone-bootstrap-dtdbs" Jan 11 17:50:26 crc kubenswrapper[4837]: I0111 17:50:26.425423 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/0c2f4993-7ed4-4e39-b9ef-1902b2fd3510-dns-swift-storage-0\") pod \"dnsmasq-dns-55fff446b9-zmx2m\" (UID: \"0c2f4993-7ed4-4e39-b9ef-1902b2fd3510\") " pod="openstack/dnsmasq-dns-55fff446b9-zmx2m" Jan 11 17:50:26 crc kubenswrapper[4837]: I0111 17:50:26.425443 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/42cb0cea-5da0-4694-b28f-a24f296b6399-config-data\") pod \"keystone-bootstrap-dtdbs\" (UID: \"42cb0cea-5da0-4694-b28f-a24f296b6399\") " pod="openstack/keystone-bootstrap-dtdbs" Jan 11 17:50:26 crc kubenswrapper[4837]: I0111 17:50:26.425464 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hdbwx\" (UniqueName: \"kubernetes.io/projected/0c2f4993-7ed4-4e39-b9ef-1902b2fd3510-kube-api-access-hdbwx\") pod \"dnsmasq-dns-55fff446b9-zmx2m\" (UID: \"0c2f4993-7ed4-4e39-b9ef-1902b2fd3510\") " pod="openstack/dnsmasq-dns-55fff446b9-zmx2m" Jan 11 17:50:26 crc kubenswrapper[4837]: I0111 17:50:26.425494 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/42cb0cea-5da0-4694-b28f-a24f296b6399-scripts\") pod \"keystone-bootstrap-dtdbs\" (UID: \"42cb0cea-5da0-4694-b28f-a24f296b6399\") " pod="openstack/keystone-bootstrap-dtdbs" Jan 11 17:50:26 crc kubenswrapper[4837]: I0111 17:50:26.425543 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/0c2f4993-7ed4-4e39-b9ef-1902b2fd3510-ovsdbserver-sb\") pod \"dnsmasq-dns-55fff446b9-zmx2m\" (UID: \"0c2f4993-7ed4-4e39-b9ef-1902b2fd3510\") " pod="openstack/dnsmasq-dns-55fff446b9-zmx2m" Jan 11 17:50:26 crc kubenswrapper[4837]: I0111 17:50:26.425565 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/0c2f4993-7ed4-4e39-b9ef-1902b2fd3510-config\") pod \"dnsmasq-dns-55fff446b9-zmx2m\" (UID: \"0c2f4993-7ed4-4e39-b9ef-1902b2fd3510\") " pod="openstack/dnsmasq-dns-55fff446b9-zmx2m" Jan 11 17:50:26 crc kubenswrapper[4837]: I0111 17:50:26.425608 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/42cb0cea-5da0-4694-b28f-a24f296b6399-fernet-keys\") pod \"keystone-bootstrap-dtdbs\" (UID: \"42cb0cea-5da0-4694-b28f-a24f296b6399\") " pod="openstack/keystone-bootstrap-dtdbs" Jan 11 17:50:26 crc kubenswrapper[4837]: I0111 17:50:26.425634 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/0c2f4993-7ed4-4e39-b9ef-1902b2fd3510-ovsdbserver-nb\") pod \"dnsmasq-dns-55fff446b9-zmx2m\" (UID: \"0c2f4993-7ed4-4e39-b9ef-1902b2fd3510\") " pod="openstack/dnsmasq-dns-55fff446b9-zmx2m" Jan 11 17:50:26 crc kubenswrapper[4837]: I0111 17:50:26.426551 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/0c2f4993-7ed4-4e39-b9ef-1902b2fd3510-ovsdbserver-nb\") pod \"dnsmasq-dns-55fff446b9-zmx2m\" (UID: \"0c2f4993-7ed4-4e39-b9ef-1902b2fd3510\") " pod="openstack/dnsmasq-dns-55fff446b9-zmx2m" Jan 11 17:50:26 crc kubenswrapper[4837]: I0111 17:50:26.427754 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/0c2f4993-7ed4-4e39-b9ef-1902b2fd3510-dns-swift-storage-0\") pod \"dnsmasq-dns-55fff446b9-zmx2m\" (UID: \"0c2f4993-7ed4-4e39-b9ef-1902b2fd3510\") " pod="openstack/dnsmasq-dns-55fff446b9-zmx2m" Jan 11 17:50:26 crc kubenswrapper[4837]: I0111 17:50:26.428332 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/0c2f4993-7ed4-4e39-b9ef-1902b2fd3510-ovsdbserver-sb\") pod \"dnsmasq-dns-55fff446b9-zmx2m\" (UID: \"0c2f4993-7ed4-4e39-b9ef-1902b2fd3510\") " pod="openstack/dnsmasq-dns-55fff446b9-zmx2m" Jan 11 17:50:26 crc kubenswrapper[4837]: I0111 17:50:26.429033 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/0c2f4993-7ed4-4e39-b9ef-1902b2fd3510-dns-svc\") pod \"dnsmasq-dns-55fff446b9-zmx2m\" (UID: \"0c2f4993-7ed4-4e39-b9ef-1902b2fd3510\") " pod="openstack/dnsmasq-dns-55fff446b9-zmx2m" Jan 11 17:50:26 crc kubenswrapper[4837]: I0111 17:50:26.429561 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/0c2f4993-7ed4-4e39-b9ef-1902b2fd3510-config\") pod \"dnsmasq-dns-55fff446b9-zmx2m\" (UID: \"0c2f4993-7ed4-4e39-b9ef-1902b2fd3510\") " pod="openstack/dnsmasq-dns-55fff446b9-zmx2m" Jan 11 17:50:26 crc kubenswrapper[4837]: I0111 17:50:26.489391 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hdbwx\" (UniqueName: \"kubernetes.io/projected/0c2f4993-7ed4-4e39-b9ef-1902b2fd3510-kube-api-access-hdbwx\") pod \"dnsmasq-dns-55fff446b9-zmx2m\" (UID: \"0c2f4993-7ed4-4e39-b9ef-1902b2fd3510\") " pod="openstack/dnsmasq-dns-55fff446b9-zmx2m" Jan 11 17:50:26 crc kubenswrapper[4837]: I0111 17:50:26.532120 4837 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/neutron-db-sync-fv6l8"] Jan 11 17:50:26 crc kubenswrapper[4837]: I0111 17:50:26.532602 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/42cb0cea-5da0-4694-b28f-a24f296b6399-fernet-keys\") pod \"keystone-bootstrap-dtdbs\" (UID: \"42cb0cea-5da0-4694-b28f-a24f296b6399\") " pod="openstack/keystone-bootstrap-dtdbs" Jan 11 17:50:26 crc kubenswrapper[4837]: I0111 17:50:26.532647 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/ecc4b7a3-4585-48ee-9cf3-caa58c9f5895-scripts\") pod \"cinder-db-sync-l58f7\" (UID: \"ecc4b7a3-4585-48ee-9cf3-caa58c9f5895\") " pod="openstack/cinder-db-sync-l58f7" Jan 11 17:50:26 crc kubenswrapper[4837]: I0111 17:50:26.532666 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/de0a2bd6-2144-4c5c-91ab-a5fde49c9808-horizon-secret-key\") pod \"horizon-774fb4c689-s8tf9\" (UID: \"de0a2bd6-2144-4c5c-91ab-a5fde49c9808\") " pod="openstack/horizon-774fb4c689-s8tf9" Jan 11 17:50:26 crc kubenswrapper[4837]: I0111 17:50:26.532699 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/de0a2bd6-2144-4c5c-91ab-a5fde49c9808-config-data\") pod \"horizon-774fb4c689-s8tf9\" (UID: \"de0a2bd6-2144-4c5c-91ab-a5fde49c9808\") " pod="openstack/horizon-774fb4c689-s8tf9" Jan 11 17:50:26 crc kubenswrapper[4837]: I0111 17:50:26.532719 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/ecc4b7a3-4585-48ee-9cf3-caa58c9f5895-db-sync-config-data\") pod \"cinder-db-sync-l58f7\" (UID: \"ecc4b7a3-4585-48ee-9cf3-caa58c9f5895\") " pod="openstack/cinder-db-sync-l58f7" Jan 11 17:50:26 crc kubenswrapper[4837]: I0111 17:50:26.532732 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7st6m\" (UniqueName: \"kubernetes.io/projected/ecc4b7a3-4585-48ee-9cf3-caa58c9f5895-kube-api-access-7st6m\") pod \"cinder-db-sync-l58f7\" (UID: \"ecc4b7a3-4585-48ee-9cf3-caa58c9f5895\") " pod="openstack/cinder-db-sync-l58f7" Jan 11 17:50:26 crc kubenswrapper[4837]: I0111 17:50:26.532757 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-djtfv\" (UniqueName: \"kubernetes.io/projected/de0a2bd6-2144-4c5c-91ab-a5fde49c9808-kube-api-access-djtfv\") pod \"horizon-774fb4c689-s8tf9\" (UID: \"de0a2bd6-2144-4c5c-91ab-a5fde49c9808\") " pod="openstack/horizon-774fb4c689-s8tf9" Jan 11 17:50:26 crc kubenswrapper[4837]: I0111 17:50:26.532774 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/42cb0cea-5da0-4694-b28f-a24f296b6399-combined-ca-bundle\") pod \"keystone-bootstrap-dtdbs\" (UID: \"42cb0cea-5da0-4694-b28f-a24f296b6399\") " pod="openstack/keystone-bootstrap-dtdbs" Jan 11 17:50:26 crc kubenswrapper[4837]: I0111 17:50:26.532787 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/de0a2bd6-2144-4c5c-91ab-a5fde49c9808-logs\") pod \"horizon-774fb4c689-s8tf9\" (UID: \"de0a2bd6-2144-4c5c-91ab-a5fde49c9808\") " pod="openstack/horizon-774fb4c689-s8tf9" Jan 11 17:50:26 crc kubenswrapper[4837]: I0111 17:50:26.532805 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/42cb0cea-5da0-4694-b28f-a24f296b6399-credential-keys\") pod \"keystone-bootstrap-dtdbs\" (UID: \"42cb0cea-5da0-4694-b28f-a24f296b6399\") " pod="openstack/keystone-bootstrap-dtdbs" Jan 11 17:50:26 crc kubenswrapper[4837]: I0111 17:50:26.532820 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9s8qr\" (UniqueName: \"kubernetes.io/projected/42cb0cea-5da0-4694-b28f-a24f296b6399-kube-api-access-9s8qr\") pod \"keystone-bootstrap-dtdbs\" (UID: \"42cb0cea-5da0-4694-b28f-a24f296b6399\") " pod="openstack/keystone-bootstrap-dtdbs" Jan 11 17:50:26 crc kubenswrapper[4837]: I0111 17:50:26.532837 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/ecc4b7a3-4585-48ee-9cf3-caa58c9f5895-etc-machine-id\") pod \"cinder-db-sync-l58f7\" (UID: \"ecc4b7a3-4585-48ee-9cf3-caa58c9f5895\") " pod="openstack/cinder-db-sync-l58f7" Jan 11 17:50:26 crc kubenswrapper[4837]: I0111 17:50:26.532867 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/42cb0cea-5da0-4694-b28f-a24f296b6399-config-data\") pod \"keystone-bootstrap-dtdbs\" (UID: \"42cb0cea-5da0-4694-b28f-a24f296b6399\") " pod="openstack/keystone-bootstrap-dtdbs" Jan 11 17:50:26 crc kubenswrapper[4837]: I0111 17:50:26.532898 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/de0a2bd6-2144-4c5c-91ab-a5fde49c9808-scripts\") pod \"horizon-774fb4c689-s8tf9\" (UID: \"de0a2bd6-2144-4c5c-91ab-a5fde49c9808\") " pod="openstack/horizon-774fb4c689-s8tf9" Jan 11 17:50:26 crc kubenswrapper[4837]: I0111 17:50:26.532919 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/42cb0cea-5da0-4694-b28f-a24f296b6399-scripts\") pod \"keystone-bootstrap-dtdbs\" (UID: \"42cb0cea-5da0-4694-b28f-a24f296b6399\") " pod="openstack/keystone-bootstrap-dtdbs" Jan 11 17:50:26 crc kubenswrapper[4837]: I0111 17:50:26.532966 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ecc4b7a3-4585-48ee-9cf3-caa58c9f5895-config-data\") pod \"cinder-db-sync-l58f7\" (UID: \"ecc4b7a3-4585-48ee-9cf3-caa58c9f5895\") " pod="openstack/cinder-db-sync-l58f7" Jan 11 17:50:26 crc kubenswrapper[4837]: I0111 17:50:26.532983 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ecc4b7a3-4585-48ee-9cf3-caa58c9f5895-combined-ca-bundle\") pod \"cinder-db-sync-l58f7\" (UID: \"ecc4b7a3-4585-48ee-9cf3-caa58c9f5895\") " pod="openstack/cinder-db-sync-l58f7" Jan 11 17:50:26 crc kubenswrapper[4837]: I0111 17:50:26.541501 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-sync-fv6l8" Jan 11 17:50:26 crc kubenswrapper[4837]: I0111 17:50:26.546356 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/42cb0cea-5da0-4694-b28f-a24f296b6399-fernet-keys\") pod \"keystone-bootstrap-dtdbs\" (UID: \"42cb0cea-5da0-4694-b28f-a24f296b6399\") " pod="openstack/keystone-bootstrap-dtdbs" Jan 11 17:50:26 crc kubenswrapper[4837]: I0111 17:50:26.547938 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/42cb0cea-5da0-4694-b28f-a24f296b6399-scripts\") pod \"keystone-bootstrap-dtdbs\" (UID: \"42cb0cea-5da0-4694-b28f-a24f296b6399\") " pod="openstack/keystone-bootstrap-dtdbs" Jan 11 17:50:26 crc kubenswrapper[4837]: I0111 17:50:26.548422 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/42cb0cea-5da0-4694-b28f-a24f296b6399-combined-ca-bundle\") pod \"keystone-bootstrap-dtdbs\" (UID: \"42cb0cea-5da0-4694-b28f-a24f296b6399\") " pod="openstack/keystone-bootstrap-dtdbs" Jan 11 17:50:26 crc kubenswrapper[4837]: I0111 17:50:26.555129 4837 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-httpd-config" Jan 11 17:50:26 crc kubenswrapper[4837]: I0111 17:50:26.558433 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/42cb0cea-5da0-4694-b28f-a24f296b6399-credential-keys\") pod \"keystone-bootstrap-dtdbs\" (UID: \"42cb0cea-5da0-4694-b28f-a24f296b6399\") " pod="openstack/keystone-bootstrap-dtdbs" Jan 11 17:50:26 crc kubenswrapper[4837]: I0111 17:50:26.559069 4837 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-neutron-dockercfg-mp677" Jan 11 17:50:26 crc kubenswrapper[4837]: I0111 17:50:26.560332 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/42cb0cea-5da0-4694-b28f-a24f296b6399-config-data\") pod \"keystone-bootstrap-dtdbs\" (UID: \"42cb0cea-5da0-4694-b28f-a24f296b6399\") " pod="openstack/keystone-bootstrap-dtdbs" Jan 11 17:50:26 crc kubenswrapper[4837]: I0111 17:50:26.565517 4837 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-config" Jan 11 17:50:26 crc kubenswrapper[4837]: I0111 17:50:26.569568 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-55fff446b9-zmx2m" Jan 11 17:50:26 crc kubenswrapper[4837]: I0111 17:50:26.575059 4837 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/horizon-7755778889-fjtkp"] Jan 11 17:50:26 crc kubenswrapper[4837]: I0111 17:50:26.576346 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-7755778889-fjtkp" Jan 11 17:50:26 crc kubenswrapper[4837]: I0111 17:50:26.617596 4837 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-db-sync-fv6l8"] Jan 11 17:50:26 crc kubenswrapper[4837]: I0111 17:50:26.640316 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9s8qr\" (UniqueName: \"kubernetes.io/projected/42cb0cea-5da0-4694-b28f-a24f296b6399-kube-api-access-9s8qr\") pod \"keystone-bootstrap-dtdbs\" (UID: \"42cb0cea-5da0-4694-b28f-a24f296b6399\") " pod="openstack/keystone-bootstrap-dtdbs" Jan 11 17:50:26 crc kubenswrapper[4837]: I0111 17:50:26.641128 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/ecc4b7a3-4585-48ee-9cf3-caa58c9f5895-scripts\") pod \"cinder-db-sync-l58f7\" (UID: \"ecc4b7a3-4585-48ee-9cf3-caa58c9f5895\") " pod="openstack/cinder-db-sync-l58f7" Jan 11 17:50:26 crc kubenswrapper[4837]: I0111 17:50:26.641155 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/de0a2bd6-2144-4c5c-91ab-a5fde49c9808-horizon-secret-key\") pod \"horizon-774fb4c689-s8tf9\" (UID: \"de0a2bd6-2144-4c5c-91ab-a5fde49c9808\") " pod="openstack/horizon-774fb4c689-s8tf9" Jan 11 17:50:26 crc kubenswrapper[4837]: I0111 17:50:26.641171 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/de0a2bd6-2144-4c5c-91ab-a5fde49c9808-config-data\") pod \"horizon-774fb4c689-s8tf9\" (UID: \"de0a2bd6-2144-4c5c-91ab-a5fde49c9808\") " pod="openstack/horizon-774fb4c689-s8tf9" Jan 11 17:50:26 crc kubenswrapper[4837]: I0111 17:50:26.641189 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/ecc4b7a3-4585-48ee-9cf3-caa58c9f5895-db-sync-config-data\") pod \"cinder-db-sync-l58f7\" (UID: \"ecc4b7a3-4585-48ee-9cf3-caa58c9f5895\") " pod="openstack/cinder-db-sync-l58f7" Jan 11 17:50:26 crc kubenswrapper[4837]: I0111 17:50:26.641264 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7st6m\" (UniqueName: \"kubernetes.io/projected/ecc4b7a3-4585-48ee-9cf3-caa58c9f5895-kube-api-access-7st6m\") pod \"cinder-db-sync-l58f7\" (UID: \"ecc4b7a3-4585-48ee-9cf3-caa58c9f5895\") " pod="openstack/cinder-db-sync-l58f7" Jan 11 17:50:26 crc kubenswrapper[4837]: I0111 17:50:26.641290 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-djtfv\" (UniqueName: \"kubernetes.io/projected/de0a2bd6-2144-4c5c-91ab-a5fde49c9808-kube-api-access-djtfv\") pod \"horizon-774fb4c689-s8tf9\" (UID: \"de0a2bd6-2144-4c5c-91ab-a5fde49c9808\") " pod="openstack/horizon-774fb4c689-s8tf9" Jan 11 17:50:26 crc kubenswrapper[4837]: I0111 17:50:26.641305 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/de0a2bd6-2144-4c5c-91ab-a5fde49c9808-logs\") pod \"horizon-774fb4c689-s8tf9\" (UID: \"de0a2bd6-2144-4c5c-91ab-a5fde49c9808\") " pod="openstack/horizon-774fb4c689-s8tf9" Jan 11 17:50:26 crc kubenswrapper[4837]: I0111 17:50:26.641321 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/ecc4b7a3-4585-48ee-9cf3-caa58c9f5895-etc-machine-id\") pod \"cinder-db-sync-l58f7\" (UID: \"ecc4b7a3-4585-48ee-9cf3-caa58c9f5895\") " pod="openstack/cinder-db-sync-l58f7" Jan 11 17:50:26 crc kubenswrapper[4837]: I0111 17:50:26.641371 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/de0a2bd6-2144-4c5c-91ab-a5fde49c9808-scripts\") pod \"horizon-774fb4c689-s8tf9\" (UID: \"de0a2bd6-2144-4c5c-91ab-a5fde49c9808\") " pod="openstack/horizon-774fb4c689-s8tf9" Jan 11 17:50:26 crc kubenswrapper[4837]: I0111 17:50:26.641420 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ecc4b7a3-4585-48ee-9cf3-caa58c9f5895-config-data\") pod \"cinder-db-sync-l58f7\" (UID: \"ecc4b7a3-4585-48ee-9cf3-caa58c9f5895\") " pod="openstack/cinder-db-sync-l58f7" Jan 11 17:50:26 crc kubenswrapper[4837]: I0111 17:50:26.641437 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ecc4b7a3-4585-48ee-9cf3-caa58c9f5895-combined-ca-bundle\") pod \"cinder-db-sync-l58f7\" (UID: \"ecc4b7a3-4585-48ee-9cf3-caa58c9f5895\") " pod="openstack/cinder-db-sync-l58f7" Jan 11 17:50:26 crc kubenswrapper[4837]: I0111 17:50:26.641721 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/ecc4b7a3-4585-48ee-9cf3-caa58c9f5895-etc-machine-id\") pod \"cinder-db-sync-l58f7\" (UID: \"ecc4b7a3-4585-48ee-9cf3-caa58c9f5895\") " pod="openstack/cinder-db-sync-l58f7" Jan 11 17:50:26 crc kubenswrapper[4837]: I0111 17:50:26.642216 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/de0a2bd6-2144-4c5c-91ab-a5fde49c9808-logs\") pod \"horizon-774fb4c689-s8tf9\" (UID: \"de0a2bd6-2144-4c5c-91ab-a5fde49c9808\") " pod="openstack/horizon-774fb4c689-s8tf9" Jan 11 17:50:26 crc kubenswrapper[4837]: I0111 17:50:26.644796 4837 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/horizon-7755778889-fjtkp"] Jan 11 17:50:26 crc kubenswrapper[4837]: I0111 17:50:26.654260 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/de0a2bd6-2144-4c5c-91ab-a5fde49c9808-scripts\") pod \"horizon-774fb4c689-s8tf9\" (UID: \"de0a2bd6-2144-4c5c-91ab-a5fde49c9808\") " pod="openstack/horizon-774fb4c689-s8tf9" Jan 11 17:50:26 crc kubenswrapper[4837]: I0111 17:50:26.659033 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/de0a2bd6-2144-4c5c-91ab-a5fde49c9808-horizon-secret-key\") pod \"horizon-774fb4c689-s8tf9\" (UID: \"de0a2bd6-2144-4c5c-91ab-a5fde49c9808\") " pod="openstack/horizon-774fb4c689-s8tf9" Jan 11 17:50:26 crc kubenswrapper[4837]: I0111 17:50:26.660192 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/ecc4b7a3-4585-48ee-9cf3-caa58c9f5895-scripts\") pod \"cinder-db-sync-l58f7\" (UID: \"ecc4b7a3-4585-48ee-9cf3-caa58c9f5895\") " pod="openstack/cinder-db-sync-l58f7" Jan 11 17:50:26 crc kubenswrapper[4837]: I0111 17:50:26.666104 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/ecc4b7a3-4585-48ee-9cf3-caa58c9f5895-db-sync-config-data\") pod \"cinder-db-sync-l58f7\" (UID: \"ecc4b7a3-4585-48ee-9cf3-caa58c9f5895\") " pod="openstack/cinder-db-sync-l58f7" Jan 11 17:50:26 crc kubenswrapper[4837]: I0111 17:50:26.671525 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ecc4b7a3-4585-48ee-9cf3-caa58c9f5895-config-data\") pod \"cinder-db-sync-l58f7\" (UID: \"ecc4b7a3-4585-48ee-9cf3-caa58c9f5895\") " pod="openstack/cinder-db-sync-l58f7" Jan 11 17:50:26 crc kubenswrapper[4837]: I0111 17:50:26.673159 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ecc4b7a3-4585-48ee-9cf3-caa58c9f5895-combined-ca-bundle\") pod \"cinder-db-sync-l58f7\" (UID: \"ecc4b7a3-4585-48ee-9cf3-caa58c9f5895\") " pod="openstack/cinder-db-sync-l58f7" Jan 11 17:50:26 crc kubenswrapper[4837]: I0111 17:50:26.681894 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/de0a2bd6-2144-4c5c-91ab-a5fde49c9808-config-data\") pod \"horizon-774fb4c689-s8tf9\" (UID: \"de0a2bd6-2144-4c5c-91ab-a5fde49c9808\") " pod="openstack/horizon-774fb4c689-s8tf9" Jan 11 17:50:26 crc kubenswrapper[4837]: I0111 17:50:26.698447 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-djtfv\" (UniqueName: \"kubernetes.io/projected/de0a2bd6-2144-4c5c-91ab-a5fde49c9808-kube-api-access-djtfv\") pod \"horizon-774fb4c689-s8tf9\" (UID: \"de0a2bd6-2144-4c5c-91ab-a5fde49c9808\") " pod="openstack/horizon-774fb4c689-s8tf9" Jan 11 17:50:26 crc kubenswrapper[4837]: I0111 17:50:26.699167 4837 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/barbican-db-sync-6mdwl"] Jan 11 17:50:26 crc kubenswrapper[4837]: I0111 17:50:26.717768 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-sync-6mdwl" Jan 11 17:50:26 crc kubenswrapper[4837]: I0111 17:50:26.720457 4837 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-config-data" Jan 11 17:50:26 crc kubenswrapper[4837]: I0111 17:50:26.721834 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-774fb4c689-s8tf9" Jan 11 17:50:26 crc kubenswrapper[4837]: I0111 17:50:26.722319 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7st6m\" (UniqueName: \"kubernetes.io/projected/ecc4b7a3-4585-48ee-9cf3-caa58c9f5895-kube-api-access-7st6m\") pod \"cinder-db-sync-l58f7\" (UID: \"ecc4b7a3-4585-48ee-9cf3-caa58c9f5895\") " pod="openstack/cinder-db-sync-l58f7" Jan 11 17:50:26 crc kubenswrapper[4837]: I0111 17:50:26.736348 4837 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-barbican-dockercfg-fvnjj" Jan 11 17:50:26 crc kubenswrapper[4837]: I0111 17:50:26.748359 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-sync-l58f7" Jan 11 17:50:26 crc kubenswrapper[4837]: I0111 17:50:26.780987 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/77a72881-5957-4b75-b490-0417214a39b0-logs\") pod \"horizon-7755778889-fjtkp\" (UID: \"77a72881-5957-4b75-b490-0417214a39b0\") " pod="openstack/horizon-7755778889-fjtkp" Jan 11 17:50:26 crc kubenswrapper[4837]: I0111 17:50:26.781035 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/77a72881-5957-4b75-b490-0417214a39b0-config-data\") pod \"horizon-7755778889-fjtkp\" (UID: \"77a72881-5957-4b75-b490-0417214a39b0\") " pod="openstack/horizon-7755778889-fjtkp" Jan 11 17:50:26 crc kubenswrapper[4837]: I0111 17:50:26.781074 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tlm7h\" (UniqueName: \"kubernetes.io/projected/1b22cc91-a2b0-4468-96bc-73cb4aab66bb-kube-api-access-tlm7h\") pod \"neutron-db-sync-fv6l8\" (UID: \"1b22cc91-a2b0-4468-96bc-73cb4aab66bb\") " pod="openstack/neutron-db-sync-fv6l8" Jan 11 17:50:26 crc kubenswrapper[4837]: I0111 17:50:26.781157 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/77a72881-5957-4b75-b490-0417214a39b0-horizon-secret-key\") pod \"horizon-7755778889-fjtkp\" (UID: \"77a72881-5957-4b75-b490-0417214a39b0\") " pod="openstack/horizon-7755778889-fjtkp" Jan 11 17:50:26 crc kubenswrapper[4837]: I0111 17:50:26.781255 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/1b22cc91-a2b0-4468-96bc-73cb4aab66bb-config\") pod \"neutron-db-sync-fv6l8\" (UID: \"1b22cc91-a2b0-4468-96bc-73cb4aab66bb\") " pod="openstack/neutron-db-sync-fv6l8" Jan 11 17:50:26 crc kubenswrapper[4837]: I0111 17:50:26.781308 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1b22cc91-a2b0-4468-96bc-73cb4aab66bb-combined-ca-bundle\") pod \"neutron-db-sync-fv6l8\" (UID: \"1b22cc91-a2b0-4468-96bc-73cb4aab66bb\") " pod="openstack/neutron-db-sync-fv6l8" Jan 11 17:50:26 crc kubenswrapper[4837]: I0111 17:50:26.781337 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bdppn\" (UniqueName: \"kubernetes.io/projected/77a72881-5957-4b75-b490-0417214a39b0-kube-api-access-bdppn\") pod \"horizon-7755778889-fjtkp\" (UID: \"77a72881-5957-4b75-b490-0417214a39b0\") " pod="openstack/horizon-7755778889-fjtkp" Jan 11 17:50:26 crc kubenswrapper[4837]: I0111 17:50:26.781423 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/77a72881-5957-4b75-b490-0417214a39b0-scripts\") pod \"horizon-7755778889-fjtkp\" (UID: \"77a72881-5957-4b75-b490-0417214a39b0\") " pod="openstack/horizon-7755778889-fjtkp" Jan 11 17:50:26 crc kubenswrapper[4837]: I0111 17:50:26.790009 4837 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-db-sync-6mdwl"] Jan 11 17:50:26 crc kubenswrapper[4837]: I0111 17:50:26.800738 4837 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceilometer-0"] Jan 11 17:50:26 crc kubenswrapper[4837]: I0111 17:50:26.873164 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Jan 11 17:50:26 crc kubenswrapper[4837]: I0111 17:50:26.875886 4837 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-config-data" Jan 11 17:50:26 crc kubenswrapper[4837]: I0111 17:50:26.878340 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-dtdbs" Jan 11 17:50:26 crc kubenswrapper[4837]: I0111 17:50:26.878941 4837 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-scripts" Jan 11 17:50:26 crc kubenswrapper[4837]: I0111 17:50:26.886807 4837 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Jan 11 17:50:26 crc kubenswrapper[4837]: I0111 17:50:26.887727 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/77a72881-5957-4b75-b490-0417214a39b0-logs\") pod \"horizon-7755778889-fjtkp\" (UID: \"77a72881-5957-4b75-b490-0417214a39b0\") " pod="openstack/horizon-7755778889-fjtkp" Jan 11 17:50:26 crc kubenswrapper[4837]: I0111 17:50:26.887765 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/77a72881-5957-4b75-b490-0417214a39b0-config-data\") pod \"horizon-7755778889-fjtkp\" (UID: \"77a72881-5957-4b75-b490-0417214a39b0\") " pod="openstack/horizon-7755778889-fjtkp" Jan 11 17:50:26 crc kubenswrapper[4837]: I0111 17:50:26.887798 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-tlm7h\" (UniqueName: \"kubernetes.io/projected/1b22cc91-a2b0-4468-96bc-73cb4aab66bb-kube-api-access-tlm7h\") pod \"neutron-db-sync-fv6l8\" (UID: \"1b22cc91-a2b0-4468-96bc-73cb4aab66bb\") " pod="openstack/neutron-db-sync-fv6l8" Jan 11 17:50:26 crc kubenswrapper[4837]: I0111 17:50:26.887847 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/77a72881-5957-4b75-b490-0417214a39b0-horizon-secret-key\") pod \"horizon-7755778889-fjtkp\" (UID: \"77a72881-5957-4b75-b490-0417214a39b0\") " pod="openstack/horizon-7755778889-fjtkp" Jan 11 17:50:26 crc kubenswrapper[4837]: I0111 17:50:26.887888 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/5930b460-1c65-4c06-a3bc-f6d6f0518110-db-sync-config-data\") pod \"barbican-db-sync-6mdwl\" (UID: \"5930b460-1c65-4c06-a3bc-f6d6f0518110\") " pod="openstack/barbican-db-sync-6mdwl" Jan 11 17:50:26 crc kubenswrapper[4837]: I0111 17:50:26.887917 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rbsr2\" (UniqueName: \"kubernetes.io/projected/5930b460-1c65-4c06-a3bc-f6d6f0518110-kube-api-access-rbsr2\") pod \"barbican-db-sync-6mdwl\" (UID: \"5930b460-1c65-4c06-a3bc-f6d6f0518110\") " pod="openstack/barbican-db-sync-6mdwl" Jan 11 17:50:26 crc kubenswrapper[4837]: I0111 17:50:26.887946 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/1b22cc91-a2b0-4468-96bc-73cb4aab66bb-config\") pod \"neutron-db-sync-fv6l8\" (UID: \"1b22cc91-a2b0-4468-96bc-73cb4aab66bb\") " pod="openstack/neutron-db-sync-fv6l8" Jan 11 17:50:26 crc kubenswrapper[4837]: I0111 17:50:26.887985 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1b22cc91-a2b0-4468-96bc-73cb4aab66bb-combined-ca-bundle\") pod \"neutron-db-sync-fv6l8\" (UID: \"1b22cc91-a2b0-4468-96bc-73cb4aab66bb\") " pod="openstack/neutron-db-sync-fv6l8" Jan 11 17:50:26 crc kubenswrapper[4837]: I0111 17:50:26.888011 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bdppn\" (UniqueName: \"kubernetes.io/projected/77a72881-5957-4b75-b490-0417214a39b0-kube-api-access-bdppn\") pod \"horizon-7755778889-fjtkp\" (UID: \"77a72881-5957-4b75-b490-0417214a39b0\") " pod="openstack/horizon-7755778889-fjtkp" Jan 11 17:50:26 crc kubenswrapper[4837]: I0111 17:50:26.888033 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5930b460-1c65-4c06-a3bc-f6d6f0518110-combined-ca-bundle\") pod \"barbican-db-sync-6mdwl\" (UID: \"5930b460-1c65-4c06-a3bc-f6d6f0518110\") " pod="openstack/barbican-db-sync-6mdwl" Jan 11 17:50:26 crc kubenswrapper[4837]: I0111 17:50:26.888078 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/77a72881-5957-4b75-b490-0417214a39b0-scripts\") pod \"horizon-7755778889-fjtkp\" (UID: \"77a72881-5957-4b75-b490-0417214a39b0\") " pod="openstack/horizon-7755778889-fjtkp" Jan 11 17:50:26 crc kubenswrapper[4837]: I0111 17:50:26.889049 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/77a72881-5957-4b75-b490-0417214a39b0-logs\") pod \"horizon-7755778889-fjtkp\" (UID: \"77a72881-5957-4b75-b490-0417214a39b0\") " pod="openstack/horizon-7755778889-fjtkp" Jan 11 17:50:26 crc kubenswrapper[4837]: I0111 17:50:26.889920 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/77a72881-5957-4b75-b490-0417214a39b0-scripts\") pod \"horizon-7755778889-fjtkp\" (UID: \"77a72881-5957-4b75-b490-0417214a39b0\") " pod="openstack/horizon-7755778889-fjtkp" Jan 11 17:50:26 crc kubenswrapper[4837]: I0111 17:50:26.894309 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/77a72881-5957-4b75-b490-0417214a39b0-config-data\") pod \"horizon-7755778889-fjtkp\" (UID: \"77a72881-5957-4b75-b490-0417214a39b0\") " pod="openstack/horizon-7755778889-fjtkp" Jan 11 17:50:26 crc kubenswrapper[4837]: I0111 17:50:26.901607 4837 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-55fff446b9-zmx2m"] Jan 11 17:50:26 crc kubenswrapper[4837]: I0111 17:50:26.903233 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/77a72881-5957-4b75-b490-0417214a39b0-horizon-secret-key\") pod \"horizon-7755778889-fjtkp\" (UID: \"77a72881-5957-4b75-b490-0417214a39b0\") " pod="openstack/horizon-7755778889-fjtkp" Jan 11 17:50:26 crc kubenswrapper[4837]: I0111 17:50:26.903311 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1b22cc91-a2b0-4468-96bc-73cb4aab66bb-combined-ca-bundle\") pod \"neutron-db-sync-fv6l8\" (UID: \"1b22cc91-a2b0-4468-96bc-73cb4aab66bb\") " pod="openstack/neutron-db-sync-fv6l8" Jan 11 17:50:26 crc kubenswrapper[4837]: I0111 17:50:26.903967 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/secret/1b22cc91-a2b0-4468-96bc-73cb4aab66bb-config\") pod \"neutron-db-sync-fv6l8\" (UID: \"1b22cc91-a2b0-4468-96bc-73cb4aab66bb\") " pod="openstack/neutron-db-sync-fv6l8" Jan 11 17:50:26 crc kubenswrapper[4837]: I0111 17:50:26.917443 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bdppn\" (UniqueName: \"kubernetes.io/projected/77a72881-5957-4b75-b490-0417214a39b0-kube-api-access-bdppn\") pod \"horizon-7755778889-fjtkp\" (UID: \"77a72881-5957-4b75-b490-0417214a39b0\") " pod="openstack/horizon-7755778889-fjtkp" Jan 11 17:50:26 crc kubenswrapper[4837]: I0111 17:50:26.920067 4837 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/placement-db-sync-k7p7p"] Jan 11 17:50:26 crc kubenswrapper[4837]: I0111 17:50:26.921201 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-sync-k7p7p" Jan 11 17:50:26 crc kubenswrapper[4837]: I0111 17:50:26.937328 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-tlm7h\" (UniqueName: \"kubernetes.io/projected/1b22cc91-a2b0-4468-96bc-73cb4aab66bb-kube-api-access-tlm7h\") pod \"neutron-db-sync-fv6l8\" (UID: \"1b22cc91-a2b0-4468-96bc-73cb4aab66bb\") " pod="openstack/neutron-db-sync-fv6l8" Jan 11 17:50:26 crc kubenswrapper[4837]: I0111 17:50:26.938903 4837 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-76fcf4b695-glkr2"] Jan 11 17:50:26 crc kubenswrapper[4837]: I0111 17:50:26.941412 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-76fcf4b695-glkr2" Jan 11 17:50:26 crc kubenswrapper[4837]: I0111 17:50:26.943126 4837 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"placement-config-data" Jan 11 17:50:26 crc kubenswrapper[4837]: I0111 17:50:26.943318 4837 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"placement-placement-dockercfg-ltsqk" Jan 11 17:50:26 crc kubenswrapper[4837]: I0111 17:50:26.943422 4837 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"placement-scripts" Jan 11 17:50:26 crc kubenswrapper[4837]: I0111 17:50:26.962263 4837 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-76fcf4b695-glkr2"] Jan 11 17:50:26 crc kubenswrapper[4837]: I0111 17:50:26.980908 4837 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-db-sync-k7p7p"] Jan 11 17:50:26 crc kubenswrapper[4837]: I0111 17:50:26.991648 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5ec0beaf-de63-407f-8d18-46738023ab11-config-data\") pod \"ceilometer-0\" (UID: \"5ec0beaf-de63-407f-8d18-46738023ab11\") " pod="openstack/ceilometer-0" Jan 11 17:50:26 crc kubenswrapper[4837]: I0111 17:50:26.991700 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/5ec0beaf-de63-407f-8d18-46738023ab11-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"5ec0beaf-de63-407f-8d18-46738023ab11\") " pod="openstack/ceilometer-0" Jan 11 17:50:26 crc kubenswrapper[4837]: I0111 17:50:26.991722 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/5930b460-1c65-4c06-a3bc-f6d6f0518110-db-sync-config-data\") pod \"barbican-db-sync-6mdwl\" (UID: \"5930b460-1c65-4c06-a3bc-f6d6f0518110\") " pod="openstack/barbican-db-sync-6mdwl" Jan 11 17:50:27 crc kubenswrapper[4837]: I0111 17:50:26.991744 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rbsr2\" (UniqueName: \"kubernetes.io/projected/5930b460-1c65-4c06-a3bc-f6d6f0518110-kube-api-access-rbsr2\") pod \"barbican-db-sync-6mdwl\" (UID: \"5930b460-1c65-4c06-a3bc-f6d6f0518110\") " pod="openstack/barbican-db-sync-6mdwl" Jan 11 17:50:27 crc kubenswrapper[4837]: I0111 17:50:26.999209 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/5ec0beaf-de63-407f-8d18-46738023ab11-scripts\") pod \"ceilometer-0\" (UID: \"5ec0beaf-de63-407f-8d18-46738023ab11\") " pod="openstack/ceilometer-0" Jan 11 17:50:27 crc kubenswrapper[4837]: I0111 17:50:26.999242 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/5ec0beaf-de63-407f-8d18-46738023ab11-log-httpd\") pod \"ceilometer-0\" (UID: \"5ec0beaf-de63-407f-8d18-46738023ab11\") " pod="openstack/ceilometer-0" Jan 11 17:50:27 crc kubenswrapper[4837]: I0111 17:50:26.999266 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-k964n\" (UniqueName: \"kubernetes.io/projected/5ec0beaf-de63-407f-8d18-46738023ab11-kube-api-access-k964n\") pod \"ceilometer-0\" (UID: \"5ec0beaf-de63-407f-8d18-46738023ab11\") " pod="openstack/ceilometer-0" Jan 11 17:50:27 crc kubenswrapper[4837]: I0111 17:50:26.999373 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5930b460-1c65-4c06-a3bc-f6d6f0518110-combined-ca-bundle\") pod \"barbican-db-sync-6mdwl\" (UID: \"5930b460-1c65-4c06-a3bc-f6d6f0518110\") " pod="openstack/barbican-db-sync-6mdwl" Jan 11 17:50:27 crc kubenswrapper[4837]: I0111 17:50:26.999389 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5ec0beaf-de63-407f-8d18-46738023ab11-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"5ec0beaf-de63-407f-8d18-46738023ab11\") " pod="openstack/ceilometer-0" Jan 11 17:50:27 crc kubenswrapper[4837]: I0111 17:50:26.999563 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/5ec0beaf-de63-407f-8d18-46738023ab11-run-httpd\") pod \"ceilometer-0\" (UID: \"5ec0beaf-de63-407f-8d18-46738023ab11\") " pod="openstack/ceilometer-0" Jan 11 17:50:27 crc kubenswrapper[4837]: I0111 17:50:27.004926 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5930b460-1c65-4c06-a3bc-f6d6f0518110-combined-ca-bundle\") pod \"barbican-db-sync-6mdwl\" (UID: \"5930b460-1c65-4c06-a3bc-f6d6f0518110\") " pod="openstack/barbican-db-sync-6mdwl" Jan 11 17:50:27 crc kubenswrapper[4837]: I0111 17:50:27.011695 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/5930b460-1c65-4c06-a3bc-f6d6f0518110-db-sync-config-data\") pod \"barbican-db-sync-6mdwl\" (UID: \"5930b460-1c65-4c06-a3bc-f6d6f0518110\") " pod="openstack/barbican-db-sync-6mdwl" Jan 11 17:50:27 crc kubenswrapper[4837]: I0111 17:50:27.015898 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rbsr2\" (UniqueName: \"kubernetes.io/projected/5930b460-1c65-4c06-a3bc-f6d6f0518110-kube-api-access-rbsr2\") pod \"barbican-db-sync-6mdwl\" (UID: \"5930b460-1c65-4c06-a3bc-f6d6f0518110\") " pod="openstack/barbican-db-sync-6mdwl" Jan 11 17:50:27 crc kubenswrapper[4837]: I0111 17:50:27.101521 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zbk4d\" (UniqueName: \"kubernetes.io/projected/e7385aa0-566d-4f8d-b561-3295e49dd1fd-kube-api-access-zbk4d\") pod \"dnsmasq-dns-76fcf4b695-glkr2\" (UID: \"e7385aa0-566d-4f8d-b561-3295e49dd1fd\") " pod="openstack/dnsmasq-dns-76fcf4b695-glkr2" Jan 11 17:50:27 crc kubenswrapper[4837]: I0111 17:50:27.101664 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/e7385aa0-566d-4f8d-b561-3295e49dd1fd-ovsdbserver-sb\") pod \"dnsmasq-dns-76fcf4b695-glkr2\" (UID: \"e7385aa0-566d-4f8d-b561-3295e49dd1fd\") " pod="openstack/dnsmasq-dns-76fcf4b695-glkr2" Jan 11 17:50:27 crc kubenswrapper[4837]: I0111 17:50:27.101721 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5ec0beaf-de63-407f-8d18-46738023ab11-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"5ec0beaf-de63-407f-8d18-46738023ab11\") " pod="openstack/ceilometer-0" Jan 11 17:50:27 crc kubenswrapper[4837]: I0111 17:50:27.101746 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/e7385aa0-566d-4f8d-b561-3295e49dd1fd-dns-svc\") pod \"dnsmasq-dns-76fcf4b695-glkr2\" (UID: \"e7385aa0-566d-4f8d-b561-3295e49dd1fd\") " pod="openstack/dnsmasq-dns-76fcf4b695-glkr2" Jan 11 17:50:27 crc kubenswrapper[4837]: I0111 17:50:27.101786 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c5803c46-a48f-4120-9010-51375caff2a5-config-data\") pod \"placement-db-sync-k7p7p\" (UID: \"c5803c46-a48f-4120-9010-51375caff2a5\") " pod="openstack/placement-db-sync-k7p7p" Jan 11 17:50:27 crc kubenswrapper[4837]: I0111 17:50:27.101872 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/5ec0beaf-de63-407f-8d18-46738023ab11-run-httpd\") pod \"ceilometer-0\" (UID: \"5ec0beaf-de63-407f-8d18-46738023ab11\") " pod="openstack/ceilometer-0" Jan 11 17:50:27 crc kubenswrapper[4837]: I0111 17:50:27.101900 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c5803c46-a48f-4120-9010-51375caff2a5-combined-ca-bundle\") pod \"placement-db-sync-k7p7p\" (UID: \"c5803c46-a48f-4120-9010-51375caff2a5\") " pod="openstack/placement-db-sync-k7p7p" Jan 11 17:50:27 crc kubenswrapper[4837]: I0111 17:50:27.101961 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/e7385aa0-566d-4f8d-b561-3295e49dd1fd-dns-swift-storage-0\") pod \"dnsmasq-dns-76fcf4b695-glkr2\" (UID: \"e7385aa0-566d-4f8d-b561-3295e49dd1fd\") " pod="openstack/dnsmasq-dns-76fcf4b695-glkr2" Jan 11 17:50:27 crc kubenswrapper[4837]: I0111 17:50:27.101992 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e7385aa0-566d-4f8d-b561-3295e49dd1fd-config\") pod \"dnsmasq-dns-76fcf4b695-glkr2\" (UID: \"e7385aa0-566d-4f8d-b561-3295e49dd1fd\") " pod="openstack/dnsmasq-dns-76fcf4b695-glkr2" Jan 11 17:50:27 crc kubenswrapper[4837]: I0111 17:50:27.102062 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5ec0beaf-de63-407f-8d18-46738023ab11-config-data\") pod \"ceilometer-0\" (UID: \"5ec0beaf-de63-407f-8d18-46738023ab11\") " pod="openstack/ceilometer-0" Jan 11 17:50:27 crc kubenswrapper[4837]: I0111 17:50:27.102141 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/5ec0beaf-de63-407f-8d18-46738023ab11-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"5ec0beaf-de63-407f-8d18-46738023ab11\") " pod="openstack/ceilometer-0" Jan 11 17:50:27 crc kubenswrapper[4837]: I0111 17:50:27.102215 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/c5803c46-a48f-4120-9010-51375caff2a5-logs\") pod \"placement-db-sync-k7p7p\" (UID: \"c5803c46-a48f-4120-9010-51375caff2a5\") " pod="openstack/placement-db-sync-k7p7p" Jan 11 17:50:27 crc kubenswrapper[4837]: I0111 17:50:27.102234 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-l7czq\" (UniqueName: \"kubernetes.io/projected/c5803c46-a48f-4120-9010-51375caff2a5-kube-api-access-l7czq\") pod \"placement-db-sync-k7p7p\" (UID: \"c5803c46-a48f-4120-9010-51375caff2a5\") " pod="openstack/placement-db-sync-k7p7p" Jan 11 17:50:27 crc kubenswrapper[4837]: I0111 17:50:27.102289 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/e7385aa0-566d-4f8d-b561-3295e49dd1fd-ovsdbserver-nb\") pod \"dnsmasq-dns-76fcf4b695-glkr2\" (UID: \"e7385aa0-566d-4f8d-b561-3295e49dd1fd\") " pod="openstack/dnsmasq-dns-76fcf4b695-glkr2" Jan 11 17:50:27 crc kubenswrapper[4837]: I0111 17:50:27.103223 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/5ec0beaf-de63-407f-8d18-46738023ab11-scripts\") pod \"ceilometer-0\" (UID: \"5ec0beaf-de63-407f-8d18-46738023ab11\") " pod="openstack/ceilometer-0" Jan 11 17:50:27 crc kubenswrapper[4837]: I0111 17:50:27.103264 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/5ec0beaf-de63-407f-8d18-46738023ab11-log-httpd\") pod \"ceilometer-0\" (UID: \"5ec0beaf-de63-407f-8d18-46738023ab11\") " pod="openstack/ceilometer-0" Jan 11 17:50:27 crc kubenswrapper[4837]: I0111 17:50:27.103279 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-k964n\" (UniqueName: \"kubernetes.io/projected/5ec0beaf-de63-407f-8d18-46738023ab11-kube-api-access-k964n\") pod \"ceilometer-0\" (UID: \"5ec0beaf-de63-407f-8d18-46738023ab11\") " pod="openstack/ceilometer-0" Jan 11 17:50:27 crc kubenswrapper[4837]: I0111 17:50:27.103301 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/c5803c46-a48f-4120-9010-51375caff2a5-scripts\") pod \"placement-db-sync-k7p7p\" (UID: \"c5803c46-a48f-4120-9010-51375caff2a5\") " pod="openstack/placement-db-sync-k7p7p" Jan 11 17:50:27 crc kubenswrapper[4837]: I0111 17:50:27.103915 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/5ec0beaf-de63-407f-8d18-46738023ab11-run-httpd\") pod \"ceilometer-0\" (UID: \"5ec0beaf-de63-407f-8d18-46738023ab11\") " pod="openstack/ceilometer-0" Jan 11 17:50:27 crc kubenswrapper[4837]: I0111 17:50:27.104151 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/5ec0beaf-de63-407f-8d18-46738023ab11-log-httpd\") pod \"ceilometer-0\" (UID: \"5ec0beaf-de63-407f-8d18-46738023ab11\") " pod="openstack/ceilometer-0" Jan 11 17:50:27 crc kubenswrapper[4837]: I0111 17:50:27.106568 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-sync-fv6l8" Jan 11 17:50:27 crc kubenswrapper[4837]: I0111 17:50:27.107695 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5ec0beaf-de63-407f-8d18-46738023ab11-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"5ec0beaf-de63-407f-8d18-46738023ab11\") " pod="openstack/ceilometer-0" Jan 11 17:50:27 crc kubenswrapper[4837]: I0111 17:50:27.108145 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5ec0beaf-de63-407f-8d18-46738023ab11-config-data\") pod \"ceilometer-0\" (UID: \"5ec0beaf-de63-407f-8d18-46738023ab11\") " pod="openstack/ceilometer-0" Jan 11 17:50:27 crc kubenswrapper[4837]: I0111 17:50:27.109643 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/5ec0beaf-de63-407f-8d18-46738023ab11-scripts\") pod \"ceilometer-0\" (UID: \"5ec0beaf-de63-407f-8d18-46738023ab11\") " pod="openstack/ceilometer-0" Jan 11 17:50:27 crc kubenswrapper[4837]: I0111 17:50:27.112294 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/5ec0beaf-de63-407f-8d18-46738023ab11-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"5ec0beaf-de63-407f-8d18-46738023ab11\") " pod="openstack/ceilometer-0" Jan 11 17:50:27 crc kubenswrapper[4837]: I0111 17:50:27.124930 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-k964n\" (UniqueName: \"kubernetes.io/projected/5ec0beaf-de63-407f-8d18-46738023ab11-kube-api-access-k964n\") pod \"ceilometer-0\" (UID: \"5ec0beaf-de63-407f-8d18-46738023ab11\") " pod="openstack/ceilometer-0" Jan 11 17:50:27 crc kubenswrapper[4837]: I0111 17:50:27.174807 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-7755778889-fjtkp" Jan 11 17:50:27 crc kubenswrapper[4837]: I0111 17:50:27.194235 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-sync-6mdwl" Jan 11 17:50:27 crc kubenswrapper[4837]: I0111 17:50:27.205041 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zbk4d\" (UniqueName: \"kubernetes.io/projected/e7385aa0-566d-4f8d-b561-3295e49dd1fd-kube-api-access-zbk4d\") pod \"dnsmasq-dns-76fcf4b695-glkr2\" (UID: \"e7385aa0-566d-4f8d-b561-3295e49dd1fd\") " pod="openstack/dnsmasq-dns-76fcf4b695-glkr2" Jan 11 17:50:27 crc kubenswrapper[4837]: I0111 17:50:27.205084 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/e7385aa0-566d-4f8d-b561-3295e49dd1fd-ovsdbserver-sb\") pod \"dnsmasq-dns-76fcf4b695-glkr2\" (UID: \"e7385aa0-566d-4f8d-b561-3295e49dd1fd\") " pod="openstack/dnsmasq-dns-76fcf4b695-glkr2" Jan 11 17:50:27 crc kubenswrapper[4837]: I0111 17:50:27.205114 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/e7385aa0-566d-4f8d-b561-3295e49dd1fd-dns-svc\") pod \"dnsmasq-dns-76fcf4b695-glkr2\" (UID: \"e7385aa0-566d-4f8d-b561-3295e49dd1fd\") " pod="openstack/dnsmasq-dns-76fcf4b695-glkr2" Jan 11 17:50:27 crc kubenswrapper[4837]: I0111 17:50:27.205132 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c5803c46-a48f-4120-9010-51375caff2a5-config-data\") pod \"placement-db-sync-k7p7p\" (UID: \"c5803c46-a48f-4120-9010-51375caff2a5\") " pod="openstack/placement-db-sync-k7p7p" Jan 11 17:50:27 crc kubenswrapper[4837]: I0111 17:50:27.205194 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c5803c46-a48f-4120-9010-51375caff2a5-combined-ca-bundle\") pod \"placement-db-sync-k7p7p\" (UID: \"c5803c46-a48f-4120-9010-51375caff2a5\") " pod="openstack/placement-db-sync-k7p7p" Jan 11 17:50:27 crc kubenswrapper[4837]: I0111 17:50:27.205220 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/e7385aa0-566d-4f8d-b561-3295e49dd1fd-dns-swift-storage-0\") pod \"dnsmasq-dns-76fcf4b695-glkr2\" (UID: \"e7385aa0-566d-4f8d-b561-3295e49dd1fd\") " pod="openstack/dnsmasq-dns-76fcf4b695-glkr2" Jan 11 17:50:27 crc kubenswrapper[4837]: I0111 17:50:27.205252 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e7385aa0-566d-4f8d-b561-3295e49dd1fd-config\") pod \"dnsmasq-dns-76fcf4b695-glkr2\" (UID: \"e7385aa0-566d-4f8d-b561-3295e49dd1fd\") " pod="openstack/dnsmasq-dns-76fcf4b695-glkr2" Jan 11 17:50:27 crc kubenswrapper[4837]: I0111 17:50:27.205284 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/c5803c46-a48f-4120-9010-51375caff2a5-logs\") pod \"placement-db-sync-k7p7p\" (UID: \"c5803c46-a48f-4120-9010-51375caff2a5\") " pod="openstack/placement-db-sync-k7p7p" Jan 11 17:50:27 crc kubenswrapper[4837]: I0111 17:50:27.205302 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-l7czq\" (UniqueName: \"kubernetes.io/projected/c5803c46-a48f-4120-9010-51375caff2a5-kube-api-access-l7czq\") pod \"placement-db-sync-k7p7p\" (UID: \"c5803c46-a48f-4120-9010-51375caff2a5\") " pod="openstack/placement-db-sync-k7p7p" Jan 11 17:50:27 crc kubenswrapper[4837]: I0111 17:50:27.205319 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/e7385aa0-566d-4f8d-b561-3295e49dd1fd-ovsdbserver-nb\") pod \"dnsmasq-dns-76fcf4b695-glkr2\" (UID: \"e7385aa0-566d-4f8d-b561-3295e49dd1fd\") " pod="openstack/dnsmasq-dns-76fcf4b695-glkr2" Jan 11 17:50:27 crc kubenswrapper[4837]: I0111 17:50:27.205344 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/c5803c46-a48f-4120-9010-51375caff2a5-scripts\") pod \"placement-db-sync-k7p7p\" (UID: \"c5803c46-a48f-4120-9010-51375caff2a5\") " pod="openstack/placement-db-sync-k7p7p" Jan 11 17:50:27 crc kubenswrapper[4837]: I0111 17:50:27.205927 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/c5803c46-a48f-4120-9010-51375caff2a5-logs\") pod \"placement-db-sync-k7p7p\" (UID: \"c5803c46-a48f-4120-9010-51375caff2a5\") " pod="openstack/placement-db-sync-k7p7p" Jan 11 17:50:27 crc kubenswrapper[4837]: I0111 17:50:27.206547 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/e7385aa0-566d-4f8d-b561-3295e49dd1fd-dns-swift-storage-0\") pod \"dnsmasq-dns-76fcf4b695-glkr2\" (UID: \"e7385aa0-566d-4f8d-b561-3295e49dd1fd\") " pod="openstack/dnsmasq-dns-76fcf4b695-glkr2" Jan 11 17:50:27 crc kubenswrapper[4837]: I0111 17:50:27.207083 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e7385aa0-566d-4f8d-b561-3295e49dd1fd-config\") pod \"dnsmasq-dns-76fcf4b695-glkr2\" (UID: \"e7385aa0-566d-4f8d-b561-3295e49dd1fd\") " pod="openstack/dnsmasq-dns-76fcf4b695-glkr2" Jan 11 17:50:27 crc kubenswrapper[4837]: I0111 17:50:27.207335 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/e7385aa0-566d-4f8d-b561-3295e49dd1fd-ovsdbserver-nb\") pod \"dnsmasq-dns-76fcf4b695-glkr2\" (UID: \"e7385aa0-566d-4f8d-b561-3295e49dd1fd\") " pod="openstack/dnsmasq-dns-76fcf4b695-glkr2" Jan 11 17:50:27 crc kubenswrapper[4837]: I0111 17:50:27.207819 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/e7385aa0-566d-4f8d-b561-3295e49dd1fd-ovsdbserver-sb\") pod \"dnsmasq-dns-76fcf4b695-glkr2\" (UID: \"e7385aa0-566d-4f8d-b561-3295e49dd1fd\") " pod="openstack/dnsmasq-dns-76fcf4b695-glkr2" Jan 11 17:50:27 crc kubenswrapper[4837]: I0111 17:50:27.213431 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/c5803c46-a48f-4120-9010-51375caff2a5-scripts\") pod \"placement-db-sync-k7p7p\" (UID: \"c5803c46-a48f-4120-9010-51375caff2a5\") " pod="openstack/placement-db-sync-k7p7p" Jan 11 17:50:27 crc kubenswrapper[4837]: I0111 17:50:27.213903 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/e7385aa0-566d-4f8d-b561-3295e49dd1fd-dns-svc\") pod \"dnsmasq-dns-76fcf4b695-glkr2\" (UID: \"e7385aa0-566d-4f8d-b561-3295e49dd1fd\") " pod="openstack/dnsmasq-dns-76fcf4b695-glkr2" Jan 11 17:50:27 crc kubenswrapper[4837]: I0111 17:50:27.213938 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c5803c46-a48f-4120-9010-51375caff2a5-config-data\") pod \"placement-db-sync-k7p7p\" (UID: \"c5803c46-a48f-4120-9010-51375caff2a5\") " pod="openstack/placement-db-sync-k7p7p" Jan 11 17:50:27 crc kubenswrapper[4837]: I0111 17:50:27.215718 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c5803c46-a48f-4120-9010-51375caff2a5-combined-ca-bundle\") pod \"placement-db-sync-k7p7p\" (UID: \"c5803c46-a48f-4120-9010-51375caff2a5\") " pod="openstack/placement-db-sync-k7p7p" Jan 11 17:50:27 crc kubenswrapper[4837]: I0111 17:50:27.217156 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Jan 11 17:50:27 crc kubenswrapper[4837]: I0111 17:50:27.221647 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zbk4d\" (UniqueName: \"kubernetes.io/projected/e7385aa0-566d-4f8d-b561-3295e49dd1fd-kube-api-access-zbk4d\") pod \"dnsmasq-dns-76fcf4b695-glkr2\" (UID: \"e7385aa0-566d-4f8d-b561-3295e49dd1fd\") " pod="openstack/dnsmasq-dns-76fcf4b695-glkr2" Jan 11 17:50:27 crc kubenswrapper[4837]: I0111 17:50:27.233761 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-l7czq\" (UniqueName: \"kubernetes.io/projected/c5803c46-a48f-4120-9010-51375caff2a5-kube-api-access-l7czq\") pod \"placement-db-sync-k7p7p\" (UID: \"c5803c46-a48f-4120-9010-51375caff2a5\") " pod="openstack/placement-db-sync-k7p7p" Jan 11 17:50:27 crc kubenswrapper[4837]: I0111 17:50:27.277182 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-sync-k7p7p" Jan 11 17:50:27 crc kubenswrapper[4837]: I0111 17:50:27.294166 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-76fcf4b695-glkr2" Jan 11 17:50:27 crc kubenswrapper[4837]: I0111 17:50:27.415271 4837 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-db-sync-l58f7"] Jan 11 17:50:27 crc kubenswrapper[4837]: I0111 17:50:27.429142 4837 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-55fff446b9-zmx2m"] Jan 11 17:50:27 crc kubenswrapper[4837]: I0111 17:50:27.555558 4837 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-bootstrap-dtdbs"] Jan 11 17:50:27 crc kubenswrapper[4837]: I0111 17:50:27.677389 4837 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/horizon-774fb4c689-s8tf9"] Jan 11 17:50:27 crc kubenswrapper[4837]: I0111 17:50:27.683198 4837 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/horizon-7755778889-fjtkp"] Jan 11 17:50:27 crc kubenswrapper[4837]: I0111 17:50:27.712341 4837 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-db-sync-fv6l8"] Jan 11 17:50:27 crc kubenswrapper[4837]: W0111 17:50:27.832359 4837 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod1b22cc91_a2b0_4468_96bc_73cb4aab66bb.slice/crio-42ccf9115a3447e853ebf96b73d2dc31114b47cf09877edca8126916216d2b39 WatchSource:0}: Error finding container 42ccf9115a3447e853ebf96b73d2dc31114b47cf09877edca8126916216d2b39: Status 404 returned error can't find the container with id 42ccf9115a3447e853ebf96b73d2dc31114b47cf09877edca8126916216d2b39 Jan 11 17:50:27 crc kubenswrapper[4837]: I0111 17:50:27.834715 4837 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-sync-r8vzn" Jan 11 17:50:27 crc kubenswrapper[4837]: I0111 17:50:27.900601 4837 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-db-sync-6mdwl"] Jan 11 17:50:27 crc kubenswrapper[4837]: I0111 17:50:27.914904 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ed1cfeaf-6da5-40ed-b605-077e5c95900c-combined-ca-bundle\") pod \"ed1cfeaf-6da5-40ed-b605-077e5c95900c\" (UID: \"ed1cfeaf-6da5-40ed-b605-077e5c95900c\") " Jan 11 17:50:27 crc kubenswrapper[4837]: I0111 17:50:27.914944 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-g6sfd\" (UniqueName: \"kubernetes.io/projected/ed1cfeaf-6da5-40ed-b605-077e5c95900c-kube-api-access-g6sfd\") pod \"ed1cfeaf-6da5-40ed-b605-077e5c95900c\" (UID: \"ed1cfeaf-6da5-40ed-b605-077e5c95900c\") " Jan 11 17:50:27 crc kubenswrapper[4837]: I0111 17:50:27.915304 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/ed1cfeaf-6da5-40ed-b605-077e5c95900c-db-sync-config-data\") pod \"ed1cfeaf-6da5-40ed-b605-077e5c95900c\" (UID: \"ed1cfeaf-6da5-40ed-b605-077e5c95900c\") " Jan 11 17:50:27 crc kubenswrapper[4837]: I0111 17:50:27.915344 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ed1cfeaf-6da5-40ed-b605-077e5c95900c-config-data\") pod \"ed1cfeaf-6da5-40ed-b605-077e5c95900c\" (UID: \"ed1cfeaf-6da5-40ed-b605-077e5c95900c\") " Jan 11 17:50:27 crc kubenswrapper[4837]: I0111 17:50:27.919057 4837 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Jan 11 17:50:27 crc kubenswrapper[4837]: I0111 17:50:27.929988 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ed1cfeaf-6da5-40ed-b605-077e5c95900c-db-sync-config-data" (OuterVolumeSpecName: "db-sync-config-data") pod "ed1cfeaf-6da5-40ed-b605-077e5c95900c" (UID: "ed1cfeaf-6da5-40ed-b605-077e5c95900c"). InnerVolumeSpecName "db-sync-config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 11 17:50:27 crc kubenswrapper[4837]: I0111 17:50:27.931238 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ed1cfeaf-6da5-40ed-b605-077e5c95900c-kube-api-access-g6sfd" (OuterVolumeSpecName: "kube-api-access-g6sfd") pod "ed1cfeaf-6da5-40ed-b605-077e5c95900c" (UID: "ed1cfeaf-6da5-40ed-b605-077e5c95900c"). InnerVolumeSpecName "kube-api-access-g6sfd". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 11 17:50:27 crc kubenswrapper[4837]: I0111 17:50:27.967556 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-7755778889-fjtkp" event={"ID":"77a72881-5957-4b75-b490-0417214a39b0","Type":"ContainerStarted","Data":"017a024d95cc6246ecd3c548904f78c3c70916b5beb921eaead52672bb8e7dcf"} Jan 11 17:50:27 crc kubenswrapper[4837]: I0111 17:50:27.971344 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-774fb4c689-s8tf9" event={"ID":"de0a2bd6-2144-4c5c-91ab-a5fde49c9808","Type":"ContainerStarted","Data":"73351a7db683ab2a22f755d6abd69634d726c38cf0296db896a17b6b15b98f0f"} Jan 11 17:50:27 crc kubenswrapper[4837]: I0111 17:50:27.981250 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ed1cfeaf-6da5-40ed-b605-077e5c95900c-config-data" (OuterVolumeSpecName: "config-data") pod "ed1cfeaf-6da5-40ed-b605-077e5c95900c" (UID: "ed1cfeaf-6da5-40ed-b605-077e5c95900c"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 11 17:50:27 crc kubenswrapper[4837]: I0111 17:50:27.983587 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ed1cfeaf-6da5-40ed-b605-077e5c95900c-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "ed1cfeaf-6da5-40ed-b605-077e5c95900c" (UID: "ed1cfeaf-6da5-40ed-b605-077e5c95900c"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 11 17:50:27 crc kubenswrapper[4837]: I0111 17:50:27.989708 4837 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-db-sync-k7p7p"] Jan 11 17:50:27 crc kubenswrapper[4837]: I0111 17:50:27.995173 4837 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-76fcf4b695-glkr2"] Jan 11 17:50:28 crc kubenswrapper[4837]: I0111 17:50:28.020111 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-55fff446b9-zmx2m" event={"ID":"0c2f4993-7ed4-4e39-b9ef-1902b2fd3510","Type":"ContainerStarted","Data":"00940cc42626383b9a6bee92ded73382680f77d5ce1e0150615bad5c90190fb2"} Jan 11 17:50:28 crc kubenswrapper[4837]: I0111 17:50:28.020915 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-55fff446b9-zmx2m" event={"ID":"0c2f4993-7ed4-4e39-b9ef-1902b2fd3510","Type":"ContainerStarted","Data":"617043aa32759256129f835163cefeb1e6cd7b0023d08359d2e84a9555f0578c"} Jan 11 17:50:28 crc kubenswrapper[4837]: W0111 17:50:28.034275 4837 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podc5803c46_a48f_4120_9010_51375caff2a5.slice/crio-754de8f0e133e15347340ecdd59668a8763878335076729796b6ad0e10ca77bf WatchSource:0}: Error finding container 754de8f0e133e15347340ecdd59668a8763878335076729796b6ad0e10ca77bf: Status 404 returned error can't find the container with id 754de8f0e133e15347340ecdd59668a8763878335076729796b6ad0e10ca77bf Jan 11 17:50:28 crc kubenswrapper[4837]: I0111 17:50:28.034361 4837 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-g6sfd\" (UniqueName: \"kubernetes.io/projected/ed1cfeaf-6da5-40ed-b605-077e5c95900c-kube-api-access-g6sfd\") on node \"crc\" DevicePath \"\"" Jan 11 17:50:28 crc kubenswrapper[4837]: I0111 17:50:28.034386 4837 reconciler_common.go:293] "Volume detached for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/ed1cfeaf-6da5-40ed-b605-077e5c95900c-db-sync-config-data\") on node \"crc\" DevicePath \"\"" Jan 11 17:50:28 crc kubenswrapper[4837]: I0111 17:50:28.034413 4837 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ed1cfeaf-6da5-40ed-b605-077e5c95900c-config-data\") on node \"crc\" DevicePath \"\"" Jan 11 17:50:28 crc kubenswrapper[4837]: I0111 17:50:28.034424 4837 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ed1cfeaf-6da5-40ed-b605-077e5c95900c-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 11 17:50:28 crc kubenswrapper[4837]: I0111 17:50:28.051123 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-db-sync-6mdwl" event={"ID":"5930b460-1c65-4c06-a3bc-f6d6f0518110","Type":"ContainerStarted","Data":"fac5f64fc3629ce6fec395f0cbcd55a5181cd6499ab0f35614aa3073533395eb"} Jan 11 17:50:28 crc kubenswrapper[4837]: I0111 17:50:28.076834 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-dtdbs" event={"ID":"42cb0cea-5da0-4694-b28f-a24f296b6399","Type":"ContainerStarted","Data":"a19f25df1eb806552c2981f7e193e0b388af7bc05792353f25cf48021a2ad99f"} Jan 11 17:50:28 crc kubenswrapper[4837]: I0111 17:50:28.079960 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-db-sync-fv6l8" event={"ID":"1b22cc91-a2b0-4468-96bc-73cb4aab66bb","Type":"ContainerStarted","Data":"42ccf9115a3447e853ebf96b73d2dc31114b47cf09877edca8126916216d2b39"} Jan 11 17:50:28 crc kubenswrapper[4837]: I0111 17:50:28.083837 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-db-sync-l58f7" event={"ID":"ecc4b7a3-4585-48ee-9cf3-caa58c9f5895","Type":"ContainerStarted","Data":"15e5cfa4043cd70b8c50268be91873413c661ee651c5e28d59fff158e2f43142"} Jan 11 17:50:28 crc kubenswrapper[4837]: I0111 17:50:28.096167 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db-sync-r8vzn" event={"ID":"ed1cfeaf-6da5-40ed-b605-077e5c95900c","Type":"ContainerDied","Data":"a0b6ab519ec350b87b042e380a38c6f87de122c9827e623f60a8415eb6f1f8f4"} Jan 11 17:50:28 crc kubenswrapper[4837]: I0111 17:50:28.096200 4837 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="a0b6ab519ec350b87b042e380a38c6f87de122c9827e623f60a8415eb6f1f8f4" Jan 11 17:50:28 crc kubenswrapper[4837]: I0111 17:50:28.096278 4837 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-sync-r8vzn" Jan 11 17:50:28 crc kubenswrapper[4837]: I0111 17:50:28.110131 4837 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/keystone-bootstrap-dtdbs" podStartSLOduration=2.1101173 podStartE2EDuration="2.1101173s" podCreationTimestamp="2026-01-11 17:50:26 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-11 17:50:28.09706545 +0000 UTC m=+1202.275258156" watchObservedRunningTime="2026-01-11 17:50:28.1101173 +0000 UTC m=+1202.288310006" Jan 11 17:50:28 crc kubenswrapper[4837]: I0111 17:50:28.121641 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"5ec0beaf-de63-407f-8d18-46738023ab11","Type":"ContainerStarted","Data":"321d247207531c2c1fe0901940f4550855249a0d284bd3a493d398bea93ffdfc"} Jan 11 17:50:28 crc kubenswrapper[4837]: I0111 17:50:28.310742 4837 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-76fcf4b695-glkr2"] Jan 11 17:50:28 crc kubenswrapper[4837]: I0111 17:50:28.337740 4837 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/horizon-774fb4c689-s8tf9"] Jan 11 17:50:28 crc kubenswrapper[4837]: I0111 17:50:28.414421 4837 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-8b5c85b87-vkf4l"] Jan 11 17:50:28 crc kubenswrapper[4837]: E0111 17:50:28.414734 4837 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ed1cfeaf-6da5-40ed-b605-077e5c95900c" containerName="glance-db-sync" Jan 11 17:50:28 crc kubenswrapper[4837]: I0111 17:50:28.414746 4837 state_mem.go:107] "Deleted CPUSet assignment" podUID="ed1cfeaf-6da5-40ed-b605-077e5c95900c" containerName="glance-db-sync" Jan 11 17:50:28 crc kubenswrapper[4837]: I0111 17:50:28.414892 4837 memory_manager.go:354] "RemoveStaleState removing state" podUID="ed1cfeaf-6da5-40ed-b605-077e5c95900c" containerName="glance-db-sync" Jan 11 17:50:28 crc kubenswrapper[4837]: I0111 17:50:28.415627 4837 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/horizon-686c95d4d9-ql2xj"] Jan 11 17:50:28 crc kubenswrapper[4837]: I0111 17:50:28.416802 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-8b5c85b87-vkf4l" Jan 11 17:50:28 crc kubenswrapper[4837]: I0111 17:50:28.417640 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-686c95d4d9-ql2xj" Jan 11 17:50:28 crc kubenswrapper[4837]: I0111 17:50:28.436290 4837 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-8b5c85b87-vkf4l"] Jan 11 17:50:28 crc kubenswrapper[4837]: I0111 17:50:28.453533 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/62eadde8-9001-457f-b1e1-86f88c38054f-ovsdbserver-sb\") pod \"dnsmasq-dns-8b5c85b87-vkf4l\" (UID: \"62eadde8-9001-457f-b1e1-86f88c38054f\") " pod="openstack/dnsmasq-dns-8b5c85b87-vkf4l" Jan 11 17:50:28 crc kubenswrapper[4837]: I0111 17:50:28.453605 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/3cda1c41-2167-487e-951e-cd87cf9f6ab1-scripts\") pod \"horizon-686c95d4d9-ql2xj\" (UID: \"3cda1c41-2167-487e-951e-cd87cf9f6ab1\") " pod="openstack/horizon-686c95d4d9-ql2xj" Jan 11 17:50:28 crc kubenswrapper[4837]: I0111 17:50:28.453638 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fxrxn\" (UniqueName: \"kubernetes.io/projected/3cda1c41-2167-487e-951e-cd87cf9f6ab1-kube-api-access-fxrxn\") pod \"horizon-686c95d4d9-ql2xj\" (UID: \"3cda1c41-2167-487e-951e-cd87cf9f6ab1\") " pod="openstack/horizon-686c95d4d9-ql2xj" Jan 11 17:50:28 crc kubenswrapper[4837]: I0111 17:50:28.453660 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/62eadde8-9001-457f-b1e1-86f88c38054f-config\") pod \"dnsmasq-dns-8b5c85b87-vkf4l\" (UID: \"62eadde8-9001-457f-b1e1-86f88c38054f\") " pod="openstack/dnsmasq-dns-8b5c85b87-vkf4l" Jan 11 17:50:28 crc kubenswrapper[4837]: I0111 17:50:28.453760 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/3cda1c41-2167-487e-951e-cd87cf9f6ab1-horizon-secret-key\") pod \"horizon-686c95d4d9-ql2xj\" (UID: \"3cda1c41-2167-487e-951e-cd87cf9f6ab1\") " pod="openstack/horizon-686c95d4d9-ql2xj" Jan 11 17:50:28 crc kubenswrapper[4837]: I0111 17:50:28.453810 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/3cda1c41-2167-487e-951e-cd87cf9f6ab1-logs\") pod \"horizon-686c95d4d9-ql2xj\" (UID: \"3cda1c41-2167-487e-951e-cd87cf9f6ab1\") " pod="openstack/horizon-686c95d4d9-ql2xj" Jan 11 17:50:28 crc kubenswrapper[4837]: I0111 17:50:28.453873 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/62eadde8-9001-457f-b1e1-86f88c38054f-dns-swift-storage-0\") pod \"dnsmasq-dns-8b5c85b87-vkf4l\" (UID: \"62eadde8-9001-457f-b1e1-86f88c38054f\") " pod="openstack/dnsmasq-dns-8b5c85b87-vkf4l" Jan 11 17:50:28 crc kubenswrapper[4837]: I0111 17:50:28.453923 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/62eadde8-9001-457f-b1e1-86f88c38054f-dns-svc\") pod \"dnsmasq-dns-8b5c85b87-vkf4l\" (UID: \"62eadde8-9001-457f-b1e1-86f88c38054f\") " pod="openstack/dnsmasq-dns-8b5c85b87-vkf4l" Jan 11 17:50:28 crc kubenswrapper[4837]: I0111 17:50:28.453951 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/62eadde8-9001-457f-b1e1-86f88c38054f-ovsdbserver-nb\") pod \"dnsmasq-dns-8b5c85b87-vkf4l\" (UID: \"62eadde8-9001-457f-b1e1-86f88c38054f\") " pod="openstack/dnsmasq-dns-8b5c85b87-vkf4l" Jan 11 17:50:28 crc kubenswrapper[4837]: I0111 17:50:28.453977 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/3cda1c41-2167-487e-951e-cd87cf9f6ab1-config-data\") pod \"horizon-686c95d4d9-ql2xj\" (UID: \"3cda1c41-2167-487e-951e-cd87cf9f6ab1\") " pod="openstack/horizon-686c95d4d9-ql2xj" Jan 11 17:50:28 crc kubenswrapper[4837]: I0111 17:50:28.454020 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zsffv\" (UniqueName: \"kubernetes.io/projected/62eadde8-9001-457f-b1e1-86f88c38054f-kube-api-access-zsffv\") pod \"dnsmasq-dns-8b5c85b87-vkf4l\" (UID: \"62eadde8-9001-457f-b1e1-86f88c38054f\") " pod="openstack/dnsmasq-dns-8b5c85b87-vkf4l" Jan 11 17:50:28 crc kubenswrapper[4837]: I0111 17:50:28.457465 4837 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/horizon-686c95d4d9-ql2xj"] Jan 11 17:50:28 crc kubenswrapper[4837]: I0111 17:50:28.518417 4837 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Jan 11 17:50:28 crc kubenswrapper[4837]: I0111 17:50:28.557271 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/3cda1c41-2167-487e-951e-cd87cf9f6ab1-horizon-secret-key\") pod \"horizon-686c95d4d9-ql2xj\" (UID: \"3cda1c41-2167-487e-951e-cd87cf9f6ab1\") " pod="openstack/horizon-686c95d4d9-ql2xj" Jan 11 17:50:28 crc kubenswrapper[4837]: I0111 17:50:28.557323 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/3cda1c41-2167-487e-951e-cd87cf9f6ab1-logs\") pod \"horizon-686c95d4d9-ql2xj\" (UID: \"3cda1c41-2167-487e-951e-cd87cf9f6ab1\") " pod="openstack/horizon-686c95d4d9-ql2xj" Jan 11 17:50:28 crc kubenswrapper[4837]: I0111 17:50:28.557355 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/62eadde8-9001-457f-b1e1-86f88c38054f-dns-swift-storage-0\") pod \"dnsmasq-dns-8b5c85b87-vkf4l\" (UID: \"62eadde8-9001-457f-b1e1-86f88c38054f\") " pod="openstack/dnsmasq-dns-8b5c85b87-vkf4l" Jan 11 17:50:28 crc kubenswrapper[4837]: I0111 17:50:28.557389 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/62eadde8-9001-457f-b1e1-86f88c38054f-dns-svc\") pod \"dnsmasq-dns-8b5c85b87-vkf4l\" (UID: \"62eadde8-9001-457f-b1e1-86f88c38054f\") " pod="openstack/dnsmasq-dns-8b5c85b87-vkf4l" Jan 11 17:50:28 crc kubenswrapper[4837]: I0111 17:50:28.557415 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/62eadde8-9001-457f-b1e1-86f88c38054f-ovsdbserver-nb\") pod \"dnsmasq-dns-8b5c85b87-vkf4l\" (UID: \"62eadde8-9001-457f-b1e1-86f88c38054f\") " pod="openstack/dnsmasq-dns-8b5c85b87-vkf4l" Jan 11 17:50:28 crc kubenswrapper[4837]: I0111 17:50:28.557434 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/3cda1c41-2167-487e-951e-cd87cf9f6ab1-config-data\") pod \"horizon-686c95d4d9-ql2xj\" (UID: \"3cda1c41-2167-487e-951e-cd87cf9f6ab1\") " pod="openstack/horizon-686c95d4d9-ql2xj" Jan 11 17:50:28 crc kubenswrapper[4837]: I0111 17:50:28.557455 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zsffv\" (UniqueName: \"kubernetes.io/projected/62eadde8-9001-457f-b1e1-86f88c38054f-kube-api-access-zsffv\") pod \"dnsmasq-dns-8b5c85b87-vkf4l\" (UID: \"62eadde8-9001-457f-b1e1-86f88c38054f\") " pod="openstack/dnsmasq-dns-8b5c85b87-vkf4l" Jan 11 17:50:28 crc kubenswrapper[4837]: I0111 17:50:28.557496 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/62eadde8-9001-457f-b1e1-86f88c38054f-ovsdbserver-sb\") pod \"dnsmasq-dns-8b5c85b87-vkf4l\" (UID: \"62eadde8-9001-457f-b1e1-86f88c38054f\") " pod="openstack/dnsmasq-dns-8b5c85b87-vkf4l" Jan 11 17:50:28 crc kubenswrapper[4837]: I0111 17:50:28.557516 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/3cda1c41-2167-487e-951e-cd87cf9f6ab1-scripts\") pod \"horizon-686c95d4d9-ql2xj\" (UID: \"3cda1c41-2167-487e-951e-cd87cf9f6ab1\") " pod="openstack/horizon-686c95d4d9-ql2xj" Jan 11 17:50:28 crc kubenswrapper[4837]: I0111 17:50:28.557532 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fxrxn\" (UniqueName: \"kubernetes.io/projected/3cda1c41-2167-487e-951e-cd87cf9f6ab1-kube-api-access-fxrxn\") pod \"horizon-686c95d4d9-ql2xj\" (UID: \"3cda1c41-2167-487e-951e-cd87cf9f6ab1\") " pod="openstack/horizon-686c95d4d9-ql2xj" Jan 11 17:50:28 crc kubenswrapper[4837]: I0111 17:50:28.557547 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/62eadde8-9001-457f-b1e1-86f88c38054f-config\") pod \"dnsmasq-dns-8b5c85b87-vkf4l\" (UID: \"62eadde8-9001-457f-b1e1-86f88c38054f\") " pod="openstack/dnsmasq-dns-8b5c85b87-vkf4l" Jan 11 17:50:28 crc kubenswrapper[4837]: I0111 17:50:28.559638 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/62eadde8-9001-457f-b1e1-86f88c38054f-ovsdbserver-nb\") pod \"dnsmasq-dns-8b5c85b87-vkf4l\" (UID: \"62eadde8-9001-457f-b1e1-86f88c38054f\") " pod="openstack/dnsmasq-dns-8b5c85b87-vkf4l" Jan 11 17:50:28 crc kubenswrapper[4837]: I0111 17:50:28.561610 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/62eadde8-9001-457f-b1e1-86f88c38054f-config\") pod \"dnsmasq-dns-8b5c85b87-vkf4l\" (UID: \"62eadde8-9001-457f-b1e1-86f88c38054f\") " pod="openstack/dnsmasq-dns-8b5c85b87-vkf4l" Jan 11 17:50:28 crc kubenswrapper[4837]: I0111 17:50:28.561869 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/62eadde8-9001-457f-b1e1-86f88c38054f-dns-swift-storage-0\") pod \"dnsmasq-dns-8b5c85b87-vkf4l\" (UID: \"62eadde8-9001-457f-b1e1-86f88c38054f\") " pod="openstack/dnsmasq-dns-8b5c85b87-vkf4l" Jan 11 17:50:28 crc kubenswrapper[4837]: I0111 17:50:28.562082 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/3cda1c41-2167-487e-951e-cd87cf9f6ab1-logs\") pod \"horizon-686c95d4d9-ql2xj\" (UID: \"3cda1c41-2167-487e-951e-cd87cf9f6ab1\") " pod="openstack/horizon-686c95d4d9-ql2xj" Jan 11 17:50:28 crc kubenswrapper[4837]: I0111 17:50:28.562646 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/62eadde8-9001-457f-b1e1-86f88c38054f-dns-svc\") pod \"dnsmasq-dns-8b5c85b87-vkf4l\" (UID: \"62eadde8-9001-457f-b1e1-86f88c38054f\") " pod="openstack/dnsmasq-dns-8b5c85b87-vkf4l" Jan 11 17:50:28 crc kubenswrapper[4837]: I0111 17:50:28.562899 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/3cda1c41-2167-487e-951e-cd87cf9f6ab1-config-data\") pod \"horizon-686c95d4d9-ql2xj\" (UID: \"3cda1c41-2167-487e-951e-cd87cf9f6ab1\") " pod="openstack/horizon-686c95d4d9-ql2xj" Jan 11 17:50:28 crc kubenswrapper[4837]: I0111 17:50:28.563042 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/62eadde8-9001-457f-b1e1-86f88c38054f-ovsdbserver-sb\") pod \"dnsmasq-dns-8b5c85b87-vkf4l\" (UID: \"62eadde8-9001-457f-b1e1-86f88c38054f\") " pod="openstack/dnsmasq-dns-8b5c85b87-vkf4l" Jan 11 17:50:28 crc kubenswrapper[4837]: I0111 17:50:28.563292 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/3cda1c41-2167-487e-951e-cd87cf9f6ab1-scripts\") pod \"horizon-686c95d4d9-ql2xj\" (UID: \"3cda1c41-2167-487e-951e-cd87cf9f6ab1\") " pod="openstack/horizon-686c95d4d9-ql2xj" Jan 11 17:50:28 crc kubenswrapper[4837]: I0111 17:50:28.567266 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/3cda1c41-2167-487e-951e-cd87cf9f6ab1-horizon-secret-key\") pod \"horizon-686c95d4d9-ql2xj\" (UID: \"3cda1c41-2167-487e-951e-cd87cf9f6ab1\") " pod="openstack/horizon-686c95d4d9-ql2xj" Jan 11 17:50:28 crc kubenswrapper[4837]: I0111 17:50:28.579990 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fxrxn\" (UniqueName: \"kubernetes.io/projected/3cda1c41-2167-487e-951e-cd87cf9f6ab1-kube-api-access-fxrxn\") pod \"horizon-686c95d4d9-ql2xj\" (UID: \"3cda1c41-2167-487e-951e-cd87cf9f6ab1\") " pod="openstack/horizon-686c95d4d9-ql2xj" Jan 11 17:50:28 crc kubenswrapper[4837]: I0111 17:50:28.583987 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zsffv\" (UniqueName: \"kubernetes.io/projected/62eadde8-9001-457f-b1e1-86f88c38054f-kube-api-access-zsffv\") pod \"dnsmasq-dns-8b5c85b87-vkf4l\" (UID: \"62eadde8-9001-457f-b1e1-86f88c38054f\") " pod="openstack/dnsmasq-dns-8b5c85b87-vkf4l" Jan 11 17:50:28 crc kubenswrapper[4837]: I0111 17:50:28.703714 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-686c95d4d9-ql2xj" Jan 11 17:50:28 crc kubenswrapper[4837]: I0111 17:50:28.707879 4837 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-55fff446b9-zmx2m" Jan 11 17:50:28 crc kubenswrapper[4837]: I0111 17:50:28.759396 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-hdbwx\" (UniqueName: \"kubernetes.io/projected/0c2f4993-7ed4-4e39-b9ef-1902b2fd3510-kube-api-access-hdbwx\") pod \"0c2f4993-7ed4-4e39-b9ef-1902b2fd3510\" (UID: \"0c2f4993-7ed4-4e39-b9ef-1902b2fd3510\") " Jan 11 17:50:28 crc kubenswrapper[4837]: I0111 17:50:28.760181 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/0c2f4993-7ed4-4e39-b9ef-1902b2fd3510-dns-swift-storage-0\") pod \"0c2f4993-7ed4-4e39-b9ef-1902b2fd3510\" (UID: \"0c2f4993-7ed4-4e39-b9ef-1902b2fd3510\") " Jan 11 17:50:28 crc kubenswrapper[4837]: I0111 17:50:28.760510 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/0c2f4993-7ed4-4e39-b9ef-1902b2fd3510-config\") pod \"0c2f4993-7ed4-4e39-b9ef-1902b2fd3510\" (UID: \"0c2f4993-7ed4-4e39-b9ef-1902b2fd3510\") " Jan 11 17:50:28 crc kubenswrapper[4837]: I0111 17:50:28.760572 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/0c2f4993-7ed4-4e39-b9ef-1902b2fd3510-ovsdbserver-sb\") pod \"0c2f4993-7ed4-4e39-b9ef-1902b2fd3510\" (UID: \"0c2f4993-7ed4-4e39-b9ef-1902b2fd3510\") " Jan 11 17:50:28 crc kubenswrapper[4837]: I0111 17:50:28.760596 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/0c2f4993-7ed4-4e39-b9ef-1902b2fd3510-dns-svc\") pod \"0c2f4993-7ed4-4e39-b9ef-1902b2fd3510\" (UID: \"0c2f4993-7ed4-4e39-b9ef-1902b2fd3510\") " Jan 11 17:50:28 crc kubenswrapper[4837]: I0111 17:50:28.760620 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/0c2f4993-7ed4-4e39-b9ef-1902b2fd3510-ovsdbserver-nb\") pod \"0c2f4993-7ed4-4e39-b9ef-1902b2fd3510\" (UID: \"0c2f4993-7ed4-4e39-b9ef-1902b2fd3510\") " Jan 11 17:50:28 crc kubenswrapper[4837]: I0111 17:50:28.762619 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0c2f4993-7ed4-4e39-b9ef-1902b2fd3510-kube-api-access-hdbwx" (OuterVolumeSpecName: "kube-api-access-hdbwx") pod "0c2f4993-7ed4-4e39-b9ef-1902b2fd3510" (UID: "0c2f4993-7ed4-4e39-b9ef-1902b2fd3510"). InnerVolumeSpecName "kube-api-access-hdbwx". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 11 17:50:28 crc kubenswrapper[4837]: I0111 17:50:28.763891 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-8b5c85b87-vkf4l" Jan 11 17:50:28 crc kubenswrapper[4837]: I0111 17:50:28.787316 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/0c2f4993-7ed4-4e39-b9ef-1902b2fd3510-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "0c2f4993-7ed4-4e39-b9ef-1902b2fd3510" (UID: "0c2f4993-7ed4-4e39-b9ef-1902b2fd3510"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 11 17:50:28 crc kubenswrapper[4837]: I0111 17:50:28.862940 4837 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-hdbwx\" (UniqueName: \"kubernetes.io/projected/0c2f4993-7ed4-4e39-b9ef-1902b2fd3510-kube-api-access-hdbwx\") on node \"crc\" DevicePath \"\"" Jan 11 17:50:28 crc kubenswrapper[4837]: I0111 17:50:28.862990 4837 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/0c2f4993-7ed4-4e39-b9ef-1902b2fd3510-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Jan 11 17:50:28 crc kubenswrapper[4837]: I0111 17:50:28.872732 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/0c2f4993-7ed4-4e39-b9ef-1902b2fd3510-config" (OuterVolumeSpecName: "config") pod "0c2f4993-7ed4-4e39-b9ef-1902b2fd3510" (UID: "0c2f4993-7ed4-4e39-b9ef-1902b2fd3510"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 11 17:50:28 crc kubenswrapper[4837]: I0111 17:50:28.877103 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/0c2f4993-7ed4-4e39-b9ef-1902b2fd3510-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "0c2f4993-7ed4-4e39-b9ef-1902b2fd3510" (UID: "0c2f4993-7ed4-4e39-b9ef-1902b2fd3510"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 11 17:50:28 crc kubenswrapper[4837]: I0111 17:50:28.893459 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/0c2f4993-7ed4-4e39-b9ef-1902b2fd3510-dns-swift-storage-0" (OuterVolumeSpecName: "dns-swift-storage-0") pod "0c2f4993-7ed4-4e39-b9ef-1902b2fd3510" (UID: "0c2f4993-7ed4-4e39-b9ef-1902b2fd3510"). InnerVolumeSpecName "dns-swift-storage-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 11 17:50:28 crc kubenswrapper[4837]: I0111 17:50:28.898531 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/0c2f4993-7ed4-4e39-b9ef-1902b2fd3510-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "0c2f4993-7ed4-4e39-b9ef-1902b2fd3510" (UID: "0c2f4993-7ed4-4e39-b9ef-1902b2fd3510"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 11 17:50:28 crc kubenswrapper[4837]: I0111 17:50:28.964758 4837 reconciler_common.go:293] "Volume detached for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/0c2f4993-7ed4-4e39-b9ef-1902b2fd3510-dns-swift-storage-0\") on node \"crc\" DevicePath \"\"" Jan 11 17:50:28 crc kubenswrapper[4837]: I0111 17:50:28.964792 4837 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/0c2f4993-7ed4-4e39-b9ef-1902b2fd3510-config\") on node \"crc\" DevicePath \"\"" Jan 11 17:50:28 crc kubenswrapper[4837]: I0111 17:50:28.964803 4837 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/0c2f4993-7ed4-4e39-b9ef-1902b2fd3510-dns-svc\") on node \"crc\" DevicePath \"\"" Jan 11 17:50:28 crc kubenswrapper[4837]: I0111 17:50:28.964815 4837 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/0c2f4993-7ed4-4e39-b9ef-1902b2fd3510-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Jan 11 17:50:29 crc kubenswrapper[4837]: I0111 17:50:29.151561 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-dtdbs" event={"ID":"42cb0cea-5da0-4694-b28f-a24f296b6399","Type":"ContainerStarted","Data":"1289c566c9651fd8ce1f4b7af45c7b5db7bc43fe1f2044142a2eea0e3c3c33cb"} Jan 11 17:50:29 crc kubenswrapper[4837]: I0111 17:50:29.158068 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-db-sync-fv6l8" event={"ID":"1b22cc91-a2b0-4468-96bc-73cb4aab66bb","Type":"ContainerStarted","Data":"2bb5637ead90d8db0e9cc88d248e8a187c19bf6d89b936cc130f704c59b2d2eb"} Jan 11 17:50:29 crc kubenswrapper[4837]: I0111 17:50:29.169015 4837 generic.go:334] "Generic (PLEG): container finished" podID="0c2f4993-7ed4-4e39-b9ef-1902b2fd3510" containerID="00940cc42626383b9a6bee92ded73382680f77d5ce1e0150615bad5c90190fb2" exitCode=0 Jan 11 17:50:29 crc kubenswrapper[4837]: I0111 17:50:29.169101 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-55fff446b9-zmx2m" event={"ID":"0c2f4993-7ed4-4e39-b9ef-1902b2fd3510","Type":"ContainerDied","Data":"00940cc42626383b9a6bee92ded73382680f77d5ce1e0150615bad5c90190fb2"} Jan 11 17:50:29 crc kubenswrapper[4837]: I0111 17:50:29.169182 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-55fff446b9-zmx2m" event={"ID":"0c2f4993-7ed4-4e39-b9ef-1902b2fd3510","Type":"ContainerDied","Data":"617043aa32759256129f835163cefeb1e6cd7b0023d08359d2e84a9555f0578c"} Jan 11 17:50:29 crc kubenswrapper[4837]: I0111 17:50:29.169204 4837 scope.go:117] "RemoveContainer" containerID="00940cc42626383b9a6bee92ded73382680f77d5ce1e0150615bad5c90190fb2" Jan 11 17:50:29 crc kubenswrapper[4837]: I0111 17:50:29.169324 4837 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-55fff446b9-zmx2m" Jan 11 17:50:29 crc kubenswrapper[4837]: I0111 17:50:29.175342 4837 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/neutron-db-sync-fv6l8" podStartSLOduration=3.175320046 podStartE2EDuration="3.175320046s" podCreationTimestamp="2026-01-11 17:50:26 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-11 17:50:29.17438446 +0000 UTC m=+1203.352577156" watchObservedRunningTime="2026-01-11 17:50:29.175320046 +0000 UTC m=+1203.353512762" Jan 11 17:50:29 crc kubenswrapper[4837]: I0111 17:50:29.181365 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-db-sync-k7p7p" event={"ID":"c5803c46-a48f-4120-9010-51375caff2a5","Type":"ContainerStarted","Data":"754de8f0e133e15347340ecdd59668a8763878335076729796b6ad0e10ca77bf"} Jan 11 17:50:29 crc kubenswrapper[4837]: I0111 17:50:29.185625 4837 generic.go:334] "Generic (PLEG): container finished" podID="e7385aa0-566d-4f8d-b561-3295e49dd1fd" containerID="ab6b1d553a4b846c989fac49773a88a7129b96c775cea437e6e603e8b06b2e1e" exitCode=0 Jan 11 17:50:29 crc kubenswrapper[4837]: I0111 17:50:29.185801 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-76fcf4b695-glkr2" event={"ID":"e7385aa0-566d-4f8d-b561-3295e49dd1fd","Type":"ContainerDied","Data":"ab6b1d553a4b846c989fac49773a88a7129b96c775cea437e6e603e8b06b2e1e"} Jan 11 17:50:29 crc kubenswrapper[4837]: I0111 17:50:29.185824 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-76fcf4b695-glkr2" event={"ID":"e7385aa0-566d-4f8d-b561-3295e49dd1fd","Type":"ContainerStarted","Data":"02f1e46640d0f1ff4ba47cf2ebe5406b94d3e52b7a779df67d1706b804b030b1"} Jan 11 17:50:29 crc kubenswrapper[4837]: I0111 17:50:29.230576 4837 scope.go:117] "RemoveContainer" containerID="00940cc42626383b9a6bee92ded73382680f77d5ce1e0150615bad5c90190fb2" Jan 11 17:50:29 crc kubenswrapper[4837]: E0111 17:50:29.231488 4837 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"00940cc42626383b9a6bee92ded73382680f77d5ce1e0150615bad5c90190fb2\": container with ID starting with 00940cc42626383b9a6bee92ded73382680f77d5ce1e0150615bad5c90190fb2 not found: ID does not exist" containerID="00940cc42626383b9a6bee92ded73382680f77d5ce1e0150615bad5c90190fb2" Jan 11 17:50:29 crc kubenswrapper[4837]: I0111 17:50:29.231544 4837 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"00940cc42626383b9a6bee92ded73382680f77d5ce1e0150615bad5c90190fb2"} err="failed to get container status \"00940cc42626383b9a6bee92ded73382680f77d5ce1e0150615bad5c90190fb2\": rpc error: code = NotFound desc = could not find container \"00940cc42626383b9a6bee92ded73382680f77d5ce1e0150615bad5c90190fb2\": container with ID starting with 00940cc42626383b9a6bee92ded73382680f77d5ce1e0150615bad5c90190fb2 not found: ID does not exist" Jan 11 17:50:29 crc kubenswrapper[4837]: I0111 17:50:29.278938 4837 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/horizon-686c95d4d9-ql2xj"] Jan 11 17:50:29 crc kubenswrapper[4837]: I0111 17:50:29.295906 4837 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-55fff446b9-zmx2m"] Jan 11 17:50:29 crc kubenswrapper[4837]: I0111 17:50:29.304395 4837 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-55fff446b9-zmx2m"] Jan 11 17:50:29 crc kubenswrapper[4837]: I0111 17:50:29.415220 4837 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-default-external-api-0"] Jan 11 17:50:29 crc kubenswrapper[4837]: E0111 17:50:29.415992 4837 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0c2f4993-7ed4-4e39-b9ef-1902b2fd3510" containerName="init" Jan 11 17:50:29 crc kubenswrapper[4837]: I0111 17:50:29.416005 4837 state_mem.go:107] "Deleted CPUSet assignment" podUID="0c2f4993-7ed4-4e39-b9ef-1902b2fd3510" containerName="init" Jan 11 17:50:29 crc kubenswrapper[4837]: I0111 17:50:29.416394 4837 memory_manager.go:354] "RemoveStaleState removing state" podUID="0c2f4993-7ed4-4e39-b9ef-1902b2fd3510" containerName="init" Jan 11 17:50:29 crc kubenswrapper[4837]: I0111 17:50:29.426531 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Jan 11 17:50:29 crc kubenswrapper[4837]: I0111 17:50:29.432889 4837 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-external-api-0"] Jan 11 17:50:29 crc kubenswrapper[4837]: I0111 17:50:29.433337 4837 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-scripts" Jan 11 17:50:29 crc kubenswrapper[4837]: I0111 17:50:29.433559 4837 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-glance-dockercfg-mt2ws" Jan 11 17:50:29 crc kubenswrapper[4837]: I0111 17:50:29.436076 4837 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-default-external-config-data" Jan 11 17:50:29 crc kubenswrapper[4837]: I0111 17:50:29.468060 4837 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-8b5c85b87-vkf4l"] Jan 11 17:50:29 crc kubenswrapper[4837]: I0111 17:50:29.517939 4837 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-default-internal-api-0"] Jan 11 17:50:29 crc kubenswrapper[4837]: I0111 17:50:29.521530 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Jan 11 17:50:29 crc kubenswrapper[4837]: I0111 17:50:29.528708 4837 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-default-internal-config-data" Jan 11 17:50:29 crc kubenswrapper[4837]: I0111 17:50:29.539225 4837 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-internal-api-0"] Jan 11 17:50:29 crc kubenswrapper[4837]: I0111 17:50:29.620569 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/81f4fdee-d6ec-4ad2-8c8e-860791e29d29-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"81f4fdee-d6ec-4ad2-8c8e-860791e29d29\") " pod="openstack/glance-default-external-api-0" Jan 11 17:50:29 crc kubenswrapper[4837]: I0111 17:50:29.620623 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/81f4fdee-d6ec-4ad2-8c8e-860791e29d29-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"81f4fdee-d6ec-4ad2-8c8e-860791e29d29\") " pod="openstack/glance-default-external-api-0" Jan 11 17:50:29 crc kubenswrapper[4837]: I0111 17:50:29.620650 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/81f4fdee-d6ec-4ad2-8c8e-860791e29d29-logs\") pod \"glance-default-external-api-0\" (UID: \"81f4fdee-d6ec-4ad2-8c8e-860791e29d29\") " pod="openstack/glance-default-external-api-0" Jan 11 17:50:29 crc kubenswrapper[4837]: I0111 17:50:29.620710 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/81f4fdee-d6ec-4ad2-8c8e-860791e29d29-config-data\") pod \"glance-default-external-api-0\" (UID: \"81f4fdee-d6ec-4ad2-8c8e-860791e29d29\") " pod="openstack/glance-default-external-api-0" Jan 11 17:50:29 crc kubenswrapper[4837]: I0111 17:50:29.620730 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/81f4fdee-d6ec-4ad2-8c8e-860791e29d29-scripts\") pod \"glance-default-external-api-0\" (UID: \"81f4fdee-d6ec-4ad2-8c8e-860791e29d29\") " pod="openstack/glance-default-external-api-0" Jan 11 17:50:29 crc kubenswrapper[4837]: I0111 17:50:29.620779 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bppsr\" (UniqueName: \"kubernetes.io/projected/81f4fdee-d6ec-4ad2-8c8e-860791e29d29-kube-api-access-bppsr\") pod \"glance-default-external-api-0\" (UID: \"81f4fdee-d6ec-4ad2-8c8e-860791e29d29\") " pod="openstack/glance-default-external-api-0" Jan 11 17:50:29 crc kubenswrapper[4837]: I0111 17:50:29.620796 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage12-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage12-crc\") pod \"glance-default-external-api-0\" (UID: \"81f4fdee-d6ec-4ad2-8c8e-860791e29d29\") " pod="openstack/glance-default-external-api-0" Jan 11 17:50:29 crc kubenswrapper[4837]: I0111 17:50:29.676344 4837 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-76fcf4b695-glkr2" Jan 11 17:50:29 crc kubenswrapper[4837]: I0111 17:50:29.722371 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/81f4fdee-d6ec-4ad2-8c8e-860791e29d29-config-data\") pod \"glance-default-external-api-0\" (UID: \"81f4fdee-d6ec-4ad2-8c8e-860791e29d29\") " pod="openstack/glance-default-external-api-0" Jan 11 17:50:29 crc kubenswrapper[4837]: I0111 17:50:29.722424 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c8f257ff-6f30-439e-a45d-49ae8a65d059-config-data\") pod \"glance-default-internal-api-0\" (UID: \"c8f257ff-6f30-439e-a45d-49ae8a65d059\") " pod="openstack/glance-default-internal-api-0" Jan 11 17:50:29 crc kubenswrapper[4837]: I0111 17:50:29.722444 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/81f4fdee-d6ec-4ad2-8c8e-860791e29d29-scripts\") pod \"glance-default-external-api-0\" (UID: \"81f4fdee-d6ec-4ad2-8c8e-860791e29d29\") " pod="openstack/glance-default-external-api-0" Jan 11 17:50:29 crc kubenswrapper[4837]: I0111 17:50:29.722486 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/c8f257ff-6f30-439e-a45d-49ae8a65d059-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"c8f257ff-6f30-439e-a45d-49ae8a65d059\") " pod="openstack/glance-default-internal-api-0" Jan 11 17:50:29 crc kubenswrapper[4837]: I0111 17:50:29.722506 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/c8f257ff-6f30-439e-a45d-49ae8a65d059-logs\") pod \"glance-default-internal-api-0\" (UID: \"c8f257ff-6f30-439e-a45d-49ae8a65d059\") " pod="openstack/glance-default-internal-api-0" Jan 11 17:50:29 crc kubenswrapper[4837]: I0111 17:50:29.722536 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bppsr\" (UniqueName: \"kubernetes.io/projected/81f4fdee-d6ec-4ad2-8c8e-860791e29d29-kube-api-access-bppsr\") pod \"glance-default-external-api-0\" (UID: \"81f4fdee-d6ec-4ad2-8c8e-860791e29d29\") " pod="openstack/glance-default-external-api-0" Jan 11 17:50:29 crc kubenswrapper[4837]: I0111 17:50:29.722556 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage12-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage12-crc\") pod \"glance-default-external-api-0\" (UID: \"81f4fdee-d6ec-4ad2-8c8e-860791e29d29\") " pod="openstack/glance-default-external-api-0" Jan 11 17:50:29 crc kubenswrapper[4837]: I0111 17:50:29.722578 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage01-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") pod \"glance-default-internal-api-0\" (UID: \"c8f257ff-6f30-439e-a45d-49ae8a65d059\") " pod="openstack/glance-default-internal-api-0" Jan 11 17:50:29 crc kubenswrapper[4837]: I0111 17:50:29.722622 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wjdgb\" (UniqueName: \"kubernetes.io/projected/c8f257ff-6f30-439e-a45d-49ae8a65d059-kube-api-access-wjdgb\") pod \"glance-default-internal-api-0\" (UID: \"c8f257ff-6f30-439e-a45d-49ae8a65d059\") " pod="openstack/glance-default-internal-api-0" Jan 11 17:50:29 crc kubenswrapper[4837]: I0111 17:50:29.722654 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/81f4fdee-d6ec-4ad2-8c8e-860791e29d29-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"81f4fdee-d6ec-4ad2-8c8e-860791e29d29\") " pod="openstack/glance-default-external-api-0" Jan 11 17:50:29 crc kubenswrapper[4837]: I0111 17:50:29.722692 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/81f4fdee-d6ec-4ad2-8c8e-860791e29d29-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"81f4fdee-d6ec-4ad2-8c8e-860791e29d29\") " pod="openstack/glance-default-external-api-0" Jan 11 17:50:29 crc kubenswrapper[4837]: I0111 17:50:29.722712 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/c8f257ff-6f30-439e-a45d-49ae8a65d059-scripts\") pod \"glance-default-internal-api-0\" (UID: \"c8f257ff-6f30-439e-a45d-49ae8a65d059\") " pod="openstack/glance-default-internal-api-0" Jan 11 17:50:29 crc kubenswrapper[4837]: I0111 17:50:29.722735 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/81f4fdee-d6ec-4ad2-8c8e-860791e29d29-logs\") pod \"glance-default-external-api-0\" (UID: \"81f4fdee-d6ec-4ad2-8c8e-860791e29d29\") " pod="openstack/glance-default-external-api-0" Jan 11 17:50:29 crc kubenswrapper[4837]: I0111 17:50:29.722770 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c8f257ff-6f30-439e-a45d-49ae8a65d059-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"c8f257ff-6f30-439e-a45d-49ae8a65d059\") " pod="openstack/glance-default-internal-api-0" Jan 11 17:50:29 crc kubenswrapper[4837]: I0111 17:50:29.723639 4837 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage12-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage12-crc\") pod \"glance-default-external-api-0\" (UID: \"81f4fdee-d6ec-4ad2-8c8e-860791e29d29\") device mount path \"/mnt/openstack/pv12\"" pod="openstack/glance-default-external-api-0" Jan 11 17:50:29 crc kubenswrapper[4837]: I0111 17:50:29.725394 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/81f4fdee-d6ec-4ad2-8c8e-860791e29d29-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"81f4fdee-d6ec-4ad2-8c8e-860791e29d29\") " pod="openstack/glance-default-external-api-0" Jan 11 17:50:29 crc kubenswrapper[4837]: I0111 17:50:29.726944 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/81f4fdee-d6ec-4ad2-8c8e-860791e29d29-logs\") pod \"glance-default-external-api-0\" (UID: \"81f4fdee-d6ec-4ad2-8c8e-860791e29d29\") " pod="openstack/glance-default-external-api-0" Jan 11 17:50:29 crc kubenswrapper[4837]: I0111 17:50:29.731291 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/81f4fdee-d6ec-4ad2-8c8e-860791e29d29-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"81f4fdee-d6ec-4ad2-8c8e-860791e29d29\") " pod="openstack/glance-default-external-api-0" Jan 11 17:50:29 crc kubenswrapper[4837]: I0111 17:50:29.731745 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/81f4fdee-d6ec-4ad2-8c8e-860791e29d29-config-data\") pod \"glance-default-external-api-0\" (UID: \"81f4fdee-d6ec-4ad2-8c8e-860791e29d29\") " pod="openstack/glance-default-external-api-0" Jan 11 17:50:29 crc kubenswrapper[4837]: I0111 17:50:29.733514 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/81f4fdee-d6ec-4ad2-8c8e-860791e29d29-scripts\") pod \"glance-default-external-api-0\" (UID: \"81f4fdee-d6ec-4ad2-8c8e-860791e29d29\") " pod="openstack/glance-default-external-api-0" Jan 11 17:50:29 crc kubenswrapper[4837]: I0111 17:50:29.743343 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bppsr\" (UniqueName: \"kubernetes.io/projected/81f4fdee-d6ec-4ad2-8c8e-860791e29d29-kube-api-access-bppsr\") pod \"glance-default-external-api-0\" (UID: \"81f4fdee-d6ec-4ad2-8c8e-860791e29d29\") " pod="openstack/glance-default-external-api-0" Jan 11 17:50:29 crc kubenswrapper[4837]: I0111 17:50:29.761868 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage12-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage12-crc\") pod \"glance-default-external-api-0\" (UID: \"81f4fdee-d6ec-4ad2-8c8e-860791e29d29\") " pod="openstack/glance-default-external-api-0" Jan 11 17:50:29 crc kubenswrapper[4837]: I0111 17:50:29.824154 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zbk4d\" (UniqueName: \"kubernetes.io/projected/e7385aa0-566d-4f8d-b561-3295e49dd1fd-kube-api-access-zbk4d\") pod \"e7385aa0-566d-4f8d-b561-3295e49dd1fd\" (UID: \"e7385aa0-566d-4f8d-b561-3295e49dd1fd\") " Jan 11 17:50:29 crc kubenswrapper[4837]: I0111 17:50:29.824260 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e7385aa0-566d-4f8d-b561-3295e49dd1fd-config\") pod \"e7385aa0-566d-4f8d-b561-3295e49dd1fd\" (UID: \"e7385aa0-566d-4f8d-b561-3295e49dd1fd\") " Jan 11 17:50:29 crc kubenswrapper[4837]: I0111 17:50:29.824294 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/e7385aa0-566d-4f8d-b561-3295e49dd1fd-ovsdbserver-nb\") pod \"e7385aa0-566d-4f8d-b561-3295e49dd1fd\" (UID: \"e7385aa0-566d-4f8d-b561-3295e49dd1fd\") " Jan 11 17:50:29 crc kubenswrapper[4837]: I0111 17:50:29.824315 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/e7385aa0-566d-4f8d-b561-3295e49dd1fd-dns-swift-storage-0\") pod \"e7385aa0-566d-4f8d-b561-3295e49dd1fd\" (UID: \"e7385aa0-566d-4f8d-b561-3295e49dd1fd\") " Jan 11 17:50:29 crc kubenswrapper[4837]: I0111 17:50:29.824399 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/e7385aa0-566d-4f8d-b561-3295e49dd1fd-dns-svc\") pod \"e7385aa0-566d-4f8d-b561-3295e49dd1fd\" (UID: \"e7385aa0-566d-4f8d-b561-3295e49dd1fd\") " Jan 11 17:50:29 crc kubenswrapper[4837]: I0111 17:50:29.824429 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/e7385aa0-566d-4f8d-b561-3295e49dd1fd-ovsdbserver-sb\") pod \"e7385aa0-566d-4f8d-b561-3295e49dd1fd\" (UID: \"e7385aa0-566d-4f8d-b561-3295e49dd1fd\") " Jan 11 17:50:29 crc kubenswrapper[4837]: I0111 17:50:29.824637 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wjdgb\" (UniqueName: \"kubernetes.io/projected/c8f257ff-6f30-439e-a45d-49ae8a65d059-kube-api-access-wjdgb\") pod \"glance-default-internal-api-0\" (UID: \"c8f257ff-6f30-439e-a45d-49ae8a65d059\") " pod="openstack/glance-default-internal-api-0" Jan 11 17:50:29 crc kubenswrapper[4837]: I0111 17:50:29.824709 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/c8f257ff-6f30-439e-a45d-49ae8a65d059-scripts\") pod \"glance-default-internal-api-0\" (UID: \"c8f257ff-6f30-439e-a45d-49ae8a65d059\") " pod="openstack/glance-default-internal-api-0" Jan 11 17:50:29 crc kubenswrapper[4837]: I0111 17:50:29.824751 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c8f257ff-6f30-439e-a45d-49ae8a65d059-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"c8f257ff-6f30-439e-a45d-49ae8a65d059\") " pod="openstack/glance-default-internal-api-0" Jan 11 17:50:29 crc kubenswrapper[4837]: I0111 17:50:29.824790 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c8f257ff-6f30-439e-a45d-49ae8a65d059-config-data\") pod \"glance-default-internal-api-0\" (UID: \"c8f257ff-6f30-439e-a45d-49ae8a65d059\") " pod="openstack/glance-default-internal-api-0" Jan 11 17:50:29 crc kubenswrapper[4837]: I0111 17:50:29.824837 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/c8f257ff-6f30-439e-a45d-49ae8a65d059-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"c8f257ff-6f30-439e-a45d-49ae8a65d059\") " pod="openstack/glance-default-internal-api-0" Jan 11 17:50:29 crc kubenswrapper[4837]: I0111 17:50:29.824854 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/c8f257ff-6f30-439e-a45d-49ae8a65d059-logs\") pod \"glance-default-internal-api-0\" (UID: \"c8f257ff-6f30-439e-a45d-49ae8a65d059\") " pod="openstack/glance-default-internal-api-0" Jan 11 17:50:29 crc kubenswrapper[4837]: I0111 17:50:29.824894 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage01-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") pod \"glance-default-internal-api-0\" (UID: \"c8f257ff-6f30-439e-a45d-49ae8a65d059\") " pod="openstack/glance-default-internal-api-0" Jan 11 17:50:29 crc kubenswrapper[4837]: I0111 17:50:29.825033 4837 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage01-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") pod \"glance-default-internal-api-0\" (UID: \"c8f257ff-6f30-439e-a45d-49ae8a65d059\") device mount path \"/mnt/openstack/pv01\"" pod="openstack/glance-default-internal-api-0" Jan 11 17:50:29 crc kubenswrapper[4837]: I0111 17:50:29.825940 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/c8f257ff-6f30-439e-a45d-49ae8a65d059-logs\") pod \"glance-default-internal-api-0\" (UID: \"c8f257ff-6f30-439e-a45d-49ae8a65d059\") " pod="openstack/glance-default-internal-api-0" Jan 11 17:50:29 crc kubenswrapper[4837]: I0111 17:50:29.825979 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/c8f257ff-6f30-439e-a45d-49ae8a65d059-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"c8f257ff-6f30-439e-a45d-49ae8a65d059\") " pod="openstack/glance-default-internal-api-0" Jan 11 17:50:29 crc kubenswrapper[4837]: I0111 17:50:29.829094 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/c8f257ff-6f30-439e-a45d-49ae8a65d059-scripts\") pod \"glance-default-internal-api-0\" (UID: \"c8f257ff-6f30-439e-a45d-49ae8a65d059\") " pod="openstack/glance-default-internal-api-0" Jan 11 17:50:29 crc kubenswrapper[4837]: I0111 17:50:29.829461 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c8f257ff-6f30-439e-a45d-49ae8a65d059-config-data\") pod \"glance-default-internal-api-0\" (UID: \"c8f257ff-6f30-439e-a45d-49ae8a65d059\") " pod="openstack/glance-default-internal-api-0" Jan 11 17:50:29 crc kubenswrapper[4837]: I0111 17:50:29.829947 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c8f257ff-6f30-439e-a45d-49ae8a65d059-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"c8f257ff-6f30-439e-a45d-49ae8a65d059\") " pod="openstack/glance-default-internal-api-0" Jan 11 17:50:29 crc kubenswrapper[4837]: I0111 17:50:29.845523 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e7385aa0-566d-4f8d-b561-3295e49dd1fd-kube-api-access-zbk4d" (OuterVolumeSpecName: "kube-api-access-zbk4d") pod "e7385aa0-566d-4f8d-b561-3295e49dd1fd" (UID: "e7385aa0-566d-4f8d-b561-3295e49dd1fd"). InnerVolumeSpecName "kube-api-access-zbk4d". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 11 17:50:29 crc kubenswrapper[4837]: I0111 17:50:29.853539 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wjdgb\" (UniqueName: \"kubernetes.io/projected/c8f257ff-6f30-439e-a45d-49ae8a65d059-kube-api-access-wjdgb\") pod \"glance-default-internal-api-0\" (UID: \"c8f257ff-6f30-439e-a45d-49ae8a65d059\") " pod="openstack/glance-default-internal-api-0" Jan 11 17:50:29 crc kubenswrapper[4837]: I0111 17:50:29.867731 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage01-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") pod \"glance-default-internal-api-0\" (UID: \"c8f257ff-6f30-439e-a45d-49ae8a65d059\") " pod="openstack/glance-default-internal-api-0" Jan 11 17:50:29 crc kubenswrapper[4837]: I0111 17:50:29.875507 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e7385aa0-566d-4f8d-b561-3295e49dd1fd-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "e7385aa0-566d-4f8d-b561-3295e49dd1fd" (UID: "e7385aa0-566d-4f8d-b561-3295e49dd1fd"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 11 17:50:29 crc kubenswrapper[4837]: I0111 17:50:29.876143 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e7385aa0-566d-4f8d-b561-3295e49dd1fd-config" (OuterVolumeSpecName: "config") pod "e7385aa0-566d-4f8d-b561-3295e49dd1fd" (UID: "e7385aa0-566d-4f8d-b561-3295e49dd1fd"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 11 17:50:29 crc kubenswrapper[4837]: I0111 17:50:29.884880 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e7385aa0-566d-4f8d-b561-3295e49dd1fd-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "e7385aa0-566d-4f8d-b561-3295e49dd1fd" (UID: "e7385aa0-566d-4f8d-b561-3295e49dd1fd"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 11 17:50:29 crc kubenswrapper[4837]: I0111 17:50:29.893321 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e7385aa0-566d-4f8d-b561-3295e49dd1fd-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "e7385aa0-566d-4f8d-b561-3295e49dd1fd" (UID: "e7385aa0-566d-4f8d-b561-3295e49dd1fd"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 11 17:50:29 crc kubenswrapper[4837]: I0111 17:50:29.894249 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e7385aa0-566d-4f8d-b561-3295e49dd1fd-dns-swift-storage-0" (OuterVolumeSpecName: "dns-swift-storage-0") pod "e7385aa0-566d-4f8d-b561-3295e49dd1fd" (UID: "e7385aa0-566d-4f8d-b561-3295e49dd1fd"). InnerVolumeSpecName "dns-swift-storage-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 11 17:50:29 crc kubenswrapper[4837]: I0111 17:50:29.926392 4837 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/e7385aa0-566d-4f8d-b561-3295e49dd1fd-dns-svc\") on node \"crc\" DevicePath \"\"" Jan 11 17:50:29 crc kubenswrapper[4837]: I0111 17:50:29.926422 4837 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/e7385aa0-566d-4f8d-b561-3295e49dd1fd-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Jan 11 17:50:29 crc kubenswrapper[4837]: I0111 17:50:29.926436 4837 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-zbk4d\" (UniqueName: \"kubernetes.io/projected/e7385aa0-566d-4f8d-b561-3295e49dd1fd-kube-api-access-zbk4d\") on node \"crc\" DevicePath \"\"" Jan 11 17:50:29 crc kubenswrapper[4837]: I0111 17:50:29.926445 4837 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e7385aa0-566d-4f8d-b561-3295e49dd1fd-config\") on node \"crc\" DevicePath \"\"" Jan 11 17:50:29 crc kubenswrapper[4837]: I0111 17:50:29.926453 4837 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/e7385aa0-566d-4f8d-b561-3295e49dd1fd-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Jan 11 17:50:29 crc kubenswrapper[4837]: I0111 17:50:29.926461 4837 reconciler_common.go:293] "Volume detached for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/e7385aa0-566d-4f8d-b561-3295e49dd1fd-dns-swift-storage-0\") on node \"crc\" DevicePath \"\"" Jan 11 17:50:29 crc kubenswrapper[4837]: I0111 17:50:29.958159 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Jan 11 17:50:29 crc kubenswrapper[4837]: I0111 17:50:29.969667 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Jan 11 17:50:30 crc kubenswrapper[4837]: I0111 17:50:30.200583 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-8b5c85b87-vkf4l" event={"ID":"62eadde8-9001-457f-b1e1-86f88c38054f","Type":"ContainerStarted","Data":"e7fcc437f2309e28763b248b036ba897d7572fbf437a3bb140e5ba586d9c8155"} Jan 11 17:50:30 crc kubenswrapper[4837]: I0111 17:50:30.203164 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-686c95d4d9-ql2xj" event={"ID":"3cda1c41-2167-487e-951e-cd87cf9f6ab1","Type":"ContainerStarted","Data":"ec076891666ff483736fc5aaf7973fa2b00fd3c0f009a714f8fcb822c19d46f7"} Jan 11 17:50:30 crc kubenswrapper[4837]: I0111 17:50:30.206297 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-76fcf4b695-glkr2" event={"ID":"e7385aa0-566d-4f8d-b561-3295e49dd1fd","Type":"ContainerDied","Data":"02f1e46640d0f1ff4ba47cf2ebe5406b94d3e52b7a779df67d1706b804b030b1"} Jan 11 17:50:30 crc kubenswrapper[4837]: I0111 17:50:30.206357 4837 scope.go:117] "RemoveContainer" containerID="ab6b1d553a4b846c989fac49773a88a7129b96c775cea437e6e603e8b06b2e1e" Jan 11 17:50:30 crc kubenswrapper[4837]: I0111 17:50:30.207039 4837 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-76fcf4b695-glkr2" Jan 11 17:50:30 crc kubenswrapper[4837]: I0111 17:50:30.267729 4837 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-76fcf4b695-glkr2"] Jan 11 17:50:30 crc kubenswrapper[4837]: I0111 17:50:30.272891 4837 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-76fcf4b695-glkr2"] Jan 11 17:50:30 crc kubenswrapper[4837]: I0111 17:50:30.390402 4837 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0c2f4993-7ed4-4e39-b9ef-1902b2fd3510" path="/var/lib/kubelet/pods/0c2f4993-7ed4-4e39-b9ef-1902b2fd3510/volumes" Jan 11 17:50:30 crc kubenswrapper[4837]: I0111 17:50:30.391041 4837 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e7385aa0-566d-4f8d-b561-3295e49dd1fd" path="/var/lib/kubelet/pods/e7385aa0-566d-4f8d-b561-3295e49dd1fd/volumes" Jan 11 17:50:30 crc kubenswrapper[4837]: I0111 17:50:30.558321 4837 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-external-api-0"] Jan 11 17:50:30 crc kubenswrapper[4837]: W0111 17:50:30.592256 4837 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod81f4fdee_d6ec_4ad2_8c8e_860791e29d29.slice/crio-ac96e43ac1c08e2a26d4bbba15a1da80f3caa1592785c93ed8deaecb4fdd60cf WatchSource:0}: Error finding container ac96e43ac1c08e2a26d4bbba15a1da80f3caa1592785c93ed8deaecb4fdd60cf: Status 404 returned error can't find the container with id ac96e43ac1c08e2a26d4bbba15a1da80f3caa1592785c93ed8deaecb4fdd60cf Jan 11 17:50:30 crc kubenswrapper[4837]: I0111 17:50:30.665186 4837 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-internal-api-0"] Jan 11 17:50:30 crc kubenswrapper[4837]: W0111 17:50:30.688940 4837 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podc8f257ff_6f30_439e_a45d_49ae8a65d059.slice/crio-dad4e7ea8976636ac059267949e0e71443b2f851cbc2e0da23cb9e51e930373d WatchSource:0}: Error finding container dad4e7ea8976636ac059267949e0e71443b2f851cbc2e0da23cb9e51e930373d: Status 404 returned error can't find the container with id dad4e7ea8976636ac059267949e0e71443b2f851cbc2e0da23cb9e51e930373d Jan 11 17:50:31 crc kubenswrapper[4837]: I0111 17:50:31.281429 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"c8f257ff-6f30-439e-a45d-49ae8a65d059","Type":"ContainerStarted","Data":"dad4e7ea8976636ac059267949e0e71443b2f851cbc2e0da23cb9e51e930373d"} Jan 11 17:50:31 crc kubenswrapper[4837]: I0111 17:50:31.287565 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"81f4fdee-d6ec-4ad2-8c8e-860791e29d29","Type":"ContainerStarted","Data":"ac96e43ac1c08e2a26d4bbba15a1da80f3caa1592785c93ed8deaecb4fdd60cf"} Jan 11 17:50:34 crc kubenswrapper[4837]: I0111 17:50:34.341728 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"c8f257ff-6f30-439e-a45d-49ae8a65d059","Type":"ContainerStarted","Data":"884acb8d41d634b132ce34d258296280cf451e47107311023222d177f374b0ab"} Jan 11 17:50:35 crc kubenswrapper[4837]: I0111 17:50:35.352558 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-8b5c85b87-vkf4l" event={"ID":"62eadde8-9001-457f-b1e1-86f88c38054f","Type":"ContainerStarted","Data":"a4309113013c24ff187b372a2a4d740fc274a55d1c33b095f5dd5d925201ff1e"} Jan 11 17:50:35 crc kubenswrapper[4837]: I0111 17:50:35.354701 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"81f4fdee-d6ec-4ad2-8c8e-860791e29d29","Type":"ContainerStarted","Data":"1ee545563084fcd1474bc5c5f724cd899fa6f68bc74e4db8fc23fd271aecdb4d"} Jan 11 17:50:37 crc kubenswrapper[4837]: I0111 17:50:37.372425 4837 generic.go:334] "Generic (PLEG): container finished" podID="62eadde8-9001-457f-b1e1-86f88c38054f" containerID="a4309113013c24ff187b372a2a4d740fc274a55d1c33b095f5dd5d925201ff1e" exitCode=0 Jan 11 17:50:37 crc kubenswrapper[4837]: I0111 17:50:37.372466 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-8b5c85b87-vkf4l" event={"ID":"62eadde8-9001-457f-b1e1-86f88c38054f","Type":"ContainerDied","Data":"a4309113013c24ff187b372a2a4d740fc274a55d1c33b095f5dd5d925201ff1e"} Jan 11 17:50:38 crc kubenswrapper[4837]: I0111 17:50:38.413808 4837 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-external-api-0"] Jan 11 17:50:38 crc kubenswrapper[4837]: I0111 17:50:38.513308 4837 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-internal-api-0"] Jan 11 17:50:39 crc kubenswrapper[4837]: I0111 17:50:39.409862 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"c8f257ff-6f30-439e-a45d-49ae8a65d059","Type":"ContainerStarted","Data":"d4b92aca9f3ae089900bb2a81527cb4de2fef3bbf8fcdb52f6873862f033ea68"} Jan 11 17:50:39 crc kubenswrapper[4837]: I0111 17:50:39.410404 4837 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-internal-api-0" podUID="c8f257ff-6f30-439e-a45d-49ae8a65d059" containerName="glance-log" containerID="cri-o://884acb8d41d634b132ce34d258296280cf451e47107311023222d177f374b0ab" gracePeriod=30 Jan 11 17:50:39 crc kubenswrapper[4837]: I0111 17:50:39.411027 4837 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-internal-api-0" podUID="c8f257ff-6f30-439e-a45d-49ae8a65d059" containerName="glance-httpd" containerID="cri-o://d4b92aca9f3ae089900bb2a81527cb4de2fef3bbf8fcdb52f6873862f033ea68" gracePeriod=30 Jan 11 17:50:39 crc kubenswrapper[4837]: I0111 17:50:39.437288 4837 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/glance-default-internal-api-0" podStartSLOduration=11.43727323 podStartE2EDuration="11.43727323s" podCreationTimestamp="2026-01-11 17:50:28 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-11 17:50:39.431564946 +0000 UTC m=+1213.609757652" watchObservedRunningTime="2026-01-11 17:50:39.43727323 +0000 UTC m=+1213.615465936" Jan 11 17:50:39 crc kubenswrapper[4837]: I0111 17:50:39.443656 4837 patch_prober.go:28] interesting pod/machine-config-daemon-pqnst container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Jan 11 17:50:39 crc kubenswrapper[4837]: I0111 17:50:39.443747 4837 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-pqnst" podUID="1cf6fa66-290a-4e29-bafc-e60185a22fe2" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Jan 11 17:50:39 crc kubenswrapper[4837]: I0111 17:50:39.470806 4837 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/horizon-7755778889-fjtkp"] Jan 11 17:50:39 crc kubenswrapper[4837]: I0111 17:50:39.509756 4837 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/horizon-5d4fd56848-nmkm6"] Jan 11 17:50:39 crc kubenswrapper[4837]: E0111 17:50:39.510183 4837 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e7385aa0-566d-4f8d-b561-3295e49dd1fd" containerName="init" Jan 11 17:50:39 crc kubenswrapper[4837]: I0111 17:50:39.510201 4837 state_mem.go:107] "Deleted CPUSet assignment" podUID="e7385aa0-566d-4f8d-b561-3295e49dd1fd" containerName="init" Jan 11 17:50:39 crc kubenswrapper[4837]: I0111 17:50:39.510358 4837 memory_manager.go:354] "RemoveStaleState removing state" podUID="e7385aa0-566d-4f8d-b561-3295e49dd1fd" containerName="init" Jan 11 17:50:39 crc kubenswrapper[4837]: I0111 17:50:39.511256 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-5d4fd56848-nmkm6" Jan 11 17:50:39 crc kubenswrapper[4837]: I0111 17:50:39.513563 4837 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-horizon-svc" Jan 11 17:50:39 crc kubenswrapper[4837]: I0111 17:50:39.532663 4837 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/horizon-5d4fd56848-nmkm6"] Jan 11 17:50:39 crc kubenswrapper[4837]: I0111 17:50:39.571513 4837 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/horizon-686c95d4d9-ql2xj"] Jan 11 17:50:39 crc kubenswrapper[4837]: I0111 17:50:39.598748 4837 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/horizon-7f65cf99f6-zwzzs"] Jan 11 17:50:39 crc kubenswrapper[4837]: I0111 17:50:39.606098 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-7f65cf99f6-zwzzs" Jan 11 17:50:39 crc kubenswrapper[4837]: I0111 17:50:39.626706 4837 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/horizon-7f65cf99f6-zwzzs"] Jan 11 17:50:39 crc kubenswrapper[4837]: I0111 17:50:39.702186 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vt67d\" (UniqueName: \"kubernetes.io/projected/af5aeb3b-e789-4f43-ac70-bb570e59027e-kube-api-access-vt67d\") pod \"horizon-5d4fd56848-nmkm6\" (UID: \"af5aeb3b-e789-4f43-ac70-bb570e59027e\") " pod="openstack/horizon-5d4fd56848-nmkm6" Jan 11 17:50:39 crc kubenswrapper[4837]: I0111 17:50:39.702532 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/af5aeb3b-e789-4f43-ac70-bb570e59027e-logs\") pod \"horizon-5d4fd56848-nmkm6\" (UID: \"af5aeb3b-e789-4f43-ac70-bb570e59027e\") " pod="openstack/horizon-5d4fd56848-nmkm6" Jan 11 17:50:39 crc kubenswrapper[4837]: I0111 17:50:39.702580 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/af5aeb3b-e789-4f43-ac70-bb570e59027e-horizon-secret-key\") pod \"horizon-5d4fd56848-nmkm6\" (UID: \"af5aeb3b-e789-4f43-ac70-bb570e59027e\") " pod="openstack/horizon-5d4fd56848-nmkm6" Jan 11 17:50:39 crc kubenswrapper[4837]: I0111 17:50:39.702815 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/af5aeb3b-e789-4f43-ac70-bb570e59027e-combined-ca-bundle\") pod \"horizon-5d4fd56848-nmkm6\" (UID: \"af5aeb3b-e789-4f43-ac70-bb570e59027e\") " pod="openstack/horizon-5d4fd56848-nmkm6" Jan 11 17:50:39 crc kubenswrapper[4837]: I0111 17:50:39.702910 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/af5aeb3b-e789-4f43-ac70-bb570e59027e-config-data\") pod \"horizon-5d4fd56848-nmkm6\" (UID: \"af5aeb3b-e789-4f43-ac70-bb570e59027e\") " pod="openstack/horizon-5d4fd56848-nmkm6" Jan 11 17:50:39 crc kubenswrapper[4837]: I0111 17:50:39.703015 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"horizon-tls-certs\" (UniqueName: \"kubernetes.io/secret/af5aeb3b-e789-4f43-ac70-bb570e59027e-horizon-tls-certs\") pod \"horizon-5d4fd56848-nmkm6\" (UID: \"af5aeb3b-e789-4f43-ac70-bb570e59027e\") " pod="openstack/horizon-5d4fd56848-nmkm6" Jan 11 17:50:39 crc kubenswrapper[4837]: I0111 17:50:39.703139 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/af5aeb3b-e789-4f43-ac70-bb570e59027e-scripts\") pod \"horizon-5d4fd56848-nmkm6\" (UID: \"af5aeb3b-e789-4f43-ac70-bb570e59027e\") " pod="openstack/horizon-5d4fd56848-nmkm6" Jan 11 17:50:39 crc kubenswrapper[4837]: I0111 17:50:39.804824 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"horizon-tls-certs\" (UniqueName: \"kubernetes.io/secret/af5aeb3b-e789-4f43-ac70-bb570e59027e-horizon-tls-certs\") pod \"horizon-5d4fd56848-nmkm6\" (UID: \"af5aeb3b-e789-4f43-ac70-bb570e59027e\") " pod="openstack/horizon-5d4fd56848-nmkm6" Jan 11 17:50:39 crc kubenswrapper[4837]: I0111 17:50:39.805778 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/ad90513d-7bd8-4407-af16-8d041440673f-scripts\") pod \"horizon-7f65cf99f6-zwzzs\" (UID: \"ad90513d-7bd8-4407-af16-8d041440673f\") " pod="openstack/horizon-7f65cf99f6-zwzzs" Jan 11 17:50:39 crc kubenswrapper[4837]: I0111 17:50:39.805807 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ad90513d-7bd8-4407-af16-8d041440673f-combined-ca-bundle\") pod \"horizon-7f65cf99f6-zwzzs\" (UID: \"ad90513d-7bd8-4407-af16-8d041440673f\") " pod="openstack/horizon-7f65cf99f6-zwzzs" Jan 11 17:50:39 crc kubenswrapper[4837]: I0111 17:50:39.805877 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/af5aeb3b-e789-4f43-ac70-bb570e59027e-scripts\") pod \"horizon-5d4fd56848-nmkm6\" (UID: \"af5aeb3b-e789-4f43-ac70-bb570e59027e\") " pod="openstack/horizon-5d4fd56848-nmkm6" Jan 11 17:50:39 crc kubenswrapper[4837]: I0111 17:50:39.805946 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vt67d\" (UniqueName: \"kubernetes.io/projected/af5aeb3b-e789-4f43-ac70-bb570e59027e-kube-api-access-vt67d\") pod \"horizon-5d4fd56848-nmkm6\" (UID: \"af5aeb3b-e789-4f43-ac70-bb570e59027e\") " pod="openstack/horizon-5d4fd56848-nmkm6" Jan 11 17:50:39 crc kubenswrapper[4837]: I0111 17:50:39.805996 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/af5aeb3b-e789-4f43-ac70-bb570e59027e-logs\") pod \"horizon-5d4fd56848-nmkm6\" (UID: \"af5aeb3b-e789-4f43-ac70-bb570e59027e\") " pod="openstack/horizon-5d4fd56848-nmkm6" Jan 11 17:50:39 crc kubenswrapper[4837]: I0111 17:50:39.806468 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/af5aeb3b-e789-4f43-ac70-bb570e59027e-logs\") pod \"horizon-5d4fd56848-nmkm6\" (UID: \"af5aeb3b-e789-4f43-ac70-bb570e59027e\") " pod="openstack/horizon-5d4fd56848-nmkm6" Jan 11 17:50:39 crc kubenswrapper[4837]: I0111 17:50:39.806735 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/af5aeb3b-e789-4f43-ac70-bb570e59027e-scripts\") pod \"horizon-5d4fd56848-nmkm6\" (UID: \"af5aeb3b-e789-4f43-ac70-bb570e59027e\") " pod="openstack/horizon-5d4fd56848-nmkm6" Jan 11 17:50:39 crc kubenswrapper[4837]: I0111 17:50:39.806854 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/af5aeb3b-e789-4f43-ac70-bb570e59027e-horizon-secret-key\") pod \"horizon-5d4fd56848-nmkm6\" (UID: \"af5aeb3b-e789-4f43-ac70-bb570e59027e\") " pod="openstack/horizon-5d4fd56848-nmkm6" Jan 11 17:50:39 crc kubenswrapper[4837]: I0111 17:50:39.806912 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/ad90513d-7bd8-4407-af16-8d041440673f-config-data\") pod \"horizon-7f65cf99f6-zwzzs\" (UID: \"ad90513d-7bd8-4407-af16-8d041440673f\") " pod="openstack/horizon-7f65cf99f6-zwzzs" Jan 11 17:50:39 crc kubenswrapper[4837]: I0111 17:50:39.806933 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/af5aeb3b-e789-4f43-ac70-bb570e59027e-combined-ca-bundle\") pod \"horizon-5d4fd56848-nmkm6\" (UID: \"af5aeb3b-e789-4f43-ac70-bb570e59027e\") " pod="openstack/horizon-5d4fd56848-nmkm6" Jan 11 17:50:39 crc kubenswrapper[4837]: I0111 17:50:39.806956 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/ad90513d-7bd8-4407-af16-8d041440673f-horizon-secret-key\") pod \"horizon-7f65cf99f6-zwzzs\" (UID: \"ad90513d-7bd8-4407-af16-8d041440673f\") " pod="openstack/horizon-7f65cf99f6-zwzzs" Jan 11 17:50:39 crc kubenswrapper[4837]: I0111 17:50:39.806971 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"horizon-tls-certs\" (UniqueName: \"kubernetes.io/secret/ad90513d-7bd8-4407-af16-8d041440673f-horizon-tls-certs\") pod \"horizon-7f65cf99f6-zwzzs\" (UID: \"ad90513d-7bd8-4407-af16-8d041440673f\") " pod="openstack/horizon-7f65cf99f6-zwzzs" Jan 11 17:50:39 crc kubenswrapper[4837]: I0111 17:50:39.807025 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/af5aeb3b-e789-4f43-ac70-bb570e59027e-config-data\") pod \"horizon-5d4fd56848-nmkm6\" (UID: \"af5aeb3b-e789-4f43-ac70-bb570e59027e\") " pod="openstack/horizon-5d4fd56848-nmkm6" Jan 11 17:50:39 crc kubenswrapper[4837]: I0111 17:50:39.807090 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/ad90513d-7bd8-4407-af16-8d041440673f-logs\") pod \"horizon-7f65cf99f6-zwzzs\" (UID: \"ad90513d-7bd8-4407-af16-8d041440673f\") " pod="openstack/horizon-7f65cf99f6-zwzzs" Jan 11 17:50:39 crc kubenswrapper[4837]: I0111 17:50:39.807127 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-npxbw\" (UniqueName: \"kubernetes.io/projected/ad90513d-7bd8-4407-af16-8d041440673f-kube-api-access-npxbw\") pod \"horizon-7f65cf99f6-zwzzs\" (UID: \"ad90513d-7bd8-4407-af16-8d041440673f\") " pod="openstack/horizon-7f65cf99f6-zwzzs" Jan 11 17:50:39 crc kubenswrapper[4837]: I0111 17:50:39.808204 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/af5aeb3b-e789-4f43-ac70-bb570e59027e-config-data\") pod \"horizon-5d4fd56848-nmkm6\" (UID: \"af5aeb3b-e789-4f43-ac70-bb570e59027e\") " pod="openstack/horizon-5d4fd56848-nmkm6" Jan 11 17:50:39 crc kubenswrapper[4837]: I0111 17:50:39.810588 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/af5aeb3b-e789-4f43-ac70-bb570e59027e-combined-ca-bundle\") pod \"horizon-5d4fd56848-nmkm6\" (UID: \"af5aeb3b-e789-4f43-ac70-bb570e59027e\") " pod="openstack/horizon-5d4fd56848-nmkm6" Jan 11 17:50:39 crc kubenswrapper[4837]: I0111 17:50:39.810634 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"horizon-tls-certs\" (UniqueName: \"kubernetes.io/secret/af5aeb3b-e789-4f43-ac70-bb570e59027e-horizon-tls-certs\") pod \"horizon-5d4fd56848-nmkm6\" (UID: \"af5aeb3b-e789-4f43-ac70-bb570e59027e\") " pod="openstack/horizon-5d4fd56848-nmkm6" Jan 11 17:50:39 crc kubenswrapper[4837]: I0111 17:50:39.812496 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/af5aeb3b-e789-4f43-ac70-bb570e59027e-horizon-secret-key\") pod \"horizon-5d4fd56848-nmkm6\" (UID: \"af5aeb3b-e789-4f43-ac70-bb570e59027e\") " pod="openstack/horizon-5d4fd56848-nmkm6" Jan 11 17:50:39 crc kubenswrapper[4837]: I0111 17:50:39.829079 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vt67d\" (UniqueName: \"kubernetes.io/projected/af5aeb3b-e789-4f43-ac70-bb570e59027e-kube-api-access-vt67d\") pod \"horizon-5d4fd56848-nmkm6\" (UID: \"af5aeb3b-e789-4f43-ac70-bb570e59027e\") " pod="openstack/horizon-5d4fd56848-nmkm6" Jan 11 17:50:39 crc kubenswrapper[4837]: I0111 17:50:39.830168 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-5d4fd56848-nmkm6" Jan 11 17:50:39 crc kubenswrapper[4837]: I0111 17:50:39.908343 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-npxbw\" (UniqueName: \"kubernetes.io/projected/ad90513d-7bd8-4407-af16-8d041440673f-kube-api-access-npxbw\") pod \"horizon-7f65cf99f6-zwzzs\" (UID: \"ad90513d-7bd8-4407-af16-8d041440673f\") " pod="openstack/horizon-7f65cf99f6-zwzzs" Jan 11 17:50:39 crc kubenswrapper[4837]: I0111 17:50:39.908406 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/ad90513d-7bd8-4407-af16-8d041440673f-scripts\") pod \"horizon-7f65cf99f6-zwzzs\" (UID: \"ad90513d-7bd8-4407-af16-8d041440673f\") " pod="openstack/horizon-7f65cf99f6-zwzzs" Jan 11 17:50:39 crc kubenswrapper[4837]: I0111 17:50:39.908429 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ad90513d-7bd8-4407-af16-8d041440673f-combined-ca-bundle\") pod \"horizon-7f65cf99f6-zwzzs\" (UID: \"ad90513d-7bd8-4407-af16-8d041440673f\") " pod="openstack/horizon-7f65cf99f6-zwzzs" Jan 11 17:50:39 crc kubenswrapper[4837]: I0111 17:50:39.908502 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/ad90513d-7bd8-4407-af16-8d041440673f-config-data\") pod \"horizon-7f65cf99f6-zwzzs\" (UID: \"ad90513d-7bd8-4407-af16-8d041440673f\") " pod="openstack/horizon-7f65cf99f6-zwzzs" Jan 11 17:50:39 crc kubenswrapper[4837]: I0111 17:50:39.908524 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/ad90513d-7bd8-4407-af16-8d041440673f-horizon-secret-key\") pod \"horizon-7f65cf99f6-zwzzs\" (UID: \"ad90513d-7bd8-4407-af16-8d041440673f\") " pod="openstack/horizon-7f65cf99f6-zwzzs" Jan 11 17:50:39 crc kubenswrapper[4837]: I0111 17:50:39.908547 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"horizon-tls-certs\" (UniqueName: \"kubernetes.io/secret/ad90513d-7bd8-4407-af16-8d041440673f-horizon-tls-certs\") pod \"horizon-7f65cf99f6-zwzzs\" (UID: \"ad90513d-7bd8-4407-af16-8d041440673f\") " pod="openstack/horizon-7f65cf99f6-zwzzs" Jan 11 17:50:39 crc kubenswrapper[4837]: I0111 17:50:39.908568 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/ad90513d-7bd8-4407-af16-8d041440673f-logs\") pod \"horizon-7f65cf99f6-zwzzs\" (UID: \"ad90513d-7bd8-4407-af16-8d041440673f\") " pod="openstack/horizon-7f65cf99f6-zwzzs" Jan 11 17:50:39 crc kubenswrapper[4837]: I0111 17:50:39.908962 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/ad90513d-7bd8-4407-af16-8d041440673f-logs\") pod \"horizon-7f65cf99f6-zwzzs\" (UID: \"ad90513d-7bd8-4407-af16-8d041440673f\") " pod="openstack/horizon-7f65cf99f6-zwzzs" Jan 11 17:50:39 crc kubenswrapper[4837]: I0111 17:50:39.909401 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/ad90513d-7bd8-4407-af16-8d041440673f-scripts\") pod \"horizon-7f65cf99f6-zwzzs\" (UID: \"ad90513d-7bd8-4407-af16-8d041440673f\") " pod="openstack/horizon-7f65cf99f6-zwzzs" Jan 11 17:50:39 crc kubenswrapper[4837]: I0111 17:50:39.910560 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/ad90513d-7bd8-4407-af16-8d041440673f-config-data\") pod \"horizon-7f65cf99f6-zwzzs\" (UID: \"ad90513d-7bd8-4407-af16-8d041440673f\") " pod="openstack/horizon-7f65cf99f6-zwzzs" Jan 11 17:50:39 crc kubenswrapper[4837]: I0111 17:50:39.912548 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ad90513d-7bd8-4407-af16-8d041440673f-combined-ca-bundle\") pod \"horizon-7f65cf99f6-zwzzs\" (UID: \"ad90513d-7bd8-4407-af16-8d041440673f\") " pod="openstack/horizon-7f65cf99f6-zwzzs" Jan 11 17:50:39 crc kubenswrapper[4837]: I0111 17:50:39.913208 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/ad90513d-7bd8-4407-af16-8d041440673f-horizon-secret-key\") pod \"horizon-7f65cf99f6-zwzzs\" (UID: \"ad90513d-7bd8-4407-af16-8d041440673f\") " pod="openstack/horizon-7f65cf99f6-zwzzs" Jan 11 17:50:39 crc kubenswrapper[4837]: I0111 17:50:39.921255 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"horizon-tls-certs\" (UniqueName: \"kubernetes.io/secret/ad90513d-7bd8-4407-af16-8d041440673f-horizon-tls-certs\") pod \"horizon-7f65cf99f6-zwzzs\" (UID: \"ad90513d-7bd8-4407-af16-8d041440673f\") " pod="openstack/horizon-7f65cf99f6-zwzzs" Jan 11 17:50:39 crc kubenswrapper[4837]: I0111 17:50:39.926274 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-npxbw\" (UniqueName: \"kubernetes.io/projected/ad90513d-7bd8-4407-af16-8d041440673f-kube-api-access-npxbw\") pod \"horizon-7f65cf99f6-zwzzs\" (UID: \"ad90513d-7bd8-4407-af16-8d041440673f\") " pod="openstack/horizon-7f65cf99f6-zwzzs" Jan 11 17:50:39 crc kubenswrapper[4837]: I0111 17:50:39.977614 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-7f65cf99f6-zwzzs" Jan 11 17:50:40 crc kubenswrapper[4837]: I0111 17:50:40.423523 4837 generic.go:334] "Generic (PLEG): container finished" podID="c8f257ff-6f30-439e-a45d-49ae8a65d059" containerID="d4b92aca9f3ae089900bb2a81527cb4de2fef3bbf8fcdb52f6873862f033ea68" exitCode=0 Jan 11 17:50:40 crc kubenswrapper[4837]: I0111 17:50:40.423560 4837 generic.go:334] "Generic (PLEG): container finished" podID="c8f257ff-6f30-439e-a45d-49ae8a65d059" containerID="884acb8d41d634b132ce34d258296280cf451e47107311023222d177f374b0ab" exitCode=143 Jan 11 17:50:40 crc kubenswrapper[4837]: I0111 17:50:40.423582 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"c8f257ff-6f30-439e-a45d-49ae8a65d059","Type":"ContainerDied","Data":"d4b92aca9f3ae089900bb2a81527cb4de2fef3bbf8fcdb52f6873862f033ea68"} Jan 11 17:50:40 crc kubenswrapper[4837]: I0111 17:50:40.423609 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"c8f257ff-6f30-439e-a45d-49ae8a65d059","Type":"ContainerDied","Data":"884acb8d41d634b132ce34d258296280cf451e47107311023222d177f374b0ab"} Jan 11 17:50:41 crc kubenswrapper[4837]: I0111 17:50:41.454391 4837 generic.go:334] "Generic (PLEG): container finished" podID="42cb0cea-5da0-4694-b28f-a24f296b6399" containerID="1289c566c9651fd8ce1f4b7af45c7b5db7bc43fe1f2044142a2eea0e3c3c33cb" exitCode=0 Jan 11 17:50:41 crc kubenswrapper[4837]: I0111 17:50:41.454792 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-dtdbs" event={"ID":"42cb0cea-5da0-4694-b28f-a24f296b6399","Type":"ContainerDied","Data":"1289c566c9651fd8ce1f4b7af45c7b5db7bc43fe1f2044142a2eea0e3c3c33cb"} Jan 11 17:50:45 crc kubenswrapper[4837]: E0111 17:50:45.365237 4837 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/podified-antelope-centos9/openstack-horizon:current-podified" Jan 11 17:50:45 crc kubenswrapper[4837]: E0111 17:50:45.365947 4837 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:horizon-log,Image:quay.io/podified-antelope-centos9/openstack-horizon:current-podified,Command:[/bin/bash],Args:[-c tail -n+1 -F /var/log/horizon/horizon.log],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:CONFIG_HASH,Value:n594hbchfbh6bh588h68h67h5ddh76h78h9bh9ch687h697h64fh9dhdh687h559h55hf9h549hf8h549h597hd8h667h548h595h56fh94hfcq,ValueFrom:nil,},EnvVar{Name:ENABLE_DESIGNATE,Value:yes,ValueFrom:nil,},EnvVar{Name:ENABLE_HEAT,Value:yes,ValueFrom:nil,},EnvVar{Name:ENABLE_IRONIC,Value:yes,ValueFrom:nil,},EnvVar{Name:ENABLE_MANILA,Value:yes,ValueFrom:nil,},EnvVar{Name:ENABLE_OCTAVIA,Value:yes,ValueFrom:nil,},EnvVar{Name:ENABLE_WATCHER,Value:no,ValueFrom:nil,},EnvVar{Name:KOLLA_CONFIG_STRATEGY,Value:COPY_ALWAYS,ValueFrom:nil,},EnvVar{Name:UNPACK_THEME,Value:true,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:logs,ReadOnly:false,MountPath:/var/log/horizon,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-bdppn,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*48,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*true,RunAsGroup:*42400,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod horizon-7755778889-fjtkp_openstack(77a72881-5957-4b75-b490-0417214a39b0): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Jan 11 17:50:45 crc kubenswrapper[4837]: E0111 17:50:45.377135 4837 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"horizon-log\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\", failed to \"StartContainer\" for \"horizon\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/podified-antelope-centos9/openstack-horizon:current-podified\\\"\"]" pod="openstack/horizon-7755778889-fjtkp" podUID="77a72881-5957-4b75-b490-0417214a39b0" Jan 11 17:50:45 crc kubenswrapper[4837]: I0111 17:50:45.471516 4837 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-dtdbs" Jan 11 17:50:45 crc kubenswrapper[4837]: I0111 17:50:45.493392 4837 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-dtdbs" Jan 11 17:50:45 crc kubenswrapper[4837]: I0111 17:50:45.493746 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-dtdbs" event={"ID":"42cb0cea-5da0-4694-b28f-a24f296b6399","Type":"ContainerDied","Data":"a19f25df1eb806552c2981f7e193e0b388af7bc05792353f25cf48021a2ad99f"} Jan 11 17:50:45 crc kubenswrapper[4837]: I0111 17:50:45.494098 4837 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="a19f25df1eb806552c2981f7e193e0b388af7bc05792353f25cf48021a2ad99f" Jan 11 17:50:45 crc kubenswrapper[4837]: I0111 17:50:45.548758 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/42cb0cea-5da0-4694-b28f-a24f296b6399-fernet-keys\") pod \"42cb0cea-5da0-4694-b28f-a24f296b6399\" (UID: \"42cb0cea-5da0-4694-b28f-a24f296b6399\") " Jan 11 17:50:45 crc kubenswrapper[4837]: I0111 17:50:45.548829 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/42cb0cea-5da0-4694-b28f-a24f296b6399-config-data\") pod \"42cb0cea-5da0-4694-b28f-a24f296b6399\" (UID: \"42cb0cea-5da0-4694-b28f-a24f296b6399\") " Jan 11 17:50:45 crc kubenswrapper[4837]: I0111 17:50:45.548867 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/42cb0cea-5da0-4694-b28f-a24f296b6399-scripts\") pod \"42cb0cea-5da0-4694-b28f-a24f296b6399\" (UID: \"42cb0cea-5da0-4694-b28f-a24f296b6399\") " Jan 11 17:50:45 crc kubenswrapper[4837]: I0111 17:50:45.548920 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/42cb0cea-5da0-4694-b28f-a24f296b6399-combined-ca-bundle\") pod \"42cb0cea-5da0-4694-b28f-a24f296b6399\" (UID: \"42cb0cea-5da0-4694-b28f-a24f296b6399\") " Jan 11 17:50:45 crc kubenswrapper[4837]: I0111 17:50:45.549071 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/42cb0cea-5da0-4694-b28f-a24f296b6399-credential-keys\") pod \"42cb0cea-5da0-4694-b28f-a24f296b6399\" (UID: \"42cb0cea-5da0-4694-b28f-a24f296b6399\") " Jan 11 17:50:45 crc kubenswrapper[4837]: I0111 17:50:45.549155 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-9s8qr\" (UniqueName: \"kubernetes.io/projected/42cb0cea-5da0-4694-b28f-a24f296b6399-kube-api-access-9s8qr\") pod \"42cb0cea-5da0-4694-b28f-a24f296b6399\" (UID: \"42cb0cea-5da0-4694-b28f-a24f296b6399\") " Jan 11 17:50:45 crc kubenswrapper[4837]: I0111 17:50:45.555761 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/42cb0cea-5da0-4694-b28f-a24f296b6399-scripts" (OuterVolumeSpecName: "scripts") pod "42cb0cea-5da0-4694-b28f-a24f296b6399" (UID: "42cb0cea-5da0-4694-b28f-a24f296b6399"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 11 17:50:45 crc kubenswrapper[4837]: I0111 17:50:45.556016 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/42cb0cea-5da0-4694-b28f-a24f296b6399-credential-keys" (OuterVolumeSpecName: "credential-keys") pod "42cb0cea-5da0-4694-b28f-a24f296b6399" (UID: "42cb0cea-5da0-4694-b28f-a24f296b6399"). InnerVolumeSpecName "credential-keys". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 11 17:50:45 crc kubenswrapper[4837]: I0111 17:50:45.559886 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/42cb0cea-5da0-4694-b28f-a24f296b6399-fernet-keys" (OuterVolumeSpecName: "fernet-keys") pod "42cb0cea-5da0-4694-b28f-a24f296b6399" (UID: "42cb0cea-5da0-4694-b28f-a24f296b6399"). InnerVolumeSpecName "fernet-keys". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 11 17:50:45 crc kubenswrapper[4837]: I0111 17:50:45.571443 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/42cb0cea-5da0-4694-b28f-a24f296b6399-kube-api-access-9s8qr" (OuterVolumeSpecName: "kube-api-access-9s8qr") pod "42cb0cea-5da0-4694-b28f-a24f296b6399" (UID: "42cb0cea-5da0-4694-b28f-a24f296b6399"). InnerVolumeSpecName "kube-api-access-9s8qr". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 11 17:50:45 crc kubenswrapper[4837]: I0111 17:50:45.596792 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/42cb0cea-5da0-4694-b28f-a24f296b6399-config-data" (OuterVolumeSpecName: "config-data") pod "42cb0cea-5da0-4694-b28f-a24f296b6399" (UID: "42cb0cea-5da0-4694-b28f-a24f296b6399"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 11 17:50:45 crc kubenswrapper[4837]: I0111 17:50:45.617663 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/42cb0cea-5da0-4694-b28f-a24f296b6399-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "42cb0cea-5da0-4694-b28f-a24f296b6399" (UID: "42cb0cea-5da0-4694-b28f-a24f296b6399"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 11 17:50:45 crc kubenswrapper[4837]: I0111 17:50:45.651381 4837 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-9s8qr\" (UniqueName: \"kubernetes.io/projected/42cb0cea-5da0-4694-b28f-a24f296b6399-kube-api-access-9s8qr\") on node \"crc\" DevicePath \"\"" Jan 11 17:50:45 crc kubenswrapper[4837]: I0111 17:50:45.651421 4837 reconciler_common.go:293] "Volume detached for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/42cb0cea-5da0-4694-b28f-a24f296b6399-fernet-keys\") on node \"crc\" DevicePath \"\"" Jan 11 17:50:45 crc kubenswrapper[4837]: I0111 17:50:45.651436 4837 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/42cb0cea-5da0-4694-b28f-a24f296b6399-config-data\") on node \"crc\" DevicePath \"\"" Jan 11 17:50:45 crc kubenswrapper[4837]: I0111 17:50:45.651448 4837 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/42cb0cea-5da0-4694-b28f-a24f296b6399-scripts\") on node \"crc\" DevicePath \"\"" Jan 11 17:50:45 crc kubenswrapper[4837]: I0111 17:50:45.651460 4837 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/42cb0cea-5da0-4694-b28f-a24f296b6399-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 11 17:50:45 crc kubenswrapper[4837]: I0111 17:50:45.651471 4837 reconciler_common.go:293] "Volume detached for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/42cb0cea-5da0-4694-b28f-a24f296b6399-credential-keys\") on node \"crc\" DevicePath \"\"" Jan 11 17:50:46 crc kubenswrapper[4837]: I0111 17:50:46.557352 4837 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/keystone-bootstrap-dtdbs"] Jan 11 17:50:46 crc kubenswrapper[4837]: I0111 17:50:46.564473 4837 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/keystone-bootstrap-dtdbs"] Jan 11 17:50:46 crc kubenswrapper[4837]: I0111 17:50:46.708390 4837 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/keystone-bootstrap-7pd2m"] Jan 11 17:50:46 crc kubenswrapper[4837]: E0111 17:50:46.708940 4837 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="42cb0cea-5da0-4694-b28f-a24f296b6399" containerName="keystone-bootstrap" Jan 11 17:50:46 crc kubenswrapper[4837]: I0111 17:50:46.708961 4837 state_mem.go:107] "Deleted CPUSet assignment" podUID="42cb0cea-5da0-4694-b28f-a24f296b6399" containerName="keystone-bootstrap" Jan 11 17:50:46 crc kubenswrapper[4837]: I0111 17:50:46.709226 4837 memory_manager.go:354] "RemoveStaleState removing state" podUID="42cb0cea-5da0-4694-b28f-a24f296b6399" containerName="keystone-bootstrap" Jan 11 17:50:46 crc kubenswrapper[4837]: I0111 17:50:46.710030 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-7pd2m" Jan 11 17:50:46 crc kubenswrapper[4837]: I0111 17:50:46.729893 4837 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"osp-secret" Jan 11 17:50:46 crc kubenswrapper[4837]: I0111 17:50:46.730152 4837 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-keystone-dockercfg-6ct2s" Jan 11 17:50:46 crc kubenswrapper[4837]: I0111 17:50:46.730448 4837 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-scripts" Jan 11 17:50:46 crc kubenswrapper[4837]: I0111 17:50:46.730611 4837 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-config-data" Jan 11 17:50:46 crc kubenswrapper[4837]: I0111 17:50:46.730784 4837 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone" Jan 11 17:50:46 crc kubenswrapper[4837]: I0111 17:50:46.742895 4837 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-bootstrap-7pd2m"] Jan 11 17:50:46 crc kubenswrapper[4837]: I0111 17:50:46.806720 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ca96b8a6-63fc-49fc-8898-6794c54e1676-combined-ca-bundle\") pod \"keystone-bootstrap-7pd2m\" (UID: \"ca96b8a6-63fc-49fc-8898-6794c54e1676\") " pod="openstack/keystone-bootstrap-7pd2m" Jan 11 17:50:46 crc kubenswrapper[4837]: I0111 17:50:46.806791 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/ca96b8a6-63fc-49fc-8898-6794c54e1676-credential-keys\") pod \"keystone-bootstrap-7pd2m\" (UID: \"ca96b8a6-63fc-49fc-8898-6794c54e1676\") " pod="openstack/keystone-bootstrap-7pd2m" Jan 11 17:50:46 crc kubenswrapper[4837]: I0111 17:50:46.807380 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/ca96b8a6-63fc-49fc-8898-6794c54e1676-scripts\") pod \"keystone-bootstrap-7pd2m\" (UID: \"ca96b8a6-63fc-49fc-8898-6794c54e1676\") " pod="openstack/keystone-bootstrap-7pd2m" Jan 11 17:50:46 crc kubenswrapper[4837]: I0111 17:50:46.807544 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/ca96b8a6-63fc-49fc-8898-6794c54e1676-fernet-keys\") pod \"keystone-bootstrap-7pd2m\" (UID: \"ca96b8a6-63fc-49fc-8898-6794c54e1676\") " pod="openstack/keystone-bootstrap-7pd2m" Jan 11 17:50:46 crc kubenswrapper[4837]: I0111 17:50:46.807618 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ca96b8a6-63fc-49fc-8898-6794c54e1676-config-data\") pod \"keystone-bootstrap-7pd2m\" (UID: \"ca96b8a6-63fc-49fc-8898-6794c54e1676\") " pod="openstack/keystone-bootstrap-7pd2m" Jan 11 17:50:46 crc kubenswrapper[4837]: I0111 17:50:46.807651 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bncrq\" (UniqueName: \"kubernetes.io/projected/ca96b8a6-63fc-49fc-8898-6794c54e1676-kube-api-access-bncrq\") pod \"keystone-bootstrap-7pd2m\" (UID: \"ca96b8a6-63fc-49fc-8898-6794c54e1676\") " pod="openstack/keystone-bootstrap-7pd2m" Jan 11 17:50:46 crc kubenswrapper[4837]: I0111 17:50:46.909105 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ca96b8a6-63fc-49fc-8898-6794c54e1676-config-data\") pod \"keystone-bootstrap-7pd2m\" (UID: \"ca96b8a6-63fc-49fc-8898-6794c54e1676\") " pod="openstack/keystone-bootstrap-7pd2m" Jan 11 17:50:46 crc kubenswrapper[4837]: I0111 17:50:46.909149 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bncrq\" (UniqueName: \"kubernetes.io/projected/ca96b8a6-63fc-49fc-8898-6794c54e1676-kube-api-access-bncrq\") pod \"keystone-bootstrap-7pd2m\" (UID: \"ca96b8a6-63fc-49fc-8898-6794c54e1676\") " pod="openstack/keystone-bootstrap-7pd2m" Jan 11 17:50:46 crc kubenswrapper[4837]: I0111 17:50:46.909187 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ca96b8a6-63fc-49fc-8898-6794c54e1676-combined-ca-bundle\") pod \"keystone-bootstrap-7pd2m\" (UID: \"ca96b8a6-63fc-49fc-8898-6794c54e1676\") " pod="openstack/keystone-bootstrap-7pd2m" Jan 11 17:50:46 crc kubenswrapper[4837]: I0111 17:50:46.909214 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/ca96b8a6-63fc-49fc-8898-6794c54e1676-credential-keys\") pod \"keystone-bootstrap-7pd2m\" (UID: \"ca96b8a6-63fc-49fc-8898-6794c54e1676\") " pod="openstack/keystone-bootstrap-7pd2m" Jan 11 17:50:46 crc kubenswrapper[4837]: I0111 17:50:46.910496 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/ca96b8a6-63fc-49fc-8898-6794c54e1676-scripts\") pod \"keystone-bootstrap-7pd2m\" (UID: \"ca96b8a6-63fc-49fc-8898-6794c54e1676\") " pod="openstack/keystone-bootstrap-7pd2m" Jan 11 17:50:46 crc kubenswrapper[4837]: I0111 17:50:46.910553 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/ca96b8a6-63fc-49fc-8898-6794c54e1676-fernet-keys\") pod \"keystone-bootstrap-7pd2m\" (UID: \"ca96b8a6-63fc-49fc-8898-6794c54e1676\") " pod="openstack/keystone-bootstrap-7pd2m" Jan 11 17:50:46 crc kubenswrapper[4837]: I0111 17:50:46.920307 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/ca96b8a6-63fc-49fc-8898-6794c54e1676-credential-keys\") pod \"keystone-bootstrap-7pd2m\" (UID: \"ca96b8a6-63fc-49fc-8898-6794c54e1676\") " pod="openstack/keystone-bootstrap-7pd2m" Jan 11 17:50:46 crc kubenswrapper[4837]: I0111 17:50:46.920352 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/ca96b8a6-63fc-49fc-8898-6794c54e1676-scripts\") pod \"keystone-bootstrap-7pd2m\" (UID: \"ca96b8a6-63fc-49fc-8898-6794c54e1676\") " pod="openstack/keystone-bootstrap-7pd2m" Jan 11 17:50:46 crc kubenswrapper[4837]: I0111 17:50:46.920448 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/ca96b8a6-63fc-49fc-8898-6794c54e1676-fernet-keys\") pod \"keystone-bootstrap-7pd2m\" (UID: \"ca96b8a6-63fc-49fc-8898-6794c54e1676\") " pod="openstack/keystone-bootstrap-7pd2m" Jan 11 17:50:46 crc kubenswrapper[4837]: I0111 17:50:46.920783 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ca96b8a6-63fc-49fc-8898-6794c54e1676-combined-ca-bundle\") pod \"keystone-bootstrap-7pd2m\" (UID: \"ca96b8a6-63fc-49fc-8898-6794c54e1676\") " pod="openstack/keystone-bootstrap-7pd2m" Jan 11 17:50:46 crc kubenswrapper[4837]: I0111 17:50:46.920884 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ca96b8a6-63fc-49fc-8898-6794c54e1676-config-data\") pod \"keystone-bootstrap-7pd2m\" (UID: \"ca96b8a6-63fc-49fc-8898-6794c54e1676\") " pod="openstack/keystone-bootstrap-7pd2m" Jan 11 17:50:46 crc kubenswrapper[4837]: I0111 17:50:46.930002 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bncrq\" (UniqueName: \"kubernetes.io/projected/ca96b8a6-63fc-49fc-8898-6794c54e1676-kube-api-access-bncrq\") pod \"keystone-bootstrap-7pd2m\" (UID: \"ca96b8a6-63fc-49fc-8898-6794c54e1676\") " pod="openstack/keystone-bootstrap-7pd2m" Jan 11 17:50:47 crc kubenswrapper[4837]: I0111 17:50:47.053250 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-7pd2m" Jan 11 17:50:48 crc kubenswrapper[4837]: I0111 17:50:48.375560 4837 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="42cb0cea-5da0-4694-b28f-a24f296b6399" path="/var/lib/kubelet/pods/42cb0cea-5da0-4694-b28f-a24f296b6399/volumes" Jan 11 17:50:49 crc kubenswrapper[4837]: E0111 17:50:49.776341 4837 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/podified-antelope-centos9/openstack-horizon:current-podified" Jan 11 17:50:49 crc kubenswrapper[4837]: E0111 17:50:49.776883 4837 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:horizon-log,Image:quay.io/podified-antelope-centos9/openstack-horizon:current-podified,Command:[/bin/bash],Args:[-c tail -n+1 -F /var/log/horizon/horizon.log],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:CONFIG_HASH,Value:n6bh6fh65ch567h6bh569h654h58ch575h75hd8hc4h5cfh8bh67bh9fhbbh56bh5cdh57bh8bh648h5b7h54ch599h95h665h594h5c5h686h668h688q,ValueFrom:nil,},EnvVar{Name:ENABLE_DESIGNATE,Value:yes,ValueFrom:nil,},EnvVar{Name:ENABLE_HEAT,Value:yes,ValueFrom:nil,},EnvVar{Name:ENABLE_IRONIC,Value:yes,ValueFrom:nil,},EnvVar{Name:ENABLE_MANILA,Value:yes,ValueFrom:nil,},EnvVar{Name:ENABLE_OCTAVIA,Value:yes,ValueFrom:nil,},EnvVar{Name:ENABLE_WATCHER,Value:no,ValueFrom:nil,},EnvVar{Name:KOLLA_CONFIG_STRATEGY,Value:COPY_ALWAYS,ValueFrom:nil,},EnvVar{Name:UNPACK_THEME,Value:true,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:logs,ReadOnly:false,MountPath:/var/log/horizon,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-djtfv,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*48,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*true,RunAsGroup:*42400,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod horizon-774fb4c689-s8tf9_openstack(de0a2bd6-2144-4c5c-91ab-a5fde49c9808): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Jan 11 17:50:49 crc kubenswrapper[4837]: E0111 17:50:49.779191 4837 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"horizon-log\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\", failed to \"StartContainer\" for \"horizon\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/podified-antelope-centos9/openstack-horizon:current-podified\\\"\"]" pod="openstack/horizon-774fb4c689-s8tf9" podUID="de0a2bd6-2144-4c5c-91ab-a5fde49c9808" Jan 11 17:50:49 crc kubenswrapper[4837]: E0111 17:50:49.810002 4837 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/podified-antelope-centos9/openstack-horizon:current-podified" Jan 11 17:50:49 crc kubenswrapper[4837]: E0111 17:50:49.810262 4837 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:horizon-log,Image:quay.io/podified-antelope-centos9/openstack-horizon:current-podified,Command:[/bin/bash],Args:[-c tail -n+1 -F /var/log/horizon/horizon.log],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:CONFIG_HASH,Value:n557h6hc9hfh666h87h84hb4hb4h9dh5dch5ffhbfh545h5hbbh65fh58fh5cchc9h5f5hd6h679h65h5c5h75h5ffh9h6dhfch5ffh684q,ValueFrom:nil,},EnvVar{Name:ENABLE_DESIGNATE,Value:yes,ValueFrom:nil,},EnvVar{Name:ENABLE_HEAT,Value:yes,ValueFrom:nil,},EnvVar{Name:ENABLE_IRONIC,Value:yes,ValueFrom:nil,},EnvVar{Name:ENABLE_MANILA,Value:yes,ValueFrom:nil,},EnvVar{Name:ENABLE_OCTAVIA,Value:yes,ValueFrom:nil,},EnvVar{Name:ENABLE_WATCHER,Value:no,ValueFrom:nil,},EnvVar{Name:KOLLA_CONFIG_STRATEGY,Value:COPY_ALWAYS,ValueFrom:nil,},EnvVar{Name:UNPACK_THEME,Value:true,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:logs,ReadOnly:false,MountPath:/var/log/horizon,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-fxrxn,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*48,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*true,RunAsGroup:*42400,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod horizon-686c95d4d9-ql2xj_openstack(3cda1c41-2167-487e-951e-cd87cf9f6ab1): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Jan 11 17:50:49 crc kubenswrapper[4837]: E0111 17:50:49.812882 4837 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"horizon-log\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\", failed to \"StartContainer\" for \"horizon\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/podified-antelope-centos9/openstack-horizon:current-podified\\\"\"]" pod="openstack/horizon-686c95d4d9-ql2xj" podUID="3cda1c41-2167-487e-951e-cd87cf9f6ab1" Jan 11 17:50:57 crc kubenswrapper[4837]: I0111 17:50:57.486623 4837 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Jan 11 17:50:57 crc kubenswrapper[4837]: I0111 17:50:57.496393 4837 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-7755778889-fjtkp" Jan 11 17:50:57 crc kubenswrapper[4837]: I0111 17:50:57.608862 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"c8f257ff-6f30-439e-a45d-49ae8a65d059","Type":"ContainerDied","Data":"dad4e7ea8976636ac059267949e0e71443b2f851cbc2e0da23cb9e51e930373d"} Jan 11 17:50:57 crc kubenswrapper[4837]: I0111 17:50:57.608897 4837 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Jan 11 17:50:57 crc kubenswrapper[4837]: I0111 17:50:57.608942 4837 scope.go:117] "RemoveContainer" containerID="d4b92aca9f3ae089900bb2a81527cb4de2fef3bbf8fcdb52f6873862f033ea68" Jan 11 17:50:57 crc kubenswrapper[4837]: I0111 17:50:57.610666 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-7755778889-fjtkp" event={"ID":"77a72881-5957-4b75-b490-0417214a39b0","Type":"ContainerDied","Data":"017a024d95cc6246ecd3c548904f78c3c70916b5beb921eaead52672bb8e7dcf"} Jan 11 17:50:57 crc kubenswrapper[4837]: I0111 17:50:57.610748 4837 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-7755778889-fjtkp" Jan 11 17:50:57 crc kubenswrapper[4837]: I0111 17:50:57.614663 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-wjdgb\" (UniqueName: \"kubernetes.io/projected/c8f257ff-6f30-439e-a45d-49ae8a65d059-kube-api-access-wjdgb\") pod \"c8f257ff-6f30-439e-a45d-49ae8a65d059\" (UID: \"c8f257ff-6f30-439e-a45d-49ae8a65d059\") " Jan 11 17:50:57 crc kubenswrapper[4837]: I0111 17:50:57.614753 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/77a72881-5957-4b75-b490-0417214a39b0-scripts\") pod \"77a72881-5957-4b75-b490-0417214a39b0\" (UID: \"77a72881-5957-4b75-b490-0417214a39b0\") " Jan 11 17:50:57 crc kubenswrapper[4837]: I0111 17:50:57.614785 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/77a72881-5957-4b75-b490-0417214a39b0-horizon-secret-key\") pod \"77a72881-5957-4b75-b490-0417214a39b0\" (UID: \"77a72881-5957-4b75-b490-0417214a39b0\") " Jan 11 17:50:57 crc kubenswrapper[4837]: I0111 17:50:57.614859 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c8f257ff-6f30-439e-a45d-49ae8a65d059-config-data\") pod \"c8f257ff-6f30-439e-a45d-49ae8a65d059\" (UID: \"c8f257ff-6f30-439e-a45d-49ae8a65d059\") " Jan 11 17:50:57 crc kubenswrapper[4837]: I0111 17:50:57.614920 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c8f257ff-6f30-439e-a45d-49ae8a65d059-combined-ca-bundle\") pod \"c8f257ff-6f30-439e-a45d-49ae8a65d059\" (UID: \"c8f257ff-6f30-439e-a45d-49ae8a65d059\") " Jan 11 17:50:57 crc kubenswrapper[4837]: I0111 17:50:57.614950 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/77a72881-5957-4b75-b490-0417214a39b0-logs\") pod \"77a72881-5957-4b75-b490-0417214a39b0\" (UID: \"77a72881-5957-4b75-b490-0417214a39b0\") " Jan 11 17:50:57 crc kubenswrapper[4837]: I0111 17:50:57.614979 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/c8f257ff-6f30-439e-a45d-49ae8a65d059-logs\") pod \"c8f257ff-6f30-439e-a45d-49ae8a65d059\" (UID: \"c8f257ff-6f30-439e-a45d-49ae8a65d059\") " Jan 11 17:50:57 crc kubenswrapper[4837]: I0111 17:50:57.615012 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-bdppn\" (UniqueName: \"kubernetes.io/projected/77a72881-5957-4b75-b490-0417214a39b0-kube-api-access-bdppn\") pod \"77a72881-5957-4b75-b490-0417214a39b0\" (UID: \"77a72881-5957-4b75-b490-0417214a39b0\") " Jan 11 17:50:57 crc kubenswrapper[4837]: I0111 17:50:57.615046 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"glance\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") pod \"c8f257ff-6f30-439e-a45d-49ae8a65d059\" (UID: \"c8f257ff-6f30-439e-a45d-49ae8a65d059\") " Jan 11 17:50:57 crc kubenswrapper[4837]: I0111 17:50:57.615094 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/c8f257ff-6f30-439e-a45d-49ae8a65d059-httpd-run\") pod \"c8f257ff-6f30-439e-a45d-49ae8a65d059\" (UID: \"c8f257ff-6f30-439e-a45d-49ae8a65d059\") " Jan 11 17:50:57 crc kubenswrapper[4837]: I0111 17:50:57.615122 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/c8f257ff-6f30-439e-a45d-49ae8a65d059-scripts\") pod \"c8f257ff-6f30-439e-a45d-49ae8a65d059\" (UID: \"c8f257ff-6f30-439e-a45d-49ae8a65d059\") " Jan 11 17:50:57 crc kubenswrapper[4837]: I0111 17:50:57.615152 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/77a72881-5957-4b75-b490-0417214a39b0-config-data\") pod \"77a72881-5957-4b75-b490-0417214a39b0\" (UID: \"77a72881-5957-4b75-b490-0417214a39b0\") " Jan 11 17:50:57 crc kubenswrapper[4837]: I0111 17:50:57.615433 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/c8f257ff-6f30-439e-a45d-49ae8a65d059-logs" (OuterVolumeSpecName: "logs") pod "c8f257ff-6f30-439e-a45d-49ae8a65d059" (UID: "c8f257ff-6f30-439e-a45d-49ae8a65d059"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 11 17:50:57 crc kubenswrapper[4837]: I0111 17:50:57.615694 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/77a72881-5957-4b75-b490-0417214a39b0-logs" (OuterVolumeSpecName: "logs") pod "77a72881-5957-4b75-b490-0417214a39b0" (UID: "77a72881-5957-4b75-b490-0417214a39b0"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 11 17:50:57 crc kubenswrapper[4837]: I0111 17:50:57.615737 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/c8f257ff-6f30-439e-a45d-49ae8a65d059-httpd-run" (OuterVolumeSpecName: "httpd-run") pod "c8f257ff-6f30-439e-a45d-49ae8a65d059" (UID: "c8f257ff-6f30-439e-a45d-49ae8a65d059"). InnerVolumeSpecName "httpd-run". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 11 17:50:57 crc kubenswrapper[4837]: I0111 17:50:57.616269 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/77a72881-5957-4b75-b490-0417214a39b0-scripts" (OuterVolumeSpecName: "scripts") pod "77a72881-5957-4b75-b490-0417214a39b0" (UID: "77a72881-5957-4b75-b490-0417214a39b0"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 11 17:50:57 crc kubenswrapper[4837]: I0111 17:50:57.616300 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/77a72881-5957-4b75-b490-0417214a39b0-config-data" (OuterVolumeSpecName: "config-data") pod "77a72881-5957-4b75-b490-0417214a39b0" (UID: "77a72881-5957-4b75-b490-0417214a39b0"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 11 17:50:57 crc kubenswrapper[4837]: I0111 17:50:57.620615 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage01-crc" (OuterVolumeSpecName: "glance") pod "c8f257ff-6f30-439e-a45d-49ae8a65d059" (UID: "c8f257ff-6f30-439e-a45d-49ae8a65d059"). InnerVolumeSpecName "local-storage01-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Jan 11 17:50:57 crc kubenswrapper[4837]: I0111 17:50:57.621919 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c8f257ff-6f30-439e-a45d-49ae8a65d059-scripts" (OuterVolumeSpecName: "scripts") pod "c8f257ff-6f30-439e-a45d-49ae8a65d059" (UID: "c8f257ff-6f30-439e-a45d-49ae8a65d059"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 11 17:50:57 crc kubenswrapper[4837]: I0111 17:50:57.622391 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/77a72881-5957-4b75-b490-0417214a39b0-kube-api-access-bdppn" (OuterVolumeSpecName: "kube-api-access-bdppn") pod "77a72881-5957-4b75-b490-0417214a39b0" (UID: "77a72881-5957-4b75-b490-0417214a39b0"). InnerVolumeSpecName "kube-api-access-bdppn". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 11 17:50:57 crc kubenswrapper[4837]: I0111 17:50:57.622429 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/77a72881-5957-4b75-b490-0417214a39b0-horizon-secret-key" (OuterVolumeSpecName: "horizon-secret-key") pod "77a72881-5957-4b75-b490-0417214a39b0" (UID: "77a72881-5957-4b75-b490-0417214a39b0"). InnerVolumeSpecName "horizon-secret-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 11 17:50:57 crc kubenswrapper[4837]: I0111 17:50:57.622909 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c8f257ff-6f30-439e-a45d-49ae8a65d059-kube-api-access-wjdgb" (OuterVolumeSpecName: "kube-api-access-wjdgb") pod "c8f257ff-6f30-439e-a45d-49ae8a65d059" (UID: "c8f257ff-6f30-439e-a45d-49ae8a65d059"). InnerVolumeSpecName "kube-api-access-wjdgb". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 11 17:50:57 crc kubenswrapper[4837]: I0111 17:50:57.655319 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c8f257ff-6f30-439e-a45d-49ae8a65d059-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "c8f257ff-6f30-439e-a45d-49ae8a65d059" (UID: "c8f257ff-6f30-439e-a45d-49ae8a65d059"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 11 17:50:57 crc kubenswrapper[4837]: I0111 17:50:57.672455 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c8f257ff-6f30-439e-a45d-49ae8a65d059-config-data" (OuterVolumeSpecName: "config-data") pod "c8f257ff-6f30-439e-a45d-49ae8a65d059" (UID: "c8f257ff-6f30-439e-a45d-49ae8a65d059"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 11 17:50:57 crc kubenswrapper[4837]: I0111 17:50:57.717439 4837 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-bdppn\" (UniqueName: \"kubernetes.io/projected/77a72881-5957-4b75-b490-0417214a39b0-kube-api-access-bdppn\") on node \"crc\" DevicePath \"\"" Jan 11 17:50:57 crc kubenswrapper[4837]: I0111 17:50:57.717491 4837 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage01-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") on node \"crc\" " Jan 11 17:50:57 crc kubenswrapper[4837]: I0111 17:50:57.717501 4837 reconciler_common.go:293] "Volume detached for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/c8f257ff-6f30-439e-a45d-49ae8a65d059-httpd-run\") on node \"crc\" DevicePath \"\"" Jan 11 17:50:57 crc kubenswrapper[4837]: I0111 17:50:57.717512 4837 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/c8f257ff-6f30-439e-a45d-49ae8a65d059-scripts\") on node \"crc\" DevicePath \"\"" Jan 11 17:50:57 crc kubenswrapper[4837]: I0111 17:50:57.717521 4837 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/77a72881-5957-4b75-b490-0417214a39b0-config-data\") on node \"crc\" DevicePath \"\"" Jan 11 17:50:57 crc kubenswrapper[4837]: I0111 17:50:57.717529 4837 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-wjdgb\" (UniqueName: \"kubernetes.io/projected/c8f257ff-6f30-439e-a45d-49ae8a65d059-kube-api-access-wjdgb\") on node \"crc\" DevicePath \"\"" Jan 11 17:50:57 crc kubenswrapper[4837]: I0111 17:50:57.717537 4837 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/77a72881-5957-4b75-b490-0417214a39b0-scripts\") on node \"crc\" DevicePath \"\"" Jan 11 17:50:57 crc kubenswrapper[4837]: I0111 17:50:57.717545 4837 reconciler_common.go:293] "Volume detached for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/77a72881-5957-4b75-b490-0417214a39b0-horizon-secret-key\") on node \"crc\" DevicePath \"\"" Jan 11 17:50:57 crc kubenswrapper[4837]: I0111 17:50:57.717556 4837 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c8f257ff-6f30-439e-a45d-49ae8a65d059-config-data\") on node \"crc\" DevicePath \"\"" Jan 11 17:50:57 crc kubenswrapper[4837]: I0111 17:50:57.717564 4837 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c8f257ff-6f30-439e-a45d-49ae8a65d059-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 11 17:50:57 crc kubenswrapper[4837]: I0111 17:50:57.717572 4837 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/77a72881-5957-4b75-b490-0417214a39b0-logs\") on node \"crc\" DevicePath \"\"" Jan 11 17:50:57 crc kubenswrapper[4837]: I0111 17:50:57.717580 4837 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/c8f257ff-6f30-439e-a45d-49ae8a65d059-logs\") on node \"crc\" DevicePath \"\"" Jan 11 17:50:57 crc kubenswrapper[4837]: I0111 17:50:57.735560 4837 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage01-crc" (UniqueName: "kubernetes.io/local-volume/local-storage01-crc") on node "crc" Jan 11 17:50:57 crc kubenswrapper[4837]: I0111 17:50:57.818846 4837 reconciler_common.go:293] "Volume detached for volume \"local-storage01-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") on node \"crc\" DevicePath \"\"" Jan 11 17:50:57 crc kubenswrapper[4837]: E0111 17:50:57.925782 4837 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/podified-antelope-centos9/openstack-barbican-api:current-podified" Jan 11 17:50:57 crc kubenswrapper[4837]: E0111 17:50:57.926014 4837 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:barbican-db-sync,Image:quay.io/podified-antelope-centos9/openstack-barbican-api:current-podified,Command:[/bin/bash],Args:[-c barbican-manage db upgrade],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:KOLLA_BOOTSTRAP,Value:TRUE,ValueFrom:nil,},EnvVar{Name:KOLLA_CONFIG_STRATEGY,Value:COPY_ALWAYS,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:db-sync-config-data,ReadOnly:true,MountPath:/etc/barbican/barbican.conf.d,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:combined-ca-bundle,ReadOnly:true,MountPath:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem,SubPath:tls-ca-bundle.pem,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-rbsr2,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*42403,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:nil,RunAsGroup:*42403,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod barbican-db-sync-6mdwl_openstack(5930b460-1c65-4c06-a3bc-f6d6f0518110): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Jan 11 17:50:57 crc kubenswrapper[4837]: E0111 17:50:57.928379 4837 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"barbican-db-sync\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack/barbican-db-sync-6mdwl" podUID="5930b460-1c65-4c06-a3bc-f6d6f0518110" Jan 11 17:50:57 crc kubenswrapper[4837]: I0111 17:50:57.951389 4837 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-internal-api-0"] Jan 11 17:50:57 crc kubenswrapper[4837]: I0111 17:50:57.969326 4837 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glance-default-internal-api-0"] Jan 11 17:50:57 crc kubenswrapper[4837]: I0111 17:50:57.976494 4837 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-default-internal-api-0"] Jan 11 17:50:57 crc kubenswrapper[4837]: E0111 17:50:57.977147 4837 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c8f257ff-6f30-439e-a45d-49ae8a65d059" containerName="glance-httpd" Jan 11 17:50:57 crc kubenswrapper[4837]: I0111 17:50:57.977170 4837 state_mem.go:107] "Deleted CPUSet assignment" podUID="c8f257ff-6f30-439e-a45d-49ae8a65d059" containerName="glance-httpd" Jan 11 17:50:57 crc kubenswrapper[4837]: E0111 17:50:57.977184 4837 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c8f257ff-6f30-439e-a45d-49ae8a65d059" containerName="glance-log" Jan 11 17:50:57 crc kubenswrapper[4837]: I0111 17:50:57.977194 4837 state_mem.go:107] "Deleted CPUSet assignment" podUID="c8f257ff-6f30-439e-a45d-49ae8a65d059" containerName="glance-log" Jan 11 17:50:57 crc kubenswrapper[4837]: I0111 17:50:57.977419 4837 memory_manager.go:354] "RemoveStaleState removing state" podUID="c8f257ff-6f30-439e-a45d-49ae8a65d059" containerName="glance-httpd" Jan 11 17:50:57 crc kubenswrapper[4837]: I0111 17:50:57.977451 4837 memory_manager.go:354] "RemoveStaleState removing state" podUID="c8f257ff-6f30-439e-a45d-49ae8a65d059" containerName="glance-log" Jan 11 17:50:57 crc kubenswrapper[4837]: I0111 17:50:57.978642 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Jan 11 17:50:57 crc kubenswrapper[4837]: I0111 17:50:57.985209 4837 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-internal-api-0"] Jan 11 17:50:57 crc kubenswrapper[4837]: I0111 17:50:57.992441 4837 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-glance-default-internal-svc" Jan 11 17:50:57 crc kubenswrapper[4837]: I0111 17:50:57.994506 4837 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-default-internal-config-data" Jan 11 17:50:58 crc kubenswrapper[4837]: I0111 17:50:58.002487 4837 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/horizon-7755778889-fjtkp"] Jan 11 17:50:58 crc kubenswrapper[4837]: I0111 17:50:58.031470 4837 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/horizon-7755778889-fjtkp"] Jan 11 17:50:58 crc kubenswrapper[4837]: I0111 17:50:58.128878 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/979d7c48-3688-478f-bb46-d78b535b84dc-config-data\") pod \"glance-default-internal-api-0\" (UID: \"979d7c48-3688-478f-bb46-d78b535b84dc\") " pod="openstack/glance-default-internal-api-0" Jan 11 17:50:58 crc kubenswrapper[4837]: I0111 17:50:58.128941 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/979d7c48-3688-478f-bb46-d78b535b84dc-scripts\") pod \"glance-default-internal-api-0\" (UID: \"979d7c48-3688-478f-bb46-d78b535b84dc\") " pod="openstack/glance-default-internal-api-0" Jan 11 17:50:58 crc kubenswrapper[4837]: I0111 17:50:58.128972 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7grhl\" (UniqueName: \"kubernetes.io/projected/979d7c48-3688-478f-bb46-d78b535b84dc-kube-api-access-7grhl\") pod \"glance-default-internal-api-0\" (UID: \"979d7c48-3688-478f-bb46-d78b535b84dc\") " pod="openstack/glance-default-internal-api-0" Jan 11 17:50:58 crc kubenswrapper[4837]: I0111 17:50:58.129042 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/979d7c48-3688-478f-bb46-d78b535b84dc-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"979d7c48-3688-478f-bb46-d78b535b84dc\") " pod="openstack/glance-default-internal-api-0" Jan 11 17:50:58 crc kubenswrapper[4837]: I0111 17:50:58.129074 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/979d7c48-3688-478f-bb46-d78b535b84dc-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"979d7c48-3688-478f-bb46-d78b535b84dc\") " pod="openstack/glance-default-internal-api-0" Jan 11 17:50:58 crc kubenswrapper[4837]: I0111 17:50:58.129383 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage01-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") pod \"glance-default-internal-api-0\" (UID: \"979d7c48-3688-478f-bb46-d78b535b84dc\") " pod="openstack/glance-default-internal-api-0" Jan 11 17:50:58 crc kubenswrapper[4837]: I0111 17:50:58.129481 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/979d7c48-3688-478f-bb46-d78b535b84dc-internal-tls-certs\") pod \"glance-default-internal-api-0\" (UID: \"979d7c48-3688-478f-bb46-d78b535b84dc\") " pod="openstack/glance-default-internal-api-0" Jan 11 17:50:58 crc kubenswrapper[4837]: I0111 17:50:58.129543 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/979d7c48-3688-478f-bb46-d78b535b84dc-logs\") pod \"glance-default-internal-api-0\" (UID: \"979d7c48-3688-478f-bb46-d78b535b84dc\") " pod="openstack/glance-default-internal-api-0" Jan 11 17:50:58 crc kubenswrapper[4837]: I0111 17:50:58.230738 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/979d7c48-3688-478f-bb46-d78b535b84dc-scripts\") pod \"glance-default-internal-api-0\" (UID: \"979d7c48-3688-478f-bb46-d78b535b84dc\") " pod="openstack/glance-default-internal-api-0" Jan 11 17:50:58 crc kubenswrapper[4837]: I0111 17:50:58.230793 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7grhl\" (UniqueName: \"kubernetes.io/projected/979d7c48-3688-478f-bb46-d78b535b84dc-kube-api-access-7grhl\") pod \"glance-default-internal-api-0\" (UID: \"979d7c48-3688-478f-bb46-d78b535b84dc\") " pod="openstack/glance-default-internal-api-0" Jan 11 17:50:58 crc kubenswrapper[4837]: I0111 17:50:58.230848 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/979d7c48-3688-478f-bb46-d78b535b84dc-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"979d7c48-3688-478f-bb46-d78b535b84dc\") " pod="openstack/glance-default-internal-api-0" Jan 11 17:50:58 crc kubenswrapper[4837]: I0111 17:50:58.230880 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/979d7c48-3688-478f-bb46-d78b535b84dc-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"979d7c48-3688-478f-bb46-d78b535b84dc\") " pod="openstack/glance-default-internal-api-0" Jan 11 17:50:58 crc kubenswrapper[4837]: I0111 17:50:58.230956 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage01-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") pod \"glance-default-internal-api-0\" (UID: \"979d7c48-3688-478f-bb46-d78b535b84dc\") " pod="openstack/glance-default-internal-api-0" Jan 11 17:50:58 crc kubenswrapper[4837]: I0111 17:50:58.230993 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/979d7c48-3688-478f-bb46-d78b535b84dc-internal-tls-certs\") pod \"glance-default-internal-api-0\" (UID: \"979d7c48-3688-478f-bb46-d78b535b84dc\") " pod="openstack/glance-default-internal-api-0" Jan 11 17:50:58 crc kubenswrapper[4837]: I0111 17:50:58.231016 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/979d7c48-3688-478f-bb46-d78b535b84dc-logs\") pod \"glance-default-internal-api-0\" (UID: \"979d7c48-3688-478f-bb46-d78b535b84dc\") " pod="openstack/glance-default-internal-api-0" Jan 11 17:50:58 crc kubenswrapper[4837]: I0111 17:50:58.231058 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/979d7c48-3688-478f-bb46-d78b535b84dc-config-data\") pod \"glance-default-internal-api-0\" (UID: \"979d7c48-3688-478f-bb46-d78b535b84dc\") " pod="openstack/glance-default-internal-api-0" Jan 11 17:50:58 crc kubenswrapper[4837]: I0111 17:50:58.234154 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/979d7c48-3688-478f-bb46-d78b535b84dc-logs\") pod \"glance-default-internal-api-0\" (UID: \"979d7c48-3688-478f-bb46-d78b535b84dc\") " pod="openstack/glance-default-internal-api-0" Jan 11 17:50:58 crc kubenswrapper[4837]: I0111 17:50:58.234325 4837 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage01-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") pod \"glance-default-internal-api-0\" (UID: \"979d7c48-3688-478f-bb46-d78b535b84dc\") device mount path \"/mnt/openstack/pv01\"" pod="openstack/glance-default-internal-api-0" Jan 11 17:50:58 crc kubenswrapper[4837]: I0111 17:50:58.236125 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/979d7c48-3688-478f-bb46-d78b535b84dc-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"979d7c48-3688-478f-bb46-d78b535b84dc\") " pod="openstack/glance-default-internal-api-0" Jan 11 17:50:58 crc kubenswrapper[4837]: I0111 17:50:58.236233 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/979d7c48-3688-478f-bb46-d78b535b84dc-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"979d7c48-3688-478f-bb46-d78b535b84dc\") " pod="openstack/glance-default-internal-api-0" Jan 11 17:50:58 crc kubenswrapper[4837]: I0111 17:50:58.238098 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/979d7c48-3688-478f-bb46-d78b535b84dc-config-data\") pod \"glance-default-internal-api-0\" (UID: \"979d7c48-3688-478f-bb46-d78b535b84dc\") " pod="openstack/glance-default-internal-api-0" Jan 11 17:50:58 crc kubenswrapper[4837]: I0111 17:50:58.242016 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/979d7c48-3688-478f-bb46-d78b535b84dc-scripts\") pod \"glance-default-internal-api-0\" (UID: \"979d7c48-3688-478f-bb46-d78b535b84dc\") " pod="openstack/glance-default-internal-api-0" Jan 11 17:50:58 crc kubenswrapper[4837]: I0111 17:50:58.257260 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/979d7c48-3688-478f-bb46-d78b535b84dc-internal-tls-certs\") pod \"glance-default-internal-api-0\" (UID: \"979d7c48-3688-478f-bb46-d78b535b84dc\") " pod="openstack/glance-default-internal-api-0" Jan 11 17:50:58 crc kubenswrapper[4837]: I0111 17:50:58.262584 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7grhl\" (UniqueName: \"kubernetes.io/projected/979d7c48-3688-478f-bb46-d78b535b84dc-kube-api-access-7grhl\") pod \"glance-default-internal-api-0\" (UID: \"979d7c48-3688-478f-bb46-d78b535b84dc\") " pod="openstack/glance-default-internal-api-0" Jan 11 17:50:58 crc kubenswrapper[4837]: I0111 17:50:58.265525 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage01-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") pod \"glance-default-internal-api-0\" (UID: \"979d7c48-3688-478f-bb46-d78b535b84dc\") " pod="openstack/glance-default-internal-api-0" Jan 11 17:50:58 crc kubenswrapper[4837]: I0111 17:50:58.311650 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Jan 11 17:50:58 crc kubenswrapper[4837]: I0111 17:50:58.373567 4837 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="77a72881-5957-4b75-b490-0417214a39b0" path="/var/lib/kubelet/pods/77a72881-5957-4b75-b490-0417214a39b0/volumes" Jan 11 17:50:58 crc kubenswrapper[4837]: I0111 17:50:58.374225 4837 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c8f257ff-6f30-439e-a45d-49ae8a65d059" path="/var/lib/kubelet/pods/c8f257ff-6f30-439e-a45d-49ae8a65d059/volumes" Jan 11 17:50:58 crc kubenswrapper[4837]: E0111 17:50:58.454177 4837 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/podified-antelope-centos9/openstack-ceilometer-central:current-podified" Jan 11 17:50:58 crc kubenswrapper[4837]: E0111 17:50:58.454332 4837 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:ceilometer-central-agent,Image:quay.io/podified-antelope-centos9/openstack-ceilometer-central:current-podified,Command:[/bin/bash],Args:[-c /usr/local/bin/kolla_start],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:CONFIG_HASH,Value:n8fh5bh649h594h58hbchc5h5b5h55fh5c6h67bh5d8h679hc4h687h575h5fdh54bh5ddh9hfbh5b7hc5h664hcdhbh576h64bh5cch66fh677h5dcq,ValueFrom:nil,},EnvVar{Name:KOLLA_CONFIG_STRATEGY,Value:COPY_ALWAYS,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:scripts,ReadOnly:true,MountPath:/var/lib/openstack/bin,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:config-data,ReadOnly:true,MountPath:/var/lib/openstack/config,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:config-data,ReadOnly:true,MountPath:/var/lib/kolla/config_files/config.json,SubPath:ceilometer-central-config.json,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:combined-ca-bundle,ReadOnly:true,MountPath:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem,SubPath:tls-ca-bundle.pem,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-k964n,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:&ExecAction{Command:[/usr/bin/python3 /var/lib/openstack/bin/centralhealth.py],},HTTPGet:nil,TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:300,TimeoutSeconds:5,PeriodSeconds:5,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:nil,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod ceilometer-0_openstack(5ec0beaf-de63-407f-8d18-46738023ab11): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Jan 11 17:50:58 crc kubenswrapper[4837]: I0111 17:50:58.542081 4837 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-774fb4c689-s8tf9" Jan 11 17:50:58 crc kubenswrapper[4837]: I0111 17:50:58.553366 4837 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-686c95d4d9-ql2xj" Jan 11 17:50:58 crc kubenswrapper[4837]: I0111 17:50:58.621280 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-686c95d4d9-ql2xj" event={"ID":"3cda1c41-2167-487e-951e-cd87cf9f6ab1","Type":"ContainerDied","Data":"ec076891666ff483736fc5aaf7973fa2b00fd3c0f009a714f8fcb822c19d46f7"} Jan 11 17:50:58 crc kubenswrapper[4837]: I0111 17:50:58.621295 4837 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-686c95d4d9-ql2xj" Jan 11 17:50:58 crc kubenswrapper[4837]: I0111 17:50:58.623080 4837 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-774fb4c689-s8tf9" Jan 11 17:50:58 crc kubenswrapper[4837]: I0111 17:50:58.623089 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-774fb4c689-s8tf9" event={"ID":"de0a2bd6-2144-4c5c-91ab-a5fde49c9808","Type":"ContainerDied","Data":"73351a7db683ab2a22f755d6abd69634d726c38cf0296db896a17b6b15b98f0f"} Jan 11 17:50:58 crc kubenswrapper[4837]: E0111 17:50:58.624372 4837 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"barbican-db-sync\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/podified-antelope-centos9/openstack-barbican-api:current-podified\\\"\"" pod="openstack/barbican-db-sync-6mdwl" podUID="5930b460-1c65-4c06-a3bc-f6d6f0518110" Jan 11 17:50:58 crc kubenswrapper[4837]: I0111 17:50:58.637296 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/3cda1c41-2167-487e-951e-cd87cf9f6ab1-config-data\") pod \"3cda1c41-2167-487e-951e-cd87cf9f6ab1\" (UID: \"3cda1c41-2167-487e-951e-cd87cf9f6ab1\") " Jan 11 17:50:58 crc kubenswrapper[4837]: I0111 17:50:58.637442 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/de0a2bd6-2144-4c5c-91ab-a5fde49c9808-logs\") pod \"de0a2bd6-2144-4c5c-91ab-a5fde49c9808\" (UID: \"de0a2bd6-2144-4c5c-91ab-a5fde49c9808\") " Jan 11 17:50:58 crc kubenswrapper[4837]: I0111 17:50:58.637492 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/de0a2bd6-2144-4c5c-91ab-a5fde49c9808-horizon-secret-key\") pod \"de0a2bd6-2144-4c5c-91ab-a5fde49c9808\" (UID: \"de0a2bd6-2144-4c5c-91ab-a5fde49c9808\") " Jan 11 17:50:58 crc kubenswrapper[4837]: I0111 17:50:58.637562 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/3cda1c41-2167-487e-951e-cd87cf9f6ab1-scripts\") pod \"3cda1c41-2167-487e-951e-cd87cf9f6ab1\" (UID: \"3cda1c41-2167-487e-951e-cd87cf9f6ab1\") " Jan 11 17:50:58 crc kubenswrapper[4837]: I0111 17:50:58.637644 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fxrxn\" (UniqueName: \"kubernetes.io/projected/3cda1c41-2167-487e-951e-cd87cf9f6ab1-kube-api-access-fxrxn\") pod \"3cda1c41-2167-487e-951e-cd87cf9f6ab1\" (UID: \"3cda1c41-2167-487e-951e-cd87cf9f6ab1\") " Jan 11 17:50:58 crc kubenswrapper[4837]: I0111 17:50:58.637714 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/de0a2bd6-2144-4c5c-91ab-a5fde49c9808-scripts\") pod \"de0a2bd6-2144-4c5c-91ab-a5fde49c9808\" (UID: \"de0a2bd6-2144-4c5c-91ab-a5fde49c9808\") " Jan 11 17:50:58 crc kubenswrapper[4837]: I0111 17:50:58.637739 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-djtfv\" (UniqueName: \"kubernetes.io/projected/de0a2bd6-2144-4c5c-91ab-a5fde49c9808-kube-api-access-djtfv\") pod \"de0a2bd6-2144-4c5c-91ab-a5fde49c9808\" (UID: \"de0a2bd6-2144-4c5c-91ab-a5fde49c9808\") " Jan 11 17:50:58 crc kubenswrapper[4837]: I0111 17:50:58.637914 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/de0a2bd6-2144-4c5c-91ab-a5fde49c9808-config-data\") pod \"de0a2bd6-2144-4c5c-91ab-a5fde49c9808\" (UID: \"de0a2bd6-2144-4c5c-91ab-a5fde49c9808\") " Jan 11 17:50:58 crc kubenswrapper[4837]: I0111 17:50:58.637956 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/3cda1c41-2167-487e-951e-cd87cf9f6ab1-logs\") pod \"3cda1c41-2167-487e-951e-cd87cf9f6ab1\" (UID: \"3cda1c41-2167-487e-951e-cd87cf9f6ab1\") " Jan 11 17:50:58 crc kubenswrapper[4837]: I0111 17:50:58.637979 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/3cda1c41-2167-487e-951e-cd87cf9f6ab1-horizon-secret-key\") pod \"3cda1c41-2167-487e-951e-cd87cf9f6ab1\" (UID: \"3cda1c41-2167-487e-951e-cd87cf9f6ab1\") " Jan 11 17:50:58 crc kubenswrapper[4837]: I0111 17:50:58.639518 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/3cda1c41-2167-487e-951e-cd87cf9f6ab1-logs" (OuterVolumeSpecName: "logs") pod "3cda1c41-2167-487e-951e-cd87cf9f6ab1" (UID: "3cda1c41-2167-487e-951e-cd87cf9f6ab1"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 11 17:50:58 crc kubenswrapper[4837]: I0111 17:50:58.639785 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/3cda1c41-2167-487e-951e-cd87cf9f6ab1-scripts" (OuterVolumeSpecName: "scripts") pod "3cda1c41-2167-487e-951e-cd87cf9f6ab1" (UID: "3cda1c41-2167-487e-951e-cd87cf9f6ab1"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 11 17:50:58 crc kubenswrapper[4837]: I0111 17:50:58.642302 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/de0a2bd6-2144-4c5c-91ab-a5fde49c9808-config-data" (OuterVolumeSpecName: "config-data") pod "de0a2bd6-2144-4c5c-91ab-a5fde49c9808" (UID: "de0a2bd6-2144-4c5c-91ab-a5fde49c9808"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 11 17:50:58 crc kubenswrapper[4837]: I0111 17:50:58.643840 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/3cda1c41-2167-487e-951e-cd87cf9f6ab1-config-data" (OuterVolumeSpecName: "config-data") pod "3cda1c41-2167-487e-951e-cd87cf9f6ab1" (UID: "3cda1c41-2167-487e-951e-cd87cf9f6ab1"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 11 17:50:58 crc kubenswrapper[4837]: I0111 17:50:58.643946 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/de0a2bd6-2144-4c5c-91ab-a5fde49c9808-scripts" (OuterVolumeSpecName: "scripts") pod "de0a2bd6-2144-4c5c-91ab-a5fde49c9808" (UID: "de0a2bd6-2144-4c5c-91ab-a5fde49c9808"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 11 17:50:58 crc kubenswrapper[4837]: I0111 17:50:58.644176 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/de0a2bd6-2144-4c5c-91ab-a5fde49c9808-logs" (OuterVolumeSpecName: "logs") pod "de0a2bd6-2144-4c5c-91ab-a5fde49c9808" (UID: "de0a2bd6-2144-4c5c-91ab-a5fde49c9808"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 11 17:50:58 crc kubenswrapper[4837]: I0111 17:50:58.644362 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/de0a2bd6-2144-4c5c-91ab-a5fde49c9808-horizon-secret-key" (OuterVolumeSpecName: "horizon-secret-key") pod "de0a2bd6-2144-4c5c-91ab-a5fde49c9808" (UID: "de0a2bd6-2144-4c5c-91ab-a5fde49c9808"). InnerVolumeSpecName "horizon-secret-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 11 17:50:58 crc kubenswrapper[4837]: I0111 17:50:58.645097 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3cda1c41-2167-487e-951e-cd87cf9f6ab1-kube-api-access-fxrxn" (OuterVolumeSpecName: "kube-api-access-fxrxn") pod "3cda1c41-2167-487e-951e-cd87cf9f6ab1" (UID: "3cda1c41-2167-487e-951e-cd87cf9f6ab1"). InnerVolumeSpecName "kube-api-access-fxrxn". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 11 17:50:58 crc kubenswrapper[4837]: I0111 17:50:58.645519 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/de0a2bd6-2144-4c5c-91ab-a5fde49c9808-kube-api-access-djtfv" (OuterVolumeSpecName: "kube-api-access-djtfv") pod "de0a2bd6-2144-4c5c-91ab-a5fde49c9808" (UID: "de0a2bd6-2144-4c5c-91ab-a5fde49c9808"). InnerVolumeSpecName "kube-api-access-djtfv". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 11 17:50:58 crc kubenswrapper[4837]: I0111 17:50:58.645919 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3cda1c41-2167-487e-951e-cd87cf9f6ab1-horizon-secret-key" (OuterVolumeSpecName: "horizon-secret-key") pod "3cda1c41-2167-487e-951e-cd87cf9f6ab1" (UID: "3cda1c41-2167-487e-951e-cd87cf9f6ab1"). InnerVolumeSpecName "horizon-secret-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 11 17:50:58 crc kubenswrapper[4837]: I0111 17:50:58.739919 4837 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/3cda1c41-2167-487e-951e-cd87cf9f6ab1-scripts\") on node \"crc\" DevicePath \"\"" Jan 11 17:50:58 crc kubenswrapper[4837]: I0111 17:50:58.739964 4837 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-fxrxn\" (UniqueName: \"kubernetes.io/projected/3cda1c41-2167-487e-951e-cd87cf9f6ab1-kube-api-access-fxrxn\") on node \"crc\" DevicePath \"\"" Jan 11 17:50:58 crc kubenswrapper[4837]: I0111 17:50:58.739985 4837 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-djtfv\" (UniqueName: \"kubernetes.io/projected/de0a2bd6-2144-4c5c-91ab-a5fde49c9808-kube-api-access-djtfv\") on node \"crc\" DevicePath \"\"" Jan 11 17:50:58 crc kubenswrapper[4837]: I0111 17:50:58.739995 4837 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/de0a2bd6-2144-4c5c-91ab-a5fde49c9808-scripts\") on node \"crc\" DevicePath \"\"" Jan 11 17:50:58 crc kubenswrapper[4837]: I0111 17:50:58.740003 4837 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/de0a2bd6-2144-4c5c-91ab-a5fde49c9808-config-data\") on node \"crc\" DevicePath \"\"" Jan 11 17:50:58 crc kubenswrapper[4837]: I0111 17:50:58.740014 4837 reconciler_common.go:293] "Volume detached for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/3cda1c41-2167-487e-951e-cd87cf9f6ab1-horizon-secret-key\") on node \"crc\" DevicePath \"\"" Jan 11 17:50:58 crc kubenswrapper[4837]: I0111 17:50:58.740021 4837 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/3cda1c41-2167-487e-951e-cd87cf9f6ab1-logs\") on node \"crc\" DevicePath \"\"" Jan 11 17:50:58 crc kubenswrapper[4837]: I0111 17:50:58.740029 4837 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/3cda1c41-2167-487e-951e-cd87cf9f6ab1-config-data\") on node \"crc\" DevicePath \"\"" Jan 11 17:50:58 crc kubenswrapper[4837]: I0111 17:50:58.740038 4837 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/de0a2bd6-2144-4c5c-91ab-a5fde49c9808-logs\") on node \"crc\" DevicePath \"\"" Jan 11 17:50:58 crc kubenswrapper[4837]: I0111 17:50:58.740047 4837 reconciler_common.go:293] "Volume detached for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/de0a2bd6-2144-4c5c-91ab-a5fde49c9808-horizon-secret-key\") on node \"crc\" DevicePath \"\"" Jan 11 17:50:58 crc kubenswrapper[4837]: I0111 17:50:58.999493 4837 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/horizon-686c95d4d9-ql2xj"] Jan 11 17:50:59 crc kubenswrapper[4837]: I0111 17:50:59.007071 4837 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/horizon-686c95d4d9-ql2xj"] Jan 11 17:50:59 crc kubenswrapper[4837]: I0111 17:50:59.022585 4837 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/horizon-774fb4c689-s8tf9"] Jan 11 17:50:59 crc kubenswrapper[4837]: I0111 17:50:59.030120 4837 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/horizon-774fb4c689-s8tf9"] Jan 11 17:51:00 crc kubenswrapper[4837]: I0111 17:51:00.377413 4837 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3cda1c41-2167-487e-951e-cd87cf9f6ab1" path="/var/lib/kubelet/pods/3cda1c41-2167-487e-951e-cd87cf9f6ab1/volumes" Jan 11 17:51:00 crc kubenswrapper[4837]: I0111 17:51:00.378619 4837 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="de0a2bd6-2144-4c5c-91ab-a5fde49c9808" path="/var/lib/kubelet/pods/de0a2bd6-2144-4c5c-91ab-a5fde49c9808/volumes" Jan 11 17:51:00 crc kubenswrapper[4837]: E0111 17:51:00.894589 4837 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/podified-antelope-centos9/openstack-cinder-api:current-podified" Jan 11 17:51:00 crc kubenswrapper[4837]: E0111 17:51:00.894934 4837 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:cinder-db-sync,Image:quay.io/podified-antelope-centos9/openstack-cinder-api:current-podified,Command:[/bin/bash],Args:[-c /usr/local/bin/kolla_set_configs && /usr/local/bin/kolla_start],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:KOLLA_BOOTSTRAP,Value:TRUE,ValueFrom:nil,},EnvVar{Name:KOLLA_CONFIG_STRATEGY,Value:COPY_ALWAYS,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:etc-machine-id,ReadOnly:true,MountPath:/etc/machine-id,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:scripts,ReadOnly:true,MountPath:/usr/local/bin/container-scripts,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:config-data,ReadOnly:true,MountPath:/var/lib/config-data/merged,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:config-data,ReadOnly:true,MountPath:/etc/my.cnf,SubPath:my.cnf,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:db-sync-config-data,ReadOnly:true,MountPath:/etc/cinder/cinder.conf.d,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:config-data,ReadOnly:true,MountPath:/var/lib/kolla/config_files/config.json,SubPath:db-sync-config.json,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:combined-ca-bundle,ReadOnly:true,MountPath:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem,SubPath:tls-ca-bundle.pem,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-7st6m,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:nil,Privileged:nil,SELinuxOptions:nil,RunAsUser:*0,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:nil,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod cinder-db-sync-l58f7_openstack(ecc4b7a3-4585-48ee-9cf3-caa58c9f5895): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Jan 11 17:51:00 crc kubenswrapper[4837]: E0111 17:51:00.896260 4837 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"cinder-db-sync\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack/cinder-db-sync-l58f7" podUID="ecc4b7a3-4585-48ee-9cf3-caa58c9f5895" Jan 11 17:51:00 crc kubenswrapper[4837]: I0111 17:51:00.912964 4837 scope.go:117] "RemoveContainer" containerID="884acb8d41d634b132ce34d258296280cf451e47107311023222d177f374b0ab" Jan 11 17:51:01 crc kubenswrapper[4837]: I0111 17:51:01.431609 4837 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/horizon-5d4fd56848-nmkm6"] Jan 11 17:51:01 crc kubenswrapper[4837]: I0111 17:51:01.527501 4837 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-bootstrap-7pd2m"] Jan 11 17:51:01 crc kubenswrapper[4837]: I0111 17:51:01.539433 4837 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/horizon-7f65cf99f6-zwzzs"] Jan 11 17:51:01 crc kubenswrapper[4837]: I0111 17:51:01.610910 4837 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-internal-api-0"] Jan 11 17:51:01 crc kubenswrapper[4837]: I0111 17:51:01.655220 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-db-sync-k7p7p" event={"ID":"c5803c46-a48f-4120-9010-51375caff2a5","Type":"ContainerStarted","Data":"48dffaa36d493d8933fc0c6d2c32e0493364340b235a0c14855cc42fb8164cb3"} Jan 11 17:51:01 crc kubenswrapper[4837]: I0111 17:51:01.657551 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-8b5c85b87-vkf4l" event={"ID":"62eadde8-9001-457f-b1e1-86f88c38054f","Type":"ContainerStarted","Data":"01b38fa03cd288d00d9cae5cf20c33763603aabff635cc6f247d9c7495659eed"} Jan 11 17:51:01 crc kubenswrapper[4837]: I0111 17:51:01.657730 4837 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-8b5c85b87-vkf4l" Jan 11 17:51:01 crc kubenswrapper[4837]: I0111 17:51:01.663689 4837 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-external-api-0" podUID="81f4fdee-d6ec-4ad2-8c8e-860791e29d29" containerName="glance-log" containerID="cri-o://1ee545563084fcd1474bc5c5f724cd899fa6f68bc74e4db8fc23fd271aecdb4d" gracePeriod=30 Jan 11 17:51:01 crc kubenswrapper[4837]: I0111 17:51:01.663904 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"81f4fdee-d6ec-4ad2-8c8e-860791e29d29","Type":"ContainerStarted","Data":"839b0def4ada7fcbae8706d56fd333b7ca38ae628c564640c8e50d09467f3b10"} Jan 11 17:51:01 crc kubenswrapper[4837]: I0111 17:51:01.663957 4837 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-external-api-0" podUID="81f4fdee-d6ec-4ad2-8c8e-860791e29d29" containerName="glance-httpd" containerID="cri-o://839b0def4ada7fcbae8706d56fd333b7ca38ae628c564640c8e50d09467f3b10" gracePeriod=30 Jan 11 17:51:01 crc kubenswrapper[4837]: E0111 17:51:01.664604 4837 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"cinder-db-sync\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/podified-antelope-centos9/openstack-cinder-api:current-podified\\\"\"" pod="openstack/cinder-db-sync-l58f7" podUID="ecc4b7a3-4585-48ee-9cf3-caa58c9f5895" Jan 11 17:51:01 crc kubenswrapper[4837]: I0111 17:51:01.677430 4837 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/placement-db-sync-k7p7p" podStartSLOduration=5.279794692 podStartE2EDuration="35.677415771s" podCreationTimestamp="2026-01-11 17:50:26 +0000 UTC" firstStartedPulling="2026-01-11 17:50:28.050618752 +0000 UTC m=+1202.228811458" lastFinishedPulling="2026-01-11 17:50:58.448239821 +0000 UTC m=+1232.626432537" observedRunningTime="2026-01-11 17:51:01.671540294 +0000 UTC m=+1235.849733000" watchObservedRunningTime="2026-01-11 17:51:01.677415771 +0000 UTC m=+1235.855608477" Jan 11 17:51:01 crc kubenswrapper[4837]: I0111 17:51:01.694360 4837 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/glance-default-external-api-0" podStartSLOduration=33.694339416 podStartE2EDuration="33.694339416s" podCreationTimestamp="2026-01-11 17:50:28 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-11 17:51:01.688966011 +0000 UTC m=+1235.867158717" watchObservedRunningTime="2026-01-11 17:51:01.694339416 +0000 UTC m=+1235.872532122" Jan 11 17:51:01 crc kubenswrapper[4837]: I0111 17:51:01.733900 4837 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-8b5c85b87-vkf4l" podStartSLOduration=33.733881528 podStartE2EDuration="33.733881528s" podCreationTimestamp="2026-01-11 17:50:28 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-11 17:51:01.730577179 +0000 UTC m=+1235.908769885" watchObservedRunningTime="2026-01-11 17:51:01.733881528 +0000 UTC m=+1235.912074234" Jan 11 17:51:01 crc kubenswrapper[4837]: W0111 17:51:01.753167 4837 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podca96b8a6_63fc_49fc_8898_6794c54e1676.slice/crio-72012f39c7fe583f2232e8629f2ef5ba52635c7ec6b9e778bcbfd7e9e255b3b6 WatchSource:0}: Error finding container 72012f39c7fe583f2232e8629f2ef5ba52635c7ec6b9e778bcbfd7e9e255b3b6: Status 404 returned error can't find the container with id 72012f39c7fe583f2232e8629f2ef5ba52635c7ec6b9e778bcbfd7e9e255b3b6 Jan 11 17:51:01 crc kubenswrapper[4837]: W0111 17:51:01.767988 4837 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podad90513d_7bd8_4407_af16_8d041440673f.slice/crio-665ff5b39579a6fbee058b204149942234ce2f5ba3c25f5be5f8661662034d49 WatchSource:0}: Error finding container 665ff5b39579a6fbee058b204149942234ce2f5ba3c25f5be5f8661662034d49: Status 404 returned error can't find the container with id 665ff5b39579a6fbee058b204149942234ce2f5ba3c25f5be5f8661662034d49 Jan 11 17:51:02 crc kubenswrapper[4837]: I0111 17:51:02.202859 4837 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Jan 11 17:51:02 crc kubenswrapper[4837]: I0111 17:51:02.319790 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/81f4fdee-d6ec-4ad2-8c8e-860791e29d29-config-data\") pod \"81f4fdee-d6ec-4ad2-8c8e-860791e29d29\" (UID: \"81f4fdee-d6ec-4ad2-8c8e-860791e29d29\") " Jan 11 17:51:02 crc kubenswrapper[4837]: I0111 17:51:02.319906 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/81f4fdee-d6ec-4ad2-8c8e-860791e29d29-logs\") pod \"81f4fdee-d6ec-4ad2-8c8e-860791e29d29\" (UID: \"81f4fdee-d6ec-4ad2-8c8e-860791e29d29\") " Jan 11 17:51:02 crc kubenswrapper[4837]: I0111 17:51:02.319940 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"glance\" (UniqueName: \"kubernetes.io/local-volume/local-storage12-crc\") pod \"81f4fdee-d6ec-4ad2-8c8e-860791e29d29\" (UID: \"81f4fdee-d6ec-4ad2-8c8e-860791e29d29\") " Jan 11 17:51:02 crc kubenswrapper[4837]: I0111 17:51:02.319963 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/81f4fdee-d6ec-4ad2-8c8e-860791e29d29-httpd-run\") pod \"81f4fdee-d6ec-4ad2-8c8e-860791e29d29\" (UID: \"81f4fdee-d6ec-4ad2-8c8e-860791e29d29\") " Jan 11 17:51:02 crc kubenswrapper[4837]: I0111 17:51:02.320042 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/81f4fdee-d6ec-4ad2-8c8e-860791e29d29-combined-ca-bundle\") pod \"81f4fdee-d6ec-4ad2-8c8e-860791e29d29\" (UID: \"81f4fdee-d6ec-4ad2-8c8e-860791e29d29\") " Jan 11 17:51:02 crc kubenswrapper[4837]: I0111 17:51:02.320065 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-bppsr\" (UniqueName: \"kubernetes.io/projected/81f4fdee-d6ec-4ad2-8c8e-860791e29d29-kube-api-access-bppsr\") pod \"81f4fdee-d6ec-4ad2-8c8e-860791e29d29\" (UID: \"81f4fdee-d6ec-4ad2-8c8e-860791e29d29\") " Jan 11 17:51:02 crc kubenswrapper[4837]: I0111 17:51:02.320123 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/81f4fdee-d6ec-4ad2-8c8e-860791e29d29-scripts\") pod \"81f4fdee-d6ec-4ad2-8c8e-860791e29d29\" (UID: \"81f4fdee-d6ec-4ad2-8c8e-860791e29d29\") " Jan 11 17:51:02 crc kubenswrapper[4837]: I0111 17:51:02.321696 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/81f4fdee-d6ec-4ad2-8c8e-860791e29d29-logs" (OuterVolumeSpecName: "logs") pod "81f4fdee-d6ec-4ad2-8c8e-860791e29d29" (UID: "81f4fdee-d6ec-4ad2-8c8e-860791e29d29"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 11 17:51:02 crc kubenswrapper[4837]: I0111 17:51:02.321752 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/81f4fdee-d6ec-4ad2-8c8e-860791e29d29-httpd-run" (OuterVolumeSpecName: "httpd-run") pod "81f4fdee-d6ec-4ad2-8c8e-860791e29d29" (UID: "81f4fdee-d6ec-4ad2-8c8e-860791e29d29"). InnerVolumeSpecName "httpd-run". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 11 17:51:02 crc kubenswrapper[4837]: I0111 17:51:02.327694 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/81f4fdee-d6ec-4ad2-8c8e-860791e29d29-kube-api-access-bppsr" (OuterVolumeSpecName: "kube-api-access-bppsr") pod "81f4fdee-d6ec-4ad2-8c8e-860791e29d29" (UID: "81f4fdee-d6ec-4ad2-8c8e-860791e29d29"). InnerVolumeSpecName "kube-api-access-bppsr". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 11 17:51:02 crc kubenswrapper[4837]: I0111 17:51:02.329744 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage12-crc" (OuterVolumeSpecName: "glance") pod "81f4fdee-d6ec-4ad2-8c8e-860791e29d29" (UID: "81f4fdee-d6ec-4ad2-8c8e-860791e29d29"). InnerVolumeSpecName "local-storage12-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Jan 11 17:51:02 crc kubenswrapper[4837]: I0111 17:51:02.329928 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/81f4fdee-d6ec-4ad2-8c8e-860791e29d29-scripts" (OuterVolumeSpecName: "scripts") pod "81f4fdee-d6ec-4ad2-8c8e-860791e29d29" (UID: "81f4fdee-d6ec-4ad2-8c8e-860791e29d29"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 11 17:51:02 crc kubenswrapper[4837]: I0111 17:51:02.361792 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/81f4fdee-d6ec-4ad2-8c8e-860791e29d29-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "81f4fdee-d6ec-4ad2-8c8e-860791e29d29" (UID: "81f4fdee-d6ec-4ad2-8c8e-860791e29d29"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 11 17:51:02 crc kubenswrapper[4837]: I0111 17:51:02.397814 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/81f4fdee-d6ec-4ad2-8c8e-860791e29d29-config-data" (OuterVolumeSpecName: "config-data") pod "81f4fdee-d6ec-4ad2-8c8e-860791e29d29" (UID: "81f4fdee-d6ec-4ad2-8c8e-860791e29d29"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 11 17:51:02 crc kubenswrapper[4837]: I0111 17:51:02.422186 4837 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/81f4fdee-d6ec-4ad2-8c8e-860791e29d29-config-data\") on node \"crc\" DevicePath \"\"" Jan 11 17:51:02 crc kubenswrapper[4837]: I0111 17:51:02.422219 4837 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/81f4fdee-d6ec-4ad2-8c8e-860791e29d29-logs\") on node \"crc\" DevicePath \"\"" Jan 11 17:51:02 crc kubenswrapper[4837]: I0111 17:51:02.422245 4837 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage12-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage12-crc\") on node \"crc\" " Jan 11 17:51:02 crc kubenswrapper[4837]: I0111 17:51:02.422254 4837 reconciler_common.go:293] "Volume detached for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/81f4fdee-d6ec-4ad2-8c8e-860791e29d29-httpd-run\") on node \"crc\" DevicePath \"\"" Jan 11 17:51:02 crc kubenswrapper[4837]: I0111 17:51:02.422263 4837 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/81f4fdee-d6ec-4ad2-8c8e-860791e29d29-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 11 17:51:02 crc kubenswrapper[4837]: I0111 17:51:02.422274 4837 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-bppsr\" (UniqueName: \"kubernetes.io/projected/81f4fdee-d6ec-4ad2-8c8e-860791e29d29-kube-api-access-bppsr\") on node \"crc\" DevicePath \"\"" Jan 11 17:51:02 crc kubenswrapper[4837]: I0111 17:51:02.422282 4837 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/81f4fdee-d6ec-4ad2-8c8e-860791e29d29-scripts\") on node \"crc\" DevicePath \"\"" Jan 11 17:51:02 crc kubenswrapper[4837]: I0111 17:51:02.444324 4837 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage12-crc" (UniqueName: "kubernetes.io/local-volume/local-storage12-crc") on node "crc" Jan 11 17:51:02 crc kubenswrapper[4837]: I0111 17:51:02.524302 4837 reconciler_common.go:293] "Volume detached for volume \"local-storage12-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage12-crc\") on node \"crc\" DevicePath \"\"" Jan 11 17:51:02 crc kubenswrapper[4837]: I0111 17:51:02.675531 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-7pd2m" event={"ID":"ca96b8a6-63fc-49fc-8898-6794c54e1676","Type":"ContainerStarted","Data":"aac301345f13d97aaec56ecccb3cd8c148a3f0d7fdf0d6970717bd45d50ca86d"} Jan 11 17:51:02 crc kubenswrapper[4837]: I0111 17:51:02.675792 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-7pd2m" event={"ID":"ca96b8a6-63fc-49fc-8898-6794c54e1676","Type":"ContainerStarted","Data":"72012f39c7fe583f2232e8629f2ef5ba52635c7ec6b9e778bcbfd7e9e255b3b6"} Jan 11 17:51:02 crc kubenswrapper[4837]: I0111 17:51:02.690711 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-7f65cf99f6-zwzzs" event={"ID":"ad90513d-7bd8-4407-af16-8d041440673f","Type":"ContainerStarted","Data":"98865f1ad3b29c4e04600174953cfc9f5e8072c9565697179596455069582ace"} Jan 11 17:51:02 crc kubenswrapper[4837]: I0111 17:51:02.690762 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-7f65cf99f6-zwzzs" event={"ID":"ad90513d-7bd8-4407-af16-8d041440673f","Type":"ContainerStarted","Data":"665ff5b39579a6fbee058b204149942234ce2f5ba3c25f5be5f8661662034d49"} Jan 11 17:51:02 crc kubenswrapper[4837]: I0111 17:51:02.696448 4837 generic.go:334] "Generic (PLEG): container finished" podID="81f4fdee-d6ec-4ad2-8c8e-860791e29d29" containerID="839b0def4ada7fcbae8706d56fd333b7ca38ae628c564640c8e50d09467f3b10" exitCode=143 Jan 11 17:51:02 crc kubenswrapper[4837]: I0111 17:51:02.696491 4837 generic.go:334] "Generic (PLEG): container finished" podID="81f4fdee-d6ec-4ad2-8c8e-860791e29d29" containerID="1ee545563084fcd1474bc5c5f724cd899fa6f68bc74e4db8fc23fd271aecdb4d" exitCode=143 Jan 11 17:51:02 crc kubenswrapper[4837]: I0111 17:51:02.696560 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"81f4fdee-d6ec-4ad2-8c8e-860791e29d29","Type":"ContainerDied","Data":"839b0def4ada7fcbae8706d56fd333b7ca38ae628c564640c8e50d09467f3b10"} Jan 11 17:51:02 crc kubenswrapper[4837]: I0111 17:51:02.696595 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"81f4fdee-d6ec-4ad2-8c8e-860791e29d29","Type":"ContainerDied","Data":"1ee545563084fcd1474bc5c5f724cd899fa6f68bc74e4db8fc23fd271aecdb4d"} Jan 11 17:51:02 crc kubenswrapper[4837]: I0111 17:51:02.696610 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"81f4fdee-d6ec-4ad2-8c8e-860791e29d29","Type":"ContainerDied","Data":"ac96e43ac1c08e2a26d4bbba15a1da80f3caa1592785c93ed8deaecb4fdd60cf"} Jan 11 17:51:02 crc kubenswrapper[4837]: I0111 17:51:02.696628 4837 scope.go:117] "RemoveContainer" containerID="839b0def4ada7fcbae8706d56fd333b7ca38ae628c564640c8e50d09467f3b10" Jan 11 17:51:02 crc kubenswrapper[4837]: I0111 17:51:02.696856 4837 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Jan 11 17:51:02 crc kubenswrapper[4837]: I0111 17:51:02.701461 4837 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/keystone-bootstrap-7pd2m" podStartSLOduration=16.701437475 podStartE2EDuration="16.701437475s" podCreationTimestamp="2026-01-11 17:50:46 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-11 17:51:02.693412789 +0000 UTC m=+1236.871605495" watchObservedRunningTime="2026-01-11 17:51:02.701437475 +0000 UTC m=+1236.879630171" Jan 11 17:51:02 crc kubenswrapper[4837]: I0111 17:51:02.706588 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-5d4fd56848-nmkm6" event={"ID":"af5aeb3b-e789-4f43-ac70-bb570e59027e","Type":"ContainerStarted","Data":"6d7775187bc9d8933cd74c6fb7ea42d457f38072bf52d7ec4be27690a1b39be4"} Jan 11 17:51:02 crc kubenswrapper[4837]: I0111 17:51:02.706636 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-5d4fd56848-nmkm6" event={"ID":"af5aeb3b-e789-4f43-ac70-bb570e59027e","Type":"ContainerStarted","Data":"2cb13701105734300d330441ec671cbbb96ce16214153e592ff508e3acc0b60d"} Jan 11 17:51:02 crc kubenswrapper[4837]: I0111 17:51:02.709204 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"5ec0beaf-de63-407f-8d18-46738023ab11","Type":"ContainerStarted","Data":"6365fbb22458d0aa6474e7c381cd0e77f5f0e1ffa3601c5afba83111f5f7eff4"} Jan 11 17:51:02 crc kubenswrapper[4837]: I0111 17:51:02.716266 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"979d7c48-3688-478f-bb46-d78b535b84dc","Type":"ContainerStarted","Data":"97a13017e7d586891bc8a96f1fd39c70b32c09231632f3f470b5437de2e72751"} Jan 11 17:51:02 crc kubenswrapper[4837]: I0111 17:51:02.716496 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"979d7c48-3688-478f-bb46-d78b535b84dc","Type":"ContainerStarted","Data":"ca1e84dc3cb6f40a2763a349082811b2575c3fb7758df69a625605bc60e628e9"} Jan 11 17:51:02 crc kubenswrapper[4837]: I0111 17:51:02.778501 4837 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-external-api-0"] Jan 11 17:51:02 crc kubenswrapper[4837]: I0111 17:51:02.786320 4837 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glance-default-external-api-0"] Jan 11 17:51:02 crc kubenswrapper[4837]: I0111 17:51:02.797783 4837 scope.go:117] "RemoveContainer" containerID="1ee545563084fcd1474bc5c5f724cd899fa6f68bc74e4db8fc23fd271aecdb4d" Jan 11 17:51:02 crc kubenswrapper[4837]: I0111 17:51:02.803454 4837 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-default-external-api-0"] Jan 11 17:51:02 crc kubenswrapper[4837]: E0111 17:51:02.804078 4837 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="81f4fdee-d6ec-4ad2-8c8e-860791e29d29" containerName="glance-log" Jan 11 17:51:02 crc kubenswrapper[4837]: I0111 17:51:02.804165 4837 state_mem.go:107] "Deleted CPUSet assignment" podUID="81f4fdee-d6ec-4ad2-8c8e-860791e29d29" containerName="glance-log" Jan 11 17:51:02 crc kubenswrapper[4837]: E0111 17:51:02.804270 4837 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="81f4fdee-d6ec-4ad2-8c8e-860791e29d29" containerName="glance-httpd" Jan 11 17:51:02 crc kubenswrapper[4837]: I0111 17:51:02.804337 4837 state_mem.go:107] "Deleted CPUSet assignment" podUID="81f4fdee-d6ec-4ad2-8c8e-860791e29d29" containerName="glance-httpd" Jan 11 17:51:02 crc kubenswrapper[4837]: I0111 17:51:02.804537 4837 memory_manager.go:354] "RemoveStaleState removing state" podUID="81f4fdee-d6ec-4ad2-8c8e-860791e29d29" containerName="glance-httpd" Jan 11 17:51:02 crc kubenswrapper[4837]: I0111 17:51:02.804597 4837 memory_manager.go:354] "RemoveStaleState removing state" podUID="81f4fdee-d6ec-4ad2-8c8e-860791e29d29" containerName="glance-log" Jan 11 17:51:02 crc kubenswrapper[4837]: I0111 17:51:02.805945 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Jan 11 17:51:02 crc kubenswrapper[4837]: I0111 17:51:02.809760 4837 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-glance-default-public-svc" Jan 11 17:51:02 crc kubenswrapper[4837]: I0111 17:51:02.810075 4837 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-default-external-config-data" Jan 11 17:51:02 crc kubenswrapper[4837]: I0111 17:51:02.813720 4837 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-external-api-0"] Jan 11 17:51:02 crc kubenswrapper[4837]: I0111 17:51:02.883443 4837 scope.go:117] "RemoveContainer" containerID="839b0def4ada7fcbae8706d56fd333b7ca38ae628c564640c8e50d09467f3b10" Jan 11 17:51:02 crc kubenswrapper[4837]: E0111 17:51:02.883912 4837 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"839b0def4ada7fcbae8706d56fd333b7ca38ae628c564640c8e50d09467f3b10\": container with ID starting with 839b0def4ada7fcbae8706d56fd333b7ca38ae628c564640c8e50d09467f3b10 not found: ID does not exist" containerID="839b0def4ada7fcbae8706d56fd333b7ca38ae628c564640c8e50d09467f3b10" Jan 11 17:51:02 crc kubenswrapper[4837]: I0111 17:51:02.883945 4837 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"839b0def4ada7fcbae8706d56fd333b7ca38ae628c564640c8e50d09467f3b10"} err="failed to get container status \"839b0def4ada7fcbae8706d56fd333b7ca38ae628c564640c8e50d09467f3b10\": rpc error: code = NotFound desc = could not find container \"839b0def4ada7fcbae8706d56fd333b7ca38ae628c564640c8e50d09467f3b10\": container with ID starting with 839b0def4ada7fcbae8706d56fd333b7ca38ae628c564640c8e50d09467f3b10 not found: ID does not exist" Jan 11 17:51:02 crc kubenswrapper[4837]: I0111 17:51:02.883965 4837 scope.go:117] "RemoveContainer" containerID="1ee545563084fcd1474bc5c5f724cd899fa6f68bc74e4db8fc23fd271aecdb4d" Jan 11 17:51:02 crc kubenswrapper[4837]: E0111 17:51:02.884358 4837 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"1ee545563084fcd1474bc5c5f724cd899fa6f68bc74e4db8fc23fd271aecdb4d\": container with ID starting with 1ee545563084fcd1474bc5c5f724cd899fa6f68bc74e4db8fc23fd271aecdb4d not found: ID does not exist" containerID="1ee545563084fcd1474bc5c5f724cd899fa6f68bc74e4db8fc23fd271aecdb4d" Jan 11 17:51:02 crc kubenswrapper[4837]: I0111 17:51:02.884412 4837 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1ee545563084fcd1474bc5c5f724cd899fa6f68bc74e4db8fc23fd271aecdb4d"} err="failed to get container status \"1ee545563084fcd1474bc5c5f724cd899fa6f68bc74e4db8fc23fd271aecdb4d\": rpc error: code = NotFound desc = could not find container \"1ee545563084fcd1474bc5c5f724cd899fa6f68bc74e4db8fc23fd271aecdb4d\": container with ID starting with 1ee545563084fcd1474bc5c5f724cd899fa6f68bc74e4db8fc23fd271aecdb4d not found: ID does not exist" Jan 11 17:51:02 crc kubenswrapper[4837]: I0111 17:51:02.884446 4837 scope.go:117] "RemoveContainer" containerID="839b0def4ada7fcbae8706d56fd333b7ca38ae628c564640c8e50d09467f3b10" Jan 11 17:51:02 crc kubenswrapper[4837]: I0111 17:51:02.884820 4837 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"839b0def4ada7fcbae8706d56fd333b7ca38ae628c564640c8e50d09467f3b10"} err="failed to get container status \"839b0def4ada7fcbae8706d56fd333b7ca38ae628c564640c8e50d09467f3b10\": rpc error: code = NotFound desc = could not find container \"839b0def4ada7fcbae8706d56fd333b7ca38ae628c564640c8e50d09467f3b10\": container with ID starting with 839b0def4ada7fcbae8706d56fd333b7ca38ae628c564640c8e50d09467f3b10 not found: ID does not exist" Jan 11 17:51:02 crc kubenswrapper[4837]: I0111 17:51:02.884866 4837 scope.go:117] "RemoveContainer" containerID="1ee545563084fcd1474bc5c5f724cd899fa6f68bc74e4db8fc23fd271aecdb4d" Jan 11 17:51:02 crc kubenswrapper[4837]: I0111 17:51:02.885143 4837 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1ee545563084fcd1474bc5c5f724cd899fa6f68bc74e4db8fc23fd271aecdb4d"} err="failed to get container status \"1ee545563084fcd1474bc5c5f724cd899fa6f68bc74e4db8fc23fd271aecdb4d\": rpc error: code = NotFound desc = could not find container \"1ee545563084fcd1474bc5c5f724cd899fa6f68bc74e4db8fc23fd271aecdb4d\": container with ID starting with 1ee545563084fcd1474bc5c5f724cd899fa6f68bc74e4db8fc23fd271aecdb4d not found: ID does not exist" Jan 11 17:51:02 crc kubenswrapper[4837]: I0111 17:51:02.941271 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mhhfr\" (UniqueName: \"kubernetes.io/projected/9669a553-3ac2-4189-86f2-08e5b972e66f-kube-api-access-mhhfr\") pod \"glance-default-external-api-0\" (UID: \"9669a553-3ac2-4189-86f2-08e5b972e66f\") " pod="openstack/glance-default-external-api-0" Jan 11 17:51:02 crc kubenswrapper[4837]: I0111 17:51:02.941326 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/9669a553-3ac2-4189-86f2-08e5b972e66f-scripts\") pod \"glance-default-external-api-0\" (UID: \"9669a553-3ac2-4189-86f2-08e5b972e66f\") " pod="openstack/glance-default-external-api-0" Jan 11 17:51:02 crc kubenswrapper[4837]: I0111 17:51:02.941382 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/9669a553-3ac2-4189-86f2-08e5b972e66f-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"9669a553-3ac2-4189-86f2-08e5b972e66f\") " pod="openstack/glance-default-external-api-0" Jan 11 17:51:02 crc kubenswrapper[4837]: I0111 17:51:02.941531 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9669a553-3ac2-4189-86f2-08e5b972e66f-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"9669a553-3ac2-4189-86f2-08e5b972e66f\") " pod="openstack/glance-default-external-api-0" Jan 11 17:51:02 crc kubenswrapper[4837]: I0111 17:51:02.941584 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/9669a553-3ac2-4189-86f2-08e5b972e66f-public-tls-certs\") pod \"glance-default-external-api-0\" (UID: \"9669a553-3ac2-4189-86f2-08e5b972e66f\") " pod="openstack/glance-default-external-api-0" Jan 11 17:51:02 crc kubenswrapper[4837]: I0111 17:51:02.941713 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/9669a553-3ac2-4189-86f2-08e5b972e66f-logs\") pod \"glance-default-external-api-0\" (UID: \"9669a553-3ac2-4189-86f2-08e5b972e66f\") " pod="openstack/glance-default-external-api-0" Jan 11 17:51:02 crc kubenswrapper[4837]: I0111 17:51:02.941749 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage12-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage12-crc\") pod \"glance-default-external-api-0\" (UID: \"9669a553-3ac2-4189-86f2-08e5b972e66f\") " pod="openstack/glance-default-external-api-0" Jan 11 17:51:02 crc kubenswrapper[4837]: I0111 17:51:02.942088 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9669a553-3ac2-4189-86f2-08e5b972e66f-config-data\") pod \"glance-default-external-api-0\" (UID: \"9669a553-3ac2-4189-86f2-08e5b972e66f\") " pod="openstack/glance-default-external-api-0" Jan 11 17:51:03 crc kubenswrapper[4837]: I0111 17:51:03.043896 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/9669a553-3ac2-4189-86f2-08e5b972e66f-logs\") pod \"glance-default-external-api-0\" (UID: \"9669a553-3ac2-4189-86f2-08e5b972e66f\") " pod="openstack/glance-default-external-api-0" Jan 11 17:51:03 crc kubenswrapper[4837]: I0111 17:51:03.043953 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage12-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage12-crc\") pod \"glance-default-external-api-0\" (UID: \"9669a553-3ac2-4189-86f2-08e5b972e66f\") " pod="openstack/glance-default-external-api-0" Jan 11 17:51:03 crc kubenswrapper[4837]: I0111 17:51:03.044042 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9669a553-3ac2-4189-86f2-08e5b972e66f-config-data\") pod \"glance-default-external-api-0\" (UID: \"9669a553-3ac2-4189-86f2-08e5b972e66f\") " pod="openstack/glance-default-external-api-0" Jan 11 17:51:03 crc kubenswrapper[4837]: I0111 17:51:03.044084 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mhhfr\" (UniqueName: \"kubernetes.io/projected/9669a553-3ac2-4189-86f2-08e5b972e66f-kube-api-access-mhhfr\") pod \"glance-default-external-api-0\" (UID: \"9669a553-3ac2-4189-86f2-08e5b972e66f\") " pod="openstack/glance-default-external-api-0" Jan 11 17:51:03 crc kubenswrapper[4837]: I0111 17:51:03.044119 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/9669a553-3ac2-4189-86f2-08e5b972e66f-scripts\") pod \"glance-default-external-api-0\" (UID: \"9669a553-3ac2-4189-86f2-08e5b972e66f\") " pod="openstack/glance-default-external-api-0" Jan 11 17:51:03 crc kubenswrapper[4837]: I0111 17:51:03.044196 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/9669a553-3ac2-4189-86f2-08e5b972e66f-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"9669a553-3ac2-4189-86f2-08e5b972e66f\") " pod="openstack/glance-default-external-api-0" Jan 11 17:51:03 crc kubenswrapper[4837]: I0111 17:51:03.044238 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9669a553-3ac2-4189-86f2-08e5b972e66f-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"9669a553-3ac2-4189-86f2-08e5b972e66f\") " pod="openstack/glance-default-external-api-0" Jan 11 17:51:03 crc kubenswrapper[4837]: I0111 17:51:03.044264 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/9669a553-3ac2-4189-86f2-08e5b972e66f-public-tls-certs\") pod \"glance-default-external-api-0\" (UID: \"9669a553-3ac2-4189-86f2-08e5b972e66f\") " pod="openstack/glance-default-external-api-0" Jan 11 17:51:03 crc kubenswrapper[4837]: I0111 17:51:03.045583 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/9669a553-3ac2-4189-86f2-08e5b972e66f-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"9669a553-3ac2-4189-86f2-08e5b972e66f\") " pod="openstack/glance-default-external-api-0" Jan 11 17:51:03 crc kubenswrapper[4837]: I0111 17:51:03.045730 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/9669a553-3ac2-4189-86f2-08e5b972e66f-logs\") pod \"glance-default-external-api-0\" (UID: \"9669a553-3ac2-4189-86f2-08e5b972e66f\") " pod="openstack/glance-default-external-api-0" Jan 11 17:51:03 crc kubenswrapper[4837]: I0111 17:51:03.045983 4837 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage12-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage12-crc\") pod \"glance-default-external-api-0\" (UID: \"9669a553-3ac2-4189-86f2-08e5b972e66f\") device mount path \"/mnt/openstack/pv12\"" pod="openstack/glance-default-external-api-0" Jan 11 17:51:03 crc kubenswrapper[4837]: I0111 17:51:03.049605 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/9669a553-3ac2-4189-86f2-08e5b972e66f-scripts\") pod \"glance-default-external-api-0\" (UID: \"9669a553-3ac2-4189-86f2-08e5b972e66f\") " pod="openstack/glance-default-external-api-0" Jan 11 17:51:03 crc kubenswrapper[4837]: I0111 17:51:03.052283 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/9669a553-3ac2-4189-86f2-08e5b972e66f-public-tls-certs\") pod \"glance-default-external-api-0\" (UID: \"9669a553-3ac2-4189-86f2-08e5b972e66f\") " pod="openstack/glance-default-external-api-0" Jan 11 17:51:03 crc kubenswrapper[4837]: I0111 17:51:03.052804 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9669a553-3ac2-4189-86f2-08e5b972e66f-config-data\") pod \"glance-default-external-api-0\" (UID: \"9669a553-3ac2-4189-86f2-08e5b972e66f\") " pod="openstack/glance-default-external-api-0" Jan 11 17:51:03 crc kubenswrapper[4837]: I0111 17:51:03.058411 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9669a553-3ac2-4189-86f2-08e5b972e66f-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"9669a553-3ac2-4189-86f2-08e5b972e66f\") " pod="openstack/glance-default-external-api-0" Jan 11 17:51:03 crc kubenswrapper[4837]: I0111 17:51:03.066430 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mhhfr\" (UniqueName: \"kubernetes.io/projected/9669a553-3ac2-4189-86f2-08e5b972e66f-kube-api-access-mhhfr\") pod \"glance-default-external-api-0\" (UID: \"9669a553-3ac2-4189-86f2-08e5b972e66f\") " pod="openstack/glance-default-external-api-0" Jan 11 17:51:03 crc kubenswrapper[4837]: I0111 17:51:03.119822 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage12-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage12-crc\") pod \"glance-default-external-api-0\" (UID: \"9669a553-3ac2-4189-86f2-08e5b972e66f\") " pod="openstack/glance-default-external-api-0" Jan 11 17:51:03 crc kubenswrapper[4837]: I0111 17:51:03.251390 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Jan 11 17:51:03 crc kubenswrapper[4837]: I0111 17:51:03.729869 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-7f65cf99f6-zwzzs" event={"ID":"ad90513d-7bd8-4407-af16-8d041440673f","Type":"ContainerStarted","Data":"0c88823152b87847baed058ede82884c331724525aca81dd10ea18b3839059f5"} Jan 11 17:51:03 crc kubenswrapper[4837]: I0111 17:51:03.737528 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-5d4fd56848-nmkm6" event={"ID":"af5aeb3b-e789-4f43-ac70-bb570e59027e","Type":"ContainerStarted","Data":"31448bf7a925ddeb3fb3d1d51e19aa71d7fe4393dd1c0b9b7fa3db44c4fe276a"} Jan 11 17:51:03 crc kubenswrapper[4837]: I0111 17:51:03.742626 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"979d7c48-3688-478f-bb46-d78b535b84dc","Type":"ContainerStarted","Data":"3430e6565caca4832c1e2cc748670ee244532fb427c7b39728e31430be081ebe"} Jan 11 17:51:03 crc kubenswrapper[4837]: I0111 17:51:03.767628 4837 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/horizon-7f65cf99f6-zwzzs" podStartSLOduration=24.276698386 podStartE2EDuration="24.767608621s" podCreationTimestamp="2026-01-11 17:50:39 +0000 UTC" firstStartedPulling="2026-01-11 17:51:01.772427143 +0000 UTC m=+1235.950619849" lastFinishedPulling="2026-01-11 17:51:02.263337378 +0000 UTC m=+1236.441530084" observedRunningTime="2026-01-11 17:51:03.757918721 +0000 UTC m=+1237.936111427" watchObservedRunningTime="2026-01-11 17:51:03.767608621 +0000 UTC m=+1237.945801327" Jan 11 17:51:03 crc kubenswrapper[4837]: I0111 17:51:03.786502 4837 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/glance-default-internal-api-0" podStartSLOduration=6.786477739 podStartE2EDuration="6.786477739s" podCreationTimestamp="2026-01-11 17:50:57 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-11 17:51:03.775068422 +0000 UTC m=+1237.953261128" watchObservedRunningTime="2026-01-11 17:51:03.786477739 +0000 UTC m=+1237.964670435" Jan 11 17:51:03 crc kubenswrapper[4837]: I0111 17:51:03.809588 4837 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/horizon-5d4fd56848-nmkm6" podStartSLOduration=24.323372179 podStartE2EDuration="24.809569588s" podCreationTimestamp="2026-01-11 17:50:39 +0000 UTC" firstStartedPulling="2026-01-11 17:51:01.773215344 +0000 UTC m=+1235.951408080" lastFinishedPulling="2026-01-11 17:51:02.259412783 +0000 UTC m=+1236.437605489" observedRunningTime="2026-01-11 17:51:03.800121305 +0000 UTC m=+1237.978314011" watchObservedRunningTime="2026-01-11 17:51:03.809569588 +0000 UTC m=+1237.987762294" Jan 11 17:51:03 crc kubenswrapper[4837]: I0111 17:51:03.843220 4837 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-external-api-0"] Jan 11 17:51:03 crc kubenswrapper[4837]: W0111 17:51:03.846629 4837 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod9669a553_3ac2_4189_86f2_08e5b972e66f.slice/crio-6e9ae8e0d53b14efbab3700a8054fcc5237fed420f5b695b4b8af5293168a4f8 WatchSource:0}: Error finding container 6e9ae8e0d53b14efbab3700a8054fcc5237fed420f5b695b4b8af5293168a4f8: Status 404 returned error can't find the container with id 6e9ae8e0d53b14efbab3700a8054fcc5237fed420f5b695b4b8af5293168a4f8 Jan 11 17:51:04 crc kubenswrapper[4837]: I0111 17:51:04.376863 4837 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="81f4fdee-d6ec-4ad2-8c8e-860791e29d29" path="/var/lib/kubelet/pods/81f4fdee-d6ec-4ad2-8c8e-860791e29d29/volumes" Jan 11 17:51:04 crc kubenswrapper[4837]: I0111 17:51:04.751767 4837 generic.go:334] "Generic (PLEG): container finished" podID="c5803c46-a48f-4120-9010-51375caff2a5" containerID="48dffaa36d493d8933fc0c6d2c32e0493364340b235a0c14855cc42fb8164cb3" exitCode=0 Jan 11 17:51:04 crc kubenswrapper[4837]: I0111 17:51:04.751837 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-db-sync-k7p7p" event={"ID":"c5803c46-a48f-4120-9010-51375caff2a5","Type":"ContainerDied","Data":"48dffaa36d493d8933fc0c6d2c32e0493364340b235a0c14855cc42fb8164cb3"} Jan 11 17:51:04 crc kubenswrapper[4837]: I0111 17:51:04.753412 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"9669a553-3ac2-4189-86f2-08e5b972e66f","Type":"ContainerStarted","Data":"6e9ae8e0d53b14efbab3700a8054fcc5237fed420f5b695b4b8af5293168a4f8"} Jan 11 17:51:05 crc kubenswrapper[4837]: I0111 17:51:05.764156 4837 generic.go:334] "Generic (PLEG): container finished" podID="ca96b8a6-63fc-49fc-8898-6794c54e1676" containerID="aac301345f13d97aaec56ecccb3cd8c148a3f0d7fdf0d6970717bd45d50ca86d" exitCode=0 Jan 11 17:51:05 crc kubenswrapper[4837]: I0111 17:51:05.764215 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-7pd2m" event={"ID":"ca96b8a6-63fc-49fc-8898-6794c54e1676","Type":"ContainerDied","Data":"aac301345f13d97aaec56ecccb3cd8c148a3f0d7fdf0d6970717bd45d50ca86d"} Jan 11 17:51:05 crc kubenswrapper[4837]: I0111 17:51:05.767354 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"9669a553-3ac2-4189-86f2-08e5b972e66f","Type":"ContainerStarted","Data":"f8b0b716163d47875c4c1afde9b512bd9dec9d5c171052ca7623704f62ed4244"} Jan 11 17:51:06 crc kubenswrapper[4837]: I0111 17:51:06.775306 4837 generic.go:334] "Generic (PLEG): container finished" podID="1b22cc91-a2b0-4468-96bc-73cb4aab66bb" containerID="2bb5637ead90d8db0e9cc88d248e8a187c19bf6d89b936cc130f704c59b2d2eb" exitCode=0 Jan 11 17:51:06 crc kubenswrapper[4837]: I0111 17:51:06.775847 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-db-sync-fv6l8" event={"ID":"1b22cc91-a2b0-4468-96bc-73cb4aab66bb","Type":"ContainerDied","Data":"2bb5637ead90d8db0e9cc88d248e8a187c19bf6d89b936cc130f704c59b2d2eb"} Jan 11 17:51:07 crc kubenswrapper[4837]: I0111 17:51:07.259710 4837 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-7pd2m" Jan 11 17:51:07 crc kubenswrapper[4837]: I0111 17:51:07.268446 4837 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-sync-k7p7p" Jan 11 17:51:07 crc kubenswrapper[4837]: I0111 17:51:07.334875 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c5803c46-a48f-4120-9010-51375caff2a5-config-data\") pod \"c5803c46-a48f-4120-9010-51375caff2a5\" (UID: \"c5803c46-a48f-4120-9010-51375caff2a5\") " Jan 11 17:51:07 crc kubenswrapper[4837]: I0111 17:51:07.335188 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/ca96b8a6-63fc-49fc-8898-6794c54e1676-scripts\") pod \"ca96b8a6-63fc-49fc-8898-6794c54e1676\" (UID: \"ca96b8a6-63fc-49fc-8898-6794c54e1676\") " Jan 11 17:51:07 crc kubenswrapper[4837]: I0111 17:51:07.335362 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c5803c46-a48f-4120-9010-51375caff2a5-combined-ca-bundle\") pod \"c5803c46-a48f-4120-9010-51375caff2a5\" (UID: \"c5803c46-a48f-4120-9010-51375caff2a5\") " Jan 11 17:51:07 crc kubenswrapper[4837]: I0111 17:51:07.335450 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ca96b8a6-63fc-49fc-8898-6794c54e1676-config-data\") pod \"ca96b8a6-63fc-49fc-8898-6794c54e1676\" (UID: \"ca96b8a6-63fc-49fc-8898-6794c54e1676\") " Jan 11 17:51:07 crc kubenswrapper[4837]: I0111 17:51:07.335544 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ca96b8a6-63fc-49fc-8898-6794c54e1676-combined-ca-bundle\") pod \"ca96b8a6-63fc-49fc-8898-6794c54e1676\" (UID: \"ca96b8a6-63fc-49fc-8898-6794c54e1676\") " Jan 11 17:51:07 crc kubenswrapper[4837]: I0111 17:51:07.335712 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/c5803c46-a48f-4120-9010-51375caff2a5-logs\") pod \"c5803c46-a48f-4120-9010-51375caff2a5\" (UID: \"c5803c46-a48f-4120-9010-51375caff2a5\") " Jan 11 17:51:07 crc kubenswrapper[4837]: I0111 17:51:07.335818 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/ca96b8a6-63fc-49fc-8898-6794c54e1676-fernet-keys\") pod \"ca96b8a6-63fc-49fc-8898-6794c54e1676\" (UID: \"ca96b8a6-63fc-49fc-8898-6794c54e1676\") " Jan 11 17:51:07 crc kubenswrapper[4837]: I0111 17:51:07.335928 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/ca96b8a6-63fc-49fc-8898-6794c54e1676-credential-keys\") pod \"ca96b8a6-63fc-49fc-8898-6794c54e1676\" (UID: \"ca96b8a6-63fc-49fc-8898-6794c54e1676\") " Jan 11 17:51:07 crc kubenswrapper[4837]: I0111 17:51:07.336025 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/c5803c46-a48f-4120-9010-51375caff2a5-scripts\") pod \"c5803c46-a48f-4120-9010-51375caff2a5\" (UID: \"c5803c46-a48f-4120-9010-51375caff2a5\") " Jan 11 17:51:07 crc kubenswrapper[4837]: I0111 17:51:07.336146 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-bncrq\" (UniqueName: \"kubernetes.io/projected/ca96b8a6-63fc-49fc-8898-6794c54e1676-kube-api-access-bncrq\") pod \"ca96b8a6-63fc-49fc-8898-6794c54e1676\" (UID: \"ca96b8a6-63fc-49fc-8898-6794c54e1676\") " Jan 11 17:51:07 crc kubenswrapper[4837]: I0111 17:51:07.336272 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-l7czq\" (UniqueName: \"kubernetes.io/projected/c5803c46-a48f-4120-9010-51375caff2a5-kube-api-access-l7czq\") pod \"c5803c46-a48f-4120-9010-51375caff2a5\" (UID: \"c5803c46-a48f-4120-9010-51375caff2a5\") " Jan 11 17:51:07 crc kubenswrapper[4837]: I0111 17:51:07.340066 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/c5803c46-a48f-4120-9010-51375caff2a5-logs" (OuterVolumeSpecName: "logs") pod "c5803c46-a48f-4120-9010-51375caff2a5" (UID: "c5803c46-a48f-4120-9010-51375caff2a5"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 11 17:51:07 crc kubenswrapper[4837]: I0111 17:51:07.342612 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ca96b8a6-63fc-49fc-8898-6794c54e1676-credential-keys" (OuterVolumeSpecName: "credential-keys") pod "ca96b8a6-63fc-49fc-8898-6794c54e1676" (UID: "ca96b8a6-63fc-49fc-8898-6794c54e1676"). InnerVolumeSpecName "credential-keys". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 11 17:51:07 crc kubenswrapper[4837]: I0111 17:51:07.343369 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ca96b8a6-63fc-49fc-8898-6794c54e1676-fernet-keys" (OuterVolumeSpecName: "fernet-keys") pod "ca96b8a6-63fc-49fc-8898-6794c54e1676" (UID: "ca96b8a6-63fc-49fc-8898-6794c54e1676"). InnerVolumeSpecName "fernet-keys". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 11 17:51:07 crc kubenswrapper[4837]: I0111 17:51:07.343724 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c5803c46-a48f-4120-9010-51375caff2a5-scripts" (OuterVolumeSpecName: "scripts") pod "c5803c46-a48f-4120-9010-51375caff2a5" (UID: "c5803c46-a48f-4120-9010-51375caff2a5"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 11 17:51:07 crc kubenswrapper[4837]: I0111 17:51:07.348982 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c5803c46-a48f-4120-9010-51375caff2a5-kube-api-access-l7czq" (OuterVolumeSpecName: "kube-api-access-l7czq") pod "c5803c46-a48f-4120-9010-51375caff2a5" (UID: "c5803c46-a48f-4120-9010-51375caff2a5"). InnerVolumeSpecName "kube-api-access-l7czq". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 11 17:51:07 crc kubenswrapper[4837]: I0111 17:51:07.348975 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ca96b8a6-63fc-49fc-8898-6794c54e1676-scripts" (OuterVolumeSpecName: "scripts") pod "ca96b8a6-63fc-49fc-8898-6794c54e1676" (UID: "ca96b8a6-63fc-49fc-8898-6794c54e1676"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 11 17:51:07 crc kubenswrapper[4837]: I0111 17:51:07.357025 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ca96b8a6-63fc-49fc-8898-6794c54e1676-kube-api-access-bncrq" (OuterVolumeSpecName: "kube-api-access-bncrq") pod "ca96b8a6-63fc-49fc-8898-6794c54e1676" (UID: "ca96b8a6-63fc-49fc-8898-6794c54e1676"). InnerVolumeSpecName "kube-api-access-bncrq". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 11 17:51:07 crc kubenswrapper[4837]: I0111 17:51:07.363877 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c5803c46-a48f-4120-9010-51375caff2a5-config-data" (OuterVolumeSpecName: "config-data") pod "c5803c46-a48f-4120-9010-51375caff2a5" (UID: "c5803c46-a48f-4120-9010-51375caff2a5"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 11 17:51:07 crc kubenswrapper[4837]: I0111 17:51:07.375069 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ca96b8a6-63fc-49fc-8898-6794c54e1676-config-data" (OuterVolumeSpecName: "config-data") pod "ca96b8a6-63fc-49fc-8898-6794c54e1676" (UID: "ca96b8a6-63fc-49fc-8898-6794c54e1676"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 11 17:51:07 crc kubenswrapper[4837]: I0111 17:51:07.391396 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c5803c46-a48f-4120-9010-51375caff2a5-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "c5803c46-a48f-4120-9010-51375caff2a5" (UID: "c5803c46-a48f-4120-9010-51375caff2a5"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 11 17:51:07 crc kubenswrapper[4837]: I0111 17:51:07.396814 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ca96b8a6-63fc-49fc-8898-6794c54e1676-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "ca96b8a6-63fc-49fc-8898-6794c54e1676" (UID: "ca96b8a6-63fc-49fc-8898-6794c54e1676"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 11 17:51:07 crc kubenswrapper[4837]: I0111 17:51:07.439415 4837 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/c5803c46-a48f-4120-9010-51375caff2a5-logs\") on node \"crc\" DevicePath \"\"" Jan 11 17:51:07 crc kubenswrapper[4837]: I0111 17:51:07.439455 4837 reconciler_common.go:293] "Volume detached for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/ca96b8a6-63fc-49fc-8898-6794c54e1676-fernet-keys\") on node \"crc\" DevicePath \"\"" Jan 11 17:51:07 crc kubenswrapper[4837]: I0111 17:51:07.439469 4837 reconciler_common.go:293] "Volume detached for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/ca96b8a6-63fc-49fc-8898-6794c54e1676-credential-keys\") on node \"crc\" DevicePath \"\"" Jan 11 17:51:07 crc kubenswrapper[4837]: I0111 17:51:07.439483 4837 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/c5803c46-a48f-4120-9010-51375caff2a5-scripts\") on node \"crc\" DevicePath \"\"" Jan 11 17:51:07 crc kubenswrapper[4837]: I0111 17:51:07.439496 4837 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-bncrq\" (UniqueName: \"kubernetes.io/projected/ca96b8a6-63fc-49fc-8898-6794c54e1676-kube-api-access-bncrq\") on node \"crc\" DevicePath \"\"" Jan 11 17:51:07 crc kubenswrapper[4837]: I0111 17:51:07.439508 4837 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-l7czq\" (UniqueName: \"kubernetes.io/projected/c5803c46-a48f-4120-9010-51375caff2a5-kube-api-access-l7czq\") on node \"crc\" DevicePath \"\"" Jan 11 17:51:07 crc kubenswrapper[4837]: I0111 17:51:07.439519 4837 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c5803c46-a48f-4120-9010-51375caff2a5-config-data\") on node \"crc\" DevicePath \"\"" Jan 11 17:51:07 crc kubenswrapper[4837]: I0111 17:51:07.439530 4837 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/ca96b8a6-63fc-49fc-8898-6794c54e1676-scripts\") on node \"crc\" DevicePath \"\"" Jan 11 17:51:07 crc kubenswrapper[4837]: I0111 17:51:07.439540 4837 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c5803c46-a48f-4120-9010-51375caff2a5-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 11 17:51:07 crc kubenswrapper[4837]: I0111 17:51:07.439551 4837 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ca96b8a6-63fc-49fc-8898-6794c54e1676-config-data\") on node \"crc\" DevicePath \"\"" Jan 11 17:51:07 crc kubenswrapper[4837]: I0111 17:51:07.439562 4837 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ca96b8a6-63fc-49fc-8898-6794c54e1676-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 11 17:51:07 crc kubenswrapper[4837]: I0111 17:51:07.787089 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-7pd2m" event={"ID":"ca96b8a6-63fc-49fc-8898-6794c54e1676","Type":"ContainerDied","Data":"72012f39c7fe583f2232e8629f2ef5ba52635c7ec6b9e778bcbfd7e9e255b3b6"} Jan 11 17:51:07 crc kubenswrapper[4837]: I0111 17:51:07.787455 4837 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="72012f39c7fe583f2232e8629f2ef5ba52635c7ec6b9e778bcbfd7e9e255b3b6" Jan 11 17:51:07 crc kubenswrapper[4837]: I0111 17:51:07.787101 4837 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-7pd2m" Jan 11 17:51:07 crc kubenswrapper[4837]: I0111 17:51:07.790056 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-db-sync-k7p7p" event={"ID":"c5803c46-a48f-4120-9010-51375caff2a5","Type":"ContainerDied","Data":"754de8f0e133e15347340ecdd59668a8763878335076729796b6ad0e10ca77bf"} Jan 11 17:51:07 crc kubenswrapper[4837]: I0111 17:51:07.790084 4837 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="754de8f0e133e15347340ecdd59668a8763878335076729796b6ad0e10ca77bf" Jan 11 17:51:07 crc kubenswrapper[4837]: I0111 17:51:07.790104 4837 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-sync-k7p7p" Jan 11 17:51:07 crc kubenswrapper[4837]: I0111 17:51:07.881645 4837 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/keystone-79986b9d84-7gl5k"] Jan 11 17:51:07 crc kubenswrapper[4837]: E0111 17:51:07.882069 4837 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ca96b8a6-63fc-49fc-8898-6794c54e1676" containerName="keystone-bootstrap" Jan 11 17:51:07 crc kubenswrapper[4837]: I0111 17:51:07.882082 4837 state_mem.go:107] "Deleted CPUSet assignment" podUID="ca96b8a6-63fc-49fc-8898-6794c54e1676" containerName="keystone-bootstrap" Jan 11 17:51:07 crc kubenswrapper[4837]: E0111 17:51:07.882100 4837 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c5803c46-a48f-4120-9010-51375caff2a5" containerName="placement-db-sync" Jan 11 17:51:07 crc kubenswrapper[4837]: I0111 17:51:07.882107 4837 state_mem.go:107] "Deleted CPUSet assignment" podUID="c5803c46-a48f-4120-9010-51375caff2a5" containerName="placement-db-sync" Jan 11 17:51:07 crc kubenswrapper[4837]: I0111 17:51:07.882293 4837 memory_manager.go:354] "RemoveStaleState removing state" podUID="c5803c46-a48f-4120-9010-51375caff2a5" containerName="placement-db-sync" Jan 11 17:51:07 crc kubenswrapper[4837]: I0111 17:51:07.882313 4837 memory_manager.go:354] "RemoveStaleState removing state" podUID="ca96b8a6-63fc-49fc-8898-6794c54e1676" containerName="keystone-bootstrap" Jan 11 17:51:07 crc kubenswrapper[4837]: I0111 17:51:07.882936 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-79986b9d84-7gl5k" Jan 11 17:51:07 crc kubenswrapper[4837]: I0111 17:51:07.888188 4837 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-keystone-public-svc" Jan 11 17:51:07 crc kubenswrapper[4837]: I0111 17:51:07.888502 4837 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-scripts" Jan 11 17:51:07 crc kubenswrapper[4837]: I0111 17:51:07.889132 4837 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone" Jan 11 17:51:07 crc kubenswrapper[4837]: I0111 17:51:07.889265 4837 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-config-data" Jan 11 17:51:07 crc kubenswrapper[4837]: I0111 17:51:07.889432 4837 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-keystone-internal-svc" Jan 11 17:51:07 crc kubenswrapper[4837]: I0111 17:51:07.889584 4837 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-keystone-dockercfg-6ct2s" Jan 11 17:51:07 crc kubenswrapper[4837]: I0111 17:51:07.914388 4837 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-79986b9d84-7gl5k"] Jan 11 17:51:07 crc kubenswrapper[4837]: I0111 17:51:07.947800 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/42661ef7-7007-4cff-b945-85690a07399f-public-tls-certs\") pod \"keystone-79986b9d84-7gl5k\" (UID: \"42661ef7-7007-4cff-b945-85690a07399f\") " pod="openstack/keystone-79986b9d84-7gl5k" Jan 11 17:51:07 crc kubenswrapper[4837]: I0111 17:51:07.947908 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/42661ef7-7007-4cff-b945-85690a07399f-fernet-keys\") pod \"keystone-79986b9d84-7gl5k\" (UID: \"42661ef7-7007-4cff-b945-85690a07399f\") " pod="openstack/keystone-79986b9d84-7gl5k" Jan 11 17:51:07 crc kubenswrapper[4837]: I0111 17:51:07.947931 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/42661ef7-7007-4cff-b945-85690a07399f-credential-keys\") pod \"keystone-79986b9d84-7gl5k\" (UID: \"42661ef7-7007-4cff-b945-85690a07399f\") " pod="openstack/keystone-79986b9d84-7gl5k" Jan 11 17:51:07 crc kubenswrapper[4837]: I0111 17:51:07.947973 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/42661ef7-7007-4cff-b945-85690a07399f-combined-ca-bundle\") pod \"keystone-79986b9d84-7gl5k\" (UID: \"42661ef7-7007-4cff-b945-85690a07399f\") " pod="openstack/keystone-79986b9d84-7gl5k" Jan 11 17:51:07 crc kubenswrapper[4837]: I0111 17:51:07.948010 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/42661ef7-7007-4cff-b945-85690a07399f-config-data\") pod \"keystone-79986b9d84-7gl5k\" (UID: \"42661ef7-7007-4cff-b945-85690a07399f\") " pod="openstack/keystone-79986b9d84-7gl5k" Jan 11 17:51:07 crc kubenswrapper[4837]: I0111 17:51:07.948041 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/42661ef7-7007-4cff-b945-85690a07399f-scripts\") pod \"keystone-79986b9d84-7gl5k\" (UID: \"42661ef7-7007-4cff-b945-85690a07399f\") " pod="openstack/keystone-79986b9d84-7gl5k" Jan 11 17:51:07 crc kubenswrapper[4837]: I0111 17:51:07.948085 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/42661ef7-7007-4cff-b945-85690a07399f-internal-tls-certs\") pod \"keystone-79986b9d84-7gl5k\" (UID: \"42661ef7-7007-4cff-b945-85690a07399f\") " pod="openstack/keystone-79986b9d84-7gl5k" Jan 11 17:51:07 crc kubenswrapper[4837]: I0111 17:51:07.948293 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bphwt\" (UniqueName: \"kubernetes.io/projected/42661ef7-7007-4cff-b945-85690a07399f-kube-api-access-bphwt\") pod \"keystone-79986b9d84-7gl5k\" (UID: \"42661ef7-7007-4cff-b945-85690a07399f\") " pod="openstack/keystone-79986b9d84-7gl5k" Jan 11 17:51:08 crc kubenswrapper[4837]: I0111 17:51:08.050506 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/42661ef7-7007-4cff-b945-85690a07399f-fernet-keys\") pod \"keystone-79986b9d84-7gl5k\" (UID: \"42661ef7-7007-4cff-b945-85690a07399f\") " pod="openstack/keystone-79986b9d84-7gl5k" Jan 11 17:51:08 crc kubenswrapper[4837]: I0111 17:51:08.050556 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/42661ef7-7007-4cff-b945-85690a07399f-credential-keys\") pod \"keystone-79986b9d84-7gl5k\" (UID: \"42661ef7-7007-4cff-b945-85690a07399f\") " pod="openstack/keystone-79986b9d84-7gl5k" Jan 11 17:51:08 crc kubenswrapper[4837]: I0111 17:51:08.050594 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/42661ef7-7007-4cff-b945-85690a07399f-combined-ca-bundle\") pod \"keystone-79986b9d84-7gl5k\" (UID: \"42661ef7-7007-4cff-b945-85690a07399f\") " pod="openstack/keystone-79986b9d84-7gl5k" Jan 11 17:51:08 crc kubenswrapper[4837]: I0111 17:51:08.050623 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/42661ef7-7007-4cff-b945-85690a07399f-config-data\") pod \"keystone-79986b9d84-7gl5k\" (UID: \"42661ef7-7007-4cff-b945-85690a07399f\") " pod="openstack/keystone-79986b9d84-7gl5k" Jan 11 17:51:08 crc kubenswrapper[4837]: I0111 17:51:08.050646 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/42661ef7-7007-4cff-b945-85690a07399f-scripts\") pod \"keystone-79986b9d84-7gl5k\" (UID: \"42661ef7-7007-4cff-b945-85690a07399f\") " pod="openstack/keystone-79986b9d84-7gl5k" Jan 11 17:51:08 crc kubenswrapper[4837]: I0111 17:51:08.050710 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/42661ef7-7007-4cff-b945-85690a07399f-internal-tls-certs\") pod \"keystone-79986b9d84-7gl5k\" (UID: \"42661ef7-7007-4cff-b945-85690a07399f\") " pod="openstack/keystone-79986b9d84-7gl5k" Jan 11 17:51:08 crc kubenswrapper[4837]: I0111 17:51:08.050753 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bphwt\" (UniqueName: \"kubernetes.io/projected/42661ef7-7007-4cff-b945-85690a07399f-kube-api-access-bphwt\") pod \"keystone-79986b9d84-7gl5k\" (UID: \"42661ef7-7007-4cff-b945-85690a07399f\") " pod="openstack/keystone-79986b9d84-7gl5k" Jan 11 17:51:08 crc kubenswrapper[4837]: I0111 17:51:08.050780 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/42661ef7-7007-4cff-b945-85690a07399f-public-tls-certs\") pod \"keystone-79986b9d84-7gl5k\" (UID: \"42661ef7-7007-4cff-b945-85690a07399f\") " pod="openstack/keystone-79986b9d84-7gl5k" Jan 11 17:51:08 crc kubenswrapper[4837]: I0111 17:51:08.055644 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/42661ef7-7007-4cff-b945-85690a07399f-config-data\") pod \"keystone-79986b9d84-7gl5k\" (UID: \"42661ef7-7007-4cff-b945-85690a07399f\") " pod="openstack/keystone-79986b9d84-7gl5k" Jan 11 17:51:08 crc kubenswrapper[4837]: I0111 17:51:08.057760 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/42661ef7-7007-4cff-b945-85690a07399f-scripts\") pod \"keystone-79986b9d84-7gl5k\" (UID: \"42661ef7-7007-4cff-b945-85690a07399f\") " pod="openstack/keystone-79986b9d84-7gl5k" Jan 11 17:51:08 crc kubenswrapper[4837]: I0111 17:51:08.058392 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/42661ef7-7007-4cff-b945-85690a07399f-internal-tls-certs\") pod \"keystone-79986b9d84-7gl5k\" (UID: \"42661ef7-7007-4cff-b945-85690a07399f\") " pod="openstack/keystone-79986b9d84-7gl5k" Jan 11 17:51:08 crc kubenswrapper[4837]: I0111 17:51:08.058915 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/42661ef7-7007-4cff-b945-85690a07399f-combined-ca-bundle\") pod \"keystone-79986b9d84-7gl5k\" (UID: \"42661ef7-7007-4cff-b945-85690a07399f\") " pod="openstack/keystone-79986b9d84-7gl5k" Jan 11 17:51:08 crc kubenswrapper[4837]: I0111 17:51:08.058978 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/42661ef7-7007-4cff-b945-85690a07399f-credential-keys\") pod \"keystone-79986b9d84-7gl5k\" (UID: \"42661ef7-7007-4cff-b945-85690a07399f\") " pod="openstack/keystone-79986b9d84-7gl5k" Jan 11 17:51:08 crc kubenswrapper[4837]: I0111 17:51:08.059894 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/42661ef7-7007-4cff-b945-85690a07399f-fernet-keys\") pod \"keystone-79986b9d84-7gl5k\" (UID: \"42661ef7-7007-4cff-b945-85690a07399f\") " pod="openstack/keystone-79986b9d84-7gl5k" Jan 11 17:51:08 crc kubenswrapper[4837]: I0111 17:51:08.062121 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/42661ef7-7007-4cff-b945-85690a07399f-public-tls-certs\") pod \"keystone-79986b9d84-7gl5k\" (UID: \"42661ef7-7007-4cff-b945-85690a07399f\") " pod="openstack/keystone-79986b9d84-7gl5k" Jan 11 17:51:08 crc kubenswrapper[4837]: I0111 17:51:08.070143 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bphwt\" (UniqueName: \"kubernetes.io/projected/42661ef7-7007-4cff-b945-85690a07399f-kube-api-access-bphwt\") pod \"keystone-79986b9d84-7gl5k\" (UID: \"42661ef7-7007-4cff-b945-85690a07399f\") " pod="openstack/keystone-79986b9d84-7gl5k" Jan 11 17:51:08 crc kubenswrapper[4837]: I0111 17:51:08.211414 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-79986b9d84-7gl5k" Jan 11 17:51:08 crc kubenswrapper[4837]: I0111 17:51:08.312801 4837 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-default-internal-api-0" Jan 11 17:51:08 crc kubenswrapper[4837]: I0111 17:51:08.312849 4837 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-default-internal-api-0" Jan 11 17:51:08 crc kubenswrapper[4837]: I0111 17:51:08.357368 4837 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-default-internal-api-0" Jan 11 17:51:08 crc kubenswrapper[4837]: I0111 17:51:08.384549 4837 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-default-internal-api-0" Jan 11 17:51:08 crc kubenswrapper[4837]: I0111 17:51:08.432792 4837 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/placement-86bdff5ffb-hdnql"] Jan 11 17:51:08 crc kubenswrapper[4837]: I0111 17:51:08.434703 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-86bdff5ffb-hdnql" Jan 11 17:51:08 crc kubenswrapper[4837]: I0111 17:51:08.439158 4837 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-placement-public-svc" Jan 11 17:51:08 crc kubenswrapper[4837]: I0111 17:51:08.439496 4837 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"placement-config-data" Jan 11 17:51:08 crc kubenswrapper[4837]: I0111 17:51:08.439639 4837 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"placement-placement-dockercfg-ltsqk" Jan 11 17:51:08 crc kubenswrapper[4837]: I0111 17:51:08.439810 4837 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"placement-scripts" Jan 11 17:51:08 crc kubenswrapper[4837]: I0111 17:51:08.440078 4837 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-placement-internal-svc" Jan 11 17:51:08 crc kubenswrapper[4837]: I0111 17:51:08.441174 4837 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-86bdff5ffb-hdnql"] Jan 11 17:51:08 crc kubenswrapper[4837]: I0111 17:51:08.568771 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lc52c\" (UniqueName: \"kubernetes.io/projected/1a6ff225-8495-4008-9719-c85bcb7fa65b-kube-api-access-lc52c\") pod \"placement-86bdff5ffb-hdnql\" (UID: \"1a6ff225-8495-4008-9719-c85bcb7fa65b\") " pod="openstack/placement-86bdff5ffb-hdnql" Jan 11 17:51:08 crc kubenswrapper[4837]: I0111 17:51:08.569868 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1a6ff225-8495-4008-9719-c85bcb7fa65b-config-data\") pod \"placement-86bdff5ffb-hdnql\" (UID: \"1a6ff225-8495-4008-9719-c85bcb7fa65b\") " pod="openstack/placement-86bdff5ffb-hdnql" Jan 11 17:51:08 crc kubenswrapper[4837]: I0111 17:51:08.570359 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/1a6ff225-8495-4008-9719-c85bcb7fa65b-scripts\") pod \"placement-86bdff5ffb-hdnql\" (UID: \"1a6ff225-8495-4008-9719-c85bcb7fa65b\") " pod="openstack/placement-86bdff5ffb-hdnql" Jan 11 17:51:08 crc kubenswrapper[4837]: I0111 17:51:08.570468 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/1a6ff225-8495-4008-9719-c85bcb7fa65b-internal-tls-certs\") pod \"placement-86bdff5ffb-hdnql\" (UID: \"1a6ff225-8495-4008-9719-c85bcb7fa65b\") " pod="openstack/placement-86bdff5ffb-hdnql" Jan 11 17:51:08 crc kubenswrapper[4837]: I0111 17:51:08.571055 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/1a6ff225-8495-4008-9719-c85bcb7fa65b-public-tls-certs\") pod \"placement-86bdff5ffb-hdnql\" (UID: \"1a6ff225-8495-4008-9719-c85bcb7fa65b\") " pod="openstack/placement-86bdff5ffb-hdnql" Jan 11 17:51:08 crc kubenswrapper[4837]: I0111 17:51:08.571401 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/1a6ff225-8495-4008-9719-c85bcb7fa65b-logs\") pod \"placement-86bdff5ffb-hdnql\" (UID: \"1a6ff225-8495-4008-9719-c85bcb7fa65b\") " pod="openstack/placement-86bdff5ffb-hdnql" Jan 11 17:51:08 crc kubenswrapper[4837]: I0111 17:51:08.571934 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1a6ff225-8495-4008-9719-c85bcb7fa65b-combined-ca-bundle\") pod \"placement-86bdff5ffb-hdnql\" (UID: \"1a6ff225-8495-4008-9719-c85bcb7fa65b\") " pod="openstack/placement-86bdff5ffb-hdnql" Jan 11 17:51:08 crc kubenswrapper[4837]: I0111 17:51:08.674128 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/1a6ff225-8495-4008-9719-c85bcb7fa65b-public-tls-certs\") pod \"placement-86bdff5ffb-hdnql\" (UID: \"1a6ff225-8495-4008-9719-c85bcb7fa65b\") " pod="openstack/placement-86bdff5ffb-hdnql" Jan 11 17:51:08 crc kubenswrapper[4837]: I0111 17:51:08.674201 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/1a6ff225-8495-4008-9719-c85bcb7fa65b-logs\") pod \"placement-86bdff5ffb-hdnql\" (UID: \"1a6ff225-8495-4008-9719-c85bcb7fa65b\") " pod="openstack/placement-86bdff5ffb-hdnql" Jan 11 17:51:08 crc kubenswrapper[4837]: I0111 17:51:08.674264 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1a6ff225-8495-4008-9719-c85bcb7fa65b-combined-ca-bundle\") pod \"placement-86bdff5ffb-hdnql\" (UID: \"1a6ff225-8495-4008-9719-c85bcb7fa65b\") " pod="openstack/placement-86bdff5ffb-hdnql" Jan 11 17:51:08 crc kubenswrapper[4837]: I0111 17:51:08.674315 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lc52c\" (UniqueName: \"kubernetes.io/projected/1a6ff225-8495-4008-9719-c85bcb7fa65b-kube-api-access-lc52c\") pod \"placement-86bdff5ffb-hdnql\" (UID: \"1a6ff225-8495-4008-9719-c85bcb7fa65b\") " pod="openstack/placement-86bdff5ffb-hdnql" Jan 11 17:51:08 crc kubenswrapper[4837]: I0111 17:51:08.674351 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1a6ff225-8495-4008-9719-c85bcb7fa65b-config-data\") pod \"placement-86bdff5ffb-hdnql\" (UID: \"1a6ff225-8495-4008-9719-c85bcb7fa65b\") " pod="openstack/placement-86bdff5ffb-hdnql" Jan 11 17:51:08 crc kubenswrapper[4837]: I0111 17:51:08.674384 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/1a6ff225-8495-4008-9719-c85bcb7fa65b-scripts\") pod \"placement-86bdff5ffb-hdnql\" (UID: \"1a6ff225-8495-4008-9719-c85bcb7fa65b\") " pod="openstack/placement-86bdff5ffb-hdnql" Jan 11 17:51:08 crc kubenswrapper[4837]: I0111 17:51:08.674422 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/1a6ff225-8495-4008-9719-c85bcb7fa65b-internal-tls-certs\") pod \"placement-86bdff5ffb-hdnql\" (UID: \"1a6ff225-8495-4008-9719-c85bcb7fa65b\") " pod="openstack/placement-86bdff5ffb-hdnql" Jan 11 17:51:08 crc kubenswrapper[4837]: I0111 17:51:08.675100 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/1a6ff225-8495-4008-9719-c85bcb7fa65b-logs\") pod \"placement-86bdff5ffb-hdnql\" (UID: \"1a6ff225-8495-4008-9719-c85bcb7fa65b\") " pod="openstack/placement-86bdff5ffb-hdnql" Jan 11 17:51:08 crc kubenswrapper[4837]: I0111 17:51:08.679339 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/1a6ff225-8495-4008-9719-c85bcb7fa65b-public-tls-certs\") pod \"placement-86bdff5ffb-hdnql\" (UID: \"1a6ff225-8495-4008-9719-c85bcb7fa65b\") " pod="openstack/placement-86bdff5ffb-hdnql" Jan 11 17:51:08 crc kubenswrapper[4837]: I0111 17:51:08.681506 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1a6ff225-8495-4008-9719-c85bcb7fa65b-config-data\") pod \"placement-86bdff5ffb-hdnql\" (UID: \"1a6ff225-8495-4008-9719-c85bcb7fa65b\") " pod="openstack/placement-86bdff5ffb-hdnql" Jan 11 17:51:08 crc kubenswrapper[4837]: I0111 17:51:08.681967 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/1a6ff225-8495-4008-9719-c85bcb7fa65b-scripts\") pod \"placement-86bdff5ffb-hdnql\" (UID: \"1a6ff225-8495-4008-9719-c85bcb7fa65b\") " pod="openstack/placement-86bdff5ffb-hdnql" Jan 11 17:51:08 crc kubenswrapper[4837]: I0111 17:51:08.682262 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/1a6ff225-8495-4008-9719-c85bcb7fa65b-internal-tls-certs\") pod \"placement-86bdff5ffb-hdnql\" (UID: \"1a6ff225-8495-4008-9719-c85bcb7fa65b\") " pod="openstack/placement-86bdff5ffb-hdnql" Jan 11 17:51:08 crc kubenswrapper[4837]: I0111 17:51:08.691637 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1a6ff225-8495-4008-9719-c85bcb7fa65b-combined-ca-bundle\") pod \"placement-86bdff5ffb-hdnql\" (UID: \"1a6ff225-8495-4008-9719-c85bcb7fa65b\") " pod="openstack/placement-86bdff5ffb-hdnql" Jan 11 17:51:08 crc kubenswrapper[4837]: I0111 17:51:08.698851 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lc52c\" (UniqueName: \"kubernetes.io/projected/1a6ff225-8495-4008-9719-c85bcb7fa65b-kube-api-access-lc52c\") pod \"placement-86bdff5ffb-hdnql\" (UID: \"1a6ff225-8495-4008-9719-c85bcb7fa65b\") " pod="openstack/placement-86bdff5ffb-hdnql" Jan 11 17:51:08 crc kubenswrapper[4837]: I0111 17:51:08.759216 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-86bdff5ffb-hdnql" Jan 11 17:51:08 crc kubenswrapper[4837]: I0111 17:51:08.767056 4837 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-8b5c85b87-vkf4l" Jan 11 17:51:08 crc kubenswrapper[4837]: I0111 17:51:08.798812 4837 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-default-internal-api-0" Jan 11 17:51:08 crc kubenswrapper[4837]: I0111 17:51:08.798842 4837 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-default-internal-api-0" Jan 11 17:51:08 crc kubenswrapper[4837]: I0111 17:51:08.834223 4837 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-77585f5f8c-xqvrn"] Jan 11 17:51:08 crc kubenswrapper[4837]: I0111 17:51:08.834445 4837 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-77585f5f8c-xqvrn" podUID="e5f0d593-ab67-4967-9397-517c45742d39" containerName="dnsmasq-dns" containerID="cri-o://77eee50bec11a2ded47dc627f6957ec511c0d29c38f2d45ecf1bb4c34fd9dfab" gracePeriod=10 Jan 11 17:51:09 crc kubenswrapper[4837]: I0111 17:51:09.444530 4837 patch_prober.go:28] interesting pod/machine-config-daemon-pqnst container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Jan 11 17:51:09 crc kubenswrapper[4837]: I0111 17:51:09.444885 4837 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-pqnst" podUID="1cf6fa66-290a-4e29-bafc-e60185a22fe2" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Jan 11 17:51:09 crc kubenswrapper[4837]: I0111 17:51:09.444939 4837 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-pqnst" Jan 11 17:51:09 crc kubenswrapper[4837]: I0111 17:51:09.445707 4837 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"018458ba654491d89959a574a332d71c982ec5d0e53864759d97887f8cd68688"} pod="openshift-machine-config-operator/machine-config-daemon-pqnst" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Jan 11 17:51:09 crc kubenswrapper[4837]: I0111 17:51:09.445760 4837 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-pqnst" podUID="1cf6fa66-290a-4e29-bafc-e60185a22fe2" containerName="machine-config-daemon" containerID="cri-o://018458ba654491d89959a574a332d71c982ec5d0e53864759d97887f8cd68688" gracePeriod=600 Jan 11 17:51:09 crc kubenswrapper[4837]: I0111 17:51:09.809350 4837 generic.go:334] "Generic (PLEG): container finished" podID="e5f0d593-ab67-4967-9397-517c45742d39" containerID="77eee50bec11a2ded47dc627f6957ec511c0d29c38f2d45ecf1bb4c34fd9dfab" exitCode=0 Jan 11 17:51:09 crc kubenswrapper[4837]: I0111 17:51:09.809415 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-77585f5f8c-xqvrn" event={"ID":"e5f0d593-ab67-4967-9397-517c45742d39","Type":"ContainerDied","Data":"77eee50bec11a2ded47dc627f6957ec511c0d29c38f2d45ecf1bb4c34fd9dfab"} Jan 11 17:51:09 crc kubenswrapper[4837]: I0111 17:51:09.811654 4837 generic.go:334] "Generic (PLEG): container finished" podID="1cf6fa66-290a-4e29-bafc-e60185a22fe2" containerID="018458ba654491d89959a574a332d71c982ec5d0e53864759d97887f8cd68688" exitCode=0 Jan 11 17:51:09 crc kubenswrapper[4837]: I0111 17:51:09.812877 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-pqnst" event={"ID":"1cf6fa66-290a-4e29-bafc-e60185a22fe2","Type":"ContainerDied","Data":"018458ba654491d89959a574a332d71c982ec5d0e53864759d97887f8cd68688"} Jan 11 17:51:09 crc kubenswrapper[4837]: I0111 17:51:09.812963 4837 scope.go:117] "RemoveContainer" containerID="fefbb2dded3dd3108c6af21a68e03b8d09000012756ad3a506890b3d9bced335" Jan 11 17:51:09 crc kubenswrapper[4837]: I0111 17:51:09.830720 4837 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/horizon-5d4fd56848-nmkm6" Jan 11 17:51:09 crc kubenswrapper[4837]: I0111 17:51:09.830829 4837 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/horizon-5d4fd56848-nmkm6" Jan 11 17:51:09 crc kubenswrapper[4837]: I0111 17:51:09.979890 4837 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/horizon-7f65cf99f6-zwzzs" Jan 11 17:51:09 crc kubenswrapper[4837]: I0111 17:51:09.981334 4837 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/horizon-7f65cf99f6-zwzzs" Jan 11 17:51:10 crc kubenswrapper[4837]: I0111 17:51:10.702395 4837 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-default-internal-api-0" Jan 11 17:51:10 crc kubenswrapper[4837]: I0111 17:51:10.819261 4837 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Jan 11 17:51:11 crc kubenswrapper[4837]: I0111 17:51:11.043742 4837 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-sync-fv6l8" Jan 11 17:51:11 crc kubenswrapper[4837]: I0111 17:51:11.070135 4837 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-default-internal-api-0" Jan 11 17:51:11 crc kubenswrapper[4837]: I0111 17:51:11.127734 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/1b22cc91-a2b0-4468-96bc-73cb4aab66bb-config\") pod \"1b22cc91-a2b0-4468-96bc-73cb4aab66bb\" (UID: \"1b22cc91-a2b0-4468-96bc-73cb4aab66bb\") " Jan 11 17:51:11 crc kubenswrapper[4837]: I0111 17:51:11.128098 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-tlm7h\" (UniqueName: \"kubernetes.io/projected/1b22cc91-a2b0-4468-96bc-73cb4aab66bb-kube-api-access-tlm7h\") pod \"1b22cc91-a2b0-4468-96bc-73cb4aab66bb\" (UID: \"1b22cc91-a2b0-4468-96bc-73cb4aab66bb\") " Jan 11 17:51:11 crc kubenswrapper[4837]: I0111 17:51:11.128138 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1b22cc91-a2b0-4468-96bc-73cb4aab66bb-combined-ca-bundle\") pod \"1b22cc91-a2b0-4468-96bc-73cb4aab66bb\" (UID: \"1b22cc91-a2b0-4468-96bc-73cb4aab66bb\") " Jan 11 17:51:11 crc kubenswrapper[4837]: I0111 17:51:11.176876 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1b22cc91-a2b0-4468-96bc-73cb4aab66bb-config" (OuterVolumeSpecName: "config") pod "1b22cc91-a2b0-4468-96bc-73cb4aab66bb" (UID: "1b22cc91-a2b0-4468-96bc-73cb4aab66bb"). InnerVolumeSpecName "config". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 11 17:51:11 crc kubenswrapper[4837]: I0111 17:51:11.188868 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1b22cc91-a2b0-4468-96bc-73cb4aab66bb-kube-api-access-tlm7h" (OuterVolumeSpecName: "kube-api-access-tlm7h") pod "1b22cc91-a2b0-4468-96bc-73cb4aab66bb" (UID: "1b22cc91-a2b0-4468-96bc-73cb4aab66bb"). InnerVolumeSpecName "kube-api-access-tlm7h". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 11 17:51:11 crc kubenswrapper[4837]: I0111 17:51:11.228618 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1b22cc91-a2b0-4468-96bc-73cb4aab66bb-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "1b22cc91-a2b0-4468-96bc-73cb4aab66bb" (UID: "1b22cc91-a2b0-4468-96bc-73cb4aab66bb"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 11 17:51:11 crc kubenswrapper[4837]: I0111 17:51:11.237906 4837 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-tlm7h\" (UniqueName: \"kubernetes.io/projected/1b22cc91-a2b0-4468-96bc-73cb4aab66bb-kube-api-access-tlm7h\") on node \"crc\" DevicePath \"\"" Jan 11 17:51:11 crc kubenswrapper[4837]: I0111 17:51:11.237932 4837 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1b22cc91-a2b0-4468-96bc-73cb4aab66bb-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 11 17:51:11 crc kubenswrapper[4837]: I0111 17:51:11.237941 4837 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/secret/1b22cc91-a2b0-4468-96bc-73cb4aab66bb-config\") on node \"crc\" DevicePath \"\"" Jan 11 17:51:11 crc kubenswrapper[4837]: W0111 17:51:11.439439 4837 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod42661ef7_7007_4cff_b945_85690a07399f.slice/crio-c0db652ef5b4d0521f1baa8cc63db0ec74566eb8d545b453a88d3f4e944d6bd1 WatchSource:0}: Error finding container c0db652ef5b4d0521f1baa8cc63db0ec74566eb8d545b453a88d3f4e944d6bd1: Status 404 returned error can't find the container with id c0db652ef5b4d0521f1baa8cc63db0ec74566eb8d545b453a88d3f4e944d6bd1 Jan 11 17:51:11 crc kubenswrapper[4837]: I0111 17:51:11.441001 4837 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-79986b9d84-7gl5k"] Jan 11 17:51:11 crc kubenswrapper[4837]: I0111 17:51:11.548065 4837 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-86bdff5ffb-hdnql"] Jan 11 17:51:11 crc kubenswrapper[4837]: I0111 17:51:11.827855 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-db-sync-fv6l8" event={"ID":"1b22cc91-a2b0-4468-96bc-73cb4aab66bb","Type":"ContainerDied","Data":"42ccf9115a3447e853ebf96b73d2dc31114b47cf09877edca8126916216d2b39"} Jan 11 17:51:11 crc kubenswrapper[4837]: I0111 17:51:11.827887 4837 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="42ccf9115a3447e853ebf96b73d2dc31114b47cf09877edca8126916216d2b39" Jan 11 17:51:11 crc kubenswrapper[4837]: I0111 17:51:11.827888 4837 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-sync-fv6l8" Jan 11 17:51:11 crc kubenswrapper[4837]: I0111 17:51:11.828817 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-79986b9d84-7gl5k" event={"ID":"42661ef7-7007-4cff-b945-85690a07399f","Type":"ContainerStarted","Data":"c0db652ef5b4d0521f1baa8cc63db0ec74566eb8d545b453a88d3f4e944d6bd1"} Jan 11 17:51:12 crc kubenswrapper[4837]: I0111 17:51:12.186505 4837 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/dnsmasq-dns-77585f5f8c-xqvrn" podUID="e5f0d593-ab67-4967-9397-517c45742d39" containerName="dnsmasq-dns" probeResult="failure" output="dial tcp 10.217.0.129:5353: connect: connection refused" Jan 11 17:51:12 crc kubenswrapper[4837]: I0111 17:51:12.304254 4837 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-84b966f6c9-7p2rt"] Jan 11 17:51:12 crc kubenswrapper[4837]: E0111 17:51:12.305568 4837 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1b22cc91-a2b0-4468-96bc-73cb4aab66bb" containerName="neutron-db-sync" Jan 11 17:51:12 crc kubenswrapper[4837]: I0111 17:51:12.305597 4837 state_mem.go:107] "Deleted CPUSet assignment" podUID="1b22cc91-a2b0-4468-96bc-73cb4aab66bb" containerName="neutron-db-sync" Jan 11 17:51:12 crc kubenswrapper[4837]: I0111 17:51:12.305982 4837 memory_manager.go:354] "RemoveStaleState removing state" podUID="1b22cc91-a2b0-4468-96bc-73cb4aab66bb" containerName="neutron-db-sync" Jan 11 17:51:12 crc kubenswrapper[4837]: I0111 17:51:12.315943 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-84b966f6c9-7p2rt" Jan 11 17:51:12 crc kubenswrapper[4837]: I0111 17:51:12.336894 4837 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-84b966f6c9-7p2rt"] Jan 11 17:51:12 crc kubenswrapper[4837]: I0111 17:51:12.361042 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ce0fbcd3-57e1-437f-b004-7f609443f898-config\") pod \"dnsmasq-dns-84b966f6c9-7p2rt\" (UID: \"ce0fbcd3-57e1-437f-b004-7f609443f898\") " pod="openstack/dnsmasq-dns-84b966f6c9-7p2rt" Jan 11 17:51:12 crc kubenswrapper[4837]: I0111 17:51:12.361187 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5xb2t\" (UniqueName: \"kubernetes.io/projected/ce0fbcd3-57e1-437f-b004-7f609443f898-kube-api-access-5xb2t\") pod \"dnsmasq-dns-84b966f6c9-7p2rt\" (UID: \"ce0fbcd3-57e1-437f-b004-7f609443f898\") " pod="openstack/dnsmasq-dns-84b966f6c9-7p2rt" Jan 11 17:51:12 crc kubenswrapper[4837]: I0111 17:51:12.361253 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/ce0fbcd3-57e1-437f-b004-7f609443f898-dns-swift-storage-0\") pod \"dnsmasq-dns-84b966f6c9-7p2rt\" (UID: \"ce0fbcd3-57e1-437f-b004-7f609443f898\") " pod="openstack/dnsmasq-dns-84b966f6c9-7p2rt" Jan 11 17:51:12 crc kubenswrapper[4837]: I0111 17:51:12.361279 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/ce0fbcd3-57e1-437f-b004-7f609443f898-ovsdbserver-sb\") pod \"dnsmasq-dns-84b966f6c9-7p2rt\" (UID: \"ce0fbcd3-57e1-437f-b004-7f609443f898\") " pod="openstack/dnsmasq-dns-84b966f6c9-7p2rt" Jan 11 17:51:12 crc kubenswrapper[4837]: I0111 17:51:12.361435 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/ce0fbcd3-57e1-437f-b004-7f609443f898-dns-svc\") pod \"dnsmasq-dns-84b966f6c9-7p2rt\" (UID: \"ce0fbcd3-57e1-437f-b004-7f609443f898\") " pod="openstack/dnsmasq-dns-84b966f6c9-7p2rt" Jan 11 17:51:12 crc kubenswrapper[4837]: I0111 17:51:12.361494 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/ce0fbcd3-57e1-437f-b004-7f609443f898-ovsdbserver-nb\") pod \"dnsmasq-dns-84b966f6c9-7p2rt\" (UID: \"ce0fbcd3-57e1-437f-b004-7f609443f898\") " pod="openstack/dnsmasq-dns-84b966f6c9-7p2rt" Jan 11 17:51:12 crc kubenswrapper[4837]: I0111 17:51:12.438845 4837 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/neutron-d654f854d-9wdwv"] Jan 11 17:51:12 crc kubenswrapper[4837]: I0111 17:51:12.440136 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-d654f854d-9wdwv" Jan 11 17:51:12 crc kubenswrapper[4837]: I0111 17:51:12.442977 4837 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-neutron-ovndbs" Jan 11 17:51:12 crc kubenswrapper[4837]: I0111 17:51:12.443129 4837 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-httpd-config" Jan 11 17:51:12 crc kubenswrapper[4837]: I0111 17:51:12.443227 4837 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-neutron-dockercfg-mp677" Jan 11 17:51:12 crc kubenswrapper[4837]: I0111 17:51:12.443456 4837 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-config" Jan 11 17:51:12 crc kubenswrapper[4837]: I0111 17:51:12.457158 4837 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-d654f854d-9wdwv"] Jan 11 17:51:12 crc kubenswrapper[4837]: I0111 17:51:12.466631 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/ce0fbcd3-57e1-437f-b004-7f609443f898-dns-swift-storage-0\") pod \"dnsmasq-dns-84b966f6c9-7p2rt\" (UID: \"ce0fbcd3-57e1-437f-b004-7f609443f898\") " pod="openstack/dnsmasq-dns-84b966f6c9-7p2rt" Jan 11 17:51:12 crc kubenswrapper[4837]: I0111 17:51:12.466685 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/ce0fbcd3-57e1-437f-b004-7f609443f898-ovsdbserver-sb\") pod \"dnsmasq-dns-84b966f6c9-7p2rt\" (UID: \"ce0fbcd3-57e1-437f-b004-7f609443f898\") " pod="openstack/dnsmasq-dns-84b966f6c9-7p2rt" Jan 11 17:51:12 crc kubenswrapper[4837]: I0111 17:51:12.466732 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/6bce9623-f279-4dfb-9cc2-e5a2de56ca9d-httpd-config\") pod \"neutron-d654f854d-9wdwv\" (UID: \"6bce9623-f279-4dfb-9cc2-e5a2de56ca9d\") " pod="openstack/neutron-d654f854d-9wdwv" Jan 11 17:51:12 crc kubenswrapper[4837]: I0111 17:51:12.466782 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lv2ds\" (UniqueName: \"kubernetes.io/projected/6bce9623-f279-4dfb-9cc2-e5a2de56ca9d-kube-api-access-lv2ds\") pod \"neutron-d654f854d-9wdwv\" (UID: \"6bce9623-f279-4dfb-9cc2-e5a2de56ca9d\") " pod="openstack/neutron-d654f854d-9wdwv" Jan 11 17:51:12 crc kubenswrapper[4837]: I0111 17:51:12.466841 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/6bce9623-f279-4dfb-9cc2-e5a2de56ca9d-config\") pod \"neutron-d654f854d-9wdwv\" (UID: \"6bce9623-f279-4dfb-9cc2-e5a2de56ca9d\") " pod="openstack/neutron-d654f854d-9wdwv" Jan 11 17:51:12 crc kubenswrapper[4837]: I0111 17:51:12.466872 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/ce0fbcd3-57e1-437f-b004-7f609443f898-dns-svc\") pod \"dnsmasq-dns-84b966f6c9-7p2rt\" (UID: \"ce0fbcd3-57e1-437f-b004-7f609443f898\") " pod="openstack/dnsmasq-dns-84b966f6c9-7p2rt" Jan 11 17:51:12 crc kubenswrapper[4837]: I0111 17:51:12.466887 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/ce0fbcd3-57e1-437f-b004-7f609443f898-ovsdbserver-nb\") pod \"dnsmasq-dns-84b966f6c9-7p2rt\" (UID: \"ce0fbcd3-57e1-437f-b004-7f609443f898\") " pod="openstack/dnsmasq-dns-84b966f6c9-7p2rt" Jan 11 17:51:12 crc kubenswrapper[4837]: I0111 17:51:12.466998 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ce0fbcd3-57e1-437f-b004-7f609443f898-config\") pod \"dnsmasq-dns-84b966f6c9-7p2rt\" (UID: \"ce0fbcd3-57e1-437f-b004-7f609443f898\") " pod="openstack/dnsmasq-dns-84b966f6c9-7p2rt" Jan 11 17:51:12 crc kubenswrapper[4837]: I0111 17:51:12.467048 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6bce9623-f279-4dfb-9cc2-e5a2de56ca9d-combined-ca-bundle\") pod \"neutron-d654f854d-9wdwv\" (UID: \"6bce9623-f279-4dfb-9cc2-e5a2de56ca9d\") " pod="openstack/neutron-d654f854d-9wdwv" Jan 11 17:51:12 crc kubenswrapper[4837]: I0111 17:51:12.467077 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/6bce9623-f279-4dfb-9cc2-e5a2de56ca9d-ovndb-tls-certs\") pod \"neutron-d654f854d-9wdwv\" (UID: \"6bce9623-f279-4dfb-9cc2-e5a2de56ca9d\") " pod="openstack/neutron-d654f854d-9wdwv" Jan 11 17:51:12 crc kubenswrapper[4837]: I0111 17:51:12.467100 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5xb2t\" (UniqueName: \"kubernetes.io/projected/ce0fbcd3-57e1-437f-b004-7f609443f898-kube-api-access-5xb2t\") pod \"dnsmasq-dns-84b966f6c9-7p2rt\" (UID: \"ce0fbcd3-57e1-437f-b004-7f609443f898\") " pod="openstack/dnsmasq-dns-84b966f6c9-7p2rt" Jan 11 17:51:12 crc kubenswrapper[4837]: I0111 17:51:12.468645 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/ce0fbcd3-57e1-437f-b004-7f609443f898-dns-swift-storage-0\") pod \"dnsmasq-dns-84b966f6c9-7p2rt\" (UID: \"ce0fbcd3-57e1-437f-b004-7f609443f898\") " pod="openstack/dnsmasq-dns-84b966f6c9-7p2rt" Jan 11 17:51:12 crc kubenswrapper[4837]: I0111 17:51:12.470449 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/ce0fbcd3-57e1-437f-b004-7f609443f898-ovsdbserver-nb\") pod \"dnsmasq-dns-84b966f6c9-7p2rt\" (UID: \"ce0fbcd3-57e1-437f-b004-7f609443f898\") " pod="openstack/dnsmasq-dns-84b966f6c9-7p2rt" Jan 11 17:51:12 crc kubenswrapper[4837]: I0111 17:51:12.471060 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ce0fbcd3-57e1-437f-b004-7f609443f898-config\") pod \"dnsmasq-dns-84b966f6c9-7p2rt\" (UID: \"ce0fbcd3-57e1-437f-b004-7f609443f898\") " pod="openstack/dnsmasq-dns-84b966f6c9-7p2rt" Jan 11 17:51:12 crc kubenswrapper[4837]: I0111 17:51:12.471911 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/ce0fbcd3-57e1-437f-b004-7f609443f898-dns-svc\") pod \"dnsmasq-dns-84b966f6c9-7p2rt\" (UID: \"ce0fbcd3-57e1-437f-b004-7f609443f898\") " pod="openstack/dnsmasq-dns-84b966f6c9-7p2rt" Jan 11 17:51:12 crc kubenswrapper[4837]: I0111 17:51:12.473185 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/ce0fbcd3-57e1-437f-b004-7f609443f898-ovsdbserver-sb\") pod \"dnsmasq-dns-84b966f6c9-7p2rt\" (UID: \"ce0fbcd3-57e1-437f-b004-7f609443f898\") " pod="openstack/dnsmasq-dns-84b966f6c9-7p2rt" Jan 11 17:51:12 crc kubenswrapper[4837]: I0111 17:51:12.497397 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5xb2t\" (UniqueName: \"kubernetes.io/projected/ce0fbcd3-57e1-437f-b004-7f609443f898-kube-api-access-5xb2t\") pod \"dnsmasq-dns-84b966f6c9-7p2rt\" (UID: \"ce0fbcd3-57e1-437f-b004-7f609443f898\") " pod="openstack/dnsmasq-dns-84b966f6c9-7p2rt" Jan 11 17:51:12 crc kubenswrapper[4837]: I0111 17:51:12.568518 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/6bce9623-f279-4dfb-9cc2-e5a2de56ca9d-ovndb-tls-certs\") pod \"neutron-d654f854d-9wdwv\" (UID: \"6bce9623-f279-4dfb-9cc2-e5a2de56ca9d\") " pod="openstack/neutron-d654f854d-9wdwv" Jan 11 17:51:12 crc kubenswrapper[4837]: I0111 17:51:12.568589 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/6bce9623-f279-4dfb-9cc2-e5a2de56ca9d-httpd-config\") pod \"neutron-d654f854d-9wdwv\" (UID: \"6bce9623-f279-4dfb-9cc2-e5a2de56ca9d\") " pod="openstack/neutron-d654f854d-9wdwv" Jan 11 17:51:12 crc kubenswrapper[4837]: I0111 17:51:12.568617 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lv2ds\" (UniqueName: \"kubernetes.io/projected/6bce9623-f279-4dfb-9cc2-e5a2de56ca9d-kube-api-access-lv2ds\") pod \"neutron-d654f854d-9wdwv\" (UID: \"6bce9623-f279-4dfb-9cc2-e5a2de56ca9d\") " pod="openstack/neutron-d654f854d-9wdwv" Jan 11 17:51:12 crc kubenswrapper[4837]: I0111 17:51:12.568652 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/6bce9623-f279-4dfb-9cc2-e5a2de56ca9d-config\") pod \"neutron-d654f854d-9wdwv\" (UID: \"6bce9623-f279-4dfb-9cc2-e5a2de56ca9d\") " pod="openstack/neutron-d654f854d-9wdwv" Jan 11 17:51:12 crc kubenswrapper[4837]: I0111 17:51:12.568743 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6bce9623-f279-4dfb-9cc2-e5a2de56ca9d-combined-ca-bundle\") pod \"neutron-d654f854d-9wdwv\" (UID: \"6bce9623-f279-4dfb-9cc2-e5a2de56ca9d\") " pod="openstack/neutron-d654f854d-9wdwv" Jan 11 17:51:12 crc kubenswrapper[4837]: I0111 17:51:12.575654 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6bce9623-f279-4dfb-9cc2-e5a2de56ca9d-combined-ca-bundle\") pod \"neutron-d654f854d-9wdwv\" (UID: \"6bce9623-f279-4dfb-9cc2-e5a2de56ca9d\") " pod="openstack/neutron-d654f854d-9wdwv" Jan 11 17:51:12 crc kubenswrapper[4837]: I0111 17:51:12.575854 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/6bce9623-f279-4dfb-9cc2-e5a2de56ca9d-ovndb-tls-certs\") pod \"neutron-d654f854d-9wdwv\" (UID: \"6bce9623-f279-4dfb-9cc2-e5a2de56ca9d\") " pod="openstack/neutron-d654f854d-9wdwv" Jan 11 17:51:12 crc kubenswrapper[4837]: I0111 17:51:12.580422 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/6bce9623-f279-4dfb-9cc2-e5a2de56ca9d-httpd-config\") pod \"neutron-d654f854d-9wdwv\" (UID: \"6bce9623-f279-4dfb-9cc2-e5a2de56ca9d\") " pod="openstack/neutron-d654f854d-9wdwv" Jan 11 17:51:12 crc kubenswrapper[4837]: I0111 17:51:12.581298 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/secret/6bce9623-f279-4dfb-9cc2-e5a2de56ca9d-config\") pod \"neutron-d654f854d-9wdwv\" (UID: \"6bce9623-f279-4dfb-9cc2-e5a2de56ca9d\") " pod="openstack/neutron-d654f854d-9wdwv" Jan 11 17:51:12 crc kubenswrapper[4837]: I0111 17:51:12.601332 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lv2ds\" (UniqueName: \"kubernetes.io/projected/6bce9623-f279-4dfb-9cc2-e5a2de56ca9d-kube-api-access-lv2ds\") pod \"neutron-d654f854d-9wdwv\" (UID: \"6bce9623-f279-4dfb-9cc2-e5a2de56ca9d\") " pod="openstack/neutron-d654f854d-9wdwv" Jan 11 17:51:12 crc kubenswrapper[4837]: I0111 17:51:12.635896 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-84b966f6c9-7p2rt" Jan 11 17:51:12 crc kubenswrapper[4837]: I0111 17:51:12.761111 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-d654f854d-9wdwv" Jan 11 17:51:12 crc kubenswrapper[4837]: I0111 17:51:12.843592 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-86bdff5ffb-hdnql" event={"ID":"1a6ff225-8495-4008-9719-c85bcb7fa65b","Type":"ContainerStarted","Data":"0d8a8c2f7feaf33b3384a11f11cb4ba98d5a66961806646e0755756a6e54f88b"} Jan 11 17:51:12 crc kubenswrapper[4837]: I0111 17:51:12.866793 4837 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-77585f5f8c-xqvrn" Jan 11 17:51:13 crc kubenswrapper[4837]: I0111 17:51:12.999711 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e5f0d593-ab67-4967-9397-517c45742d39-config\") pod \"e5f0d593-ab67-4967-9397-517c45742d39\" (UID: \"e5f0d593-ab67-4967-9397-517c45742d39\") " Jan 11 17:51:13 crc kubenswrapper[4837]: I0111 17:51:13.000684 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/e5f0d593-ab67-4967-9397-517c45742d39-ovsdbserver-sb\") pod \"e5f0d593-ab67-4967-9397-517c45742d39\" (UID: \"e5f0d593-ab67-4967-9397-517c45742d39\") " Jan 11 17:51:13 crc kubenswrapper[4837]: I0111 17:51:13.000738 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7vlnn\" (UniqueName: \"kubernetes.io/projected/e5f0d593-ab67-4967-9397-517c45742d39-kube-api-access-7vlnn\") pod \"e5f0d593-ab67-4967-9397-517c45742d39\" (UID: \"e5f0d593-ab67-4967-9397-517c45742d39\") " Jan 11 17:51:13 crc kubenswrapper[4837]: I0111 17:51:13.000755 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/e5f0d593-ab67-4967-9397-517c45742d39-dns-svc\") pod \"e5f0d593-ab67-4967-9397-517c45742d39\" (UID: \"e5f0d593-ab67-4967-9397-517c45742d39\") " Jan 11 17:51:13 crc kubenswrapper[4837]: I0111 17:51:13.000846 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/e5f0d593-ab67-4967-9397-517c45742d39-ovsdbserver-nb\") pod \"e5f0d593-ab67-4967-9397-517c45742d39\" (UID: \"e5f0d593-ab67-4967-9397-517c45742d39\") " Jan 11 17:51:13 crc kubenswrapper[4837]: I0111 17:51:13.001119 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/e5f0d593-ab67-4967-9397-517c45742d39-dns-swift-storage-0\") pod \"e5f0d593-ab67-4967-9397-517c45742d39\" (UID: \"e5f0d593-ab67-4967-9397-517c45742d39\") " Jan 11 17:51:13 crc kubenswrapper[4837]: I0111 17:51:13.023537 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e5f0d593-ab67-4967-9397-517c45742d39-kube-api-access-7vlnn" (OuterVolumeSpecName: "kube-api-access-7vlnn") pod "e5f0d593-ab67-4967-9397-517c45742d39" (UID: "e5f0d593-ab67-4967-9397-517c45742d39"). InnerVolumeSpecName "kube-api-access-7vlnn". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 11 17:51:13 crc kubenswrapper[4837]: I0111 17:51:13.087859 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e5f0d593-ab67-4967-9397-517c45742d39-config" (OuterVolumeSpecName: "config") pod "e5f0d593-ab67-4967-9397-517c45742d39" (UID: "e5f0d593-ab67-4967-9397-517c45742d39"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 11 17:51:13 crc kubenswrapper[4837]: I0111 17:51:13.090153 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e5f0d593-ab67-4967-9397-517c45742d39-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "e5f0d593-ab67-4967-9397-517c45742d39" (UID: "e5f0d593-ab67-4967-9397-517c45742d39"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 11 17:51:13 crc kubenswrapper[4837]: I0111 17:51:13.108265 4837 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e5f0d593-ab67-4967-9397-517c45742d39-config\") on node \"crc\" DevicePath \"\"" Jan 11 17:51:13 crc kubenswrapper[4837]: I0111 17:51:13.108287 4837 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/e5f0d593-ab67-4967-9397-517c45742d39-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Jan 11 17:51:13 crc kubenswrapper[4837]: I0111 17:51:13.108296 4837 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-7vlnn\" (UniqueName: \"kubernetes.io/projected/e5f0d593-ab67-4967-9397-517c45742d39-kube-api-access-7vlnn\") on node \"crc\" DevicePath \"\"" Jan 11 17:51:13 crc kubenswrapper[4837]: I0111 17:51:13.110464 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e5f0d593-ab67-4967-9397-517c45742d39-dns-swift-storage-0" (OuterVolumeSpecName: "dns-swift-storage-0") pod "e5f0d593-ab67-4967-9397-517c45742d39" (UID: "e5f0d593-ab67-4967-9397-517c45742d39"). InnerVolumeSpecName "dns-swift-storage-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 11 17:51:13 crc kubenswrapper[4837]: I0111 17:51:13.117778 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e5f0d593-ab67-4967-9397-517c45742d39-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "e5f0d593-ab67-4967-9397-517c45742d39" (UID: "e5f0d593-ab67-4967-9397-517c45742d39"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 11 17:51:13 crc kubenswrapper[4837]: I0111 17:51:13.119861 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e5f0d593-ab67-4967-9397-517c45742d39-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "e5f0d593-ab67-4967-9397-517c45742d39" (UID: "e5f0d593-ab67-4967-9397-517c45742d39"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 11 17:51:13 crc kubenswrapper[4837]: I0111 17:51:13.209984 4837 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/e5f0d593-ab67-4967-9397-517c45742d39-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Jan 11 17:51:13 crc kubenswrapper[4837]: I0111 17:51:13.210015 4837 reconciler_common.go:293] "Volume detached for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/e5f0d593-ab67-4967-9397-517c45742d39-dns-swift-storage-0\") on node \"crc\" DevicePath \"\"" Jan 11 17:51:13 crc kubenswrapper[4837]: I0111 17:51:13.210054 4837 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/e5f0d593-ab67-4967-9397-517c45742d39-dns-svc\") on node \"crc\" DevicePath \"\"" Jan 11 17:51:13 crc kubenswrapper[4837]: I0111 17:51:13.281170 4837 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-84b966f6c9-7p2rt"] Jan 11 17:51:13 crc kubenswrapper[4837]: W0111 17:51:13.281635 4837 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podce0fbcd3_57e1_437f_b004_7f609443f898.slice/crio-de6c9e95f5ec9e4da5ceab00589f9279539847f46f12ed16290f478ebed9e673 WatchSource:0}: Error finding container de6c9e95f5ec9e4da5ceab00589f9279539847f46f12ed16290f478ebed9e673: Status 404 returned error can't find the container with id de6c9e95f5ec9e4da5ceab00589f9279539847f46f12ed16290f478ebed9e673 Jan 11 17:51:13 crc kubenswrapper[4837]: I0111 17:51:13.317551 4837 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-d654f854d-9wdwv"] Jan 11 17:51:13 crc kubenswrapper[4837]: W0111 17:51:13.320849 4837 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod6bce9623_f279_4dfb_9cc2_e5a2de56ca9d.slice/crio-dcae9db6e43cfdf9074ff3194356c0877b6a47345ee3c3d5ed3e3d2cf4f6fab9 WatchSource:0}: Error finding container dcae9db6e43cfdf9074ff3194356c0877b6a47345ee3c3d5ed3e3d2cf4f6fab9: Status 404 returned error can't find the container with id dcae9db6e43cfdf9074ff3194356c0877b6a47345ee3c3d5ed3e3d2cf4f6fab9 Jan 11 17:51:13 crc kubenswrapper[4837]: I0111 17:51:13.853591 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"9669a553-3ac2-4189-86f2-08e5b972e66f","Type":"ContainerStarted","Data":"b03561769cf8682e22a6f07ad29243939f373c2e2bbbdd661ab942b564c98613"} Jan 11 17:51:13 crc kubenswrapper[4837]: I0111 17:51:13.858624 4837 generic.go:334] "Generic (PLEG): container finished" podID="ce0fbcd3-57e1-437f-b004-7f609443f898" containerID="b35778a0da294966f950416862c1a2d9bc895a95c830ed367d3ba78319fdda82" exitCode=0 Jan 11 17:51:13 crc kubenswrapper[4837]: I0111 17:51:13.858686 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-84b966f6c9-7p2rt" event={"ID":"ce0fbcd3-57e1-437f-b004-7f609443f898","Type":"ContainerDied","Data":"b35778a0da294966f950416862c1a2d9bc895a95c830ed367d3ba78319fdda82"} Jan 11 17:51:13 crc kubenswrapper[4837]: I0111 17:51:13.858738 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-84b966f6c9-7p2rt" event={"ID":"ce0fbcd3-57e1-437f-b004-7f609443f898","Type":"ContainerStarted","Data":"de6c9e95f5ec9e4da5ceab00589f9279539847f46f12ed16290f478ebed9e673"} Jan 11 17:51:13 crc kubenswrapper[4837]: I0111 17:51:13.861024 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-86bdff5ffb-hdnql" event={"ID":"1a6ff225-8495-4008-9719-c85bcb7fa65b","Type":"ContainerStarted","Data":"8a560cf52de1ec3ae5b32e6f752c971faaf93ff3e43700f77385eabe8f4d2544"} Jan 11 17:51:13 crc kubenswrapper[4837]: I0111 17:51:13.861059 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-86bdff5ffb-hdnql" event={"ID":"1a6ff225-8495-4008-9719-c85bcb7fa65b","Type":"ContainerStarted","Data":"8e2e6937b32d0e1f84a3f4d72622f4516962cc1356ba1c879c580afbab8bc732"} Jan 11 17:51:13 crc kubenswrapper[4837]: I0111 17:51:13.861492 4837 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/placement-86bdff5ffb-hdnql" Jan 11 17:51:13 crc kubenswrapper[4837]: I0111 17:51:13.861519 4837 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/placement-86bdff5ffb-hdnql" Jan 11 17:51:13 crc kubenswrapper[4837]: I0111 17:51:13.870986 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"5ec0beaf-de63-407f-8d18-46738023ab11","Type":"ContainerStarted","Data":"a2be37e1652ad8ecbd0c0c8eb6badcd102f2579adebd7c475651f8d2fad86993"} Jan 11 17:51:13 crc kubenswrapper[4837]: I0111 17:51:13.888116 4837 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-77585f5f8c-xqvrn" Jan 11 17:51:13 crc kubenswrapper[4837]: I0111 17:51:13.888264 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-77585f5f8c-xqvrn" event={"ID":"e5f0d593-ab67-4967-9397-517c45742d39","Type":"ContainerDied","Data":"903a073c30fe7183293db7550b9f7189457f4885c8da44caa5fae22c605188d5"} Jan 11 17:51:13 crc kubenswrapper[4837]: I0111 17:51:13.888301 4837 scope.go:117] "RemoveContainer" containerID="77eee50bec11a2ded47dc627f6957ec511c0d29c38f2d45ecf1bb4c34fd9dfab" Jan 11 17:51:13 crc kubenswrapper[4837]: I0111 17:51:13.903994 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-pqnst" event={"ID":"1cf6fa66-290a-4e29-bafc-e60185a22fe2","Type":"ContainerStarted","Data":"e9cdcfc59acae9ad70b4c250fd104a15cd4c9020810ce5e6aa418583f1bc934e"} Jan 11 17:51:13 crc kubenswrapper[4837]: I0111 17:51:13.909203 4837 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/placement-86bdff5ffb-hdnql" podStartSLOduration=5.909182633 podStartE2EDuration="5.909182633s" podCreationTimestamp="2026-01-11 17:51:08 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-11 17:51:13.906329186 +0000 UTC m=+1248.084521892" watchObservedRunningTime="2026-01-11 17:51:13.909182633 +0000 UTC m=+1248.087375339" Jan 11 17:51:13 crc kubenswrapper[4837]: I0111 17:51:13.912037 4837 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/glance-default-external-api-0" podStartSLOduration=11.912024239 podStartE2EDuration="11.912024239s" podCreationTimestamp="2026-01-11 17:51:02 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-11 17:51:13.881654703 +0000 UTC m=+1248.059847409" watchObservedRunningTime="2026-01-11 17:51:13.912024239 +0000 UTC m=+1248.090216945" Jan 11 17:51:13 crc kubenswrapper[4837]: I0111 17:51:13.942133 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-d654f854d-9wdwv" event={"ID":"6bce9623-f279-4dfb-9cc2-e5a2de56ca9d","Type":"ContainerStarted","Data":"06151ad9cf9b5bdb88feae9a1589ae6fc974d40700356999d6d43f6a1804ceb0"} Jan 11 17:51:13 crc kubenswrapper[4837]: I0111 17:51:13.942198 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-d654f854d-9wdwv" event={"ID":"6bce9623-f279-4dfb-9cc2-e5a2de56ca9d","Type":"ContainerStarted","Data":"bd93cd6d878e837e6b6c2da1a9ac01402a4248d2dcec952dda491834a59579be"} Jan 11 17:51:13 crc kubenswrapper[4837]: I0111 17:51:13.942212 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-d654f854d-9wdwv" event={"ID":"6bce9623-f279-4dfb-9cc2-e5a2de56ca9d","Type":"ContainerStarted","Data":"dcae9db6e43cfdf9074ff3194356c0877b6a47345ee3c3d5ed3e3d2cf4f6fab9"} Jan 11 17:51:13 crc kubenswrapper[4837]: I0111 17:51:13.942231 4837 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/neutron-d654f854d-9wdwv" Jan 11 17:51:13 crc kubenswrapper[4837]: I0111 17:51:13.981253 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-79986b9d84-7gl5k" event={"ID":"42661ef7-7007-4cff-b945-85690a07399f","Type":"ContainerStarted","Data":"89e3cc6ef31a3a37a8d9acc026995d58722578346af40b2d1ed7b24d29359872"} Jan 11 17:51:13 crc kubenswrapper[4837]: I0111 17:51:13.982140 4837 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/keystone-79986b9d84-7gl5k" Jan 11 17:51:14 crc kubenswrapper[4837]: I0111 17:51:14.015069 4837 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/neutron-d654f854d-9wdwv" podStartSLOduration=2.015049066 podStartE2EDuration="2.015049066s" podCreationTimestamp="2026-01-11 17:51:12 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-11 17:51:13.988623306 +0000 UTC m=+1248.166816012" watchObservedRunningTime="2026-01-11 17:51:14.015049066 +0000 UTC m=+1248.193241772" Jan 11 17:51:14 crc kubenswrapper[4837]: I0111 17:51:14.035322 4837 scope.go:117] "RemoveContainer" containerID="d241bf124a3b2ee1f8906722e2dac544932bb327fcf087bf4e834c39c85bc804" Jan 11 17:51:14 crc kubenswrapper[4837]: I0111 17:51:14.053635 4837 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/keystone-79986b9d84-7gl5k" podStartSLOduration=7.053615543 podStartE2EDuration="7.053615543s" podCreationTimestamp="2026-01-11 17:51:07 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-11 17:51:14.022660461 +0000 UTC m=+1248.200853167" watchObservedRunningTime="2026-01-11 17:51:14.053615543 +0000 UTC m=+1248.231808249" Jan 11 17:51:14 crc kubenswrapper[4837]: I0111 17:51:14.073245 4837 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-77585f5f8c-xqvrn"] Jan 11 17:51:14 crc kubenswrapper[4837]: I0111 17:51:14.112313 4837 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-77585f5f8c-xqvrn"] Jan 11 17:51:14 crc kubenswrapper[4837]: I0111 17:51:14.402902 4837 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e5f0d593-ab67-4967-9397-517c45742d39" path="/var/lib/kubelet/pods/e5f0d593-ab67-4967-9397-517c45742d39/volumes" Jan 11 17:51:14 crc kubenswrapper[4837]: I0111 17:51:14.865511 4837 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/neutron-75f9d98d89-rxjpb"] Jan 11 17:51:14 crc kubenswrapper[4837]: E0111 17:51:14.869640 4837 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e5f0d593-ab67-4967-9397-517c45742d39" containerName="dnsmasq-dns" Jan 11 17:51:14 crc kubenswrapper[4837]: I0111 17:51:14.869860 4837 state_mem.go:107] "Deleted CPUSet assignment" podUID="e5f0d593-ab67-4967-9397-517c45742d39" containerName="dnsmasq-dns" Jan 11 17:51:14 crc kubenswrapper[4837]: E0111 17:51:14.869941 4837 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e5f0d593-ab67-4967-9397-517c45742d39" containerName="init" Jan 11 17:51:14 crc kubenswrapper[4837]: I0111 17:51:14.870014 4837 state_mem.go:107] "Deleted CPUSet assignment" podUID="e5f0d593-ab67-4967-9397-517c45742d39" containerName="init" Jan 11 17:51:14 crc kubenswrapper[4837]: I0111 17:51:14.872618 4837 memory_manager.go:354] "RemoveStaleState removing state" podUID="e5f0d593-ab67-4967-9397-517c45742d39" containerName="dnsmasq-dns" Jan 11 17:51:14 crc kubenswrapper[4837]: I0111 17:51:14.874025 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-75f9d98d89-rxjpb" Jan 11 17:51:14 crc kubenswrapper[4837]: I0111 17:51:14.882890 4837 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-neutron-public-svc" Jan 11 17:51:14 crc kubenswrapper[4837]: I0111 17:51:14.884185 4837 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-neutron-internal-svc" Jan 11 17:51:14 crc kubenswrapper[4837]: I0111 17:51:14.889033 4837 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-75f9d98d89-rxjpb"] Jan 11 17:51:14 crc kubenswrapper[4837]: I0111 17:51:14.949950 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/556f75eb-e607-44ee-bbde-cc94844a98bd-internal-tls-certs\") pod \"neutron-75f9d98d89-rxjpb\" (UID: \"556f75eb-e607-44ee-bbde-cc94844a98bd\") " pod="openstack/neutron-75f9d98d89-rxjpb" Jan 11 17:51:14 crc kubenswrapper[4837]: I0111 17:51:14.950416 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/556f75eb-e607-44ee-bbde-cc94844a98bd-ovndb-tls-certs\") pod \"neutron-75f9d98d89-rxjpb\" (UID: \"556f75eb-e607-44ee-bbde-cc94844a98bd\") " pod="openstack/neutron-75f9d98d89-rxjpb" Jan 11 17:51:14 crc kubenswrapper[4837]: I0111 17:51:14.950521 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/556f75eb-e607-44ee-bbde-cc94844a98bd-config\") pod \"neutron-75f9d98d89-rxjpb\" (UID: \"556f75eb-e607-44ee-bbde-cc94844a98bd\") " pod="openstack/neutron-75f9d98d89-rxjpb" Jan 11 17:51:14 crc kubenswrapper[4837]: I0111 17:51:14.950727 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/556f75eb-e607-44ee-bbde-cc94844a98bd-public-tls-certs\") pod \"neutron-75f9d98d89-rxjpb\" (UID: \"556f75eb-e607-44ee-bbde-cc94844a98bd\") " pod="openstack/neutron-75f9d98d89-rxjpb" Jan 11 17:51:14 crc kubenswrapper[4837]: I0111 17:51:14.950772 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/556f75eb-e607-44ee-bbde-cc94844a98bd-httpd-config\") pod \"neutron-75f9d98d89-rxjpb\" (UID: \"556f75eb-e607-44ee-bbde-cc94844a98bd\") " pod="openstack/neutron-75f9d98d89-rxjpb" Jan 11 17:51:14 crc kubenswrapper[4837]: I0111 17:51:14.950813 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/556f75eb-e607-44ee-bbde-cc94844a98bd-combined-ca-bundle\") pod \"neutron-75f9d98d89-rxjpb\" (UID: \"556f75eb-e607-44ee-bbde-cc94844a98bd\") " pod="openstack/neutron-75f9d98d89-rxjpb" Jan 11 17:51:14 crc kubenswrapper[4837]: I0111 17:51:14.950858 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-797sd\" (UniqueName: \"kubernetes.io/projected/556f75eb-e607-44ee-bbde-cc94844a98bd-kube-api-access-797sd\") pod \"neutron-75f9d98d89-rxjpb\" (UID: \"556f75eb-e607-44ee-bbde-cc94844a98bd\") " pod="openstack/neutron-75f9d98d89-rxjpb" Jan 11 17:51:14 crc kubenswrapper[4837]: I0111 17:51:14.990129 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-84b966f6c9-7p2rt" event={"ID":"ce0fbcd3-57e1-437f-b004-7f609443f898","Type":"ContainerStarted","Data":"71ae436473ecaeccb052dd25701507bd1adfa22c1b2c4afb6588a78d857f1798"} Jan 11 17:51:14 crc kubenswrapper[4837]: I0111 17:51:14.991154 4837 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-84b966f6c9-7p2rt" Jan 11 17:51:14 crc kubenswrapper[4837]: I0111 17:51:14.995807 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-db-sync-6mdwl" event={"ID":"5930b460-1c65-4c06-a3bc-f6d6f0518110","Type":"ContainerStarted","Data":"ec89d103ce72e03c31f614723ff515f9f50db651282eb16e23134970bbd6d516"} Jan 11 17:51:15 crc kubenswrapper[4837]: I0111 17:51:15.019131 4837 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-84b966f6c9-7p2rt" podStartSLOduration=3.019112864 podStartE2EDuration="3.019112864s" podCreationTimestamp="2026-01-11 17:51:12 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-11 17:51:15.006810064 +0000 UTC m=+1249.185002770" watchObservedRunningTime="2026-01-11 17:51:15.019112864 +0000 UTC m=+1249.197305570" Jan 11 17:51:15 crc kubenswrapper[4837]: I0111 17:51:15.029445 4837 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/barbican-db-sync-6mdwl" podStartSLOduration=2.8271956190000003 podStartE2EDuration="49.029427811s" podCreationTimestamp="2026-01-11 17:50:26 +0000 UTC" firstStartedPulling="2026-01-11 17:50:27.919601432 +0000 UTC m=+1202.097794148" lastFinishedPulling="2026-01-11 17:51:14.121833624 +0000 UTC m=+1248.300026340" observedRunningTime="2026-01-11 17:51:15.027976912 +0000 UTC m=+1249.206169618" watchObservedRunningTime="2026-01-11 17:51:15.029427811 +0000 UTC m=+1249.207620517" Jan 11 17:51:15 crc kubenswrapper[4837]: I0111 17:51:15.052085 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/556f75eb-e607-44ee-bbde-cc94844a98bd-public-tls-certs\") pod \"neutron-75f9d98d89-rxjpb\" (UID: \"556f75eb-e607-44ee-bbde-cc94844a98bd\") " pod="openstack/neutron-75f9d98d89-rxjpb" Jan 11 17:51:15 crc kubenswrapper[4837]: I0111 17:51:15.052151 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/556f75eb-e607-44ee-bbde-cc94844a98bd-httpd-config\") pod \"neutron-75f9d98d89-rxjpb\" (UID: \"556f75eb-e607-44ee-bbde-cc94844a98bd\") " pod="openstack/neutron-75f9d98d89-rxjpb" Jan 11 17:51:15 crc kubenswrapper[4837]: I0111 17:51:15.052169 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/556f75eb-e607-44ee-bbde-cc94844a98bd-combined-ca-bundle\") pod \"neutron-75f9d98d89-rxjpb\" (UID: \"556f75eb-e607-44ee-bbde-cc94844a98bd\") " pod="openstack/neutron-75f9d98d89-rxjpb" Jan 11 17:51:15 crc kubenswrapper[4837]: I0111 17:51:15.052187 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-797sd\" (UniqueName: \"kubernetes.io/projected/556f75eb-e607-44ee-bbde-cc94844a98bd-kube-api-access-797sd\") pod \"neutron-75f9d98d89-rxjpb\" (UID: \"556f75eb-e607-44ee-bbde-cc94844a98bd\") " pod="openstack/neutron-75f9d98d89-rxjpb" Jan 11 17:51:15 crc kubenswrapper[4837]: I0111 17:51:15.052286 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/556f75eb-e607-44ee-bbde-cc94844a98bd-internal-tls-certs\") pod \"neutron-75f9d98d89-rxjpb\" (UID: \"556f75eb-e607-44ee-bbde-cc94844a98bd\") " pod="openstack/neutron-75f9d98d89-rxjpb" Jan 11 17:51:15 crc kubenswrapper[4837]: I0111 17:51:15.052369 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/556f75eb-e607-44ee-bbde-cc94844a98bd-ovndb-tls-certs\") pod \"neutron-75f9d98d89-rxjpb\" (UID: \"556f75eb-e607-44ee-bbde-cc94844a98bd\") " pod="openstack/neutron-75f9d98d89-rxjpb" Jan 11 17:51:15 crc kubenswrapper[4837]: I0111 17:51:15.052519 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/556f75eb-e607-44ee-bbde-cc94844a98bd-config\") pod \"neutron-75f9d98d89-rxjpb\" (UID: \"556f75eb-e607-44ee-bbde-cc94844a98bd\") " pod="openstack/neutron-75f9d98d89-rxjpb" Jan 11 17:51:15 crc kubenswrapper[4837]: I0111 17:51:15.063426 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/556f75eb-e607-44ee-bbde-cc94844a98bd-httpd-config\") pod \"neutron-75f9d98d89-rxjpb\" (UID: \"556f75eb-e607-44ee-bbde-cc94844a98bd\") " pod="openstack/neutron-75f9d98d89-rxjpb" Jan 11 17:51:15 crc kubenswrapper[4837]: I0111 17:51:15.063431 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/556f75eb-e607-44ee-bbde-cc94844a98bd-public-tls-certs\") pod \"neutron-75f9d98d89-rxjpb\" (UID: \"556f75eb-e607-44ee-bbde-cc94844a98bd\") " pod="openstack/neutron-75f9d98d89-rxjpb" Jan 11 17:51:15 crc kubenswrapper[4837]: I0111 17:51:15.067276 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/556f75eb-e607-44ee-bbde-cc94844a98bd-combined-ca-bundle\") pod \"neutron-75f9d98d89-rxjpb\" (UID: \"556f75eb-e607-44ee-bbde-cc94844a98bd\") " pod="openstack/neutron-75f9d98d89-rxjpb" Jan 11 17:51:15 crc kubenswrapper[4837]: I0111 17:51:15.067922 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/secret/556f75eb-e607-44ee-bbde-cc94844a98bd-config\") pod \"neutron-75f9d98d89-rxjpb\" (UID: \"556f75eb-e607-44ee-bbde-cc94844a98bd\") " pod="openstack/neutron-75f9d98d89-rxjpb" Jan 11 17:51:15 crc kubenswrapper[4837]: I0111 17:51:15.072775 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/556f75eb-e607-44ee-bbde-cc94844a98bd-internal-tls-certs\") pod \"neutron-75f9d98d89-rxjpb\" (UID: \"556f75eb-e607-44ee-bbde-cc94844a98bd\") " pod="openstack/neutron-75f9d98d89-rxjpb" Jan 11 17:51:15 crc kubenswrapper[4837]: I0111 17:51:15.077179 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-797sd\" (UniqueName: \"kubernetes.io/projected/556f75eb-e607-44ee-bbde-cc94844a98bd-kube-api-access-797sd\") pod \"neutron-75f9d98d89-rxjpb\" (UID: \"556f75eb-e607-44ee-bbde-cc94844a98bd\") " pod="openstack/neutron-75f9d98d89-rxjpb" Jan 11 17:51:15 crc kubenswrapper[4837]: I0111 17:51:15.077456 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/556f75eb-e607-44ee-bbde-cc94844a98bd-ovndb-tls-certs\") pod \"neutron-75f9d98d89-rxjpb\" (UID: \"556f75eb-e607-44ee-bbde-cc94844a98bd\") " pod="openstack/neutron-75f9d98d89-rxjpb" Jan 11 17:51:15 crc kubenswrapper[4837]: I0111 17:51:15.229477 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-75f9d98d89-rxjpb" Jan 11 17:51:15 crc kubenswrapper[4837]: I0111 17:51:15.831431 4837 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-75f9d98d89-rxjpb"] Jan 11 17:51:15 crc kubenswrapper[4837]: W0111 17:51:15.842901 4837 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod556f75eb_e607_44ee_bbde_cc94844a98bd.slice/crio-5779b07d295b7c12187bab918e283d544a9918adb2f3e113f475b5ed1aeecc26 WatchSource:0}: Error finding container 5779b07d295b7c12187bab918e283d544a9918adb2f3e113f475b5ed1aeecc26: Status 404 returned error can't find the container with id 5779b07d295b7c12187bab918e283d544a9918adb2f3e113f475b5ed1aeecc26 Jan 11 17:51:16 crc kubenswrapper[4837]: I0111 17:51:16.015332 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-75f9d98d89-rxjpb" event={"ID":"556f75eb-e607-44ee-bbde-cc94844a98bd","Type":"ContainerStarted","Data":"5779b07d295b7c12187bab918e283d544a9918adb2f3e113f475b5ed1aeecc26"} Jan 11 17:51:17 crc kubenswrapper[4837]: I0111 17:51:17.025899 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-75f9d98d89-rxjpb" event={"ID":"556f75eb-e607-44ee-bbde-cc94844a98bd","Type":"ContainerStarted","Data":"26a2c2116b4f3678d4d523323c52c37c94cc2144914a3f3a24f6085131de3355"} Jan 11 17:51:17 crc kubenswrapper[4837]: I0111 17:51:17.026344 4837 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/neutron-75f9d98d89-rxjpb" Jan 11 17:51:17 crc kubenswrapper[4837]: I0111 17:51:17.026358 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-75f9d98d89-rxjpb" event={"ID":"556f75eb-e607-44ee-bbde-cc94844a98bd","Type":"ContainerStarted","Data":"d65c05b0601a68aea59ab820fe68219ced22c545329d19e9b15fbf320fe106f1"} Jan 11 17:51:17 crc kubenswrapper[4837]: I0111 17:51:17.027401 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-db-sync-l58f7" event={"ID":"ecc4b7a3-4585-48ee-9cf3-caa58c9f5895","Type":"ContainerStarted","Data":"05e01069189ca9bcb009b7a4e0f5bbac836d0fe3c08bea3479a309d144351b91"} Jan 11 17:51:17 crc kubenswrapper[4837]: I0111 17:51:17.053575 4837 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/neutron-75f9d98d89-rxjpb" podStartSLOduration=3.053555128 podStartE2EDuration="3.053555128s" podCreationTimestamp="2026-01-11 17:51:14 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-11 17:51:17.051602135 +0000 UTC m=+1251.229794871" watchObservedRunningTime="2026-01-11 17:51:17.053555128 +0000 UTC m=+1251.231747844" Jan 11 17:51:17 crc kubenswrapper[4837]: I0111 17:51:17.078588 4837 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/cinder-db-sync-l58f7" podStartSLOduration=2.528902106 podStartE2EDuration="51.078568709s" podCreationTimestamp="2026-01-11 17:50:26 +0000 UTC" firstStartedPulling="2026-01-11 17:50:27.436157846 +0000 UTC m=+1201.614350552" lastFinishedPulling="2026-01-11 17:51:15.985824449 +0000 UTC m=+1250.164017155" observedRunningTime="2026-01-11 17:51:17.069326561 +0000 UTC m=+1251.247519297" watchObservedRunningTime="2026-01-11 17:51:17.078568709 +0000 UTC m=+1251.256761435" Jan 11 17:51:19 crc kubenswrapper[4837]: I0111 17:51:19.045920 4837 generic.go:334] "Generic (PLEG): container finished" podID="5930b460-1c65-4c06-a3bc-f6d6f0518110" containerID="ec89d103ce72e03c31f614723ff515f9f50db651282eb16e23134970bbd6d516" exitCode=0 Jan 11 17:51:19 crc kubenswrapper[4837]: I0111 17:51:19.046032 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-db-sync-6mdwl" event={"ID":"5930b460-1c65-4c06-a3bc-f6d6f0518110","Type":"ContainerDied","Data":"ec89d103ce72e03c31f614723ff515f9f50db651282eb16e23134970bbd6d516"} Jan 11 17:51:19 crc kubenswrapper[4837]: I0111 17:51:19.832178 4837 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/horizon-5d4fd56848-nmkm6" podUID="af5aeb3b-e789-4f43-ac70-bb570e59027e" containerName="horizon" probeResult="failure" output="Get \"https://10.217.0.151:8443/dashboard/auth/login/?next=/dashboard/\": dial tcp 10.217.0.151:8443: connect: connection refused" Jan 11 17:51:19 crc kubenswrapper[4837]: I0111 17:51:19.980962 4837 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/horizon-7f65cf99f6-zwzzs" podUID="ad90513d-7bd8-4407-af16-8d041440673f" containerName="horizon" probeResult="failure" output="Get \"https://10.217.0.152:8443/dashboard/auth/login/?next=/dashboard/\": dial tcp 10.217.0.152:8443: connect: connection refused" Jan 11 17:51:22 crc kubenswrapper[4837]: I0111 17:51:22.638085 4837 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-84b966f6c9-7p2rt" Jan 11 17:51:22 crc kubenswrapper[4837]: I0111 17:51:22.649611 4837 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-sync-6mdwl" Jan 11 17:51:22 crc kubenswrapper[4837]: I0111 17:51:22.718344 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5930b460-1c65-4c06-a3bc-f6d6f0518110-combined-ca-bundle\") pod \"5930b460-1c65-4c06-a3bc-f6d6f0518110\" (UID: \"5930b460-1c65-4c06-a3bc-f6d6f0518110\") " Jan 11 17:51:22 crc kubenswrapper[4837]: I0111 17:51:22.718422 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/5930b460-1c65-4c06-a3bc-f6d6f0518110-db-sync-config-data\") pod \"5930b460-1c65-4c06-a3bc-f6d6f0518110\" (UID: \"5930b460-1c65-4c06-a3bc-f6d6f0518110\") " Jan 11 17:51:22 crc kubenswrapper[4837]: I0111 17:51:22.718456 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-rbsr2\" (UniqueName: \"kubernetes.io/projected/5930b460-1c65-4c06-a3bc-f6d6f0518110-kube-api-access-rbsr2\") pod \"5930b460-1c65-4c06-a3bc-f6d6f0518110\" (UID: \"5930b460-1c65-4c06-a3bc-f6d6f0518110\") " Jan 11 17:51:22 crc kubenswrapper[4837]: I0111 17:51:22.725865 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5930b460-1c65-4c06-a3bc-f6d6f0518110-db-sync-config-data" (OuterVolumeSpecName: "db-sync-config-data") pod "5930b460-1c65-4c06-a3bc-f6d6f0518110" (UID: "5930b460-1c65-4c06-a3bc-f6d6f0518110"). InnerVolumeSpecName "db-sync-config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 11 17:51:22 crc kubenswrapper[4837]: I0111 17:51:22.728355 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5930b460-1c65-4c06-a3bc-f6d6f0518110-kube-api-access-rbsr2" (OuterVolumeSpecName: "kube-api-access-rbsr2") pod "5930b460-1c65-4c06-a3bc-f6d6f0518110" (UID: "5930b460-1c65-4c06-a3bc-f6d6f0518110"). InnerVolumeSpecName "kube-api-access-rbsr2". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 11 17:51:22 crc kubenswrapper[4837]: I0111 17:51:22.746194 4837 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-8b5c85b87-vkf4l"] Jan 11 17:51:22 crc kubenswrapper[4837]: I0111 17:51:22.748729 4837 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-8b5c85b87-vkf4l" podUID="62eadde8-9001-457f-b1e1-86f88c38054f" containerName="dnsmasq-dns" containerID="cri-o://01b38fa03cd288d00d9cae5cf20c33763603aabff635cc6f247d9c7495659eed" gracePeriod=10 Jan 11 17:51:22 crc kubenswrapper[4837]: I0111 17:51:22.766901 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5930b460-1c65-4c06-a3bc-f6d6f0518110-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "5930b460-1c65-4c06-a3bc-f6d6f0518110" (UID: "5930b460-1c65-4c06-a3bc-f6d6f0518110"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 11 17:51:22 crc kubenswrapper[4837]: I0111 17:51:22.820729 4837 reconciler_common.go:293] "Volume detached for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/5930b460-1c65-4c06-a3bc-f6d6f0518110-db-sync-config-data\") on node \"crc\" DevicePath \"\"" Jan 11 17:51:22 crc kubenswrapper[4837]: I0111 17:51:22.821101 4837 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-rbsr2\" (UniqueName: \"kubernetes.io/projected/5930b460-1c65-4c06-a3bc-f6d6f0518110-kube-api-access-rbsr2\") on node \"crc\" DevicePath \"\"" Jan 11 17:51:22 crc kubenswrapper[4837]: I0111 17:51:22.821118 4837 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5930b460-1c65-4c06-a3bc-f6d6f0518110-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 11 17:51:23 crc kubenswrapper[4837]: E0111 17:51:23.041599 4837 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ceilometer-central-agent\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack/ceilometer-0" podUID="5ec0beaf-de63-407f-8d18-46738023ab11" Jan 11 17:51:23 crc kubenswrapper[4837]: I0111 17:51:23.094964 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"5ec0beaf-de63-407f-8d18-46738023ab11","Type":"ContainerStarted","Data":"b9d3d17ee78713de367859cb9ff25e286651fefcf150555d02643be60b2440a9"} Jan 11 17:51:23 crc kubenswrapper[4837]: I0111 17:51:23.095319 4837 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="5ec0beaf-de63-407f-8d18-46738023ab11" containerName="ceilometer-notification-agent" containerID="cri-o://6365fbb22458d0aa6474e7c381cd0e77f5f0e1ffa3601c5afba83111f5f7eff4" gracePeriod=30 Jan 11 17:51:23 crc kubenswrapper[4837]: I0111 17:51:23.095434 4837 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ceilometer-0" Jan 11 17:51:23 crc kubenswrapper[4837]: I0111 17:51:23.095610 4837 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="5ec0beaf-de63-407f-8d18-46738023ab11" containerName="sg-core" containerID="cri-o://a2be37e1652ad8ecbd0c0c8eb6badcd102f2579adebd7c475651f8d2fad86993" gracePeriod=30 Jan 11 17:51:23 crc kubenswrapper[4837]: I0111 17:51:23.095702 4837 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="5ec0beaf-de63-407f-8d18-46738023ab11" containerName="proxy-httpd" containerID="cri-o://b9d3d17ee78713de367859cb9ff25e286651fefcf150555d02643be60b2440a9" gracePeriod=30 Jan 11 17:51:23 crc kubenswrapper[4837]: I0111 17:51:23.105959 4837 generic.go:334] "Generic (PLEG): container finished" podID="62eadde8-9001-457f-b1e1-86f88c38054f" containerID="01b38fa03cd288d00d9cae5cf20c33763603aabff635cc6f247d9c7495659eed" exitCode=0 Jan 11 17:51:23 crc kubenswrapper[4837]: I0111 17:51:23.106017 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-8b5c85b87-vkf4l" event={"ID":"62eadde8-9001-457f-b1e1-86f88c38054f","Type":"ContainerDied","Data":"01b38fa03cd288d00d9cae5cf20c33763603aabff635cc6f247d9c7495659eed"} Jan 11 17:51:23 crc kubenswrapper[4837]: I0111 17:51:23.118706 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-db-sync-6mdwl" event={"ID":"5930b460-1c65-4c06-a3bc-f6d6f0518110","Type":"ContainerDied","Data":"fac5f64fc3629ce6fec395f0cbcd55a5181cd6499ab0f35614aa3073533395eb"} Jan 11 17:51:23 crc kubenswrapper[4837]: I0111 17:51:23.118748 4837 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="fac5f64fc3629ce6fec395f0cbcd55a5181cd6499ab0f35614aa3073533395eb" Jan 11 17:51:23 crc kubenswrapper[4837]: I0111 17:51:23.118811 4837 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-sync-6mdwl" Jan 11 17:51:23 crc kubenswrapper[4837]: I0111 17:51:23.258887 4837 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-default-external-api-0" Jan 11 17:51:23 crc kubenswrapper[4837]: I0111 17:51:23.258942 4837 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-default-external-api-0" Jan 11 17:51:23 crc kubenswrapper[4837]: I0111 17:51:23.264658 4837 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-8b5c85b87-vkf4l" Jan 11 17:51:23 crc kubenswrapper[4837]: I0111 17:51:23.285984 4837 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-default-external-api-0" Jan 11 17:51:23 crc kubenswrapper[4837]: I0111 17:51:23.297609 4837 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-default-external-api-0" Jan 11 17:51:23 crc kubenswrapper[4837]: I0111 17:51:23.335264 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zsffv\" (UniqueName: \"kubernetes.io/projected/62eadde8-9001-457f-b1e1-86f88c38054f-kube-api-access-zsffv\") pod \"62eadde8-9001-457f-b1e1-86f88c38054f\" (UID: \"62eadde8-9001-457f-b1e1-86f88c38054f\") " Jan 11 17:51:23 crc kubenswrapper[4837]: I0111 17:51:23.344072 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/62eadde8-9001-457f-b1e1-86f88c38054f-kube-api-access-zsffv" (OuterVolumeSpecName: "kube-api-access-zsffv") pod "62eadde8-9001-457f-b1e1-86f88c38054f" (UID: "62eadde8-9001-457f-b1e1-86f88c38054f"). InnerVolumeSpecName "kube-api-access-zsffv". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 11 17:51:23 crc kubenswrapper[4837]: I0111 17:51:23.344957 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/62eadde8-9001-457f-b1e1-86f88c38054f-ovsdbserver-sb\") pod \"62eadde8-9001-457f-b1e1-86f88c38054f\" (UID: \"62eadde8-9001-457f-b1e1-86f88c38054f\") " Jan 11 17:51:23 crc kubenswrapper[4837]: I0111 17:51:23.345093 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/62eadde8-9001-457f-b1e1-86f88c38054f-ovsdbserver-nb\") pod \"62eadde8-9001-457f-b1e1-86f88c38054f\" (UID: \"62eadde8-9001-457f-b1e1-86f88c38054f\") " Jan 11 17:51:23 crc kubenswrapper[4837]: I0111 17:51:23.345261 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/62eadde8-9001-457f-b1e1-86f88c38054f-dns-svc\") pod \"62eadde8-9001-457f-b1e1-86f88c38054f\" (UID: \"62eadde8-9001-457f-b1e1-86f88c38054f\") " Jan 11 17:51:23 crc kubenswrapper[4837]: I0111 17:51:23.345476 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/62eadde8-9001-457f-b1e1-86f88c38054f-config\") pod \"62eadde8-9001-457f-b1e1-86f88c38054f\" (UID: \"62eadde8-9001-457f-b1e1-86f88c38054f\") " Jan 11 17:51:23 crc kubenswrapper[4837]: I0111 17:51:23.345792 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/62eadde8-9001-457f-b1e1-86f88c38054f-dns-swift-storage-0\") pod \"62eadde8-9001-457f-b1e1-86f88c38054f\" (UID: \"62eadde8-9001-457f-b1e1-86f88c38054f\") " Jan 11 17:51:23 crc kubenswrapper[4837]: I0111 17:51:23.347985 4837 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-zsffv\" (UniqueName: \"kubernetes.io/projected/62eadde8-9001-457f-b1e1-86f88c38054f-kube-api-access-zsffv\") on node \"crc\" DevicePath \"\"" Jan 11 17:51:23 crc kubenswrapper[4837]: I0111 17:51:23.396278 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/62eadde8-9001-457f-b1e1-86f88c38054f-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "62eadde8-9001-457f-b1e1-86f88c38054f" (UID: "62eadde8-9001-457f-b1e1-86f88c38054f"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 11 17:51:23 crc kubenswrapper[4837]: I0111 17:51:23.403910 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/62eadde8-9001-457f-b1e1-86f88c38054f-dns-swift-storage-0" (OuterVolumeSpecName: "dns-swift-storage-0") pod "62eadde8-9001-457f-b1e1-86f88c38054f" (UID: "62eadde8-9001-457f-b1e1-86f88c38054f"). InnerVolumeSpecName "dns-swift-storage-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 11 17:51:23 crc kubenswrapper[4837]: I0111 17:51:23.412510 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/62eadde8-9001-457f-b1e1-86f88c38054f-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "62eadde8-9001-457f-b1e1-86f88c38054f" (UID: "62eadde8-9001-457f-b1e1-86f88c38054f"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 11 17:51:23 crc kubenswrapper[4837]: I0111 17:51:23.418410 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/62eadde8-9001-457f-b1e1-86f88c38054f-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "62eadde8-9001-457f-b1e1-86f88c38054f" (UID: "62eadde8-9001-457f-b1e1-86f88c38054f"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 11 17:51:23 crc kubenswrapper[4837]: I0111 17:51:23.425286 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/62eadde8-9001-457f-b1e1-86f88c38054f-config" (OuterVolumeSpecName: "config") pod "62eadde8-9001-457f-b1e1-86f88c38054f" (UID: "62eadde8-9001-457f-b1e1-86f88c38054f"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 11 17:51:23 crc kubenswrapper[4837]: I0111 17:51:23.450262 4837 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/62eadde8-9001-457f-b1e1-86f88c38054f-config\") on node \"crc\" DevicePath \"\"" Jan 11 17:51:23 crc kubenswrapper[4837]: I0111 17:51:23.450540 4837 reconciler_common.go:293] "Volume detached for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/62eadde8-9001-457f-b1e1-86f88c38054f-dns-swift-storage-0\") on node \"crc\" DevicePath \"\"" Jan 11 17:51:23 crc kubenswrapper[4837]: I0111 17:51:23.450650 4837 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/62eadde8-9001-457f-b1e1-86f88c38054f-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Jan 11 17:51:23 crc kubenswrapper[4837]: I0111 17:51:23.450760 4837 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/62eadde8-9001-457f-b1e1-86f88c38054f-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Jan 11 17:51:23 crc kubenswrapper[4837]: I0111 17:51:23.450848 4837 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/62eadde8-9001-457f-b1e1-86f88c38054f-dns-svc\") on node \"crc\" DevicePath \"\"" Jan 11 17:51:23 crc kubenswrapper[4837]: I0111 17:51:23.976636 4837 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/barbican-worker-5f749597dc-j8n24"] Jan 11 17:51:23 crc kubenswrapper[4837]: E0111 17:51:23.979497 4837 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="62eadde8-9001-457f-b1e1-86f88c38054f" containerName="init" Jan 11 17:51:23 crc kubenswrapper[4837]: I0111 17:51:23.979524 4837 state_mem.go:107] "Deleted CPUSet assignment" podUID="62eadde8-9001-457f-b1e1-86f88c38054f" containerName="init" Jan 11 17:51:23 crc kubenswrapper[4837]: E0111 17:51:23.979555 4837 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="62eadde8-9001-457f-b1e1-86f88c38054f" containerName="dnsmasq-dns" Jan 11 17:51:23 crc kubenswrapper[4837]: I0111 17:51:23.979562 4837 state_mem.go:107] "Deleted CPUSet assignment" podUID="62eadde8-9001-457f-b1e1-86f88c38054f" containerName="dnsmasq-dns" Jan 11 17:51:23 crc kubenswrapper[4837]: E0111 17:51:23.979577 4837 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5930b460-1c65-4c06-a3bc-f6d6f0518110" containerName="barbican-db-sync" Jan 11 17:51:23 crc kubenswrapper[4837]: I0111 17:51:23.979585 4837 state_mem.go:107] "Deleted CPUSet assignment" podUID="5930b460-1c65-4c06-a3bc-f6d6f0518110" containerName="barbican-db-sync" Jan 11 17:51:23 crc kubenswrapper[4837]: I0111 17:51:23.979871 4837 memory_manager.go:354] "RemoveStaleState removing state" podUID="5930b460-1c65-4c06-a3bc-f6d6f0518110" containerName="barbican-db-sync" Jan 11 17:51:23 crc kubenswrapper[4837]: I0111 17:51:23.979898 4837 memory_manager.go:354] "RemoveStaleState removing state" podUID="62eadde8-9001-457f-b1e1-86f88c38054f" containerName="dnsmasq-dns" Jan 11 17:51:23 crc kubenswrapper[4837]: I0111 17:51:23.981021 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-worker-5f749597dc-j8n24" Jan 11 17:51:23 crc kubenswrapper[4837]: I0111 17:51:23.983734 4837 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-barbican-dockercfg-fvnjj" Jan 11 17:51:23 crc kubenswrapper[4837]: I0111 17:51:23.983792 4837 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-worker-config-data" Jan 11 17:51:23 crc kubenswrapper[4837]: I0111 17:51:23.983734 4837 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-config-data" Jan 11 17:51:23 crc kubenswrapper[4837]: I0111 17:51:23.988066 4837 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-worker-5f749597dc-j8n24"] Jan 11 17:51:24 crc kubenswrapper[4837]: I0111 17:51:24.047963 4837 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/barbican-keystone-listener-6dbd56445d-4bk5s"] Jan 11 17:51:24 crc kubenswrapper[4837]: I0111 17:51:24.049327 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-keystone-listener-6dbd56445d-4bk5s" Jan 11 17:51:24 crc kubenswrapper[4837]: I0111 17:51:24.053985 4837 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-keystone-listener-config-data" Jan 11 17:51:24 crc kubenswrapper[4837]: I0111 17:51:24.060521 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/59aacef4-5c25-42e6-a96f-5ca46dc94667-config-data\") pod \"barbican-worker-5f749597dc-j8n24\" (UID: \"59aacef4-5c25-42e6-a96f-5ca46dc94667\") " pod="openstack/barbican-worker-5f749597dc-j8n24" Jan 11 17:51:24 crc kubenswrapper[4837]: I0111 17:51:24.060589 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2m2vn\" (UniqueName: \"kubernetes.io/projected/59aacef4-5c25-42e6-a96f-5ca46dc94667-kube-api-access-2m2vn\") pod \"barbican-worker-5f749597dc-j8n24\" (UID: \"59aacef4-5c25-42e6-a96f-5ca46dc94667\") " pod="openstack/barbican-worker-5f749597dc-j8n24" Jan 11 17:51:24 crc kubenswrapper[4837]: I0111 17:51:24.060604 4837 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-keystone-listener-6dbd56445d-4bk5s"] Jan 11 17:51:24 crc kubenswrapper[4837]: I0111 17:51:24.060627 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/59aacef4-5c25-42e6-a96f-5ca46dc94667-config-data-custom\") pod \"barbican-worker-5f749597dc-j8n24\" (UID: \"59aacef4-5c25-42e6-a96f-5ca46dc94667\") " pod="openstack/barbican-worker-5f749597dc-j8n24" Jan 11 17:51:24 crc kubenswrapper[4837]: I0111 17:51:24.060935 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/59aacef4-5c25-42e6-a96f-5ca46dc94667-logs\") pod \"barbican-worker-5f749597dc-j8n24\" (UID: \"59aacef4-5c25-42e6-a96f-5ca46dc94667\") " pod="openstack/barbican-worker-5f749597dc-j8n24" Jan 11 17:51:24 crc kubenswrapper[4837]: I0111 17:51:24.060978 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/59aacef4-5c25-42e6-a96f-5ca46dc94667-combined-ca-bundle\") pod \"barbican-worker-5f749597dc-j8n24\" (UID: \"59aacef4-5c25-42e6-a96f-5ca46dc94667\") " pod="openstack/barbican-worker-5f749597dc-j8n24" Jan 11 17:51:24 crc kubenswrapper[4837]: I0111 17:51:24.071784 4837 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-75c8ddd69c-kg5rw"] Jan 11 17:51:24 crc kubenswrapper[4837]: I0111 17:51:24.074433 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-75c8ddd69c-kg5rw" Jan 11 17:51:24 crc kubenswrapper[4837]: I0111 17:51:24.104592 4837 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-75c8ddd69c-kg5rw"] Jan 11 17:51:24 crc kubenswrapper[4837]: I0111 17:51:24.135825 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-8b5c85b87-vkf4l" event={"ID":"62eadde8-9001-457f-b1e1-86f88c38054f","Type":"ContainerDied","Data":"e7fcc437f2309e28763b248b036ba897d7572fbf437a3bb140e5ba586d9c8155"} Jan 11 17:51:24 crc kubenswrapper[4837]: I0111 17:51:24.135998 4837 scope.go:117] "RemoveContainer" containerID="01b38fa03cd288d00d9cae5cf20c33763603aabff635cc6f247d9c7495659eed" Jan 11 17:51:24 crc kubenswrapper[4837]: I0111 17:51:24.136226 4837 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-8b5c85b87-vkf4l" Jan 11 17:51:24 crc kubenswrapper[4837]: I0111 17:51:24.144967 4837 generic.go:334] "Generic (PLEG): container finished" podID="5ec0beaf-de63-407f-8d18-46738023ab11" containerID="a2be37e1652ad8ecbd0c0c8eb6badcd102f2579adebd7c475651f8d2fad86993" exitCode=2 Jan 11 17:51:24 crc kubenswrapper[4837]: I0111 17:51:24.145151 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"5ec0beaf-de63-407f-8d18-46738023ab11","Type":"ContainerDied","Data":"a2be37e1652ad8ecbd0c0c8eb6badcd102f2579adebd7c475651f8d2fad86993"} Jan 11 17:51:24 crc kubenswrapper[4837]: I0111 17:51:24.145908 4837 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-default-external-api-0" Jan 11 17:51:24 crc kubenswrapper[4837]: I0111 17:51:24.145933 4837 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-default-external-api-0" Jan 11 17:51:24 crc kubenswrapper[4837]: I0111 17:51:24.163608 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/fee2b6c4-27f7-4652-ac09-fa6c99ef0a15-ovsdbserver-sb\") pod \"dnsmasq-dns-75c8ddd69c-kg5rw\" (UID: \"fee2b6c4-27f7-4652-ac09-fa6c99ef0a15\") " pod="openstack/dnsmasq-dns-75c8ddd69c-kg5rw" Jan 11 17:51:24 crc kubenswrapper[4837]: I0111 17:51:24.163662 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/59aacef4-5c25-42e6-a96f-5ca46dc94667-config-data\") pod \"barbican-worker-5f749597dc-j8n24\" (UID: \"59aacef4-5c25-42e6-a96f-5ca46dc94667\") " pod="openstack/barbican-worker-5f749597dc-j8n24" Jan 11 17:51:24 crc kubenswrapper[4837]: I0111 17:51:24.163715 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/fee2b6c4-27f7-4652-ac09-fa6c99ef0a15-config\") pod \"dnsmasq-dns-75c8ddd69c-kg5rw\" (UID: \"fee2b6c4-27f7-4652-ac09-fa6c99ef0a15\") " pod="openstack/dnsmasq-dns-75c8ddd69c-kg5rw" Jan 11 17:51:24 crc kubenswrapper[4837]: I0111 17:51:24.163733 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/bdeb4f5a-a034-4698-bf56-2f6e1d9cd7d0-config-data\") pod \"barbican-keystone-listener-6dbd56445d-4bk5s\" (UID: \"bdeb4f5a-a034-4698-bf56-2f6e1d9cd7d0\") " pod="openstack/barbican-keystone-listener-6dbd56445d-4bk5s" Jan 11 17:51:24 crc kubenswrapper[4837]: I0111 17:51:24.163765 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qzqkv\" (UniqueName: \"kubernetes.io/projected/fee2b6c4-27f7-4652-ac09-fa6c99ef0a15-kube-api-access-qzqkv\") pod \"dnsmasq-dns-75c8ddd69c-kg5rw\" (UID: \"fee2b6c4-27f7-4652-ac09-fa6c99ef0a15\") " pod="openstack/dnsmasq-dns-75c8ddd69c-kg5rw" Jan 11 17:51:24 crc kubenswrapper[4837]: I0111 17:51:24.163786 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-n6hb2\" (UniqueName: \"kubernetes.io/projected/bdeb4f5a-a034-4698-bf56-2f6e1d9cd7d0-kube-api-access-n6hb2\") pod \"barbican-keystone-listener-6dbd56445d-4bk5s\" (UID: \"bdeb4f5a-a034-4698-bf56-2f6e1d9cd7d0\") " pod="openstack/barbican-keystone-listener-6dbd56445d-4bk5s" Jan 11 17:51:24 crc kubenswrapper[4837]: I0111 17:51:24.163812 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2m2vn\" (UniqueName: \"kubernetes.io/projected/59aacef4-5c25-42e6-a96f-5ca46dc94667-kube-api-access-2m2vn\") pod \"barbican-worker-5f749597dc-j8n24\" (UID: \"59aacef4-5c25-42e6-a96f-5ca46dc94667\") " pod="openstack/barbican-worker-5f749597dc-j8n24" Jan 11 17:51:24 crc kubenswrapper[4837]: I0111 17:51:24.163846 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/59aacef4-5c25-42e6-a96f-5ca46dc94667-config-data-custom\") pod \"barbican-worker-5f749597dc-j8n24\" (UID: \"59aacef4-5c25-42e6-a96f-5ca46dc94667\") " pod="openstack/barbican-worker-5f749597dc-j8n24" Jan 11 17:51:24 crc kubenswrapper[4837]: I0111 17:51:24.163869 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/fee2b6c4-27f7-4652-ac09-fa6c99ef0a15-ovsdbserver-nb\") pod \"dnsmasq-dns-75c8ddd69c-kg5rw\" (UID: \"fee2b6c4-27f7-4652-ac09-fa6c99ef0a15\") " pod="openstack/dnsmasq-dns-75c8ddd69c-kg5rw" Jan 11 17:51:24 crc kubenswrapper[4837]: I0111 17:51:24.163897 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/bdeb4f5a-a034-4698-bf56-2f6e1d9cd7d0-config-data-custom\") pod \"barbican-keystone-listener-6dbd56445d-4bk5s\" (UID: \"bdeb4f5a-a034-4698-bf56-2f6e1d9cd7d0\") " pod="openstack/barbican-keystone-listener-6dbd56445d-4bk5s" Jan 11 17:51:24 crc kubenswrapper[4837]: I0111 17:51:24.163912 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/fee2b6c4-27f7-4652-ac09-fa6c99ef0a15-dns-swift-storage-0\") pod \"dnsmasq-dns-75c8ddd69c-kg5rw\" (UID: \"fee2b6c4-27f7-4652-ac09-fa6c99ef0a15\") " pod="openstack/dnsmasq-dns-75c8ddd69c-kg5rw" Jan 11 17:51:24 crc kubenswrapper[4837]: I0111 17:51:24.163925 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/bdeb4f5a-a034-4698-bf56-2f6e1d9cd7d0-logs\") pod \"barbican-keystone-listener-6dbd56445d-4bk5s\" (UID: \"bdeb4f5a-a034-4698-bf56-2f6e1d9cd7d0\") " pod="openstack/barbican-keystone-listener-6dbd56445d-4bk5s" Jan 11 17:51:24 crc kubenswrapper[4837]: I0111 17:51:24.163957 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/bdeb4f5a-a034-4698-bf56-2f6e1d9cd7d0-combined-ca-bundle\") pod \"barbican-keystone-listener-6dbd56445d-4bk5s\" (UID: \"bdeb4f5a-a034-4698-bf56-2f6e1d9cd7d0\") " pod="openstack/barbican-keystone-listener-6dbd56445d-4bk5s" Jan 11 17:51:24 crc kubenswrapper[4837]: I0111 17:51:24.163986 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/59aacef4-5c25-42e6-a96f-5ca46dc94667-logs\") pod \"barbican-worker-5f749597dc-j8n24\" (UID: \"59aacef4-5c25-42e6-a96f-5ca46dc94667\") " pod="openstack/barbican-worker-5f749597dc-j8n24" Jan 11 17:51:24 crc kubenswrapper[4837]: I0111 17:51:24.164004 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/59aacef4-5c25-42e6-a96f-5ca46dc94667-combined-ca-bundle\") pod \"barbican-worker-5f749597dc-j8n24\" (UID: \"59aacef4-5c25-42e6-a96f-5ca46dc94667\") " pod="openstack/barbican-worker-5f749597dc-j8n24" Jan 11 17:51:24 crc kubenswrapper[4837]: I0111 17:51:24.164025 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/fee2b6c4-27f7-4652-ac09-fa6c99ef0a15-dns-svc\") pod \"dnsmasq-dns-75c8ddd69c-kg5rw\" (UID: \"fee2b6c4-27f7-4652-ac09-fa6c99ef0a15\") " pod="openstack/dnsmasq-dns-75c8ddd69c-kg5rw" Jan 11 17:51:24 crc kubenswrapper[4837]: I0111 17:51:24.169189 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/59aacef4-5c25-42e6-a96f-5ca46dc94667-logs\") pod \"barbican-worker-5f749597dc-j8n24\" (UID: \"59aacef4-5c25-42e6-a96f-5ca46dc94667\") " pod="openstack/barbican-worker-5f749597dc-j8n24" Jan 11 17:51:24 crc kubenswrapper[4837]: I0111 17:51:24.170381 4837 scope.go:117] "RemoveContainer" containerID="a4309113013c24ff187b372a2a4d740fc274a55d1c33b095f5dd5d925201ff1e" Jan 11 17:51:24 crc kubenswrapper[4837]: I0111 17:51:24.172830 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/59aacef4-5c25-42e6-a96f-5ca46dc94667-config-data\") pod \"barbican-worker-5f749597dc-j8n24\" (UID: \"59aacef4-5c25-42e6-a96f-5ca46dc94667\") " pod="openstack/barbican-worker-5f749597dc-j8n24" Jan 11 17:51:24 crc kubenswrapper[4837]: I0111 17:51:24.184222 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/59aacef4-5c25-42e6-a96f-5ca46dc94667-config-data-custom\") pod \"barbican-worker-5f749597dc-j8n24\" (UID: \"59aacef4-5c25-42e6-a96f-5ca46dc94667\") " pod="openstack/barbican-worker-5f749597dc-j8n24" Jan 11 17:51:24 crc kubenswrapper[4837]: I0111 17:51:24.187219 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/59aacef4-5c25-42e6-a96f-5ca46dc94667-combined-ca-bundle\") pod \"barbican-worker-5f749597dc-j8n24\" (UID: \"59aacef4-5c25-42e6-a96f-5ca46dc94667\") " pod="openstack/barbican-worker-5f749597dc-j8n24" Jan 11 17:51:24 crc kubenswrapper[4837]: I0111 17:51:24.199615 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2m2vn\" (UniqueName: \"kubernetes.io/projected/59aacef4-5c25-42e6-a96f-5ca46dc94667-kube-api-access-2m2vn\") pod \"barbican-worker-5f749597dc-j8n24\" (UID: \"59aacef4-5c25-42e6-a96f-5ca46dc94667\") " pod="openstack/barbican-worker-5f749597dc-j8n24" Jan 11 17:51:24 crc kubenswrapper[4837]: I0111 17:51:24.205419 4837 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-8b5c85b87-vkf4l"] Jan 11 17:51:24 crc kubenswrapper[4837]: I0111 17:51:24.215923 4837 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-8b5c85b87-vkf4l"] Jan 11 17:51:24 crc kubenswrapper[4837]: I0111 17:51:24.268638 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/fee2b6c4-27f7-4652-ac09-fa6c99ef0a15-ovsdbserver-nb\") pod \"dnsmasq-dns-75c8ddd69c-kg5rw\" (UID: \"fee2b6c4-27f7-4652-ac09-fa6c99ef0a15\") " pod="openstack/dnsmasq-dns-75c8ddd69c-kg5rw" Jan 11 17:51:24 crc kubenswrapper[4837]: I0111 17:51:24.268718 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/bdeb4f5a-a034-4698-bf56-2f6e1d9cd7d0-config-data-custom\") pod \"barbican-keystone-listener-6dbd56445d-4bk5s\" (UID: \"bdeb4f5a-a034-4698-bf56-2f6e1d9cd7d0\") " pod="openstack/barbican-keystone-listener-6dbd56445d-4bk5s" Jan 11 17:51:24 crc kubenswrapper[4837]: I0111 17:51:24.268739 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/fee2b6c4-27f7-4652-ac09-fa6c99ef0a15-dns-swift-storage-0\") pod \"dnsmasq-dns-75c8ddd69c-kg5rw\" (UID: \"fee2b6c4-27f7-4652-ac09-fa6c99ef0a15\") " pod="openstack/dnsmasq-dns-75c8ddd69c-kg5rw" Jan 11 17:51:24 crc kubenswrapper[4837]: I0111 17:51:24.268756 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/bdeb4f5a-a034-4698-bf56-2f6e1d9cd7d0-logs\") pod \"barbican-keystone-listener-6dbd56445d-4bk5s\" (UID: \"bdeb4f5a-a034-4698-bf56-2f6e1d9cd7d0\") " pod="openstack/barbican-keystone-listener-6dbd56445d-4bk5s" Jan 11 17:51:24 crc kubenswrapper[4837]: I0111 17:51:24.268788 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/bdeb4f5a-a034-4698-bf56-2f6e1d9cd7d0-combined-ca-bundle\") pod \"barbican-keystone-listener-6dbd56445d-4bk5s\" (UID: \"bdeb4f5a-a034-4698-bf56-2f6e1d9cd7d0\") " pod="openstack/barbican-keystone-listener-6dbd56445d-4bk5s" Jan 11 17:51:24 crc kubenswrapper[4837]: I0111 17:51:24.268836 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/fee2b6c4-27f7-4652-ac09-fa6c99ef0a15-dns-svc\") pod \"dnsmasq-dns-75c8ddd69c-kg5rw\" (UID: \"fee2b6c4-27f7-4652-ac09-fa6c99ef0a15\") " pod="openstack/dnsmasq-dns-75c8ddd69c-kg5rw" Jan 11 17:51:24 crc kubenswrapper[4837]: I0111 17:51:24.268863 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/fee2b6c4-27f7-4652-ac09-fa6c99ef0a15-ovsdbserver-sb\") pod \"dnsmasq-dns-75c8ddd69c-kg5rw\" (UID: \"fee2b6c4-27f7-4652-ac09-fa6c99ef0a15\") " pod="openstack/dnsmasq-dns-75c8ddd69c-kg5rw" Jan 11 17:51:24 crc kubenswrapper[4837]: I0111 17:51:24.268891 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/fee2b6c4-27f7-4652-ac09-fa6c99ef0a15-config\") pod \"dnsmasq-dns-75c8ddd69c-kg5rw\" (UID: \"fee2b6c4-27f7-4652-ac09-fa6c99ef0a15\") " pod="openstack/dnsmasq-dns-75c8ddd69c-kg5rw" Jan 11 17:51:24 crc kubenswrapper[4837]: I0111 17:51:24.268909 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/bdeb4f5a-a034-4698-bf56-2f6e1d9cd7d0-config-data\") pod \"barbican-keystone-listener-6dbd56445d-4bk5s\" (UID: \"bdeb4f5a-a034-4698-bf56-2f6e1d9cd7d0\") " pod="openstack/barbican-keystone-listener-6dbd56445d-4bk5s" Jan 11 17:51:24 crc kubenswrapper[4837]: I0111 17:51:24.268932 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qzqkv\" (UniqueName: \"kubernetes.io/projected/fee2b6c4-27f7-4652-ac09-fa6c99ef0a15-kube-api-access-qzqkv\") pod \"dnsmasq-dns-75c8ddd69c-kg5rw\" (UID: \"fee2b6c4-27f7-4652-ac09-fa6c99ef0a15\") " pod="openstack/dnsmasq-dns-75c8ddd69c-kg5rw" Jan 11 17:51:24 crc kubenswrapper[4837]: I0111 17:51:24.268952 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-n6hb2\" (UniqueName: \"kubernetes.io/projected/bdeb4f5a-a034-4698-bf56-2f6e1d9cd7d0-kube-api-access-n6hb2\") pod \"barbican-keystone-listener-6dbd56445d-4bk5s\" (UID: \"bdeb4f5a-a034-4698-bf56-2f6e1d9cd7d0\") " pod="openstack/barbican-keystone-listener-6dbd56445d-4bk5s" Jan 11 17:51:24 crc kubenswrapper[4837]: I0111 17:51:24.269869 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/fee2b6c4-27f7-4652-ac09-fa6c99ef0a15-ovsdbserver-nb\") pod \"dnsmasq-dns-75c8ddd69c-kg5rw\" (UID: \"fee2b6c4-27f7-4652-ac09-fa6c99ef0a15\") " pod="openstack/dnsmasq-dns-75c8ddd69c-kg5rw" Jan 11 17:51:24 crc kubenswrapper[4837]: I0111 17:51:24.272078 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/bdeb4f5a-a034-4698-bf56-2f6e1d9cd7d0-logs\") pod \"barbican-keystone-listener-6dbd56445d-4bk5s\" (UID: \"bdeb4f5a-a034-4698-bf56-2f6e1d9cd7d0\") " pod="openstack/barbican-keystone-listener-6dbd56445d-4bk5s" Jan 11 17:51:24 crc kubenswrapper[4837]: I0111 17:51:24.275698 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/fee2b6c4-27f7-4652-ac09-fa6c99ef0a15-dns-swift-storage-0\") pod \"dnsmasq-dns-75c8ddd69c-kg5rw\" (UID: \"fee2b6c4-27f7-4652-ac09-fa6c99ef0a15\") " pod="openstack/dnsmasq-dns-75c8ddd69c-kg5rw" Jan 11 17:51:24 crc kubenswrapper[4837]: I0111 17:51:24.282338 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/bdeb4f5a-a034-4698-bf56-2f6e1d9cd7d0-combined-ca-bundle\") pod \"barbican-keystone-listener-6dbd56445d-4bk5s\" (UID: \"bdeb4f5a-a034-4698-bf56-2f6e1d9cd7d0\") " pod="openstack/barbican-keystone-listener-6dbd56445d-4bk5s" Jan 11 17:51:24 crc kubenswrapper[4837]: I0111 17:51:24.289585 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/bdeb4f5a-a034-4698-bf56-2f6e1d9cd7d0-config-data-custom\") pod \"barbican-keystone-listener-6dbd56445d-4bk5s\" (UID: \"bdeb4f5a-a034-4698-bf56-2f6e1d9cd7d0\") " pod="openstack/barbican-keystone-listener-6dbd56445d-4bk5s" Jan 11 17:51:24 crc kubenswrapper[4837]: I0111 17:51:24.293336 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/bdeb4f5a-a034-4698-bf56-2f6e1d9cd7d0-config-data\") pod \"barbican-keystone-listener-6dbd56445d-4bk5s\" (UID: \"bdeb4f5a-a034-4698-bf56-2f6e1d9cd7d0\") " pod="openstack/barbican-keystone-listener-6dbd56445d-4bk5s" Jan 11 17:51:24 crc kubenswrapper[4837]: I0111 17:51:24.298362 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/fee2b6c4-27f7-4652-ac09-fa6c99ef0a15-dns-svc\") pod \"dnsmasq-dns-75c8ddd69c-kg5rw\" (UID: \"fee2b6c4-27f7-4652-ac09-fa6c99ef0a15\") " pod="openstack/dnsmasq-dns-75c8ddd69c-kg5rw" Jan 11 17:51:24 crc kubenswrapper[4837]: I0111 17:51:24.301310 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/fee2b6c4-27f7-4652-ac09-fa6c99ef0a15-ovsdbserver-sb\") pod \"dnsmasq-dns-75c8ddd69c-kg5rw\" (UID: \"fee2b6c4-27f7-4652-ac09-fa6c99ef0a15\") " pod="openstack/dnsmasq-dns-75c8ddd69c-kg5rw" Jan 11 17:51:24 crc kubenswrapper[4837]: I0111 17:51:24.301896 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/fee2b6c4-27f7-4652-ac09-fa6c99ef0a15-config\") pod \"dnsmasq-dns-75c8ddd69c-kg5rw\" (UID: \"fee2b6c4-27f7-4652-ac09-fa6c99ef0a15\") " pod="openstack/dnsmasq-dns-75c8ddd69c-kg5rw" Jan 11 17:51:24 crc kubenswrapper[4837]: I0111 17:51:24.303452 4837 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/barbican-api-76c97b5b58-7nlx9"] Jan 11 17:51:24 crc kubenswrapper[4837]: I0111 17:51:24.312374 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-api-76c97b5b58-7nlx9" Jan 11 17:51:24 crc kubenswrapper[4837]: I0111 17:51:24.314089 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-worker-5f749597dc-j8n24" Jan 11 17:51:24 crc kubenswrapper[4837]: I0111 17:51:24.316764 4837 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-api-config-data" Jan 11 17:51:24 crc kubenswrapper[4837]: I0111 17:51:24.327576 4837 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-api-76c97b5b58-7nlx9"] Jan 11 17:51:24 crc kubenswrapper[4837]: I0111 17:51:24.328429 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-n6hb2\" (UniqueName: \"kubernetes.io/projected/bdeb4f5a-a034-4698-bf56-2f6e1d9cd7d0-kube-api-access-n6hb2\") pod \"barbican-keystone-listener-6dbd56445d-4bk5s\" (UID: \"bdeb4f5a-a034-4698-bf56-2f6e1d9cd7d0\") " pod="openstack/barbican-keystone-listener-6dbd56445d-4bk5s" Jan 11 17:51:24 crc kubenswrapper[4837]: I0111 17:51:24.338897 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qzqkv\" (UniqueName: \"kubernetes.io/projected/fee2b6c4-27f7-4652-ac09-fa6c99ef0a15-kube-api-access-qzqkv\") pod \"dnsmasq-dns-75c8ddd69c-kg5rw\" (UID: \"fee2b6c4-27f7-4652-ac09-fa6c99ef0a15\") " pod="openstack/dnsmasq-dns-75c8ddd69c-kg5rw" Jan 11 17:51:24 crc kubenswrapper[4837]: I0111 17:51:24.369595 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-keystone-listener-6dbd56445d-4bk5s" Jan 11 17:51:24 crc kubenswrapper[4837]: I0111 17:51:24.370004 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bjrwb\" (UniqueName: \"kubernetes.io/projected/ad710eac-6bbf-4930-9696-bd5bcd59be03-kube-api-access-bjrwb\") pod \"barbican-api-76c97b5b58-7nlx9\" (UID: \"ad710eac-6bbf-4930-9696-bd5bcd59be03\") " pod="openstack/barbican-api-76c97b5b58-7nlx9" Jan 11 17:51:24 crc kubenswrapper[4837]: I0111 17:51:24.370050 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ad710eac-6bbf-4930-9696-bd5bcd59be03-config-data\") pod \"barbican-api-76c97b5b58-7nlx9\" (UID: \"ad710eac-6bbf-4930-9696-bd5bcd59be03\") " pod="openstack/barbican-api-76c97b5b58-7nlx9" Jan 11 17:51:24 crc kubenswrapper[4837]: I0111 17:51:24.370091 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/ad710eac-6bbf-4930-9696-bd5bcd59be03-config-data-custom\") pod \"barbican-api-76c97b5b58-7nlx9\" (UID: \"ad710eac-6bbf-4930-9696-bd5bcd59be03\") " pod="openstack/barbican-api-76c97b5b58-7nlx9" Jan 11 17:51:24 crc kubenswrapper[4837]: I0111 17:51:24.370218 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ad710eac-6bbf-4930-9696-bd5bcd59be03-combined-ca-bundle\") pod \"barbican-api-76c97b5b58-7nlx9\" (UID: \"ad710eac-6bbf-4930-9696-bd5bcd59be03\") " pod="openstack/barbican-api-76c97b5b58-7nlx9" Jan 11 17:51:24 crc kubenswrapper[4837]: I0111 17:51:24.370253 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/ad710eac-6bbf-4930-9696-bd5bcd59be03-logs\") pod \"barbican-api-76c97b5b58-7nlx9\" (UID: \"ad710eac-6bbf-4930-9696-bd5bcd59be03\") " pod="openstack/barbican-api-76c97b5b58-7nlx9" Jan 11 17:51:24 crc kubenswrapper[4837]: I0111 17:51:24.378485 4837 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="62eadde8-9001-457f-b1e1-86f88c38054f" path="/var/lib/kubelet/pods/62eadde8-9001-457f-b1e1-86f88c38054f/volumes" Jan 11 17:51:24 crc kubenswrapper[4837]: I0111 17:51:24.413586 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-75c8ddd69c-kg5rw" Jan 11 17:51:24 crc kubenswrapper[4837]: I0111 17:51:24.471097 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/ad710eac-6bbf-4930-9696-bd5bcd59be03-config-data-custom\") pod \"barbican-api-76c97b5b58-7nlx9\" (UID: \"ad710eac-6bbf-4930-9696-bd5bcd59be03\") " pod="openstack/barbican-api-76c97b5b58-7nlx9" Jan 11 17:51:24 crc kubenswrapper[4837]: I0111 17:51:24.471173 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ad710eac-6bbf-4930-9696-bd5bcd59be03-combined-ca-bundle\") pod \"barbican-api-76c97b5b58-7nlx9\" (UID: \"ad710eac-6bbf-4930-9696-bd5bcd59be03\") " pod="openstack/barbican-api-76c97b5b58-7nlx9" Jan 11 17:51:24 crc kubenswrapper[4837]: I0111 17:51:24.471205 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/ad710eac-6bbf-4930-9696-bd5bcd59be03-logs\") pod \"barbican-api-76c97b5b58-7nlx9\" (UID: \"ad710eac-6bbf-4930-9696-bd5bcd59be03\") " pod="openstack/barbican-api-76c97b5b58-7nlx9" Jan 11 17:51:24 crc kubenswrapper[4837]: I0111 17:51:24.471293 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bjrwb\" (UniqueName: \"kubernetes.io/projected/ad710eac-6bbf-4930-9696-bd5bcd59be03-kube-api-access-bjrwb\") pod \"barbican-api-76c97b5b58-7nlx9\" (UID: \"ad710eac-6bbf-4930-9696-bd5bcd59be03\") " pod="openstack/barbican-api-76c97b5b58-7nlx9" Jan 11 17:51:24 crc kubenswrapper[4837]: I0111 17:51:24.471322 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ad710eac-6bbf-4930-9696-bd5bcd59be03-config-data\") pod \"barbican-api-76c97b5b58-7nlx9\" (UID: \"ad710eac-6bbf-4930-9696-bd5bcd59be03\") " pod="openstack/barbican-api-76c97b5b58-7nlx9" Jan 11 17:51:24 crc kubenswrapper[4837]: I0111 17:51:24.473303 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/ad710eac-6bbf-4930-9696-bd5bcd59be03-logs\") pod \"barbican-api-76c97b5b58-7nlx9\" (UID: \"ad710eac-6bbf-4930-9696-bd5bcd59be03\") " pod="openstack/barbican-api-76c97b5b58-7nlx9" Jan 11 17:51:24 crc kubenswrapper[4837]: I0111 17:51:24.479064 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ad710eac-6bbf-4930-9696-bd5bcd59be03-config-data\") pod \"barbican-api-76c97b5b58-7nlx9\" (UID: \"ad710eac-6bbf-4930-9696-bd5bcd59be03\") " pod="openstack/barbican-api-76c97b5b58-7nlx9" Jan 11 17:51:24 crc kubenswrapper[4837]: I0111 17:51:24.479172 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ad710eac-6bbf-4930-9696-bd5bcd59be03-combined-ca-bundle\") pod \"barbican-api-76c97b5b58-7nlx9\" (UID: \"ad710eac-6bbf-4930-9696-bd5bcd59be03\") " pod="openstack/barbican-api-76c97b5b58-7nlx9" Jan 11 17:51:24 crc kubenswrapper[4837]: I0111 17:51:24.482249 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/ad710eac-6bbf-4930-9696-bd5bcd59be03-config-data-custom\") pod \"barbican-api-76c97b5b58-7nlx9\" (UID: \"ad710eac-6bbf-4930-9696-bd5bcd59be03\") " pod="openstack/barbican-api-76c97b5b58-7nlx9" Jan 11 17:51:24 crc kubenswrapper[4837]: I0111 17:51:24.516216 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bjrwb\" (UniqueName: \"kubernetes.io/projected/ad710eac-6bbf-4930-9696-bd5bcd59be03-kube-api-access-bjrwb\") pod \"barbican-api-76c97b5b58-7nlx9\" (UID: \"ad710eac-6bbf-4930-9696-bd5bcd59be03\") " pod="openstack/barbican-api-76c97b5b58-7nlx9" Jan 11 17:51:24 crc kubenswrapper[4837]: I0111 17:51:24.669009 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-api-76c97b5b58-7nlx9" Jan 11 17:51:24 crc kubenswrapper[4837]: I0111 17:51:24.772607 4837 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-worker-5f749597dc-j8n24"] Jan 11 17:51:24 crc kubenswrapper[4837]: W0111 17:51:24.794118 4837 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod59aacef4_5c25_42e6_a96f_5ca46dc94667.slice/crio-7f04fa3fb67d118dd4625b42afcf170c44d1f39f22cf1e5578fc6fae9733e40e WatchSource:0}: Error finding container 7f04fa3fb67d118dd4625b42afcf170c44d1f39f22cf1e5578fc6fae9733e40e: Status 404 returned error can't find the container with id 7f04fa3fb67d118dd4625b42afcf170c44d1f39f22cf1e5578fc6fae9733e40e Jan 11 17:51:24 crc kubenswrapper[4837]: I0111 17:51:24.891359 4837 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-keystone-listener-6dbd56445d-4bk5s"] Jan 11 17:51:24 crc kubenswrapper[4837]: I0111 17:51:24.955949 4837 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-75c8ddd69c-kg5rw"] Jan 11 17:51:25 crc kubenswrapper[4837]: I0111 17:51:25.155353 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-keystone-listener-6dbd56445d-4bk5s" event={"ID":"bdeb4f5a-a034-4698-bf56-2f6e1d9cd7d0","Type":"ContainerStarted","Data":"b55d476319078ee0aa2fa26ef81197347c0ea7f03038ca4ce17556a85130a501"} Jan 11 17:51:25 crc kubenswrapper[4837]: I0111 17:51:25.161873 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-worker-5f749597dc-j8n24" event={"ID":"59aacef4-5c25-42e6-a96f-5ca46dc94667","Type":"ContainerStarted","Data":"7f04fa3fb67d118dd4625b42afcf170c44d1f39f22cf1e5578fc6fae9733e40e"} Jan 11 17:51:25 crc kubenswrapper[4837]: I0111 17:51:25.166221 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-75c8ddd69c-kg5rw" event={"ID":"fee2b6c4-27f7-4652-ac09-fa6c99ef0a15","Type":"ContainerStarted","Data":"d8d3ab45be2e8915c97fae87bd745b998815600b1a8524f911e172c191ded09f"} Jan 11 17:51:25 crc kubenswrapper[4837]: I0111 17:51:25.166275 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-75c8ddd69c-kg5rw" event={"ID":"fee2b6c4-27f7-4652-ac09-fa6c99ef0a15","Type":"ContainerStarted","Data":"089c58ffc44ca2978a114f706c6c8574734b11abb2cfcb9c3379d93259d7d96b"} Jan 11 17:51:25 crc kubenswrapper[4837]: I0111 17:51:25.168570 4837 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-api-76c97b5b58-7nlx9"] Jan 11 17:51:25 crc kubenswrapper[4837]: W0111 17:51:25.198943 4837 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podad710eac_6bbf_4930_9696_bd5bcd59be03.slice/crio-860e0b318088350ff726da9197bd034ff68fd92be56c5aa31fca2b3c475d1dc0 WatchSource:0}: Error finding container 860e0b318088350ff726da9197bd034ff68fd92be56c5aa31fca2b3c475d1dc0: Status 404 returned error can't find the container with id 860e0b318088350ff726da9197bd034ff68fd92be56c5aa31fca2b3c475d1dc0 Jan 11 17:51:26 crc kubenswrapper[4837]: I0111 17:51:26.178933 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-76c97b5b58-7nlx9" event={"ID":"ad710eac-6bbf-4930-9696-bd5bcd59be03","Type":"ContainerStarted","Data":"b766ce5b582e0bdaabd51aaef7f27edaa2ca5b3c905c8a830f6bf72a066f1008"} Jan 11 17:51:26 crc kubenswrapper[4837]: I0111 17:51:26.178988 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-76c97b5b58-7nlx9" event={"ID":"ad710eac-6bbf-4930-9696-bd5bcd59be03","Type":"ContainerStarted","Data":"860e0b318088350ff726da9197bd034ff68fd92be56c5aa31fca2b3c475d1dc0"} Jan 11 17:51:26 crc kubenswrapper[4837]: I0111 17:51:26.180744 4837 generic.go:334] "Generic (PLEG): container finished" podID="fee2b6c4-27f7-4652-ac09-fa6c99ef0a15" containerID="d8d3ab45be2e8915c97fae87bd745b998815600b1a8524f911e172c191ded09f" exitCode=0 Jan 11 17:51:26 crc kubenswrapper[4837]: I0111 17:51:26.180791 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-75c8ddd69c-kg5rw" event={"ID":"fee2b6c4-27f7-4652-ac09-fa6c99ef0a15","Type":"ContainerDied","Data":"d8d3ab45be2e8915c97fae87bd745b998815600b1a8524f911e172c191ded09f"} Jan 11 17:51:26 crc kubenswrapper[4837]: I0111 17:51:26.180864 4837 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Jan 11 17:51:26 crc kubenswrapper[4837]: I0111 17:51:26.180877 4837 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Jan 11 17:51:26 crc kubenswrapper[4837]: I0111 17:51:26.289550 4837 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-default-external-api-0" Jan 11 17:51:26 crc kubenswrapper[4837]: I0111 17:51:26.451791 4837 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-default-external-api-0" Jan 11 17:51:26 crc kubenswrapper[4837]: I0111 17:51:26.952924 4837 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/barbican-api-c9967c84b-cfjvt"] Jan 11 17:51:26 crc kubenswrapper[4837]: I0111 17:51:26.956425 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-api-c9967c84b-cfjvt" Jan 11 17:51:26 crc kubenswrapper[4837]: I0111 17:51:26.961630 4837 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-api-c9967c84b-cfjvt"] Jan 11 17:51:26 crc kubenswrapper[4837]: I0111 17:51:26.987486 4837 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-barbican-public-svc" Jan 11 17:51:26 crc kubenswrapper[4837]: I0111 17:51:26.987619 4837 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-barbican-internal-svc" Jan 11 17:51:27 crc kubenswrapper[4837]: I0111 17:51:27.142772 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dwn99\" (UniqueName: \"kubernetes.io/projected/42d725d3-10ec-4492-8598-b505cef336fd-kube-api-access-dwn99\") pod \"barbican-api-c9967c84b-cfjvt\" (UID: \"42d725d3-10ec-4492-8598-b505cef336fd\") " pod="openstack/barbican-api-c9967c84b-cfjvt" Jan 11 17:51:27 crc kubenswrapper[4837]: I0111 17:51:27.143017 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/42d725d3-10ec-4492-8598-b505cef336fd-logs\") pod \"barbican-api-c9967c84b-cfjvt\" (UID: \"42d725d3-10ec-4492-8598-b505cef336fd\") " pod="openstack/barbican-api-c9967c84b-cfjvt" Jan 11 17:51:27 crc kubenswrapper[4837]: I0111 17:51:27.143099 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/42d725d3-10ec-4492-8598-b505cef336fd-config-data-custom\") pod \"barbican-api-c9967c84b-cfjvt\" (UID: \"42d725d3-10ec-4492-8598-b505cef336fd\") " pod="openstack/barbican-api-c9967c84b-cfjvt" Jan 11 17:51:27 crc kubenswrapper[4837]: I0111 17:51:27.143117 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/42d725d3-10ec-4492-8598-b505cef336fd-config-data\") pod \"barbican-api-c9967c84b-cfjvt\" (UID: \"42d725d3-10ec-4492-8598-b505cef336fd\") " pod="openstack/barbican-api-c9967c84b-cfjvt" Jan 11 17:51:27 crc kubenswrapper[4837]: I0111 17:51:27.143244 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/42d725d3-10ec-4492-8598-b505cef336fd-combined-ca-bundle\") pod \"barbican-api-c9967c84b-cfjvt\" (UID: \"42d725d3-10ec-4492-8598-b505cef336fd\") " pod="openstack/barbican-api-c9967c84b-cfjvt" Jan 11 17:51:27 crc kubenswrapper[4837]: I0111 17:51:27.143325 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/42d725d3-10ec-4492-8598-b505cef336fd-public-tls-certs\") pod \"barbican-api-c9967c84b-cfjvt\" (UID: \"42d725d3-10ec-4492-8598-b505cef336fd\") " pod="openstack/barbican-api-c9967c84b-cfjvt" Jan 11 17:51:27 crc kubenswrapper[4837]: I0111 17:51:27.143516 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/42d725d3-10ec-4492-8598-b505cef336fd-internal-tls-certs\") pod \"barbican-api-c9967c84b-cfjvt\" (UID: \"42d725d3-10ec-4492-8598-b505cef336fd\") " pod="openstack/barbican-api-c9967c84b-cfjvt" Jan 11 17:51:27 crc kubenswrapper[4837]: I0111 17:51:27.189935 4837 generic.go:334] "Generic (PLEG): container finished" podID="ecc4b7a3-4585-48ee-9cf3-caa58c9f5895" containerID="05e01069189ca9bcb009b7a4e0f5bbac836d0fe3c08bea3479a309d144351b91" exitCode=0 Jan 11 17:51:27 crc kubenswrapper[4837]: I0111 17:51:27.189999 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-db-sync-l58f7" event={"ID":"ecc4b7a3-4585-48ee-9cf3-caa58c9f5895","Type":"ContainerDied","Data":"05e01069189ca9bcb009b7a4e0f5bbac836d0fe3c08bea3479a309d144351b91"} Jan 11 17:51:27 crc kubenswrapper[4837]: I0111 17:51:27.192708 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-76c97b5b58-7nlx9" event={"ID":"ad710eac-6bbf-4930-9696-bd5bcd59be03","Type":"ContainerStarted","Data":"475d5dd452fc80975f970d393333df82e4feac014fe06d78e0b3f1820bf31f19"} Jan 11 17:51:27 crc kubenswrapper[4837]: I0111 17:51:27.192911 4837 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/barbican-api-76c97b5b58-7nlx9" Jan 11 17:51:27 crc kubenswrapper[4837]: I0111 17:51:27.192932 4837 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/barbican-api-76c97b5b58-7nlx9" Jan 11 17:51:27 crc kubenswrapper[4837]: I0111 17:51:27.195716 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-75c8ddd69c-kg5rw" event={"ID":"fee2b6c4-27f7-4652-ac09-fa6c99ef0a15","Type":"ContainerStarted","Data":"5eee4df63284b573630d26a7df8f50546aeb5216fd6ff0bec48819b84c6902c5"} Jan 11 17:51:27 crc kubenswrapper[4837]: I0111 17:51:27.195755 4837 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-75c8ddd69c-kg5rw" Jan 11 17:51:27 crc kubenswrapper[4837]: I0111 17:51:27.242764 4837 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-75c8ddd69c-kg5rw" podStartSLOduration=3.242744999 podStartE2EDuration="3.242744999s" podCreationTimestamp="2026-01-11 17:51:24 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-11 17:51:27.223404869 +0000 UTC m=+1261.401597575" watchObservedRunningTime="2026-01-11 17:51:27.242744999 +0000 UTC m=+1261.420937705" Jan 11 17:51:27 crc kubenswrapper[4837]: I0111 17:51:27.245001 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dwn99\" (UniqueName: \"kubernetes.io/projected/42d725d3-10ec-4492-8598-b505cef336fd-kube-api-access-dwn99\") pod \"barbican-api-c9967c84b-cfjvt\" (UID: \"42d725d3-10ec-4492-8598-b505cef336fd\") " pod="openstack/barbican-api-c9967c84b-cfjvt" Jan 11 17:51:27 crc kubenswrapper[4837]: I0111 17:51:27.245046 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/42d725d3-10ec-4492-8598-b505cef336fd-logs\") pod \"barbican-api-c9967c84b-cfjvt\" (UID: \"42d725d3-10ec-4492-8598-b505cef336fd\") " pod="openstack/barbican-api-c9967c84b-cfjvt" Jan 11 17:51:27 crc kubenswrapper[4837]: I0111 17:51:27.245114 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/42d725d3-10ec-4492-8598-b505cef336fd-config-data-custom\") pod \"barbican-api-c9967c84b-cfjvt\" (UID: \"42d725d3-10ec-4492-8598-b505cef336fd\") " pod="openstack/barbican-api-c9967c84b-cfjvt" Jan 11 17:51:27 crc kubenswrapper[4837]: I0111 17:51:27.245131 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/42d725d3-10ec-4492-8598-b505cef336fd-config-data\") pod \"barbican-api-c9967c84b-cfjvt\" (UID: \"42d725d3-10ec-4492-8598-b505cef336fd\") " pod="openstack/barbican-api-c9967c84b-cfjvt" Jan 11 17:51:27 crc kubenswrapper[4837]: I0111 17:51:27.245157 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/42d725d3-10ec-4492-8598-b505cef336fd-combined-ca-bundle\") pod \"barbican-api-c9967c84b-cfjvt\" (UID: \"42d725d3-10ec-4492-8598-b505cef336fd\") " pod="openstack/barbican-api-c9967c84b-cfjvt" Jan 11 17:51:27 crc kubenswrapper[4837]: I0111 17:51:27.245180 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/42d725d3-10ec-4492-8598-b505cef336fd-public-tls-certs\") pod \"barbican-api-c9967c84b-cfjvt\" (UID: \"42d725d3-10ec-4492-8598-b505cef336fd\") " pod="openstack/barbican-api-c9967c84b-cfjvt" Jan 11 17:51:27 crc kubenswrapper[4837]: I0111 17:51:27.245214 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/42d725d3-10ec-4492-8598-b505cef336fd-internal-tls-certs\") pod \"barbican-api-c9967c84b-cfjvt\" (UID: \"42d725d3-10ec-4492-8598-b505cef336fd\") " pod="openstack/barbican-api-c9967c84b-cfjvt" Jan 11 17:51:27 crc kubenswrapper[4837]: I0111 17:51:27.249910 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/42d725d3-10ec-4492-8598-b505cef336fd-logs\") pod \"barbican-api-c9967c84b-cfjvt\" (UID: \"42d725d3-10ec-4492-8598-b505cef336fd\") " pod="openstack/barbican-api-c9967c84b-cfjvt" Jan 11 17:51:27 crc kubenswrapper[4837]: I0111 17:51:27.250247 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/42d725d3-10ec-4492-8598-b505cef336fd-internal-tls-certs\") pod \"barbican-api-c9967c84b-cfjvt\" (UID: \"42d725d3-10ec-4492-8598-b505cef336fd\") " pod="openstack/barbican-api-c9967c84b-cfjvt" Jan 11 17:51:27 crc kubenswrapper[4837]: I0111 17:51:27.252611 4837 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/barbican-api-76c97b5b58-7nlx9" podStartSLOduration=3.252590263 podStartE2EDuration="3.252590263s" podCreationTimestamp="2026-01-11 17:51:24 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-11 17:51:27.240995691 +0000 UTC m=+1261.419188397" watchObservedRunningTime="2026-01-11 17:51:27.252590263 +0000 UTC m=+1261.430782969" Jan 11 17:51:27 crc kubenswrapper[4837]: I0111 17:51:27.253350 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/42d725d3-10ec-4492-8598-b505cef336fd-combined-ca-bundle\") pod \"barbican-api-c9967c84b-cfjvt\" (UID: \"42d725d3-10ec-4492-8598-b505cef336fd\") " pod="openstack/barbican-api-c9967c84b-cfjvt" Jan 11 17:51:27 crc kubenswrapper[4837]: I0111 17:51:27.254202 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/42d725d3-10ec-4492-8598-b505cef336fd-public-tls-certs\") pod \"barbican-api-c9967c84b-cfjvt\" (UID: \"42d725d3-10ec-4492-8598-b505cef336fd\") " pod="openstack/barbican-api-c9967c84b-cfjvt" Jan 11 17:51:27 crc kubenswrapper[4837]: I0111 17:51:27.255344 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/42d725d3-10ec-4492-8598-b505cef336fd-config-data-custom\") pod \"barbican-api-c9967c84b-cfjvt\" (UID: \"42d725d3-10ec-4492-8598-b505cef336fd\") " pod="openstack/barbican-api-c9967c84b-cfjvt" Jan 11 17:51:27 crc kubenswrapper[4837]: I0111 17:51:27.255840 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/42d725d3-10ec-4492-8598-b505cef336fd-config-data\") pod \"barbican-api-c9967c84b-cfjvt\" (UID: \"42d725d3-10ec-4492-8598-b505cef336fd\") " pod="openstack/barbican-api-c9967c84b-cfjvt" Jan 11 17:51:27 crc kubenswrapper[4837]: I0111 17:51:27.263354 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dwn99\" (UniqueName: \"kubernetes.io/projected/42d725d3-10ec-4492-8598-b505cef336fd-kube-api-access-dwn99\") pod \"barbican-api-c9967c84b-cfjvt\" (UID: \"42d725d3-10ec-4492-8598-b505cef336fd\") " pod="openstack/barbican-api-c9967c84b-cfjvt" Jan 11 17:51:27 crc kubenswrapper[4837]: I0111 17:51:27.326877 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-api-c9967c84b-cfjvt" Jan 11 17:51:27 crc kubenswrapper[4837]: I0111 17:51:27.808648 4837 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-api-c9967c84b-cfjvt"] Jan 11 17:51:28 crc kubenswrapper[4837]: I0111 17:51:28.212231 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-c9967c84b-cfjvt" event={"ID":"42d725d3-10ec-4492-8598-b505cef336fd","Type":"ContainerStarted","Data":"5413626c1e7e19fff260c854a796f233fef59915ded11c652acefe7dc889a2f3"} Jan 11 17:51:28 crc kubenswrapper[4837]: I0111 17:51:28.616108 4837 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-sync-l58f7" Jan 11 17:51:28 crc kubenswrapper[4837]: I0111 17:51:28.780932 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ecc4b7a3-4585-48ee-9cf3-caa58c9f5895-combined-ca-bundle\") pod \"ecc4b7a3-4585-48ee-9cf3-caa58c9f5895\" (UID: \"ecc4b7a3-4585-48ee-9cf3-caa58c9f5895\") " Jan 11 17:51:28 crc kubenswrapper[4837]: I0111 17:51:28.781050 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/ecc4b7a3-4585-48ee-9cf3-caa58c9f5895-db-sync-config-data\") pod \"ecc4b7a3-4585-48ee-9cf3-caa58c9f5895\" (UID: \"ecc4b7a3-4585-48ee-9cf3-caa58c9f5895\") " Jan 11 17:51:28 crc kubenswrapper[4837]: I0111 17:51:28.781134 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/ecc4b7a3-4585-48ee-9cf3-caa58c9f5895-etc-machine-id\") pod \"ecc4b7a3-4585-48ee-9cf3-caa58c9f5895\" (UID: \"ecc4b7a3-4585-48ee-9cf3-caa58c9f5895\") " Jan 11 17:51:28 crc kubenswrapper[4837]: I0111 17:51:28.781254 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/ecc4b7a3-4585-48ee-9cf3-caa58c9f5895-etc-machine-id" (OuterVolumeSpecName: "etc-machine-id") pod "ecc4b7a3-4585-48ee-9cf3-caa58c9f5895" (UID: "ecc4b7a3-4585-48ee-9cf3-caa58c9f5895"). InnerVolumeSpecName "etc-machine-id". PluginName "kubernetes.io/host-path", VolumeGidValue "" Jan 11 17:51:28 crc kubenswrapper[4837]: I0111 17:51:28.781362 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/ecc4b7a3-4585-48ee-9cf3-caa58c9f5895-scripts\") pod \"ecc4b7a3-4585-48ee-9cf3-caa58c9f5895\" (UID: \"ecc4b7a3-4585-48ee-9cf3-caa58c9f5895\") " Jan 11 17:51:28 crc kubenswrapper[4837]: I0111 17:51:28.781456 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7st6m\" (UniqueName: \"kubernetes.io/projected/ecc4b7a3-4585-48ee-9cf3-caa58c9f5895-kube-api-access-7st6m\") pod \"ecc4b7a3-4585-48ee-9cf3-caa58c9f5895\" (UID: \"ecc4b7a3-4585-48ee-9cf3-caa58c9f5895\") " Jan 11 17:51:28 crc kubenswrapper[4837]: I0111 17:51:28.781580 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ecc4b7a3-4585-48ee-9cf3-caa58c9f5895-config-data\") pod \"ecc4b7a3-4585-48ee-9cf3-caa58c9f5895\" (UID: \"ecc4b7a3-4585-48ee-9cf3-caa58c9f5895\") " Jan 11 17:51:28 crc kubenswrapper[4837]: I0111 17:51:28.782897 4837 reconciler_common.go:293] "Volume detached for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/ecc4b7a3-4585-48ee-9cf3-caa58c9f5895-etc-machine-id\") on node \"crc\" DevicePath \"\"" Jan 11 17:51:28 crc kubenswrapper[4837]: I0111 17:51:28.787508 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ecc4b7a3-4585-48ee-9cf3-caa58c9f5895-scripts" (OuterVolumeSpecName: "scripts") pod "ecc4b7a3-4585-48ee-9cf3-caa58c9f5895" (UID: "ecc4b7a3-4585-48ee-9cf3-caa58c9f5895"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 11 17:51:28 crc kubenswrapper[4837]: I0111 17:51:28.788325 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ecc4b7a3-4585-48ee-9cf3-caa58c9f5895-db-sync-config-data" (OuterVolumeSpecName: "db-sync-config-data") pod "ecc4b7a3-4585-48ee-9cf3-caa58c9f5895" (UID: "ecc4b7a3-4585-48ee-9cf3-caa58c9f5895"). InnerVolumeSpecName "db-sync-config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 11 17:51:28 crc kubenswrapper[4837]: I0111 17:51:28.792141 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ecc4b7a3-4585-48ee-9cf3-caa58c9f5895-kube-api-access-7st6m" (OuterVolumeSpecName: "kube-api-access-7st6m") pod "ecc4b7a3-4585-48ee-9cf3-caa58c9f5895" (UID: "ecc4b7a3-4585-48ee-9cf3-caa58c9f5895"). InnerVolumeSpecName "kube-api-access-7st6m". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 11 17:51:28 crc kubenswrapper[4837]: I0111 17:51:28.811409 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ecc4b7a3-4585-48ee-9cf3-caa58c9f5895-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "ecc4b7a3-4585-48ee-9cf3-caa58c9f5895" (UID: "ecc4b7a3-4585-48ee-9cf3-caa58c9f5895"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 11 17:51:28 crc kubenswrapper[4837]: I0111 17:51:28.842782 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ecc4b7a3-4585-48ee-9cf3-caa58c9f5895-config-data" (OuterVolumeSpecName: "config-data") pod "ecc4b7a3-4585-48ee-9cf3-caa58c9f5895" (UID: "ecc4b7a3-4585-48ee-9cf3-caa58c9f5895"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 11 17:51:28 crc kubenswrapper[4837]: I0111 17:51:28.885530 4837 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/ecc4b7a3-4585-48ee-9cf3-caa58c9f5895-scripts\") on node \"crc\" DevicePath \"\"" Jan 11 17:51:28 crc kubenswrapper[4837]: I0111 17:51:28.885639 4837 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-7st6m\" (UniqueName: \"kubernetes.io/projected/ecc4b7a3-4585-48ee-9cf3-caa58c9f5895-kube-api-access-7st6m\") on node \"crc\" DevicePath \"\"" Jan 11 17:51:28 crc kubenswrapper[4837]: I0111 17:51:28.885793 4837 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ecc4b7a3-4585-48ee-9cf3-caa58c9f5895-config-data\") on node \"crc\" DevicePath \"\"" Jan 11 17:51:28 crc kubenswrapper[4837]: I0111 17:51:28.885826 4837 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ecc4b7a3-4585-48ee-9cf3-caa58c9f5895-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 11 17:51:28 crc kubenswrapper[4837]: I0111 17:51:28.885845 4837 reconciler_common.go:293] "Volume detached for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/ecc4b7a3-4585-48ee-9cf3-caa58c9f5895-db-sync-config-data\") on node \"crc\" DevicePath \"\"" Jan 11 17:51:29 crc kubenswrapper[4837]: I0111 17:51:29.223981 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-db-sync-l58f7" event={"ID":"ecc4b7a3-4585-48ee-9cf3-caa58c9f5895","Type":"ContainerDied","Data":"15e5cfa4043cd70b8c50268be91873413c661ee651c5e28d59fff158e2f43142"} Jan 11 17:51:29 crc kubenswrapper[4837]: I0111 17:51:29.224072 4837 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="15e5cfa4043cd70b8c50268be91873413c661ee651c5e28d59fff158e2f43142" Jan 11 17:51:29 crc kubenswrapper[4837]: I0111 17:51:29.224065 4837 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-sync-l58f7" Jan 11 17:51:29 crc kubenswrapper[4837]: I0111 17:51:29.233900 4837 generic.go:334] "Generic (PLEG): container finished" podID="5ec0beaf-de63-407f-8d18-46738023ab11" containerID="6365fbb22458d0aa6474e7c381cd0e77f5f0e1ffa3601c5afba83111f5f7eff4" exitCode=0 Jan 11 17:51:29 crc kubenswrapper[4837]: I0111 17:51:29.233962 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"5ec0beaf-de63-407f-8d18-46738023ab11","Type":"ContainerDied","Data":"6365fbb22458d0aa6474e7c381cd0e77f5f0e1ffa3601c5afba83111f5f7eff4"} Jan 11 17:51:29 crc kubenswrapper[4837]: I0111 17:51:29.235171 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-c9967c84b-cfjvt" event={"ID":"42d725d3-10ec-4492-8598-b505cef336fd","Type":"ContainerStarted","Data":"805eae1d05ff526d32a010732e2a0249ebab8e7fa93fda58d7fba7ca8ae51cf1"} Jan 11 17:51:29 crc kubenswrapper[4837]: I0111 17:51:29.469591 4837 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-scheduler-0"] Jan 11 17:51:29 crc kubenswrapper[4837]: E0111 17:51:29.470197 4837 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ecc4b7a3-4585-48ee-9cf3-caa58c9f5895" containerName="cinder-db-sync" Jan 11 17:51:29 crc kubenswrapper[4837]: I0111 17:51:29.470213 4837 state_mem.go:107] "Deleted CPUSet assignment" podUID="ecc4b7a3-4585-48ee-9cf3-caa58c9f5895" containerName="cinder-db-sync" Jan 11 17:51:29 crc kubenswrapper[4837]: I0111 17:51:29.470383 4837 memory_manager.go:354] "RemoveStaleState removing state" podUID="ecc4b7a3-4585-48ee-9cf3-caa58c9f5895" containerName="cinder-db-sync" Jan 11 17:51:29 crc kubenswrapper[4837]: I0111 17:51:29.473406 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-scheduler-0" Jan 11 17:51:29 crc kubenswrapper[4837]: I0111 17:51:29.478228 4837 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-cinder-dockercfg-6cjjv" Jan 11 17:51:29 crc kubenswrapper[4837]: I0111 17:51:29.478400 4837 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-scheduler-config-data" Jan 11 17:51:29 crc kubenswrapper[4837]: I0111 17:51:29.478435 4837 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-scripts" Jan 11 17:51:29 crc kubenswrapper[4837]: I0111 17:51:29.479209 4837 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-config-data" Jan 11 17:51:29 crc kubenswrapper[4837]: I0111 17:51:29.489479 4837 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-scheduler-0"] Jan 11 17:51:29 crc kubenswrapper[4837]: I0111 17:51:29.541871 4837 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-75c8ddd69c-kg5rw"] Jan 11 17:51:29 crc kubenswrapper[4837]: I0111 17:51:29.542172 4837 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-75c8ddd69c-kg5rw" podUID="fee2b6c4-27f7-4652-ac09-fa6c99ef0a15" containerName="dnsmasq-dns" containerID="cri-o://5eee4df63284b573630d26a7df8f50546aeb5216fd6ff0bec48819b84c6902c5" gracePeriod=10 Jan 11 17:51:29 crc kubenswrapper[4837]: I0111 17:51:29.582015 4837 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-5784cf869f-g67l9"] Jan 11 17:51:29 crc kubenswrapper[4837]: I0111 17:51:29.583473 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5784cf869f-g67l9" Jan 11 17:51:29 crc kubenswrapper[4837]: I0111 17:51:29.600722 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-s2pg5\" (UniqueName: \"kubernetes.io/projected/dd8316e1-8d7c-4c39-82d3-d200b7d6b164-kube-api-access-s2pg5\") pod \"cinder-scheduler-0\" (UID: \"dd8316e1-8d7c-4c39-82d3-d200b7d6b164\") " pod="openstack/cinder-scheduler-0" Jan 11 17:51:29 crc kubenswrapper[4837]: I0111 17:51:29.600779 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/dd8316e1-8d7c-4c39-82d3-d200b7d6b164-combined-ca-bundle\") pod \"cinder-scheduler-0\" (UID: \"dd8316e1-8d7c-4c39-82d3-d200b7d6b164\") " pod="openstack/cinder-scheduler-0" Jan 11 17:51:29 crc kubenswrapper[4837]: I0111 17:51:29.600805 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/dd8316e1-8d7c-4c39-82d3-d200b7d6b164-scripts\") pod \"cinder-scheduler-0\" (UID: \"dd8316e1-8d7c-4c39-82d3-d200b7d6b164\") " pod="openstack/cinder-scheduler-0" Jan 11 17:51:29 crc kubenswrapper[4837]: I0111 17:51:29.600879 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/dd8316e1-8d7c-4c39-82d3-d200b7d6b164-config-data-custom\") pod \"cinder-scheduler-0\" (UID: \"dd8316e1-8d7c-4c39-82d3-d200b7d6b164\") " pod="openstack/cinder-scheduler-0" Jan 11 17:51:29 crc kubenswrapper[4837]: I0111 17:51:29.600899 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/dd8316e1-8d7c-4c39-82d3-d200b7d6b164-config-data\") pod \"cinder-scheduler-0\" (UID: \"dd8316e1-8d7c-4c39-82d3-d200b7d6b164\") " pod="openstack/cinder-scheduler-0" Jan 11 17:51:29 crc kubenswrapper[4837]: I0111 17:51:29.600926 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/dd8316e1-8d7c-4c39-82d3-d200b7d6b164-etc-machine-id\") pod \"cinder-scheduler-0\" (UID: \"dd8316e1-8d7c-4c39-82d3-d200b7d6b164\") " pod="openstack/cinder-scheduler-0" Jan 11 17:51:29 crc kubenswrapper[4837]: I0111 17:51:29.619380 4837 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-5784cf869f-g67l9"] Jan 11 17:51:29 crc kubenswrapper[4837]: I0111 17:51:29.702808 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/93f666ee-5f8d-4403-83f6-87d4be8c0961-dns-swift-storage-0\") pod \"dnsmasq-dns-5784cf869f-g67l9\" (UID: \"93f666ee-5f8d-4403-83f6-87d4be8c0961\") " pod="openstack/dnsmasq-dns-5784cf869f-g67l9" Jan 11 17:51:29 crc kubenswrapper[4837]: I0111 17:51:29.702863 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2pg5\" (UniqueName: \"kubernetes.io/projected/dd8316e1-8d7c-4c39-82d3-d200b7d6b164-kube-api-access-s2pg5\") pod \"cinder-scheduler-0\" (UID: \"dd8316e1-8d7c-4c39-82d3-d200b7d6b164\") " pod="openstack/cinder-scheduler-0" Jan 11 17:51:29 crc kubenswrapper[4837]: I0111 17:51:29.702883 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/93f666ee-5f8d-4403-83f6-87d4be8c0961-config\") pod \"dnsmasq-dns-5784cf869f-g67l9\" (UID: \"93f666ee-5f8d-4403-83f6-87d4be8c0961\") " pod="openstack/dnsmasq-dns-5784cf869f-g67l9" Jan 11 17:51:29 crc kubenswrapper[4837]: I0111 17:51:29.702907 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/93f666ee-5f8d-4403-83f6-87d4be8c0961-ovsdbserver-sb\") pod \"dnsmasq-dns-5784cf869f-g67l9\" (UID: \"93f666ee-5f8d-4403-83f6-87d4be8c0961\") " pod="openstack/dnsmasq-dns-5784cf869f-g67l9" Jan 11 17:51:29 crc kubenswrapper[4837]: I0111 17:51:29.702925 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/dd8316e1-8d7c-4c39-82d3-d200b7d6b164-combined-ca-bundle\") pod \"cinder-scheduler-0\" (UID: \"dd8316e1-8d7c-4c39-82d3-d200b7d6b164\") " pod="openstack/cinder-scheduler-0" Jan 11 17:51:29 crc kubenswrapper[4837]: I0111 17:51:29.702951 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/dd8316e1-8d7c-4c39-82d3-d200b7d6b164-scripts\") pod \"cinder-scheduler-0\" (UID: \"dd8316e1-8d7c-4c39-82d3-d200b7d6b164\") " pod="openstack/cinder-scheduler-0" Jan 11 17:51:29 crc kubenswrapper[4837]: I0111 17:51:29.702968 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/93f666ee-5f8d-4403-83f6-87d4be8c0961-ovsdbserver-nb\") pod \"dnsmasq-dns-5784cf869f-g67l9\" (UID: \"93f666ee-5f8d-4403-83f6-87d4be8c0961\") " pod="openstack/dnsmasq-dns-5784cf869f-g67l9" Jan 11 17:51:29 crc kubenswrapper[4837]: I0111 17:51:29.702994 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/93f666ee-5f8d-4403-83f6-87d4be8c0961-dns-svc\") pod \"dnsmasq-dns-5784cf869f-g67l9\" (UID: \"93f666ee-5f8d-4403-83f6-87d4be8c0961\") " pod="openstack/dnsmasq-dns-5784cf869f-g67l9" Jan 11 17:51:29 crc kubenswrapper[4837]: I0111 17:51:29.703056 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/dd8316e1-8d7c-4c39-82d3-d200b7d6b164-config-data-custom\") pod \"cinder-scheduler-0\" (UID: \"dd8316e1-8d7c-4c39-82d3-d200b7d6b164\") " pod="openstack/cinder-scheduler-0" Jan 11 17:51:29 crc kubenswrapper[4837]: I0111 17:51:29.703074 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/dd8316e1-8d7c-4c39-82d3-d200b7d6b164-config-data\") pod \"cinder-scheduler-0\" (UID: \"dd8316e1-8d7c-4c39-82d3-d200b7d6b164\") " pod="openstack/cinder-scheduler-0" Jan 11 17:51:29 crc kubenswrapper[4837]: I0111 17:51:29.703102 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/dd8316e1-8d7c-4c39-82d3-d200b7d6b164-etc-machine-id\") pod \"cinder-scheduler-0\" (UID: \"dd8316e1-8d7c-4c39-82d3-d200b7d6b164\") " pod="openstack/cinder-scheduler-0" Jan 11 17:51:29 crc kubenswrapper[4837]: I0111 17:51:29.703124 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-l8cpm\" (UniqueName: \"kubernetes.io/projected/93f666ee-5f8d-4403-83f6-87d4be8c0961-kube-api-access-l8cpm\") pod \"dnsmasq-dns-5784cf869f-g67l9\" (UID: \"93f666ee-5f8d-4403-83f6-87d4be8c0961\") " pod="openstack/dnsmasq-dns-5784cf869f-g67l9" Jan 11 17:51:29 crc kubenswrapper[4837]: I0111 17:51:29.704488 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/dd8316e1-8d7c-4c39-82d3-d200b7d6b164-etc-machine-id\") pod \"cinder-scheduler-0\" (UID: \"dd8316e1-8d7c-4c39-82d3-d200b7d6b164\") " pod="openstack/cinder-scheduler-0" Jan 11 17:51:29 crc kubenswrapper[4837]: I0111 17:51:29.709342 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/dd8316e1-8d7c-4c39-82d3-d200b7d6b164-scripts\") pod \"cinder-scheduler-0\" (UID: \"dd8316e1-8d7c-4c39-82d3-d200b7d6b164\") " pod="openstack/cinder-scheduler-0" Jan 11 17:51:29 crc kubenswrapper[4837]: I0111 17:51:29.710891 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/dd8316e1-8d7c-4c39-82d3-d200b7d6b164-config-data-custom\") pod \"cinder-scheduler-0\" (UID: \"dd8316e1-8d7c-4c39-82d3-d200b7d6b164\") " pod="openstack/cinder-scheduler-0" Jan 11 17:51:29 crc kubenswrapper[4837]: I0111 17:51:29.711325 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/dd8316e1-8d7c-4c39-82d3-d200b7d6b164-combined-ca-bundle\") pod \"cinder-scheduler-0\" (UID: \"dd8316e1-8d7c-4c39-82d3-d200b7d6b164\") " pod="openstack/cinder-scheduler-0" Jan 11 17:51:29 crc kubenswrapper[4837]: I0111 17:51:29.730206 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-s2pg5\" (UniqueName: \"kubernetes.io/projected/dd8316e1-8d7c-4c39-82d3-d200b7d6b164-kube-api-access-s2pg5\") pod \"cinder-scheduler-0\" (UID: \"dd8316e1-8d7c-4c39-82d3-d200b7d6b164\") " pod="openstack/cinder-scheduler-0" Jan 11 17:51:29 crc kubenswrapper[4837]: I0111 17:51:29.743544 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/dd8316e1-8d7c-4c39-82d3-d200b7d6b164-config-data\") pod \"cinder-scheduler-0\" (UID: \"dd8316e1-8d7c-4c39-82d3-d200b7d6b164\") " pod="openstack/cinder-scheduler-0" Jan 11 17:51:29 crc kubenswrapper[4837]: I0111 17:51:29.792723 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-scheduler-0" Jan 11 17:51:29 crc kubenswrapper[4837]: I0111 17:51:29.804881 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/93f666ee-5f8d-4403-83f6-87d4be8c0961-dns-swift-storage-0\") pod \"dnsmasq-dns-5784cf869f-g67l9\" (UID: \"93f666ee-5f8d-4403-83f6-87d4be8c0961\") " pod="openstack/dnsmasq-dns-5784cf869f-g67l9" Jan 11 17:51:29 crc kubenswrapper[4837]: I0111 17:51:29.804937 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/93f666ee-5f8d-4403-83f6-87d4be8c0961-config\") pod \"dnsmasq-dns-5784cf869f-g67l9\" (UID: \"93f666ee-5f8d-4403-83f6-87d4be8c0961\") " pod="openstack/dnsmasq-dns-5784cf869f-g67l9" Jan 11 17:51:29 crc kubenswrapper[4837]: I0111 17:51:29.804998 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/93f666ee-5f8d-4403-83f6-87d4be8c0961-ovsdbserver-sb\") pod \"dnsmasq-dns-5784cf869f-g67l9\" (UID: \"93f666ee-5f8d-4403-83f6-87d4be8c0961\") " pod="openstack/dnsmasq-dns-5784cf869f-g67l9" Jan 11 17:51:29 crc kubenswrapper[4837]: I0111 17:51:29.805027 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/93f666ee-5f8d-4403-83f6-87d4be8c0961-ovsdbserver-nb\") pod \"dnsmasq-dns-5784cf869f-g67l9\" (UID: \"93f666ee-5f8d-4403-83f6-87d4be8c0961\") " pod="openstack/dnsmasq-dns-5784cf869f-g67l9" Jan 11 17:51:29 crc kubenswrapper[4837]: I0111 17:51:29.805051 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/93f666ee-5f8d-4403-83f6-87d4be8c0961-dns-svc\") pod \"dnsmasq-dns-5784cf869f-g67l9\" (UID: \"93f666ee-5f8d-4403-83f6-87d4be8c0961\") " pod="openstack/dnsmasq-dns-5784cf869f-g67l9" Jan 11 17:51:29 crc kubenswrapper[4837]: I0111 17:51:29.805144 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-l8cpm\" (UniqueName: \"kubernetes.io/projected/93f666ee-5f8d-4403-83f6-87d4be8c0961-kube-api-access-l8cpm\") pod \"dnsmasq-dns-5784cf869f-g67l9\" (UID: \"93f666ee-5f8d-4403-83f6-87d4be8c0961\") " pod="openstack/dnsmasq-dns-5784cf869f-g67l9" Jan 11 17:51:29 crc kubenswrapper[4837]: I0111 17:51:29.805870 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/93f666ee-5f8d-4403-83f6-87d4be8c0961-dns-swift-storage-0\") pod \"dnsmasq-dns-5784cf869f-g67l9\" (UID: \"93f666ee-5f8d-4403-83f6-87d4be8c0961\") " pod="openstack/dnsmasq-dns-5784cf869f-g67l9" Jan 11 17:51:29 crc kubenswrapper[4837]: I0111 17:51:29.806035 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/93f666ee-5f8d-4403-83f6-87d4be8c0961-ovsdbserver-sb\") pod \"dnsmasq-dns-5784cf869f-g67l9\" (UID: \"93f666ee-5f8d-4403-83f6-87d4be8c0961\") " pod="openstack/dnsmasq-dns-5784cf869f-g67l9" Jan 11 17:51:29 crc kubenswrapper[4837]: I0111 17:51:29.806245 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/93f666ee-5f8d-4403-83f6-87d4be8c0961-config\") pod \"dnsmasq-dns-5784cf869f-g67l9\" (UID: \"93f666ee-5f8d-4403-83f6-87d4be8c0961\") " pod="openstack/dnsmasq-dns-5784cf869f-g67l9" Jan 11 17:51:29 crc kubenswrapper[4837]: I0111 17:51:29.807928 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/93f666ee-5f8d-4403-83f6-87d4be8c0961-ovsdbserver-nb\") pod \"dnsmasq-dns-5784cf869f-g67l9\" (UID: \"93f666ee-5f8d-4403-83f6-87d4be8c0961\") " pod="openstack/dnsmasq-dns-5784cf869f-g67l9" Jan 11 17:51:29 crc kubenswrapper[4837]: I0111 17:51:29.808090 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/93f666ee-5f8d-4403-83f6-87d4be8c0961-dns-svc\") pod \"dnsmasq-dns-5784cf869f-g67l9\" (UID: \"93f666ee-5f8d-4403-83f6-87d4be8c0961\") " pod="openstack/dnsmasq-dns-5784cf869f-g67l9" Jan 11 17:51:29 crc kubenswrapper[4837]: I0111 17:51:29.821919 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-l8cpm\" (UniqueName: \"kubernetes.io/projected/93f666ee-5f8d-4403-83f6-87d4be8c0961-kube-api-access-l8cpm\") pod \"dnsmasq-dns-5784cf869f-g67l9\" (UID: \"93f666ee-5f8d-4403-83f6-87d4be8c0961\") " pod="openstack/dnsmasq-dns-5784cf869f-g67l9" Jan 11 17:51:29 crc kubenswrapper[4837]: I0111 17:51:29.879150 4837 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-api-0"] Jan 11 17:51:29 crc kubenswrapper[4837]: I0111 17:51:29.880572 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-api-0" Jan 11 17:51:29 crc kubenswrapper[4837]: I0111 17:51:29.882561 4837 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-api-config-data" Jan 11 17:51:29 crc kubenswrapper[4837]: I0111 17:51:29.889858 4837 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-api-0"] Jan 11 17:51:29 crc kubenswrapper[4837]: I0111 17:51:29.906271 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5784cf869f-g67l9" Jan 11 17:51:30 crc kubenswrapper[4837]: I0111 17:51:30.010265 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/f941f2f8-0d7f-409f-929b-7bbba98e0ca5-config-data-custom\") pod \"cinder-api-0\" (UID: \"f941f2f8-0d7f-409f-929b-7bbba98e0ca5\") " pod="openstack/cinder-api-0" Jan 11 17:51:30 crc kubenswrapper[4837]: I0111 17:51:30.010327 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/f941f2f8-0d7f-409f-929b-7bbba98e0ca5-logs\") pod \"cinder-api-0\" (UID: \"f941f2f8-0d7f-409f-929b-7bbba98e0ca5\") " pod="openstack/cinder-api-0" Jan 11 17:51:30 crc kubenswrapper[4837]: I0111 17:51:30.010350 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f941f2f8-0d7f-409f-929b-7bbba98e0ca5-combined-ca-bundle\") pod \"cinder-api-0\" (UID: \"f941f2f8-0d7f-409f-929b-7bbba98e0ca5\") " pod="openstack/cinder-api-0" Jan 11 17:51:30 crc kubenswrapper[4837]: I0111 17:51:30.010491 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f941f2f8-0d7f-409f-929b-7bbba98e0ca5-config-data\") pod \"cinder-api-0\" (UID: \"f941f2f8-0d7f-409f-929b-7bbba98e0ca5\") " pod="openstack/cinder-api-0" Jan 11 17:51:30 crc kubenswrapper[4837]: I0111 17:51:30.010646 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/f941f2f8-0d7f-409f-929b-7bbba98e0ca5-scripts\") pod \"cinder-api-0\" (UID: \"f941f2f8-0d7f-409f-929b-7bbba98e0ca5\") " pod="openstack/cinder-api-0" Jan 11 17:51:30 crc kubenswrapper[4837]: I0111 17:51:30.010786 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/f941f2f8-0d7f-409f-929b-7bbba98e0ca5-etc-machine-id\") pod \"cinder-api-0\" (UID: \"f941f2f8-0d7f-409f-929b-7bbba98e0ca5\") " pod="openstack/cinder-api-0" Jan 11 17:51:30 crc kubenswrapper[4837]: I0111 17:51:30.010906 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-c9g7p\" (UniqueName: \"kubernetes.io/projected/f941f2f8-0d7f-409f-929b-7bbba98e0ca5-kube-api-access-c9g7p\") pod \"cinder-api-0\" (UID: \"f941f2f8-0d7f-409f-929b-7bbba98e0ca5\") " pod="openstack/cinder-api-0" Jan 11 17:51:30 crc kubenswrapper[4837]: I0111 17:51:30.112476 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/f941f2f8-0d7f-409f-929b-7bbba98e0ca5-logs\") pod \"cinder-api-0\" (UID: \"f941f2f8-0d7f-409f-929b-7bbba98e0ca5\") " pod="openstack/cinder-api-0" Jan 11 17:51:30 crc kubenswrapper[4837]: I0111 17:51:30.112530 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f941f2f8-0d7f-409f-929b-7bbba98e0ca5-combined-ca-bundle\") pod \"cinder-api-0\" (UID: \"f941f2f8-0d7f-409f-929b-7bbba98e0ca5\") " pod="openstack/cinder-api-0" Jan 11 17:51:30 crc kubenswrapper[4837]: I0111 17:51:30.112576 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f941f2f8-0d7f-409f-929b-7bbba98e0ca5-config-data\") pod \"cinder-api-0\" (UID: \"f941f2f8-0d7f-409f-929b-7bbba98e0ca5\") " pod="openstack/cinder-api-0" Jan 11 17:51:30 crc kubenswrapper[4837]: I0111 17:51:30.112611 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/f941f2f8-0d7f-409f-929b-7bbba98e0ca5-scripts\") pod \"cinder-api-0\" (UID: \"f941f2f8-0d7f-409f-929b-7bbba98e0ca5\") " pod="openstack/cinder-api-0" Jan 11 17:51:30 crc kubenswrapper[4837]: I0111 17:51:30.112642 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/f941f2f8-0d7f-409f-929b-7bbba98e0ca5-etc-machine-id\") pod \"cinder-api-0\" (UID: \"f941f2f8-0d7f-409f-929b-7bbba98e0ca5\") " pod="openstack/cinder-api-0" Jan 11 17:51:30 crc kubenswrapper[4837]: I0111 17:51:30.112702 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-c9g7p\" (UniqueName: \"kubernetes.io/projected/f941f2f8-0d7f-409f-929b-7bbba98e0ca5-kube-api-access-c9g7p\") pod \"cinder-api-0\" (UID: \"f941f2f8-0d7f-409f-929b-7bbba98e0ca5\") " pod="openstack/cinder-api-0" Jan 11 17:51:30 crc kubenswrapper[4837]: I0111 17:51:30.112763 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/f941f2f8-0d7f-409f-929b-7bbba98e0ca5-config-data-custom\") pod \"cinder-api-0\" (UID: \"f941f2f8-0d7f-409f-929b-7bbba98e0ca5\") " pod="openstack/cinder-api-0" Jan 11 17:51:30 crc kubenswrapper[4837]: I0111 17:51:30.112926 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/f941f2f8-0d7f-409f-929b-7bbba98e0ca5-etc-machine-id\") pod \"cinder-api-0\" (UID: \"f941f2f8-0d7f-409f-929b-7bbba98e0ca5\") " pod="openstack/cinder-api-0" Jan 11 17:51:30 crc kubenswrapper[4837]: I0111 17:51:30.113035 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/f941f2f8-0d7f-409f-929b-7bbba98e0ca5-logs\") pod \"cinder-api-0\" (UID: \"f941f2f8-0d7f-409f-929b-7bbba98e0ca5\") " pod="openstack/cinder-api-0" Jan 11 17:51:30 crc kubenswrapper[4837]: I0111 17:51:30.116813 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/f941f2f8-0d7f-409f-929b-7bbba98e0ca5-scripts\") pod \"cinder-api-0\" (UID: \"f941f2f8-0d7f-409f-929b-7bbba98e0ca5\") " pod="openstack/cinder-api-0" Jan 11 17:51:30 crc kubenswrapper[4837]: I0111 17:51:30.116877 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f941f2f8-0d7f-409f-929b-7bbba98e0ca5-combined-ca-bundle\") pod \"cinder-api-0\" (UID: \"f941f2f8-0d7f-409f-929b-7bbba98e0ca5\") " pod="openstack/cinder-api-0" Jan 11 17:51:30 crc kubenswrapper[4837]: I0111 17:51:30.118427 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f941f2f8-0d7f-409f-929b-7bbba98e0ca5-config-data\") pod \"cinder-api-0\" (UID: \"f941f2f8-0d7f-409f-929b-7bbba98e0ca5\") " pod="openstack/cinder-api-0" Jan 11 17:51:30 crc kubenswrapper[4837]: I0111 17:51:30.127130 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/f941f2f8-0d7f-409f-929b-7bbba98e0ca5-config-data-custom\") pod \"cinder-api-0\" (UID: \"f941f2f8-0d7f-409f-929b-7bbba98e0ca5\") " pod="openstack/cinder-api-0" Jan 11 17:51:30 crc kubenswrapper[4837]: I0111 17:51:30.129095 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-c9g7p\" (UniqueName: \"kubernetes.io/projected/f941f2f8-0d7f-409f-929b-7bbba98e0ca5-kube-api-access-c9g7p\") pod \"cinder-api-0\" (UID: \"f941f2f8-0d7f-409f-929b-7bbba98e0ca5\") " pod="openstack/cinder-api-0" Jan 11 17:51:30 crc kubenswrapper[4837]: I0111 17:51:30.208233 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-api-0" Jan 11 17:51:31 crc kubenswrapper[4837]: I0111 17:51:31.042478 4837 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-75c8ddd69c-kg5rw" Jan 11 17:51:31 crc kubenswrapper[4837]: I0111 17:51:31.134007 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/fee2b6c4-27f7-4652-ac09-fa6c99ef0a15-ovsdbserver-nb\") pod \"fee2b6c4-27f7-4652-ac09-fa6c99ef0a15\" (UID: \"fee2b6c4-27f7-4652-ac09-fa6c99ef0a15\") " Jan 11 17:51:31 crc kubenswrapper[4837]: I0111 17:51:31.134372 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/fee2b6c4-27f7-4652-ac09-fa6c99ef0a15-dns-swift-storage-0\") pod \"fee2b6c4-27f7-4652-ac09-fa6c99ef0a15\" (UID: \"fee2b6c4-27f7-4652-ac09-fa6c99ef0a15\") " Jan 11 17:51:31 crc kubenswrapper[4837]: I0111 17:51:31.134504 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/fee2b6c4-27f7-4652-ac09-fa6c99ef0a15-config\") pod \"fee2b6c4-27f7-4652-ac09-fa6c99ef0a15\" (UID: \"fee2b6c4-27f7-4652-ac09-fa6c99ef0a15\") " Jan 11 17:51:31 crc kubenswrapper[4837]: I0111 17:51:31.134529 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qzqkv\" (UniqueName: \"kubernetes.io/projected/fee2b6c4-27f7-4652-ac09-fa6c99ef0a15-kube-api-access-qzqkv\") pod \"fee2b6c4-27f7-4652-ac09-fa6c99ef0a15\" (UID: \"fee2b6c4-27f7-4652-ac09-fa6c99ef0a15\") " Jan 11 17:51:31 crc kubenswrapper[4837]: I0111 17:51:31.134566 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/fee2b6c4-27f7-4652-ac09-fa6c99ef0a15-dns-svc\") pod \"fee2b6c4-27f7-4652-ac09-fa6c99ef0a15\" (UID: \"fee2b6c4-27f7-4652-ac09-fa6c99ef0a15\") " Jan 11 17:51:31 crc kubenswrapper[4837]: I0111 17:51:31.134617 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/fee2b6c4-27f7-4652-ac09-fa6c99ef0a15-ovsdbserver-sb\") pod \"fee2b6c4-27f7-4652-ac09-fa6c99ef0a15\" (UID: \"fee2b6c4-27f7-4652-ac09-fa6c99ef0a15\") " Jan 11 17:51:31 crc kubenswrapper[4837]: I0111 17:51:31.139514 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/fee2b6c4-27f7-4652-ac09-fa6c99ef0a15-kube-api-access-qzqkv" (OuterVolumeSpecName: "kube-api-access-qzqkv") pod "fee2b6c4-27f7-4652-ac09-fa6c99ef0a15" (UID: "fee2b6c4-27f7-4652-ac09-fa6c99ef0a15"). InnerVolumeSpecName "kube-api-access-qzqkv". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 11 17:51:31 crc kubenswrapper[4837]: I0111 17:51:31.223505 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/fee2b6c4-27f7-4652-ac09-fa6c99ef0a15-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "fee2b6c4-27f7-4652-ac09-fa6c99ef0a15" (UID: "fee2b6c4-27f7-4652-ac09-fa6c99ef0a15"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 11 17:51:31 crc kubenswrapper[4837]: I0111 17:51:31.233465 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/fee2b6c4-27f7-4652-ac09-fa6c99ef0a15-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "fee2b6c4-27f7-4652-ac09-fa6c99ef0a15" (UID: "fee2b6c4-27f7-4652-ac09-fa6c99ef0a15"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 11 17:51:31 crc kubenswrapper[4837]: I0111 17:51:31.235035 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/fee2b6c4-27f7-4652-ac09-fa6c99ef0a15-config" (OuterVolumeSpecName: "config") pod "fee2b6c4-27f7-4652-ac09-fa6c99ef0a15" (UID: "fee2b6c4-27f7-4652-ac09-fa6c99ef0a15"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 11 17:51:31 crc kubenswrapper[4837]: I0111 17:51:31.238249 4837 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/fee2b6c4-27f7-4652-ac09-fa6c99ef0a15-config\") on node \"crc\" DevicePath \"\"" Jan 11 17:51:31 crc kubenswrapper[4837]: I0111 17:51:31.238277 4837 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qzqkv\" (UniqueName: \"kubernetes.io/projected/fee2b6c4-27f7-4652-ac09-fa6c99ef0a15-kube-api-access-qzqkv\") on node \"crc\" DevicePath \"\"" Jan 11 17:51:31 crc kubenswrapper[4837]: I0111 17:51:31.238287 4837 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/fee2b6c4-27f7-4652-ac09-fa6c99ef0a15-dns-svc\") on node \"crc\" DevicePath \"\"" Jan 11 17:51:31 crc kubenswrapper[4837]: I0111 17:51:31.238296 4837 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/fee2b6c4-27f7-4652-ac09-fa6c99ef0a15-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Jan 11 17:51:31 crc kubenswrapper[4837]: I0111 17:51:31.250726 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/fee2b6c4-27f7-4652-ac09-fa6c99ef0a15-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "fee2b6c4-27f7-4652-ac09-fa6c99ef0a15" (UID: "fee2b6c4-27f7-4652-ac09-fa6c99ef0a15"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 11 17:51:31 crc kubenswrapper[4837]: I0111 17:51:31.267258 4837 generic.go:334] "Generic (PLEG): container finished" podID="fee2b6c4-27f7-4652-ac09-fa6c99ef0a15" containerID="5eee4df63284b573630d26a7df8f50546aeb5216fd6ff0bec48819b84c6902c5" exitCode=0 Jan 11 17:51:31 crc kubenswrapper[4837]: I0111 17:51:31.267303 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-75c8ddd69c-kg5rw" event={"ID":"fee2b6c4-27f7-4652-ac09-fa6c99ef0a15","Type":"ContainerDied","Data":"5eee4df63284b573630d26a7df8f50546aeb5216fd6ff0bec48819b84c6902c5"} Jan 11 17:51:31 crc kubenswrapper[4837]: I0111 17:51:31.267330 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-75c8ddd69c-kg5rw" event={"ID":"fee2b6c4-27f7-4652-ac09-fa6c99ef0a15","Type":"ContainerDied","Data":"089c58ffc44ca2978a114f706c6c8574734b11abb2cfcb9c3379d93259d7d96b"} Jan 11 17:51:31 crc kubenswrapper[4837]: I0111 17:51:31.267347 4837 scope.go:117] "RemoveContainer" containerID="5eee4df63284b573630d26a7df8f50546aeb5216fd6ff0bec48819b84c6902c5" Jan 11 17:51:31 crc kubenswrapper[4837]: I0111 17:51:31.267355 4837 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-75c8ddd69c-kg5rw" Jan 11 17:51:31 crc kubenswrapper[4837]: I0111 17:51:31.294156 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/fee2b6c4-27f7-4652-ac09-fa6c99ef0a15-dns-swift-storage-0" (OuterVolumeSpecName: "dns-swift-storage-0") pod "fee2b6c4-27f7-4652-ac09-fa6c99ef0a15" (UID: "fee2b6c4-27f7-4652-ac09-fa6c99ef0a15"). InnerVolumeSpecName "dns-swift-storage-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 11 17:51:31 crc kubenswrapper[4837]: I0111 17:51:31.343920 4837 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/fee2b6c4-27f7-4652-ac09-fa6c99ef0a15-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Jan 11 17:51:31 crc kubenswrapper[4837]: I0111 17:51:31.343947 4837 reconciler_common.go:293] "Volume detached for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/fee2b6c4-27f7-4652-ac09-fa6c99ef0a15-dns-swift-storage-0\") on node \"crc\" DevicePath \"\"" Jan 11 17:51:31 crc kubenswrapper[4837]: I0111 17:51:31.501026 4837 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-api-0"] Jan 11 17:51:31 crc kubenswrapper[4837]: I0111 17:51:31.604019 4837 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-75c8ddd69c-kg5rw"] Jan 11 17:51:31 crc kubenswrapper[4837]: I0111 17:51:31.619536 4837 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-75c8ddd69c-kg5rw"] Jan 11 17:51:31 crc kubenswrapper[4837]: I0111 17:51:31.733837 4837 scope.go:117] "RemoveContainer" containerID="d8d3ab45be2e8915c97fae87bd745b998815600b1a8524f911e172c191ded09f" Jan 11 17:51:31 crc kubenswrapper[4837]: I0111 17:51:31.734014 4837 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-scheduler-0"] Jan 11 17:51:31 crc kubenswrapper[4837]: I0111 17:51:31.746120 4837 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-api-0"] Jan 11 17:51:31 crc kubenswrapper[4837]: I0111 17:51:31.843386 4837 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-5784cf869f-g67l9"] Jan 11 17:51:31 crc kubenswrapper[4837]: I0111 17:51:31.859481 4837 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/horizon-7f65cf99f6-zwzzs" Jan 11 17:51:31 crc kubenswrapper[4837]: I0111 17:51:31.888409 4837 scope.go:117] "RemoveContainer" containerID="5eee4df63284b573630d26a7df8f50546aeb5216fd6ff0bec48819b84c6902c5" Jan 11 17:51:31 crc kubenswrapper[4837]: E0111 17:51:31.891867 4837 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"5eee4df63284b573630d26a7df8f50546aeb5216fd6ff0bec48819b84c6902c5\": container with ID starting with 5eee4df63284b573630d26a7df8f50546aeb5216fd6ff0bec48819b84c6902c5 not found: ID does not exist" containerID="5eee4df63284b573630d26a7df8f50546aeb5216fd6ff0bec48819b84c6902c5" Jan 11 17:51:31 crc kubenswrapper[4837]: I0111 17:51:31.891920 4837 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5eee4df63284b573630d26a7df8f50546aeb5216fd6ff0bec48819b84c6902c5"} err="failed to get container status \"5eee4df63284b573630d26a7df8f50546aeb5216fd6ff0bec48819b84c6902c5\": rpc error: code = NotFound desc = could not find container \"5eee4df63284b573630d26a7df8f50546aeb5216fd6ff0bec48819b84c6902c5\": container with ID starting with 5eee4df63284b573630d26a7df8f50546aeb5216fd6ff0bec48819b84c6902c5 not found: ID does not exist" Jan 11 17:51:31 crc kubenswrapper[4837]: I0111 17:51:31.891950 4837 scope.go:117] "RemoveContainer" containerID="d8d3ab45be2e8915c97fae87bd745b998815600b1a8524f911e172c191ded09f" Jan 11 17:51:31 crc kubenswrapper[4837]: E0111 17:51:31.892592 4837 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d8d3ab45be2e8915c97fae87bd745b998815600b1a8524f911e172c191ded09f\": container with ID starting with d8d3ab45be2e8915c97fae87bd745b998815600b1a8524f911e172c191ded09f not found: ID does not exist" containerID="d8d3ab45be2e8915c97fae87bd745b998815600b1a8524f911e172c191ded09f" Jan 11 17:51:31 crc kubenswrapper[4837]: I0111 17:51:31.892652 4837 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d8d3ab45be2e8915c97fae87bd745b998815600b1a8524f911e172c191ded09f"} err="failed to get container status \"d8d3ab45be2e8915c97fae87bd745b998815600b1a8524f911e172c191ded09f\": rpc error: code = NotFound desc = could not find container \"d8d3ab45be2e8915c97fae87bd745b998815600b1a8524f911e172c191ded09f\": container with ID starting with d8d3ab45be2e8915c97fae87bd745b998815600b1a8524f911e172c191ded09f not found: ID does not exist" Jan 11 17:51:31 crc kubenswrapper[4837]: W0111 17:51:31.904143 4837 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod93f666ee_5f8d_4403_83f6_87d4be8c0961.slice/crio-94c675d79319e10e0baaa4d982b25b6f72883f8b715f95734542d0c51cf4dea4 WatchSource:0}: Error finding container 94c675d79319e10e0baaa4d982b25b6f72883f8b715f95734542d0c51cf4dea4: Status 404 returned error can't find the container with id 94c675d79319e10e0baaa4d982b25b6f72883f8b715f95734542d0c51cf4dea4 Jan 11 17:51:32 crc kubenswrapper[4837]: I0111 17:51:32.212105 4837 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/horizon-5d4fd56848-nmkm6" Jan 11 17:51:32 crc kubenswrapper[4837]: I0111 17:51:32.342942 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-keystone-listener-6dbd56445d-4bk5s" event={"ID":"bdeb4f5a-a034-4698-bf56-2f6e1d9cd7d0","Type":"ContainerStarted","Data":"22006f0440d36aca1384aa062837be428020bd01dc57a669d81dc23e21551afc"} Jan 11 17:51:32 crc kubenswrapper[4837]: I0111 17:51:32.354978 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"dd8316e1-8d7c-4c39-82d3-d200b7d6b164","Type":"ContainerStarted","Data":"7d7e6e770d9baa5854a7fc215c60b7e52bdbbfb361c6fe499e037f0b38dc82f9"} Jan 11 17:51:32 crc kubenswrapper[4837]: I0111 17:51:32.371023 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"f941f2f8-0d7f-409f-929b-7bbba98e0ca5","Type":"ContainerStarted","Data":"4c9aaf609124d7d6740a715e656e6ace81c1f250b3f9433e66fd6d6b2eabdb4c"} Jan 11 17:51:32 crc kubenswrapper[4837]: I0111 17:51:32.401222 4837 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="fee2b6c4-27f7-4652-ac09-fa6c99ef0a15" path="/var/lib/kubelet/pods/fee2b6c4-27f7-4652-ac09-fa6c99ef0a15/volumes" Jan 11 17:51:32 crc kubenswrapper[4837]: I0111 17:51:32.401972 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-worker-5f749597dc-j8n24" event={"ID":"59aacef4-5c25-42e6-a96f-5ca46dc94667","Type":"ContainerStarted","Data":"b520ed7d12396ad6b8923292ef0fdf1f4146ac5577ebbd783203852c7323b9ed"} Jan 11 17:51:32 crc kubenswrapper[4837]: I0111 17:51:32.401999 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-worker-5f749597dc-j8n24" event={"ID":"59aacef4-5c25-42e6-a96f-5ca46dc94667","Type":"ContainerStarted","Data":"6c73c7d2a39aa1c59221b330927a9634ae46d443b7870ce7b4926e664f09791f"} Jan 11 17:51:32 crc kubenswrapper[4837]: I0111 17:51:32.420485 4837 generic.go:334] "Generic (PLEG): container finished" podID="93f666ee-5f8d-4403-83f6-87d4be8c0961" containerID="fa47df7f17c26e02c12ae6c3c6f7e57a5ec73cd8f7bc2c53cacdf4e36a1ae592" exitCode=0 Jan 11 17:51:32 crc kubenswrapper[4837]: I0111 17:51:32.420595 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5784cf869f-g67l9" event={"ID":"93f666ee-5f8d-4403-83f6-87d4be8c0961","Type":"ContainerDied","Data":"fa47df7f17c26e02c12ae6c3c6f7e57a5ec73cd8f7bc2c53cacdf4e36a1ae592"} Jan 11 17:51:32 crc kubenswrapper[4837]: I0111 17:51:32.420627 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5784cf869f-g67l9" event={"ID":"93f666ee-5f8d-4403-83f6-87d4be8c0961","Type":"ContainerStarted","Data":"94c675d79319e10e0baaa4d982b25b6f72883f8b715f95734542d0c51cf4dea4"} Jan 11 17:51:32 crc kubenswrapper[4837]: I0111 17:51:32.421079 4837 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/barbican-worker-5f749597dc-j8n24" podStartSLOduration=3.125011116 podStartE2EDuration="9.421061692s" podCreationTimestamp="2026-01-11 17:51:23 +0000 UTC" firstStartedPulling="2026-01-11 17:51:24.800923954 +0000 UTC m=+1258.979116660" lastFinishedPulling="2026-01-11 17:51:31.09697454 +0000 UTC m=+1265.275167236" observedRunningTime="2026-01-11 17:51:32.415735529 +0000 UTC m=+1266.593928235" watchObservedRunningTime="2026-01-11 17:51:32.421061692 +0000 UTC m=+1266.599254398" Jan 11 17:51:32 crc kubenswrapper[4837]: I0111 17:51:32.444981 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-c9967c84b-cfjvt" event={"ID":"42d725d3-10ec-4492-8598-b505cef336fd","Type":"ContainerStarted","Data":"21508e256d15a39f6222eb2ed1cd3b7c5a19ad681db48d981caac57e456252aa"} Jan 11 17:51:32 crc kubenswrapper[4837]: I0111 17:51:32.445154 4837 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/barbican-api-c9967c84b-cfjvt" Jan 11 17:51:32 crc kubenswrapper[4837]: I0111 17:51:32.445196 4837 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/barbican-api-c9967c84b-cfjvt" Jan 11 17:51:32 crc kubenswrapper[4837]: I0111 17:51:32.504391 4837 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/barbican-api-c9967c84b-cfjvt" podStartSLOduration=6.50437114 podStartE2EDuration="6.50437114s" podCreationTimestamp="2026-01-11 17:51:26 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-11 17:51:32.481882356 +0000 UTC m=+1266.660075062" watchObservedRunningTime="2026-01-11 17:51:32.50437114 +0000 UTC m=+1266.682563846" Jan 11 17:51:33 crc kubenswrapper[4837]: I0111 17:51:33.460643 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-keystone-listener-6dbd56445d-4bk5s" event={"ID":"bdeb4f5a-a034-4698-bf56-2f6e1d9cd7d0","Type":"ContainerStarted","Data":"55e054e9eff051bb35752b5787184b836880944131b850367cec0661d5c4a5de"} Jan 11 17:51:33 crc kubenswrapper[4837]: I0111 17:51:33.488594 4837 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/barbican-keystone-listener-6dbd56445d-4bk5s" podStartSLOduration=3.275536879 podStartE2EDuration="9.488578455s" podCreationTimestamp="2026-01-11 17:51:24 +0000 UTC" firstStartedPulling="2026-01-11 17:51:24.901502165 +0000 UTC m=+1259.079694871" lastFinishedPulling="2026-01-11 17:51:31.114543741 +0000 UTC m=+1265.292736447" observedRunningTime="2026-01-11 17:51:33.484438644 +0000 UTC m=+1267.662631350" watchObservedRunningTime="2026-01-11 17:51:33.488578455 +0000 UTC m=+1267.666771161" Jan 11 17:51:33 crc kubenswrapper[4837]: I0111 17:51:33.593213 4837 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/horizon-7f65cf99f6-zwzzs" Jan 11 17:51:33 crc kubenswrapper[4837]: I0111 17:51:33.704414 4837 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/horizon-5d4fd56848-nmkm6"] Jan 11 17:51:33 crc kubenswrapper[4837]: I0111 17:51:33.704649 4837 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/horizon-5d4fd56848-nmkm6" podUID="af5aeb3b-e789-4f43-ac70-bb570e59027e" containerName="horizon-log" containerID="cri-o://6d7775187bc9d8933cd74c6fb7ea42d457f38072bf52d7ec4be27690a1b39be4" gracePeriod=30 Jan 11 17:51:33 crc kubenswrapper[4837]: I0111 17:51:33.705118 4837 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/horizon-5d4fd56848-nmkm6" podUID="af5aeb3b-e789-4f43-ac70-bb570e59027e" containerName="horizon" containerID="cri-o://31448bf7a925ddeb3fb3d1d51e19aa71d7fe4393dd1c0b9b7fa3db44c4fe276a" gracePeriod=30 Jan 11 17:51:33 crc kubenswrapper[4837]: I0111 17:51:33.715978 4837 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/horizon-5d4fd56848-nmkm6" podUID="af5aeb3b-e789-4f43-ac70-bb570e59027e" containerName="horizon" probeResult="failure" output="Get \"https://10.217.0.151:8443/dashboard/auth/login/?next=/dashboard/\": EOF" Jan 11 17:51:34 crc kubenswrapper[4837]: I0111 17:51:34.494732 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5784cf869f-g67l9" event={"ID":"93f666ee-5f8d-4403-83f6-87d4be8c0961","Type":"ContainerStarted","Data":"ed239e40b4823cea4d674b64febf13e8f69e00a4959abf08b0694156c212182a"} Jan 11 17:51:34 crc kubenswrapper[4837]: I0111 17:51:34.496785 4837 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-5784cf869f-g67l9" Jan 11 17:51:34 crc kubenswrapper[4837]: I0111 17:51:34.506241 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"f941f2f8-0d7f-409f-929b-7bbba98e0ca5","Type":"ContainerStarted","Data":"4998098392e360ba2c365746769e724ce16240b83808bf1f778cb965de8950a6"} Jan 11 17:51:34 crc kubenswrapper[4837]: I0111 17:51:34.520829 4837 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-5784cf869f-g67l9" podStartSLOduration=5.5208148999999995 podStartE2EDuration="5.5208149s" podCreationTimestamp="2026-01-11 17:51:29 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-11 17:51:34.517541072 +0000 UTC m=+1268.695733788" watchObservedRunningTime="2026-01-11 17:51:34.5208149 +0000 UTC m=+1268.699007606" Jan 11 17:51:35 crc kubenswrapper[4837]: I0111 17:51:35.306436 4837 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/barbican-api-c9967c84b-cfjvt" Jan 11 17:51:35 crc kubenswrapper[4837]: I0111 17:51:35.533722 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"dd8316e1-8d7c-4c39-82d3-d200b7d6b164","Type":"ContainerStarted","Data":"b6a679bed9d738026e66ca403cabbc6df8c6a4ec474b2e5487c4b9b0b0221cf5"} Jan 11 17:51:35 crc kubenswrapper[4837]: I0111 17:51:35.540895 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"f941f2f8-0d7f-409f-929b-7bbba98e0ca5","Type":"ContainerStarted","Data":"ac258adcb74c01f9cb073cb9a261b4990f17d07735f996690a0eaaf343f8bf3c"} Jan 11 17:51:35 crc kubenswrapper[4837]: I0111 17:51:35.541059 4837 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/cinder-api-0" podUID="f941f2f8-0d7f-409f-929b-7bbba98e0ca5" containerName="cinder-api-log" containerID="cri-o://4998098392e360ba2c365746769e724ce16240b83808bf1f778cb965de8950a6" gracePeriod=30 Jan 11 17:51:35 crc kubenswrapper[4837]: I0111 17:51:35.541615 4837 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/cinder-api-0" podUID="f941f2f8-0d7f-409f-929b-7bbba98e0ca5" containerName="cinder-api" containerID="cri-o://ac258adcb74c01f9cb073cb9a261b4990f17d07735f996690a0eaaf343f8bf3c" gracePeriod=30 Jan 11 17:51:35 crc kubenswrapper[4837]: E0111 17:51:35.726492 4837 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podf941f2f8_0d7f_409f_929b_7bbba98e0ca5.slice/crio-4998098392e360ba2c365746769e724ce16240b83808bf1f778cb965de8950a6.scope\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podf941f2f8_0d7f_409f_929b_7bbba98e0ca5.slice/crio-conmon-4998098392e360ba2c365746769e724ce16240b83808bf1f778cb965de8950a6.scope\": RecentStats: unable to find data in memory cache]" Jan 11 17:51:36 crc kubenswrapper[4837]: I0111 17:51:36.121322 4837 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/barbican-api-76c97b5b58-7nlx9" Jan 11 17:51:36 crc kubenswrapper[4837]: I0111 17:51:36.150712 4837 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/cinder-api-0" podStartSLOduration=7.150688577 podStartE2EDuration="7.150688577s" podCreationTimestamp="2026-01-11 17:51:29 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-11 17:51:35.565627093 +0000 UTC m=+1269.743819839" watchObservedRunningTime="2026-01-11 17:51:36.150688577 +0000 UTC m=+1270.328881293" Jan 11 17:51:36 crc kubenswrapper[4837]: I0111 17:51:36.241209 4837 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/barbican-api-76c97b5b58-7nlx9" Jan 11 17:51:36 crc kubenswrapper[4837]: I0111 17:51:36.551468 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"dd8316e1-8d7c-4c39-82d3-d200b7d6b164","Type":"ContainerStarted","Data":"4dbb2d3932eec10095c838000b83e22f0b2f31af8208deff6a1a6a290468f6e5"} Jan 11 17:51:36 crc kubenswrapper[4837]: I0111 17:51:36.555698 4837 generic.go:334] "Generic (PLEG): container finished" podID="f941f2f8-0d7f-409f-929b-7bbba98e0ca5" containerID="ac258adcb74c01f9cb073cb9a261b4990f17d07735f996690a0eaaf343f8bf3c" exitCode=0 Jan 11 17:51:36 crc kubenswrapper[4837]: I0111 17:51:36.555729 4837 generic.go:334] "Generic (PLEG): container finished" podID="f941f2f8-0d7f-409f-929b-7bbba98e0ca5" containerID="4998098392e360ba2c365746769e724ce16240b83808bf1f778cb965de8950a6" exitCode=143 Jan 11 17:51:36 crc kubenswrapper[4837]: I0111 17:51:36.555712 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"f941f2f8-0d7f-409f-929b-7bbba98e0ca5","Type":"ContainerDied","Data":"ac258adcb74c01f9cb073cb9a261b4990f17d07735f996690a0eaaf343f8bf3c"} Jan 11 17:51:36 crc kubenswrapper[4837]: I0111 17:51:36.555779 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"f941f2f8-0d7f-409f-929b-7bbba98e0ca5","Type":"ContainerDied","Data":"4998098392e360ba2c365746769e724ce16240b83808bf1f778cb965de8950a6"} Jan 11 17:51:36 crc kubenswrapper[4837]: I0111 17:51:36.573045 4837 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/cinder-scheduler-0" podStartSLOduration=4.9641381970000005 podStartE2EDuration="7.57302791s" podCreationTimestamp="2026-01-11 17:51:29 +0000 UTC" firstStartedPulling="2026-01-11 17:51:31.771765233 +0000 UTC m=+1265.949957939" lastFinishedPulling="2026-01-11 17:51:34.380654956 +0000 UTC m=+1268.558847652" observedRunningTime="2026-01-11 17:51:36.570214844 +0000 UTC m=+1270.748407550" watchObservedRunningTime="2026-01-11 17:51:36.57302791 +0000 UTC m=+1270.751220616" Jan 11 17:51:36 crc kubenswrapper[4837]: I0111 17:51:36.914357 4837 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-api-0" Jan 11 17:51:36 crc kubenswrapper[4837]: I0111 17:51:36.919712 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/f941f2f8-0d7f-409f-929b-7bbba98e0ca5-scripts\") pod \"f941f2f8-0d7f-409f-929b-7bbba98e0ca5\" (UID: \"f941f2f8-0d7f-409f-929b-7bbba98e0ca5\") " Jan 11 17:51:36 crc kubenswrapper[4837]: I0111 17:51:36.919829 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-c9g7p\" (UniqueName: \"kubernetes.io/projected/f941f2f8-0d7f-409f-929b-7bbba98e0ca5-kube-api-access-c9g7p\") pod \"f941f2f8-0d7f-409f-929b-7bbba98e0ca5\" (UID: \"f941f2f8-0d7f-409f-929b-7bbba98e0ca5\") " Jan 11 17:51:36 crc kubenswrapper[4837]: I0111 17:51:36.920017 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/f941f2f8-0d7f-409f-929b-7bbba98e0ca5-config-data-custom\") pod \"f941f2f8-0d7f-409f-929b-7bbba98e0ca5\" (UID: \"f941f2f8-0d7f-409f-929b-7bbba98e0ca5\") " Jan 11 17:51:36 crc kubenswrapper[4837]: I0111 17:51:36.920090 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f941f2f8-0d7f-409f-929b-7bbba98e0ca5-config-data\") pod \"f941f2f8-0d7f-409f-929b-7bbba98e0ca5\" (UID: \"f941f2f8-0d7f-409f-929b-7bbba98e0ca5\") " Jan 11 17:51:36 crc kubenswrapper[4837]: I0111 17:51:36.920194 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/f941f2f8-0d7f-409f-929b-7bbba98e0ca5-logs\") pod \"f941f2f8-0d7f-409f-929b-7bbba98e0ca5\" (UID: \"f941f2f8-0d7f-409f-929b-7bbba98e0ca5\") " Jan 11 17:51:36 crc kubenswrapper[4837]: I0111 17:51:36.920556 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/f941f2f8-0d7f-409f-929b-7bbba98e0ca5-etc-machine-id\") pod \"f941f2f8-0d7f-409f-929b-7bbba98e0ca5\" (UID: \"f941f2f8-0d7f-409f-929b-7bbba98e0ca5\") " Jan 11 17:51:36 crc kubenswrapper[4837]: I0111 17:51:36.921468 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f941f2f8-0d7f-409f-929b-7bbba98e0ca5-combined-ca-bundle\") pod \"f941f2f8-0d7f-409f-929b-7bbba98e0ca5\" (UID: \"f941f2f8-0d7f-409f-929b-7bbba98e0ca5\") " Jan 11 17:51:36 crc kubenswrapper[4837]: I0111 17:51:36.922982 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/f941f2f8-0d7f-409f-929b-7bbba98e0ca5-logs" (OuterVolumeSpecName: "logs") pod "f941f2f8-0d7f-409f-929b-7bbba98e0ca5" (UID: "f941f2f8-0d7f-409f-929b-7bbba98e0ca5"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 11 17:51:36 crc kubenswrapper[4837]: I0111 17:51:36.923043 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f941f2f8-0d7f-409f-929b-7bbba98e0ca5-etc-machine-id" (OuterVolumeSpecName: "etc-machine-id") pod "f941f2f8-0d7f-409f-929b-7bbba98e0ca5" (UID: "f941f2f8-0d7f-409f-929b-7bbba98e0ca5"). InnerVolumeSpecName "etc-machine-id". PluginName "kubernetes.io/host-path", VolumeGidValue "" Jan 11 17:51:36 crc kubenswrapper[4837]: I0111 17:51:36.927062 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f941f2f8-0d7f-409f-929b-7bbba98e0ca5-scripts" (OuterVolumeSpecName: "scripts") pod "f941f2f8-0d7f-409f-929b-7bbba98e0ca5" (UID: "f941f2f8-0d7f-409f-929b-7bbba98e0ca5"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 11 17:51:36 crc kubenswrapper[4837]: I0111 17:51:36.927205 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f941f2f8-0d7f-409f-929b-7bbba98e0ca5-kube-api-access-c9g7p" (OuterVolumeSpecName: "kube-api-access-c9g7p") pod "f941f2f8-0d7f-409f-929b-7bbba98e0ca5" (UID: "f941f2f8-0d7f-409f-929b-7bbba98e0ca5"). InnerVolumeSpecName "kube-api-access-c9g7p". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 11 17:51:36 crc kubenswrapper[4837]: I0111 17:51:36.940320 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f941f2f8-0d7f-409f-929b-7bbba98e0ca5-config-data-custom" (OuterVolumeSpecName: "config-data-custom") pod "f941f2f8-0d7f-409f-929b-7bbba98e0ca5" (UID: "f941f2f8-0d7f-409f-929b-7bbba98e0ca5"). InnerVolumeSpecName "config-data-custom". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 11 17:51:37 crc kubenswrapper[4837]: I0111 17:51:37.023279 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f941f2f8-0d7f-409f-929b-7bbba98e0ca5-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "f941f2f8-0d7f-409f-929b-7bbba98e0ca5" (UID: "f941f2f8-0d7f-409f-929b-7bbba98e0ca5"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 11 17:51:37 crc kubenswrapper[4837]: I0111 17:51:37.024877 4837 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/f941f2f8-0d7f-409f-929b-7bbba98e0ca5-scripts\") on node \"crc\" DevicePath \"\"" Jan 11 17:51:37 crc kubenswrapper[4837]: I0111 17:51:37.024892 4837 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-c9g7p\" (UniqueName: \"kubernetes.io/projected/f941f2f8-0d7f-409f-929b-7bbba98e0ca5-kube-api-access-c9g7p\") on node \"crc\" DevicePath \"\"" Jan 11 17:51:37 crc kubenswrapper[4837]: I0111 17:51:37.024905 4837 reconciler_common.go:293] "Volume detached for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/f941f2f8-0d7f-409f-929b-7bbba98e0ca5-config-data-custom\") on node \"crc\" DevicePath \"\"" Jan 11 17:51:37 crc kubenswrapper[4837]: I0111 17:51:37.024915 4837 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/f941f2f8-0d7f-409f-929b-7bbba98e0ca5-logs\") on node \"crc\" DevicePath \"\"" Jan 11 17:51:37 crc kubenswrapper[4837]: I0111 17:51:37.024925 4837 reconciler_common.go:293] "Volume detached for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/f941f2f8-0d7f-409f-929b-7bbba98e0ca5-etc-machine-id\") on node \"crc\" DevicePath \"\"" Jan 11 17:51:37 crc kubenswrapper[4837]: I0111 17:51:37.024933 4837 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f941f2f8-0d7f-409f-929b-7bbba98e0ca5-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 11 17:51:37 crc kubenswrapper[4837]: I0111 17:51:37.043016 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f941f2f8-0d7f-409f-929b-7bbba98e0ca5-config-data" (OuterVolumeSpecName: "config-data") pod "f941f2f8-0d7f-409f-929b-7bbba98e0ca5" (UID: "f941f2f8-0d7f-409f-929b-7bbba98e0ca5"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 11 17:51:37 crc kubenswrapper[4837]: I0111 17:51:37.114266 4837 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/horizon-5d4fd56848-nmkm6" podUID="af5aeb3b-e789-4f43-ac70-bb570e59027e" containerName="horizon" probeResult="failure" output="Get \"https://10.217.0.151:8443/dashboard/auth/login/?next=/dashboard/\": read tcp 10.217.0.2:53900->10.217.0.151:8443: read: connection reset by peer" Jan 11 17:51:37 crc kubenswrapper[4837]: I0111 17:51:37.126606 4837 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f941f2f8-0d7f-409f-929b-7bbba98e0ca5-config-data\") on node \"crc\" DevicePath \"\"" Jan 11 17:51:37 crc kubenswrapper[4837]: I0111 17:51:37.564981 4837 generic.go:334] "Generic (PLEG): container finished" podID="af5aeb3b-e789-4f43-ac70-bb570e59027e" containerID="31448bf7a925ddeb3fb3d1d51e19aa71d7fe4393dd1c0b9b7fa3db44c4fe276a" exitCode=0 Jan 11 17:51:37 crc kubenswrapper[4837]: I0111 17:51:37.565086 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-5d4fd56848-nmkm6" event={"ID":"af5aeb3b-e789-4f43-ac70-bb570e59027e","Type":"ContainerDied","Data":"31448bf7a925ddeb3fb3d1d51e19aa71d7fe4393dd1c0b9b7fa3db44c4fe276a"} Jan 11 17:51:37 crc kubenswrapper[4837]: I0111 17:51:37.568518 4837 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-api-0" Jan 11 17:51:37 crc kubenswrapper[4837]: I0111 17:51:37.568559 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"f941f2f8-0d7f-409f-929b-7bbba98e0ca5","Type":"ContainerDied","Data":"4c9aaf609124d7d6740a715e656e6ace81c1f250b3f9433e66fd6d6b2eabdb4c"} Jan 11 17:51:37 crc kubenswrapper[4837]: I0111 17:51:37.568620 4837 scope.go:117] "RemoveContainer" containerID="ac258adcb74c01f9cb073cb9a261b4990f17d07735f996690a0eaaf343f8bf3c" Jan 11 17:51:37 crc kubenswrapper[4837]: I0111 17:51:37.592160 4837 scope.go:117] "RemoveContainer" containerID="4998098392e360ba2c365746769e724ce16240b83808bf1f778cb965de8950a6" Jan 11 17:51:37 crc kubenswrapper[4837]: I0111 17:51:37.600860 4837 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-api-0"] Jan 11 17:51:37 crc kubenswrapper[4837]: I0111 17:51:37.610020 4837 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/cinder-api-0"] Jan 11 17:51:37 crc kubenswrapper[4837]: I0111 17:51:37.627178 4837 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-api-0"] Jan 11 17:51:37 crc kubenswrapper[4837]: E0111 17:51:37.627578 4837 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="fee2b6c4-27f7-4652-ac09-fa6c99ef0a15" containerName="init" Jan 11 17:51:37 crc kubenswrapper[4837]: I0111 17:51:37.627595 4837 state_mem.go:107] "Deleted CPUSet assignment" podUID="fee2b6c4-27f7-4652-ac09-fa6c99ef0a15" containerName="init" Jan 11 17:51:37 crc kubenswrapper[4837]: E0111 17:51:37.627618 4837 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f941f2f8-0d7f-409f-929b-7bbba98e0ca5" containerName="cinder-api-log" Jan 11 17:51:37 crc kubenswrapper[4837]: I0111 17:51:37.627625 4837 state_mem.go:107] "Deleted CPUSet assignment" podUID="f941f2f8-0d7f-409f-929b-7bbba98e0ca5" containerName="cinder-api-log" Jan 11 17:51:37 crc kubenswrapper[4837]: E0111 17:51:37.627634 4837 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="fee2b6c4-27f7-4652-ac09-fa6c99ef0a15" containerName="dnsmasq-dns" Jan 11 17:51:37 crc kubenswrapper[4837]: I0111 17:51:37.627640 4837 state_mem.go:107] "Deleted CPUSet assignment" podUID="fee2b6c4-27f7-4652-ac09-fa6c99ef0a15" containerName="dnsmasq-dns" Jan 11 17:51:37 crc kubenswrapper[4837]: E0111 17:51:37.627649 4837 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f941f2f8-0d7f-409f-929b-7bbba98e0ca5" containerName="cinder-api" Jan 11 17:51:37 crc kubenswrapper[4837]: I0111 17:51:37.627655 4837 state_mem.go:107] "Deleted CPUSet assignment" podUID="f941f2f8-0d7f-409f-929b-7bbba98e0ca5" containerName="cinder-api" Jan 11 17:51:37 crc kubenswrapper[4837]: I0111 17:51:37.627845 4837 memory_manager.go:354] "RemoveStaleState removing state" podUID="f941f2f8-0d7f-409f-929b-7bbba98e0ca5" containerName="cinder-api" Jan 11 17:51:37 crc kubenswrapper[4837]: I0111 17:51:37.627871 4837 memory_manager.go:354] "RemoveStaleState removing state" podUID="fee2b6c4-27f7-4652-ac09-fa6c99ef0a15" containerName="dnsmasq-dns" Jan 11 17:51:37 crc kubenswrapper[4837]: I0111 17:51:37.627880 4837 memory_manager.go:354] "RemoveStaleState removing state" podUID="f941f2f8-0d7f-409f-929b-7bbba98e0ca5" containerName="cinder-api-log" Jan 11 17:51:37 crc kubenswrapper[4837]: I0111 17:51:37.628913 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-api-0" Jan 11 17:51:37 crc kubenswrapper[4837]: I0111 17:51:37.632849 4837 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-cinder-internal-svc" Jan 11 17:51:37 crc kubenswrapper[4837]: I0111 17:51:37.633026 4837 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-cinder-public-svc" Jan 11 17:51:37 crc kubenswrapper[4837]: I0111 17:51:37.633144 4837 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-api-config-data" Jan 11 17:51:37 crc kubenswrapper[4837]: I0111 17:51:37.636259 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fbcjq\" (UniqueName: \"kubernetes.io/projected/a59977a8-3e8d-4fa9-866e-541d9e0d4bda-kube-api-access-fbcjq\") pod \"cinder-api-0\" (UID: \"a59977a8-3e8d-4fa9-866e-541d9e0d4bda\") " pod="openstack/cinder-api-0" Jan 11 17:51:37 crc kubenswrapper[4837]: I0111 17:51:37.636319 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/a59977a8-3e8d-4fa9-866e-541d9e0d4bda-etc-machine-id\") pod \"cinder-api-0\" (UID: \"a59977a8-3e8d-4fa9-866e-541d9e0d4bda\") " pod="openstack/cinder-api-0" Jan 11 17:51:37 crc kubenswrapper[4837]: I0111 17:51:37.636338 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/a59977a8-3e8d-4fa9-866e-541d9e0d4bda-internal-tls-certs\") pod \"cinder-api-0\" (UID: \"a59977a8-3e8d-4fa9-866e-541d9e0d4bda\") " pod="openstack/cinder-api-0" Jan 11 17:51:37 crc kubenswrapper[4837]: I0111 17:51:37.636419 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/a59977a8-3e8d-4fa9-866e-541d9e0d4bda-public-tls-certs\") pod \"cinder-api-0\" (UID: \"a59977a8-3e8d-4fa9-866e-541d9e0d4bda\") " pod="openstack/cinder-api-0" Jan 11 17:51:37 crc kubenswrapper[4837]: I0111 17:51:37.636513 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a59977a8-3e8d-4fa9-866e-541d9e0d4bda-combined-ca-bundle\") pod \"cinder-api-0\" (UID: \"a59977a8-3e8d-4fa9-866e-541d9e0d4bda\") " pod="openstack/cinder-api-0" Jan 11 17:51:37 crc kubenswrapper[4837]: I0111 17:51:37.636570 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/a59977a8-3e8d-4fa9-866e-541d9e0d4bda-scripts\") pod \"cinder-api-0\" (UID: \"a59977a8-3e8d-4fa9-866e-541d9e0d4bda\") " pod="openstack/cinder-api-0" Jan 11 17:51:37 crc kubenswrapper[4837]: I0111 17:51:37.636598 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/a59977a8-3e8d-4fa9-866e-541d9e0d4bda-config-data-custom\") pod \"cinder-api-0\" (UID: \"a59977a8-3e8d-4fa9-866e-541d9e0d4bda\") " pod="openstack/cinder-api-0" Jan 11 17:51:37 crc kubenswrapper[4837]: I0111 17:51:37.636628 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/a59977a8-3e8d-4fa9-866e-541d9e0d4bda-logs\") pod \"cinder-api-0\" (UID: \"a59977a8-3e8d-4fa9-866e-541d9e0d4bda\") " pod="openstack/cinder-api-0" Jan 11 17:51:37 crc kubenswrapper[4837]: I0111 17:51:37.637819 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a59977a8-3e8d-4fa9-866e-541d9e0d4bda-config-data\") pod \"cinder-api-0\" (UID: \"a59977a8-3e8d-4fa9-866e-541d9e0d4bda\") " pod="openstack/cinder-api-0" Jan 11 17:51:37 crc kubenswrapper[4837]: I0111 17:51:37.652380 4837 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-api-0"] Jan 11 17:51:37 crc kubenswrapper[4837]: I0111 17:51:37.739887 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fbcjq\" (UniqueName: \"kubernetes.io/projected/a59977a8-3e8d-4fa9-866e-541d9e0d4bda-kube-api-access-fbcjq\") pod \"cinder-api-0\" (UID: \"a59977a8-3e8d-4fa9-866e-541d9e0d4bda\") " pod="openstack/cinder-api-0" Jan 11 17:51:37 crc kubenswrapper[4837]: I0111 17:51:37.740152 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/a59977a8-3e8d-4fa9-866e-541d9e0d4bda-etc-machine-id\") pod \"cinder-api-0\" (UID: \"a59977a8-3e8d-4fa9-866e-541d9e0d4bda\") " pod="openstack/cinder-api-0" Jan 11 17:51:37 crc kubenswrapper[4837]: I0111 17:51:37.740221 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/a59977a8-3e8d-4fa9-866e-541d9e0d4bda-internal-tls-certs\") pod \"cinder-api-0\" (UID: \"a59977a8-3e8d-4fa9-866e-541d9e0d4bda\") " pod="openstack/cinder-api-0" Jan 11 17:51:37 crc kubenswrapper[4837]: I0111 17:51:37.740318 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/a59977a8-3e8d-4fa9-866e-541d9e0d4bda-public-tls-certs\") pod \"cinder-api-0\" (UID: \"a59977a8-3e8d-4fa9-866e-541d9e0d4bda\") " pod="openstack/cinder-api-0" Jan 11 17:51:37 crc kubenswrapper[4837]: I0111 17:51:37.740400 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a59977a8-3e8d-4fa9-866e-541d9e0d4bda-combined-ca-bundle\") pod \"cinder-api-0\" (UID: \"a59977a8-3e8d-4fa9-866e-541d9e0d4bda\") " pod="openstack/cinder-api-0" Jan 11 17:51:37 crc kubenswrapper[4837]: I0111 17:51:37.740479 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/a59977a8-3e8d-4fa9-866e-541d9e0d4bda-scripts\") pod \"cinder-api-0\" (UID: \"a59977a8-3e8d-4fa9-866e-541d9e0d4bda\") " pod="openstack/cinder-api-0" Jan 11 17:51:37 crc kubenswrapper[4837]: I0111 17:51:37.740547 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/a59977a8-3e8d-4fa9-866e-541d9e0d4bda-config-data-custom\") pod \"cinder-api-0\" (UID: \"a59977a8-3e8d-4fa9-866e-541d9e0d4bda\") " pod="openstack/cinder-api-0" Jan 11 17:51:37 crc kubenswrapper[4837]: I0111 17:51:37.740320 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/a59977a8-3e8d-4fa9-866e-541d9e0d4bda-etc-machine-id\") pod \"cinder-api-0\" (UID: \"a59977a8-3e8d-4fa9-866e-541d9e0d4bda\") " pod="openstack/cinder-api-0" Jan 11 17:51:37 crc kubenswrapper[4837]: I0111 17:51:37.740619 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/a59977a8-3e8d-4fa9-866e-541d9e0d4bda-logs\") pod \"cinder-api-0\" (UID: \"a59977a8-3e8d-4fa9-866e-541d9e0d4bda\") " pod="openstack/cinder-api-0" Jan 11 17:51:37 crc kubenswrapper[4837]: I0111 17:51:37.740879 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a59977a8-3e8d-4fa9-866e-541d9e0d4bda-config-data\") pod \"cinder-api-0\" (UID: \"a59977a8-3e8d-4fa9-866e-541d9e0d4bda\") " pod="openstack/cinder-api-0" Jan 11 17:51:37 crc kubenswrapper[4837]: I0111 17:51:37.741133 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/a59977a8-3e8d-4fa9-866e-541d9e0d4bda-logs\") pod \"cinder-api-0\" (UID: \"a59977a8-3e8d-4fa9-866e-541d9e0d4bda\") " pod="openstack/cinder-api-0" Jan 11 17:51:37 crc kubenswrapper[4837]: I0111 17:51:37.744739 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/a59977a8-3e8d-4fa9-866e-541d9e0d4bda-internal-tls-certs\") pod \"cinder-api-0\" (UID: \"a59977a8-3e8d-4fa9-866e-541d9e0d4bda\") " pod="openstack/cinder-api-0" Jan 11 17:51:37 crc kubenswrapper[4837]: I0111 17:51:37.745317 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/a59977a8-3e8d-4fa9-866e-541d9e0d4bda-public-tls-certs\") pod \"cinder-api-0\" (UID: \"a59977a8-3e8d-4fa9-866e-541d9e0d4bda\") " pod="openstack/cinder-api-0" Jan 11 17:51:37 crc kubenswrapper[4837]: I0111 17:51:37.747874 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a59977a8-3e8d-4fa9-866e-541d9e0d4bda-config-data\") pod \"cinder-api-0\" (UID: \"a59977a8-3e8d-4fa9-866e-541d9e0d4bda\") " pod="openstack/cinder-api-0" Jan 11 17:51:37 crc kubenswrapper[4837]: I0111 17:51:37.748385 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a59977a8-3e8d-4fa9-866e-541d9e0d4bda-combined-ca-bundle\") pod \"cinder-api-0\" (UID: \"a59977a8-3e8d-4fa9-866e-541d9e0d4bda\") " pod="openstack/cinder-api-0" Jan 11 17:51:37 crc kubenswrapper[4837]: I0111 17:51:37.751168 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/a59977a8-3e8d-4fa9-866e-541d9e0d4bda-config-data-custom\") pod \"cinder-api-0\" (UID: \"a59977a8-3e8d-4fa9-866e-541d9e0d4bda\") " pod="openstack/cinder-api-0" Jan 11 17:51:37 crc kubenswrapper[4837]: I0111 17:51:37.753869 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/a59977a8-3e8d-4fa9-866e-541d9e0d4bda-scripts\") pod \"cinder-api-0\" (UID: \"a59977a8-3e8d-4fa9-866e-541d9e0d4bda\") " pod="openstack/cinder-api-0" Jan 11 17:51:37 crc kubenswrapper[4837]: I0111 17:51:37.759577 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fbcjq\" (UniqueName: \"kubernetes.io/projected/a59977a8-3e8d-4fa9-866e-541d9e0d4bda-kube-api-access-fbcjq\") pod \"cinder-api-0\" (UID: \"a59977a8-3e8d-4fa9-866e-541d9e0d4bda\") " pod="openstack/cinder-api-0" Jan 11 17:51:37 crc kubenswrapper[4837]: I0111 17:51:37.956807 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-api-0" Jan 11 17:51:38 crc kubenswrapper[4837]: I0111 17:51:38.372561 4837 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f941f2f8-0d7f-409f-929b-7bbba98e0ca5" path="/var/lib/kubelet/pods/f941f2f8-0d7f-409f-929b-7bbba98e0ca5/volumes" Jan 11 17:51:38 crc kubenswrapper[4837]: I0111 17:51:38.418372 4837 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-api-0"] Jan 11 17:51:38 crc kubenswrapper[4837]: I0111 17:51:38.590750 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"a59977a8-3e8d-4fa9-866e-541d9e0d4bda","Type":"ContainerStarted","Data":"35e07dbd7ee22e619f3b001ae2314cb52bca64a75f2ff37f7092679dd70e89a9"} Jan 11 17:51:39 crc kubenswrapper[4837]: I0111 17:51:39.100822 4837 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/barbican-api-c9967c84b-cfjvt" Jan 11 17:51:39 crc kubenswrapper[4837]: I0111 17:51:39.162498 4837 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/barbican-api-76c97b5b58-7nlx9"] Jan 11 17:51:39 crc kubenswrapper[4837]: I0111 17:51:39.163029 4837 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/barbican-api-76c97b5b58-7nlx9" podUID="ad710eac-6bbf-4930-9696-bd5bcd59be03" containerName="barbican-api" containerID="cri-o://475d5dd452fc80975f970d393333df82e4feac014fe06d78e0b3f1820bf31f19" gracePeriod=30 Jan 11 17:51:39 crc kubenswrapper[4837]: I0111 17:51:39.163451 4837 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/barbican-api-76c97b5b58-7nlx9" podUID="ad710eac-6bbf-4930-9696-bd5bcd59be03" containerName="barbican-api-log" containerID="cri-o://b766ce5b582e0bdaabd51aaef7f27edaa2ca5b3c905c8a830f6bf72a066f1008" gracePeriod=30 Jan 11 17:51:39 crc kubenswrapper[4837]: I0111 17:51:39.608918 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"a59977a8-3e8d-4fa9-866e-541d9e0d4bda","Type":"ContainerStarted","Data":"300c14007f394f17586e7d753967d9048e354093101132b8c7b7394a4431e27d"} Jan 11 17:51:39 crc kubenswrapper[4837]: I0111 17:51:39.793584 4837 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/cinder-scheduler-0" Jan 11 17:51:39 crc kubenswrapper[4837]: I0111 17:51:39.830756 4837 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/horizon-5d4fd56848-nmkm6" podUID="af5aeb3b-e789-4f43-ac70-bb570e59027e" containerName="horizon" probeResult="failure" output="Get \"https://10.217.0.151:8443/dashboard/auth/login/?next=/dashboard/\": dial tcp 10.217.0.151:8443: connect: connection refused" Jan 11 17:51:39 crc kubenswrapper[4837]: I0111 17:51:39.909192 4837 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-5784cf869f-g67l9" Jan 11 17:51:39 crc kubenswrapper[4837]: I0111 17:51:39.967512 4837 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-84b966f6c9-7p2rt"] Jan 11 17:51:39 crc kubenswrapper[4837]: I0111 17:51:39.968199 4837 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-84b966f6c9-7p2rt" podUID="ce0fbcd3-57e1-437f-b004-7f609443f898" containerName="dnsmasq-dns" containerID="cri-o://71ae436473ecaeccb052dd25701507bd1adfa22c1b2c4afb6588a78d857f1798" gracePeriod=10 Jan 11 17:51:40 crc kubenswrapper[4837]: I0111 17:51:40.617572 4837 generic.go:334] "Generic (PLEG): container finished" podID="ce0fbcd3-57e1-437f-b004-7f609443f898" containerID="71ae436473ecaeccb052dd25701507bd1adfa22c1b2c4afb6588a78d857f1798" exitCode=0 Jan 11 17:51:40 crc kubenswrapper[4837]: I0111 17:51:40.617703 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-84b966f6c9-7p2rt" event={"ID":"ce0fbcd3-57e1-437f-b004-7f609443f898","Type":"ContainerDied","Data":"71ae436473ecaeccb052dd25701507bd1adfa22c1b2c4afb6588a78d857f1798"} Jan 11 17:51:40 crc kubenswrapper[4837]: I0111 17:51:40.617929 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-84b966f6c9-7p2rt" event={"ID":"ce0fbcd3-57e1-437f-b004-7f609443f898","Type":"ContainerDied","Data":"de6c9e95f5ec9e4da5ceab00589f9279539847f46f12ed16290f478ebed9e673"} Jan 11 17:51:40 crc kubenswrapper[4837]: I0111 17:51:40.617946 4837 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="de6c9e95f5ec9e4da5ceab00589f9279539847f46f12ed16290f478ebed9e673" Jan 11 17:51:40 crc kubenswrapper[4837]: I0111 17:51:40.621436 4837 generic.go:334] "Generic (PLEG): container finished" podID="ad710eac-6bbf-4930-9696-bd5bcd59be03" containerID="b766ce5b582e0bdaabd51aaef7f27edaa2ca5b3c905c8a830f6bf72a066f1008" exitCode=143 Jan 11 17:51:40 crc kubenswrapper[4837]: I0111 17:51:40.621475 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-76c97b5b58-7nlx9" event={"ID":"ad710eac-6bbf-4930-9696-bd5bcd59be03","Type":"ContainerDied","Data":"b766ce5b582e0bdaabd51aaef7f27edaa2ca5b3c905c8a830f6bf72a066f1008"} Jan 11 17:51:40 crc kubenswrapper[4837]: I0111 17:51:40.631126 4837 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-84b966f6c9-7p2rt" Jan 11 17:51:40 crc kubenswrapper[4837]: I0111 17:51:40.712076 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/ce0fbcd3-57e1-437f-b004-7f609443f898-dns-svc\") pod \"ce0fbcd3-57e1-437f-b004-7f609443f898\" (UID: \"ce0fbcd3-57e1-437f-b004-7f609443f898\") " Jan 11 17:51:40 crc kubenswrapper[4837]: I0111 17:51:40.712143 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-5xb2t\" (UniqueName: \"kubernetes.io/projected/ce0fbcd3-57e1-437f-b004-7f609443f898-kube-api-access-5xb2t\") pod \"ce0fbcd3-57e1-437f-b004-7f609443f898\" (UID: \"ce0fbcd3-57e1-437f-b004-7f609443f898\") " Jan 11 17:51:40 crc kubenswrapper[4837]: I0111 17:51:40.712171 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/ce0fbcd3-57e1-437f-b004-7f609443f898-ovsdbserver-sb\") pod \"ce0fbcd3-57e1-437f-b004-7f609443f898\" (UID: \"ce0fbcd3-57e1-437f-b004-7f609443f898\") " Jan 11 17:51:40 crc kubenswrapper[4837]: I0111 17:51:40.712222 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ce0fbcd3-57e1-437f-b004-7f609443f898-config\") pod \"ce0fbcd3-57e1-437f-b004-7f609443f898\" (UID: \"ce0fbcd3-57e1-437f-b004-7f609443f898\") " Jan 11 17:51:40 crc kubenswrapper[4837]: I0111 17:51:40.712258 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/ce0fbcd3-57e1-437f-b004-7f609443f898-dns-swift-storage-0\") pod \"ce0fbcd3-57e1-437f-b004-7f609443f898\" (UID: \"ce0fbcd3-57e1-437f-b004-7f609443f898\") " Jan 11 17:51:40 crc kubenswrapper[4837]: I0111 17:51:40.712354 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/ce0fbcd3-57e1-437f-b004-7f609443f898-ovsdbserver-nb\") pod \"ce0fbcd3-57e1-437f-b004-7f609443f898\" (UID: \"ce0fbcd3-57e1-437f-b004-7f609443f898\") " Jan 11 17:51:40 crc kubenswrapper[4837]: I0111 17:51:40.717620 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ce0fbcd3-57e1-437f-b004-7f609443f898-kube-api-access-5xb2t" (OuterVolumeSpecName: "kube-api-access-5xb2t") pod "ce0fbcd3-57e1-437f-b004-7f609443f898" (UID: "ce0fbcd3-57e1-437f-b004-7f609443f898"). InnerVolumeSpecName "kube-api-access-5xb2t". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 11 17:51:40 crc kubenswrapper[4837]: I0111 17:51:40.772366 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/ce0fbcd3-57e1-437f-b004-7f609443f898-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "ce0fbcd3-57e1-437f-b004-7f609443f898" (UID: "ce0fbcd3-57e1-437f-b004-7f609443f898"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 11 17:51:40 crc kubenswrapper[4837]: I0111 17:51:40.785199 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/ce0fbcd3-57e1-437f-b004-7f609443f898-config" (OuterVolumeSpecName: "config") pod "ce0fbcd3-57e1-437f-b004-7f609443f898" (UID: "ce0fbcd3-57e1-437f-b004-7f609443f898"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 11 17:51:40 crc kubenswrapper[4837]: I0111 17:51:40.793198 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/ce0fbcd3-57e1-437f-b004-7f609443f898-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "ce0fbcd3-57e1-437f-b004-7f609443f898" (UID: "ce0fbcd3-57e1-437f-b004-7f609443f898"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 11 17:51:40 crc kubenswrapper[4837]: I0111 17:51:40.803081 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/ce0fbcd3-57e1-437f-b004-7f609443f898-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "ce0fbcd3-57e1-437f-b004-7f609443f898" (UID: "ce0fbcd3-57e1-437f-b004-7f609443f898"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 11 17:51:40 crc kubenswrapper[4837]: I0111 17:51:40.814061 4837 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/ce0fbcd3-57e1-437f-b004-7f609443f898-dns-svc\") on node \"crc\" DevicePath \"\"" Jan 11 17:51:40 crc kubenswrapper[4837]: I0111 17:51:40.814085 4837 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-5xb2t\" (UniqueName: \"kubernetes.io/projected/ce0fbcd3-57e1-437f-b004-7f609443f898-kube-api-access-5xb2t\") on node \"crc\" DevicePath \"\"" Jan 11 17:51:40 crc kubenswrapper[4837]: I0111 17:51:40.814097 4837 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/ce0fbcd3-57e1-437f-b004-7f609443f898-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Jan 11 17:51:40 crc kubenswrapper[4837]: I0111 17:51:40.814106 4837 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ce0fbcd3-57e1-437f-b004-7f609443f898-config\") on node \"crc\" DevicePath \"\"" Jan 11 17:51:40 crc kubenswrapper[4837]: I0111 17:51:40.814132 4837 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/ce0fbcd3-57e1-437f-b004-7f609443f898-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Jan 11 17:51:40 crc kubenswrapper[4837]: I0111 17:51:40.818083 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/ce0fbcd3-57e1-437f-b004-7f609443f898-dns-swift-storage-0" (OuterVolumeSpecName: "dns-swift-storage-0") pod "ce0fbcd3-57e1-437f-b004-7f609443f898" (UID: "ce0fbcd3-57e1-437f-b004-7f609443f898"). InnerVolumeSpecName "dns-swift-storage-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 11 17:51:40 crc kubenswrapper[4837]: I0111 17:51:40.916122 4837 reconciler_common.go:293] "Volume detached for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/ce0fbcd3-57e1-437f-b004-7f609443f898-dns-swift-storage-0\") on node \"crc\" DevicePath \"\"" Jan 11 17:51:40 crc kubenswrapper[4837]: I0111 17:51:40.950042 4837 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/keystone-79986b9d84-7gl5k" Jan 11 17:51:41 crc kubenswrapper[4837]: I0111 17:51:41.571008 4837 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/placement-86bdff5ffb-hdnql" Jan 11 17:51:41 crc kubenswrapper[4837]: I0111 17:51:41.644860 4837 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-84b966f6c9-7p2rt" Jan 11 17:51:41 crc kubenswrapper[4837]: I0111 17:51:41.646206 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"a59977a8-3e8d-4fa9-866e-541d9e0d4bda","Type":"ContainerStarted","Data":"a8656cf4220b9e6303cabf9951f8e65b429603d412c064e44979b3c2125a2f81"} Jan 11 17:51:41 crc kubenswrapper[4837]: I0111 17:51:41.646255 4837 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/cinder-api-0" Jan 11 17:51:41 crc kubenswrapper[4837]: I0111 17:51:41.684159 4837 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/cinder-api-0" podStartSLOduration=4.6841444800000005 podStartE2EDuration="4.68414448s" podCreationTimestamp="2026-01-11 17:51:37 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-11 17:51:41.675085007 +0000 UTC m=+1275.853277713" watchObservedRunningTime="2026-01-11 17:51:41.68414448 +0000 UTC m=+1275.862337176" Jan 11 17:51:41 crc kubenswrapper[4837]: I0111 17:51:41.696337 4837 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-84b966f6c9-7p2rt"] Jan 11 17:51:41 crc kubenswrapper[4837]: I0111 17:51:41.706783 4837 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-84b966f6c9-7p2rt"] Jan 11 17:51:42 crc kubenswrapper[4837]: I0111 17:51:42.359451 4837 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/barbican-api-76c97b5b58-7nlx9" podUID="ad710eac-6bbf-4930-9696-bd5bcd59be03" containerName="barbican-api-log" probeResult="failure" output="Get \"http://10.217.0.164:9311/healthcheck\": read tcp 10.217.0.2:38516->10.217.0.164:9311: read: connection reset by peer" Jan 11 17:51:42 crc kubenswrapper[4837]: I0111 17:51:42.359463 4837 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/barbican-api-76c97b5b58-7nlx9" podUID="ad710eac-6bbf-4930-9696-bd5bcd59be03" containerName="barbican-api" probeResult="failure" output="Get \"http://10.217.0.164:9311/healthcheck\": read tcp 10.217.0.2:38518->10.217.0.164:9311: read: connection reset by peer" Jan 11 17:51:42 crc kubenswrapper[4837]: I0111 17:51:42.375933 4837 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ce0fbcd3-57e1-437f-b004-7f609443f898" path="/var/lib/kubelet/pods/ce0fbcd3-57e1-437f-b004-7f609443f898/volumes" Jan 11 17:51:42 crc kubenswrapper[4837]: I0111 17:51:42.600257 4837 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/placement-86bdff5ffb-hdnql" Jan 11 17:51:42 crc kubenswrapper[4837]: I0111 17:51:42.666106 4837 generic.go:334] "Generic (PLEG): container finished" podID="ad710eac-6bbf-4930-9696-bd5bcd59be03" containerID="475d5dd452fc80975f970d393333df82e4feac014fe06d78e0b3f1820bf31f19" exitCode=0 Jan 11 17:51:42 crc kubenswrapper[4837]: I0111 17:51:42.667031 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-76c97b5b58-7nlx9" event={"ID":"ad710eac-6bbf-4930-9696-bd5bcd59be03","Type":"ContainerDied","Data":"475d5dd452fc80975f970d393333df82e4feac014fe06d78e0b3f1820bf31f19"} Jan 11 17:51:42 crc kubenswrapper[4837]: I0111 17:51:42.774766 4837 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/neutron-d654f854d-9wdwv" Jan 11 17:51:43 crc kubenswrapper[4837]: I0111 17:51:43.433640 4837 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-api-76c97b5b58-7nlx9" Jan 11 17:51:43 crc kubenswrapper[4837]: I0111 17:51:43.467516 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/ad710eac-6bbf-4930-9696-bd5bcd59be03-logs\") pod \"ad710eac-6bbf-4930-9696-bd5bcd59be03\" (UID: \"ad710eac-6bbf-4930-9696-bd5bcd59be03\") " Jan 11 17:51:43 crc kubenswrapper[4837]: I0111 17:51:43.467609 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ad710eac-6bbf-4930-9696-bd5bcd59be03-combined-ca-bundle\") pod \"ad710eac-6bbf-4930-9696-bd5bcd59be03\" (UID: \"ad710eac-6bbf-4930-9696-bd5bcd59be03\") " Jan 11 17:51:43 crc kubenswrapper[4837]: I0111 17:51:43.467798 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ad710eac-6bbf-4930-9696-bd5bcd59be03-config-data\") pod \"ad710eac-6bbf-4930-9696-bd5bcd59be03\" (UID: \"ad710eac-6bbf-4930-9696-bd5bcd59be03\") " Jan 11 17:51:43 crc kubenswrapper[4837]: I0111 17:51:43.467858 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/ad710eac-6bbf-4930-9696-bd5bcd59be03-config-data-custom\") pod \"ad710eac-6bbf-4930-9696-bd5bcd59be03\" (UID: \"ad710eac-6bbf-4930-9696-bd5bcd59be03\") " Jan 11 17:51:43 crc kubenswrapper[4837]: I0111 17:51:43.467915 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-bjrwb\" (UniqueName: \"kubernetes.io/projected/ad710eac-6bbf-4930-9696-bd5bcd59be03-kube-api-access-bjrwb\") pod \"ad710eac-6bbf-4930-9696-bd5bcd59be03\" (UID: \"ad710eac-6bbf-4930-9696-bd5bcd59be03\") " Jan 11 17:51:43 crc kubenswrapper[4837]: I0111 17:51:43.469207 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/ad710eac-6bbf-4930-9696-bd5bcd59be03-logs" (OuterVolumeSpecName: "logs") pod "ad710eac-6bbf-4930-9696-bd5bcd59be03" (UID: "ad710eac-6bbf-4930-9696-bd5bcd59be03"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 11 17:51:43 crc kubenswrapper[4837]: I0111 17:51:43.489146 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ad710eac-6bbf-4930-9696-bd5bcd59be03-kube-api-access-bjrwb" (OuterVolumeSpecName: "kube-api-access-bjrwb") pod "ad710eac-6bbf-4930-9696-bd5bcd59be03" (UID: "ad710eac-6bbf-4930-9696-bd5bcd59be03"). InnerVolumeSpecName "kube-api-access-bjrwb". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 11 17:51:43 crc kubenswrapper[4837]: I0111 17:51:43.490835 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ad710eac-6bbf-4930-9696-bd5bcd59be03-config-data-custom" (OuterVolumeSpecName: "config-data-custom") pod "ad710eac-6bbf-4930-9696-bd5bcd59be03" (UID: "ad710eac-6bbf-4930-9696-bd5bcd59be03"). InnerVolumeSpecName "config-data-custom". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 11 17:51:43 crc kubenswrapper[4837]: I0111 17:51:43.504259 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ad710eac-6bbf-4930-9696-bd5bcd59be03-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "ad710eac-6bbf-4930-9696-bd5bcd59be03" (UID: "ad710eac-6bbf-4930-9696-bd5bcd59be03"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 11 17:51:43 crc kubenswrapper[4837]: I0111 17:51:43.527513 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ad710eac-6bbf-4930-9696-bd5bcd59be03-config-data" (OuterVolumeSpecName: "config-data") pod "ad710eac-6bbf-4930-9696-bd5bcd59be03" (UID: "ad710eac-6bbf-4930-9696-bd5bcd59be03"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 11 17:51:43 crc kubenswrapper[4837]: I0111 17:51:43.570646 4837 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/ad710eac-6bbf-4930-9696-bd5bcd59be03-logs\") on node \"crc\" DevicePath \"\"" Jan 11 17:51:43 crc kubenswrapper[4837]: I0111 17:51:43.570720 4837 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ad710eac-6bbf-4930-9696-bd5bcd59be03-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 11 17:51:43 crc kubenswrapper[4837]: I0111 17:51:43.570739 4837 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ad710eac-6bbf-4930-9696-bd5bcd59be03-config-data\") on node \"crc\" DevicePath \"\"" Jan 11 17:51:43 crc kubenswrapper[4837]: I0111 17:51:43.570756 4837 reconciler_common.go:293] "Volume detached for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/ad710eac-6bbf-4930-9696-bd5bcd59be03-config-data-custom\") on node \"crc\" DevicePath \"\"" Jan 11 17:51:43 crc kubenswrapper[4837]: I0111 17:51:43.570771 4837 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-bjrwb\" (UniqueName: \"kubernetes.io/projected/ad710eac-6bbf-4930-9696-bd5bcd59be03-kube-api-access-bjrwb\") on node \"crc\" DevicePath \"\"" Jan 11 17:51:43 crc kubenswrapper[4837]: I0111 17:51:43.677580 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-76c97b5b58-7nlx9" event={"ID":"ad710eac-6bbf-4930-9696-bd5bcd59be03","Type":"ContainerDied","Data":"860e0b318088350ff726da9197bd034ff68fd92be56c5aa31fca2b3c475d1dc0"} Jan 11 17:51:43 crc kubenswrapper[4837]: I0111 17:51:43.677990 4837 scope.go:117] "RemoveContainer" containerID="475d5dd452fc80975f970d393333df82e4feac014fe06d78e0b3f1820bf31f19" Jan 11 17:51:43 crc kubenswrapper[4837]: I0111 17:51:43.678157 4837 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-api-76c97b5b58-7nlx9" Jan 11 17:51:43 crc kubenswrapper[4837]: I0111 17:51:43.711837 4837 scope.go:117] "RemoveContainer" containerID="b766ce5b582e0bdaabd51aaef7f27edaa2ca5b3c905c8a830f6bf72a066f1008" Jan 11 17:51:43 crc kubenswrapper[4837]: I0111 17:51:43.718027 4837 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/barbican-api-76c97b5b58-7nlx9"] Jan 11 17:51:43 crc kubenswrapper[4837]: I0111 17:51:43.727554 4837 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/barbican-api-76c97b5b58-7nlx9"] Jan 11 17:51:44 crc kubenswrapper[4837]: I0111 17:51:44.416020 4837 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ad710eac-6bbf-4930-9696-bd5bcd59be03" path="/var/lib/kubelet/pods/ad710eac-6bbf-4930-9696-bd5bcd59be03/volumes" Jan 11 17:51:44 crc kubenswrapper[4837]: I0111 17:51:44.597370 4837 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/openstackclient"] Jan 11 17:51:44 crc kubenswrapper[4837]: E0111 17:51:44.598066 4837 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ad710eac-6bbf-4930-9696-bd5bcd59be03" containerName="barbican-api" Jan 11 17:51:44 crc kubenswrapper[4837]: I0111 17:51:44.598196 4837 state_mem.go:107] "Deleted CPUSet assignment" podUID="ad710eac-6bbf-4930-9696-bd5bcd59be03" containerName="barbican-api" Jan 11 17:51:44 crc kubenswrapper[4837]: E0111 17:51:44.598308 4837 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ad710eac-6bbf-4930-9696-bd5bcd59be03" containerName="barbican-api-log" Jan 11 17:51:44 crc kubenswrapper[4837]: I0111 17:51:44.598387 4837 state_mem.go:107] "Deleted CPUSet assignment" podUID="ad710eac-6bbf-4930-9696-bd5bcd59be03" containerName="barbican-api-log" Jan 11 17:51:44 crc kubenswrapper[4837]: E0111 17:51:44.598478 4837 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ce0fbcd3-57e1-437f-b004-7f609443f898" containerName="dnsmasq-dns" Jan 11 17:51:44 crc kubenswrapper[4837]: I0111 17:51:44.598547 4837 state_mem.go:107] "Deleted CPUSet assignment" podUID="ce0fbcd3-57e1-437f-b004-7f609443f898" containerName="dnsmasq-dns" Jan 11 17:51:44 crc kubenswrapper[4837]: E0111 17:51:44.598652 4837 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ce0fbcd3-57e1-437f-b004-7f609443f898" containerName="init" Jan 11 17:51:44 crc kubenswrapper[4837]: I0111 17:51:44.598753 4837 state_mem.go:107] "Deleted CPUSet assignment" podUID="ce0fbcd3-57e1-437f-b004-7f609443f898" containerName="init" Jan 11 17:51:44 crc kubenswrapper[4837]: I0111 17:51:44.599053 4837 memory_manager.go:354] "RemoveStaleState removing state" podUID="ce0fbcd3-57e1-437f-b004-7f609443f898" containerName="dnsmasq-dns" Jan 11 17:51:44 crc kubenswrapper[4837]: I0111 17:51:44.599135 4837 memory_manager.go:354] "RemoveStaleState removing state" podUID="ad710eac-6bbf-4930-9696-bd5bcd59be03" containerName="barbican-api-log" Jan 11 17:51:44 crc kubenswrapper[4837]: I0111 17:51:44.599314 4837 memory_manager.go:354] "RemoveStaleState removing state" podUID="ad710eac-6bbf-4930-9696-bd5bcd59be03" containerName="barbican-api" Jan 11 17:51:44 crc kubenswrapper[4837]: I0111 17:51:44.600186 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstackclient" Jan 11 17:51:44 crc kubenswrapper[4837]: I0111 17:51:44.602258 4837 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstackclient-openstackclient-dockercfg-cd5k9" Jan 11 17:51:44 crc kubenswrapper[4837]: I0111 17:51:44.602420 4837 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-config" Jan 11 17:51:44 crc kubenswrapper[4837]: I0111 17:51:44.602528 4837 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-config-secret" Jan 11 17:51:44 crc kubenswrapper[4837]: I0111 17:51:44.608398 4837 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/openstackclient"] Jan 11 17:51:44 crc kubenswrapper[4837]: I0111 17:51:44.711902 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a2380f65-2f68-4a02-95c4-b3fd94ba3adc-combined-ca-bundle\") pod \"openstackclient\" (UID: \"a2380f65-2f68-4a02-95c4-b3fd94ba3adc\") " pod="openstack/openstackclient" Jan 11 17:51:44 crc kubenswrapper[4837]: I0111 17:51:44.711985 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/a2380f65-2f68-4a02-95c4-b3fd94ba3adc-openstack-config\") pod \"openstackclient\" (UID: \"a2380f65-2f68-4a02-95c4-b3fd94ba3adc\") " pod="openstack/openstackclient" Jan 11 17:51:44 crc kubenswrapper[4837]: I0111 17:51:44.712049 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hxpqc\" (UniqueName: \"kubernetes.io/projected/a2380f65-2f68-4a02-95c4-b3fd94ba3adc-kube-api-access-hxpqc\") pod \"openstackclient\" (UID: \"a2380f65-2f68-4a02-95c4-b3fd94ba3adc\") " pod="openstack/openstackclient" Jan 11 17:51:44 crc kubenswrapper[4837]: I0111 17:51:44.712087 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/a2380f65-2f68-4a02-95c4-b3fd94ba3adc-openstack-config-secret\") pod \"openstackclient\" (UID: \"a2380f65-2f68-4a02-95c4-b3fd94ba3adc\") " pod="openstack/openstackclient" Jan 11 17:51:44 crc kubenswrapper[4837]: I0111 17:51:44.813552 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a2380f65-2f68-4a02-95c4-b3fd94ba3adc-combined-ca-bundle\") pod \"openstackclient\" (UID: \"a2380f65-2f68-4a02-95c4-b3fd94ba3adc\") " pod="openstack/openstackclient" Jan 11 17:51:44 crc kubenswrapper[4837]: I0111 17:51:44.813633 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/a2380f65-2f68-4a02-95c4-b3fd94ba3adc-openstack-config\") pod \"openstackclient\" (UID: \"a2380f65-2f68-4a02-95c4-b3fd94ba3adc\") " pod="openstack/openstackclient" Jan 11 17:51:44 crc kubenswrapper[4837]: I0111 17:51:44.813712 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hxpqc\" (UniqueName: \"kubernetes.io/projected/a2380f65-2f68-4a02-95c4-b3fd94ba3adc-kube-api-access-hxpqc\") pod \"openstackclient\" (UID: \"a2380f65-2f68-4a02-95c4-b3fd94ba3adc\") " pod="openstack/openstackclient" Jan 11 17:51:44 crc kubenswrapper[4837]: I0111 17:51:44.813750 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/a2380f65-2f68-4a02-95c4-b3fd94ba3adc-openstack-config-secret\") pod \"openstackclient\" (UID: \"a2380f65-2f68-4a02-95c4-b3fd94ba3adc\") " pod="openstack/openstackclient" Jan 11 17:51:44 crc kubenswrapper[4837]: I0111 17:51:44.815744 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/a2380f65-2f68-4a02-95c4-b3fd94ba3adc-openstack-config\") pod \"openstackclient\" (UID: \"a2380f65-2f68-4a02-95c4-b3fd94ba3adc\") " pod="openstack/openstackclient" Jan 11 17:51:44 crc kubenswrapper[4837]: I0111 17:51:44.820566 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/a2380f65-2f68-4a02-95c4-b3fd94ba3adc-openstack-config-secret\") pod \"openstackclient\" (UID: \"a2380f65-2f68-4a02-95c4-b3fd94ba3adc\") " pod="openstack/openstackclient" Jan 11 17:51:44 crc kubenswrapper[4837]: I0111 17:51:44.820822 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a2380f65-2f68-4a02-95c4-b3fd94ba3adc-combined-ca-bundle\") pod \"openstackclient\" (UID: \"a2380f65-2f68-4a02-95c4-b3fd94ba3adc\") " pod="openstack/openstackclient" Jan 11 17:51:44 crc kubenswrapper[4837]: I0111 17:51:44.841695 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hxpqc\" (UniqueName: \"kubernetes.io/projected/a2380f65-2f68-4a02-95c4-b3fd94ba3adc-kube-api-access-hxpqc\") pod \"openstackclient\" (UID: \"a2380f65-2f68-4a02-95c4-b3fd94ba3adc\") " pod="openstack/openstackclient" Jan 11 17:51:44 crc kubenswrapper[4837]: I0111 17:51:44.941138 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstackclient" Jan 11 17:51:45 crc kubenswrapper[4837]: I0111 17:51:45.047100 4837 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/cinder-scheduler-0" Jan 11 17:51:45 crc kubenswrapper[4837]: I0111 17:51:45.097302 4837 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-scheduler-0"] Jan 11 17:51:45 crc kubenswrapper[4837]: I0111 17:51:45.247890 4837 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/neutron-75f9d98d89-rxjpb" Jan 11 17:51:45 crc kubenswrapper[4837]: I0111 17:51:45.313232 4837 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/neutron-d654f854d-9wdwv"] Jan 11 17:51:45 crc kubenswrapper[4837]: I0111 17:51:45.313466 4837 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/neutron-d654f854d-9wdwv" podUID="6bce9623-f279-4dfb-9cc2-e5a2de56ca9d" containerName="neutron-api" containerID="cri-o://bd93cd6d878e837e6b6c2da1a9ac01402a4248d2dcec952dda491834a59579be" gracePeriod=30 Jan 11 17:51:45 crc kubenswrapper[4837]: I0111 17:51:45.313594 4837 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/neutron-d654f854d-9wdwv" podUID="6bce9623-f279-4dfb-9cc2-e5a2de56ca9d" containerName="neutron-httpd" containerID="cri-o://06151ad9cf9b5bdb88feae9a1589ae6fc974d40700356999d6d43f6a1804ceb0" gracePeriod=30 Jan 11 17:51:45 crc kubenswrapper[4837]: I0111 17:51:45.435332 4837 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/openstackclient"] Jan 11 17:51:45 crc kubenswrapper[4837]: I0111 17:51:45.698491 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstackclient" event={"ID":"a2380f65-2f68-4a02-95c4-b3fd94ba3adc","Type":"ContainerStarted","Data":"eb2ad4c9ec78b03d9378705fc0e5ecf10b69eb3a6ca6de32258a8e860baa7f69"} Jan 11 17:51:45 crc kubenswrapper[4837]: I0111 17:51:45.700285 4837 generic.go:334] "Generic (PLEG): container finished" podID="6bce9623-f279-4dfb-9cc2-e5a2de56ca9d" containerID="06151ad9cf9b5bdb88feae9a1589ae6fc974d40700356999d6d43f6a1804ceb0" exitCode=0 Jan 11 17:51:45 crc kubenswrapper[4837]: I0111 17:51:45.700353 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-d654f854d-9wdwv" event={"ID":"6bce9623-f279-4dfb-9cc2-e5a2de56ca9d","Type":"ContainerDied","Data":"06151ad9cf9b5bdb88feae9a1589ae6fc974d40700356999d6d43f6a1804ceb0"} Jan 11 17:51:45 crc kubenswrapper[4837]: I0111 17:51:45.700481 4837 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/cinder-scheduler-0" podUID="dd8316e1-8d7c-4c39-82d3-d200b7d6b164" containerName="cinder-scheduler" containerID="cri-o://b6a679bed9d738026e66ca403cabbc6df8c6a4ec474b2e5487c4b9b0b0221cf5" gracePeriod=30 Jan 11 17:51:45 crc kubenswrapper[4837]: I0111 17:51:45.700531 4837 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/cinder-scheduler-0" podUID="dd8316e1-8d7c-4c39-82d3-d200b7d6b164" containerName="probe" containerID="cri-o://4dbb2d3932eec10095c838000b83e22f0b2f31af8208deff6a1a6a290468f6e5" gracePeriod=30 Jan 11 17:51:46 crc kubenswrapper[4837]: I0111 17:51:46.714431 4837 generic.go:334] "Generic (PLEG): container finished" podID="dd8316e1-8d7c-4c39-82d3-d200b7d6b164" containerID="4dbb2d3932eec10095c838000b83e22f0b2f31af8208deff6a1a6a290468f6e5" exitCode=0 Jan 11 17:51:46 crc kubenswrapper[4837]: I0111 17:51:46.715080 4837 generic.go:334] "Generic (PLEG): container finished" podID="dd8316e1-8d7c-4c39-82d3-d200b7d6b164" containerID="b6a679bed9d738026e66ca403cabbc6df8c6a4ec474b2e5487c4b9b0b0221cf5" exitCode=0 Jan 11 17:51:46 crc kubenswrapper[4837]: I0111 17:51:46.714524 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"dd8316e1-8d7c-4c39-82d3-d200b7d6b164","Type":"ContainerDied","Data":"4dbb2d3932eec10095c838000b83e22f0b2f31af8208deff6a1a6a290468f6e5"} Jan 11 17:51:46 crc kubenswrapper[4837]: I0111 17:51:46.715115 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"dd8316e1-8d7c-4c39-82d3-d200b7d6b164","Type":"ContainerDied","Data":"b6a679bed9d738026e66ca403cabbc6df8c6a4ec474b2e5487c4b9b0b0221cf5"} Jan 11 17:51:46 crc kubenswrapper[4837]: I0111 17:51:46.949931 4837 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-scheduler-0" Jan 11 17:51:46 crc kubenswrapper[4837]: I0111 17:51:46.980646 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/dd8316e1-8d7c-4c39-82d3-d200b7d6b164-combined-ca-bundle\") pod \"dd8316e1-8d7c-4c39-82d3-d200b7d6b164\" (UID: \"dd8316e1-8d7c-4c39-82d3-d200b7d6b164\") " Jan 11 17:51:46 crc kubenswrapper[4837]: I0111 17:51:46.980798 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/dd8316e1-8d7c-4c39-82d3-d200b7d6b164-config-data-custom\") pod \"dd8316e1-8d7c-4c39-82d3-d200b7d6b164\" (UID: \"dd8316e1-8d7c-4c39-82d3-d200b7d6b164\") " Jan 11 17:51:46 crc kubenswrapper[4837]: I0111 17:51:46.980858 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/dd8316e1-8d7c-4c39-82d3-d200b7d6b164-etc-machine-id\") pod \"dd8316e1-8d7c-4c39-82d3-d200b7d6b164\" (UID: \"dd8316e1-8d7c-4c39-82d3-d200b7d6b164\") " Jan 11 17:51:46 crc kubenswrapper[4837]: I0111 17:51:46.980948 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/dd8316e1-8d7c-4c39-82d3-d200b7d6b164-scripts\") pod \"dd8316e1-8d7c-4c39-82d3-d200b7d6b164\" (UID: \"dd8316e1-8d7c-4c39-82d3-d200b7d6b164\") " Jan 11 17:51:46 crc kubenswrapper[4837]: I0111 17:51:46.981029 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-s2pg5\" (UniqueName: \"kubernetes.io/projected/dd8316e1-8d7c-4c39-82d3-d200b7d6b164-kube-api-access-s2pg5\") pod \"dd8316e1-8d7c-4c39-82d3-d200b7d6b164\" (UID: \"dd8316e1-8d7c-4c39-82d3-d200b7d6b164\") " Jan 11 17:51:46 crc kubenswrapper[4837]: I0111 17:51:46.981054 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/dd8316e1-8d7c-4c39-82d3-d200b7d6b164-etc-machine-id" (OuterVolumeSpecName: "etc-machine-id") pod "dd8316e1-8d7c-4c39-82d3-d200b7d6b164" (UID: "dd8316e1-8d7c-4c39-82d3-d200b7d6b164"). InnerVolumeSpecName "etc-machine-id". PluginName "kubernetes.io/host-path", VolumeGidValue "" Jan 11 17:51:46 crc kubenswrapper[4837]: I0111 17:51:46.982964 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/dd8316e1-8d7c-4c39-82d3-d200b7d6b164-config-data\") pod \"dd8316e1-8d7c-4c39-82d3-d200b7d6b164\" (UID: \"dd8316e1-8d7c-4c39-82d3-d200b7d6b164\") " Jan 11 17:51:46 crc kubenswrapper[4837]: I0111 17:51:46.986367 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/dd8316e1-8d7c-4c39-82d3-d200b7d6b164-kube-api-access-s2pg5" (OuterVolumeSpecName: "kube-api-access-s2pg5") pod "dd8316e1-8d7c-4c39-82d3-d200b7d6b164" (UID: "dd8316e1-8d7c-4c39-82d3-d200b7d6b164"). InnerVolumeSpecName "kube-api-access-s2pg5". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 11 17:51:46 crc kubenswrapper[4837]: I0111 17:51:46.991298 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/dd8316e1-8d7c-4c39-82d3-d200b7d6b164-config-data-custom" (OuterVolumeSpecName: "config-data-custom") pod "dd8316e1-8d7c-4c39-82d3-d200b7d6b164" (UID: "dd8316e1-8d7c-4c39-82d3-d200b7d6b164"). InnerVolumeSpecName "config-data-custom". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 11 17:51:46 crc kubenswrapper[4837]: I0111 17:51:46.993825 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/dd8316e1-8d7c-4c39-82d3-d200b7d6b164-scripts" (OuterVolumeSpecName: "scripts") pod "dd8316e1-8d7c-4c39-82d3-d200b7d6b164" (UID: "dd8316e1-8d7c-4c39-82d3-d200b7d6b164"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 11 17:51:47 crc kubenswrapper[4837]: I0111 17:51:46.999961 4837 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-s2pg5\" (UniqueName: \"kubernetes.io/projected/dd8316e1-8d7c-4c39-82d3-d200b7d6b164-kube-api-access-s2pg5\") on node \"crc\" DevicePath \"\"" Jan 11 17:51:47 crc kubenswrapper[4837]: I0111 17:51:47.000001 4837 reconciler_common.go:293] "Volume detached for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/dd8316e1-8d7c-4c39-82d3-d200b7d6b164-config-data-custom\") on node \"crc\" DevicePath \"\"" Jan 11 17:51:47 crc kubenswrapper[4837]: I0111 17:51:47.000014 4837 reconciler_common.go:293] "Volume detached for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/dd8316e1-8d7c-4c39-82d3-d200b7d6b164-etc-machine-id\") on node \"crc\" DevicePath \"\"" Jan 11 17:51:47 crc kubenswrapper[4837]: I0111 17:51:47.000024 4837 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/dd8316e1-8d7c-4c39-82d3-d200b7d6b164-scripts\") on node \"crc\" DevicePath \"\"" Jan 11 17:51:47 crc kubenswrapper[4837]: I0111 17:51:47.118134 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/dd8316e1-8d7c-4c39-82d3-d200b7d6b164-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "dd8316e1-8d7c-4c39-82d3-d200b7d6b164" (UID: "dd8316e1-8d7c-4c39-82d3-d200b7d6b164"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 11 17:51:47 crc kubenswrapper[4837]: I0111 17:51:47.205814 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/dd8316e1-8d7c-4c39-82d3-d200b7d6b164-config-data" (OuterVolumeSpecName: "config-data") pod "dd8316e1-8d7c-4c39-82d3-d200b7d6b164" (UID: "dd8316e1-8d7c-4c39-82d3-d200b7d6b164"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 11 17:51:47 crc kubenswrapper[4837]: I0111 17:51:47.206920 4837 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/dd8316e1-8d7c-4c39-82d3-d200b7d6b164-config-data\") on node \"crc\" DevicePath \"\"" Jan 11 17:51:47 crc kubenswrapper[4837]: I0111 17:51:47.206938 4837 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/dd8316e1-8d7c-4c39-82d3-d200b7d6b164-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 11 17:51:47 crc kubenswrapper[4837]: I0111 17:51:47.574895 4837 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-d654f854d-9wdwv" Jan 11 17:51:47 crc kubenswrapper[4837]: I0111 17:51:47.612933 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6bce9623-f279-4dfb-9cc2-e5a2de56ca9d-combined-ca-bundle\") pod \"6bce9623-f279-4dfb-9cc2-e5a2de56ca9d\" (UID: \"6bce9623-f279-4dfb-9cc2-e5a2de56ca9d\") " Jan 11 17:51:47 crc kubenswrapper[4837]: I0111 17:51:47.613197 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/6bce9623-f279-4dfb-9cc2-e5a2de56ca9d-httpd-config\") pod \"6bce9623-f279-4dfb-9cc2-e5a2de56ca9d\" (UID: \"6bce9623-f279-4dfb-9cc2-e5a2de56ca9d\") " Jan 11 17:51:47 crc kubenswrapper[4837]: I0111 17:51:47.613341 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/6bce9623-f279-4dfb-9cc2-e5a2de56ca9d-config\") pod \"6bce9623-f279-4dfb-9cc2-e5a2de56ca9d\" (UID: \"6bce9623-f279-4dfb-9cc2-e5a2de56ca9d\") " Jan 11 17:51:47 crc kubenswrapper[4837]: I0111 17:51:47.615147 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-lv2ds\" (UniqueName: \"kubernetes.io/projected/6bce9623-f279-4dfb-9cc2-e5a2de56ca9d-kube-api-access-lv2ds\") pod \"6bce9623-f279-4dfb-9cc2-e5a2de56ca9d\" (UID: \"6bce9623-f279-4dfb-9cc2-e5a2de56ca9d\") " Jan 11 17:51:47 crc kubenswrapper[4837]: I0111 17:51:47.615330 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/6bce9623-f279-4dfb-9cc2-e5a2de56ca9d-ovndb-tls-certs\") pod \"6bce9623-f279-4dfb-9cc2-e5a2de56ca9d\" (UID: \"6bce9623-f279-4dfb-9cc2-e5a2de56ca9d\") " Jan 11 17:51:47 crc kubenswrapper[4837]: I0111 17:51:47.621028 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6bce9623-f279-4dfb-9cc2-e5a2de56ca9d-kube-api-access-lv2ds" (OuterVolumeSpecName: "kube-api-access-lv2ds") pod "6bce9623-f279-4dfb-9cc2-e5a2de56ca9d" (UID: "6bce9623-f279-4dfb-9cc2-e5a2de56ca9d"). InnerVolumeSpecName "kube-api-access-lv2ds". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 11 17:51:47 crc kubenswrapper[4837]: I0111 17:51:47.641771 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6bce9623-f279-4dfb-9cc2-e5a2de56ca9d-httpd-config" (OuterVolumeSpecName: "httpd-config") pod "6bce9623-f279-4dfb-9cc2-e5a2de56ca9d" (UID: "6bce9623-f279-4dfb-9cc2-e5a2de56ca9d"). InnerVolumeSpecName "httpd-config". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 11 17:51:47 crc kubenswrapper[4837]: I0111 17:51:47.667694 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6bce9623-f279-4dfb-9cc2-e5a2de56ca9d-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "6bce9623-f279-4dfb-9cc2-e5a2de56ca9d" (UID: "6bce9623-f279-4dfb-9cc2-e5a2de56ca9d"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 11 17:51:47 crc kubenswrapper[4837]: I0111 17:51:47.675279 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6bce9623-f279-4dfb-9cc2-e5a2de56ca9d-config" (OuterVolumeSpecName: "config") pod "6bce9623-f279-4dfb-9cc2-e5a2de56ca9d" (UID: "6bce9623-f279-4dfb-9cc2-e5a2de56ca9d"). InnerVolumeSpecName "config". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 11 17:51:47 crc kubenswrapper[4837]: I0111 17:51:47.705828 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6bce9623-f279-4dfb-9cc2-e5a2de56ca9d-ovndb-tls-certs" (OuterVolumeSpecName: "ovndb-tls-certs") pod "6bce9623-f279-4dfb-9cc2-e5a2de56ca9d" (UID: "6bce9623-f279-4dfb-9cc2-e5a2de56ca9d"). InnerVolumeSpecName "ovndb-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 11 17:51:47 crc kubenswrapper[4837]: I0111 17:51:47.717071 4837 reconciler_common.go:293] "Volume detached for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/6bce9623-f279-4dfb-9cc2-e5a2de56ca9d-ovndb-tls-certs\") on node \"crc\" DevicePath \"\"" Jan 11 17:51:47 crc kubenswrapper[4837]: I0111 17:51:47.717095 4837 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6bce9623-f279-4dfb-9cc2-e5a2de56ca9d-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 11 17:51:47 crc kubenswrapper[4837]: I0111 17:51:47.717104 4837 reconciler_common.go:293] "Volume detached for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/6bce9623-f279-4dfb-9cc2-e5a2de56ca9d-httpd-config\") on node \"crc\" DevicePath \"\"" Jan 11 17:51:47 crc kubenswrapper[4837]: I0111 17:51:47.717114 4837 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/secret/6bce9623-f279-4dfb-9cc2-e5a2de56ca9d-config\") on node \"crc\" DevicePath \"\"" Jan 11 17:51:47 crc kubenswrapper[4837]: I0111 17:51:47.717123 4837 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-lv2ds\" (UniqueName: \"kubernetes.io/projected/6bce9623-f279-4dfb-9cc2-e5a2de56ca9d-kube-api-access-lv2ds\") on node \"crc\" DevicePath \"\"" Jan 11 17:51:47 crc kubenswrapper[4837]: I0111 17:51:47.730983 4837 generic.go:334] "Generic (PLEG): container finished" podID="6bce9623-f279-4dfb-9cc2-e5a2de56ca9d" containerID="bd93cd6d878e837e6b6c2da1a9ac01402a4248d2dcec952dda491834a59579be" exitCode=0 Jan 11 17:51:47 crc kubenswrapper[4837]: I0111 17:51:47.731069 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-d654f854d-9wdwv" event={"ID":"6bce9623-f279-4dfb-9cc2-e5a2de56ca9d","Type":"ContainerDied","Data":"bd93cd6d878e837e6b6c2da1a9ac01402a4248d2dcec952dda491834a59579be"} Jan 11 17:51:47 crc kubenswrapper[4837]: I0111 17:51:47.731114 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-d654f854d-9wdwv" event={"ID":"6bce9623-f279-4dfb-9cc2-e5a2de56ca9d","Type":"ContainerDied","Data":"dcae9db6e43cfdf9074ff3194356c0877b6a47345ee3c3d5ed3e3d2cf4f6fab9"} Jan 11 17:51:47 crc kubenswrapper[4837]: I0111 17:51:47.731135 4837 scope.go:117] "RemoveContainer" containerID="06151ad9cf9b5bdb88feae9a1589ae6fc974d40700356999d6d43f6a1804ceb0" Jan 11 17:51:47 crc kubenswrapper[4837]: I0111 17:51:47.731300 4837 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-d654f854d-9wdwv" Jan 11 17:51:47 crc kubenswrapper[4837]: I0111 17:51:47.740453 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"dd8316e1-8d7c-4c39-82d3-d200b7d6b164","Type":"ContainerDied","Data":"7d7e6e770d9baa5854a7fc215c60b7e52bdbbfb361c6fe499e037f0b38dc82f9"} Jan 11 17:51:47 crc kubenswrapper[4837]: I0111 17:51:47.740525 4837 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-scheduler-0" Jan 11 17:51:47 crc kubenswrapper[4837]: I0111 17:51:47.774876 4837 scope.go:117] "RemoveContainer" containerID="bd93cd6d878e837e6b6c2da1a9ac01402a4248d2dcec952dda491834a59579be" Jan 11 17:51:47 crc kubenswrapper[4837]: I0111 17:51:47.797851 4837 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/neutron-d654f854d-9wdwv"] Jan 11 17:51:47 crc kubenswrapper[4837]: I0111 17:51:47.817328 4837 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/neutron-d654f854d-9wdwv"] Jan 11 17:51:47 crc kubenswrapper[4837]: I0111 17:51:47.830138 4837 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-scheduler-0"] Jan 11 17:51:47 crc kubenswrapper[4837]: I0111 17:51:47.834739 4837 scope.go:117] "RemoveContainer" containerID="06151ad9cf9b5bdb88feae9a1589ae6fc974d40700356999d6d43f6a1804ceb0" Jan 11 17:51:47 crc kubenswrapper[4837]: E0111 17:51:47.841221 4837 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"06151ad9cf9b5bdb88feae9a1589ae6fc974d40700356999d6d43f6a1804ceb0\": container with ID starting with 06151ad9cf9b5bdb88feae9a1589ae6fc974d40700356999d6d43f6a1804ceb0 not found: ID does not exist" containerID="06151ad9cf9b5bdb88feae9a1589ae6fc974d40700356999d6d43f6a1804ceb0" Jan 11 17:51:47 crc kubenswrapper[4837]: I0111 17:51:47.841278 4837 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"06151ad9cf9b5bdb88feae9a1589ae6fc974d40700356999d6d43f6a1804ceb0"} err="failed to get container status \"06151ad9cf9b5bdb88feae9a1589ae6fc974d40700356999d6d43f6a1804ceb0\": rpc error: code = NotFound desc = could not find container \"06151ad9cf9b5bdb88feae9a1589ae6fc974d40700356999d6d43f6a1804ceb0\": container with ID starting with 06151ad9cf9b5bdb88feae9a1589ae6fc974d40700356999d6d43f6a1804ceb0 not found: ID does not exist" Jan 11 17:51:47 crc kubenswrapper[4837]: I0111 17:51:47.841313 4837 scope.go:117] "RemoveContainer" containerID="bd93cd6d878e837e6b6c2da1a9ac01402a4248d2dcec952dda491834a59579be" Jan 11 17:51:47 crc kubenswrapper[4837]: I0111 17:51:47.841420 4837 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/cinder-scheduler-0"] Jan 11 17:51:47 crc kubenswrapper[4837]: E0111 17:51:47.845212 4837 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"bd93cd6d878e837e6b6c2da1a9ac01402a4248d2dcec952dda491834a59579be\": container with ID starting with bd93cd6d878e837e6b6c2da1a9ac01402a4248d2dcec952dda491834a59579be not found: ID does not exist" containerID="bd93cd6d878e837e6b6c2da1a9ac01402a4248d2dcec952dda491834a59579be" Jan 11 17:51:47 crc kubenswrapper[4837]: I0111 17:51:47.845256 4837 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"bd93cd6d878e837e6b6c2da1a9ac01402a4248d2dcec952dda491834a59579be"} err="failed to get container status \"bd93cd6d878e837e6b6c2da1a9ac01402a4248d2dcec952dda491834a59579be\": rpc error: code = NotFound desc = could not find container \"bd93cd6d878e837e6b6c2da1a9ac01402a4248d2dcec952dda491834a59579be\": container with ID starting with bd93cd6d878e837e6b6c2da1a9ac01402a4248d2dcec952dda491834a59579be not found: ID does not exist" Jan 11 17:51:47 crc kubenswrapper[4837]: I0111 17:51:47.845310 4837 scope.go:117] "RemoveContainer" containerID="4dbb2d3932eec10095c838000b83e22f0b2f31af8208deff6a1a6a290468f6e5" Jan 11 17:51:47 crc kubenswrapper[4837]: I0111 17:51:47.853428 4837 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-scheduler-0"] Jan 11 17:51:47 crc kubenswrapper[4837]: E0111 17:51:47.853857 4837 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6bce9623-f279-4dfb-9cc2-e5a2de56ca9d" containerName="neutron-api" Jan 11 17:51:47 crc kubenswrapper[4837]: I0111 17:51:47.853879 4837 state_mem.go:107] "Deleted CPUSet assignment" podUID="6bce9623-f279-4dfb-9cc2-e5a2de56ca9d" containerName="neutron-api" Jan 11 17:51:47 crc kubenswrapper[4837]: E0111 17:51:47.853910 4837 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6bce9623-f279-4dfb-9cc2-e5a2de56ca9d" containerName="neutron-httpd" Jan 11 17:51:47 crc kubenswrapper[4837]: I0111 17:51:47.853918 4837 state_mem.go:107] "Deleted CPUSet assignment" podUID="6bce9623-f279-4dfb-9cc2-e5a2de56ca9d" containerName="neutron-httpd" Jan 11 17:51:47 crc kubenswrapper[4837]: E0111 17:51:47.853937 4837 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="dd8316e1-8d7c-4c39-82d3-d200b7d6b164" containerName="cinder-scheduler" Jan 11 17:51:47 crc kubenswrapper[4837]: I0111 17:51:47.853946 4837 state_mem.go:107] "Deleted CPUSet assignment" podUID="dd8316e1-8d7c-4c39-82d3-d200b7d6b164" containerName="cinder-scheduler" Jan 11 17:51:47 crc kubenswrapper[4837]: E0111 17:51:47.853963 4837 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="dd8316e1-8d7c-4c39-82d3-d200b7d6b164" containerName="probe" Jan 11 17:51:47 crc kubenswrapper[4837]: I0111 17:51:47.853971 4837 state_mem.go:107] "Deleted CPUSet assignment" podUID="dd8316e1-8d7c-4c39-82d3-d200b7d6b164" containerName="probe" Jan 11 17:51:47 crc kubenswrapper[4837]: I0111 17:51:47.854223 4837 memory_manager.go:354] "RemoveStaleState removing state" podUID="6bce9623-f279-4dfb-9cc2-e5a2de56ca9d" containerName="neutron-httpd" Jan 11 17:51:47 crc kubenswrapper[4837]: I0111 17:51:47.854254 4837 memory_manager.go:354] "RemoveStaleState removing state" podUID="dd8316e1-8d7c-4c39-82d3-d200b7d6b164" containerName="cinder-scheduler" Jan 11 17:51:47 crc kubenswrapper[4837]: I0111 17:51:47.854271 4837 memory_manager.go:354] "RemoveStaleState removing state" podUID="dd8316e1-8d7c-4c39-82d3-d200b7d6b164" containerName="probe" Jan 11 17:51:47 crc kubenswrapper[4837]: I0111 17:51:47.854286 4837 memory_manager.go:354] "RemoveStaleState removing state" podUID="6bce9623-f279-4dfb-9cc2-e5a2de56ca9d" containerName="neutron-api" Jan 11 17:51:47 crc kubenswrapper[4837]: I0111 17:51:47.855502 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-scheduler-0" Jan 11 17:51:47 crc kubenswrapper[4837]: I0111 17:51:47.858720 4837 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-scheduler-config-data" Jan 11 17:51:47 crc kubenswrapper[4837]: I0111 17:51:47.861119 4837 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-scheduler-0"] Jan 11 17:51:47 crc kubenswrapper[4837]: I0111 17:51:47.920876 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f6b872cd-f683-45bb-94db-710d997ef648-combined-ca-bundle\") pod \"cinder-scheduler-0\" (UID: \"f6b872cd-f683-45bb-94db-710d997ef648\") " pod="openstack/cinder-scheduler-0" Jan 11 17:51:47 crc kubenswrapper[4837]: I0111 17:51:47.920968 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mrhqs\" (UniqueName: \"kubernetes.io/projected/f6b872cd-f683-45bb-94db-710d997ef648-kube-api-access-mrhqs\") pod \"cinder-scheduler-0\" (UID: \"f6b872cd-f683-45bb-94db-710d997ef648\") " pod="openstack/cinder-scheduler-0" Jan 11 17:51:47 crc kubenswrapper[4837]: I0111 17:51:47.921069 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/f6b872cd-f683-45bb-94db-710d997ef648-config-data-custom\") pod \"cinder-scheduler-0\" (UID: \"f6b872cd-f683-45bb-94db-710d997ef648\") " pod="openstack/cinder-scheduler-0" Jan 11 17:51:47 crc kubenswrapper[4837]: I0111 17:51:47.921110 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f6b872cd-f683-45bb-94db-710d997ef648-config-data\") pod \"cinder-scheduler-0\" (UID: \"f6b872cd-f683-45bb-94db-710d997ef648\") " pod="openstack/cinder-scheduler-0" Jan 11 17:51:47 crc kubenswrapper[4837]: I0111 17:51:47.921188 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/f6b872cd-f683-45bb-94db-710d997ef648-scripts\") pod \"cinder-scheduler-0\" (UID: \"f6b872cd-f683-45bb-94db-710d997ef648\") " pod="openstack/cinder-scheduler-0" Jan 11 17:51:47 crc kubenswrapper[4837]: I0111 17:51:47.921262 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/f6b872cd-f683-45bb-94db-710d997ef648-etc-machine-id\") pod \"cinder-scheduler-0\" (UID: \"f6b872cd-f683-45bb-94db-710d997ef648\") " pod="openstack/cinder-scheduler-0" Jan 11 17:51:47 crc kubenswrapper[4837]: I0111 17:51:47.926188 4837 scope.go:117] "RemoveContainer" containerID="b6a679bed9d738026e66ca403cabbc6df8c6a4ec474b2e5487c4b9b0b0221cf5" Jan 11 17:51:48 crc kubenswrapper[4837]: I0111 17:51:48.023853 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mrhqs\" (UniqueName: \"kubernetes.io/projected/f6b872cd-f683-45bb-94db-710d997ef648-kube-api-access-mrhqs\") pod \"cinder-scheduler-0\" (UID: \"f6b872cd-f683-45bb-94db-710d997ef648\") " pod="openstack/cinder-scheduler-0" Jan 11 17:51:48 crc kubenswrapper[4837]: I0111 17:51:48.023954 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/f6b872cd-f683-45bb-94db-710d997ef648-config-data-custom\") pod \"cinder-scheduler-0\" (UID: \"f6b872cd-f683-45bb-94db-710d997ef648\") " pod="openstack/cinder-scheduler-0" Jan 11 17:51:48 crc kubenswrapper[4837]: I0111 17:51:48.023998 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f6b872cd-f683-45bb-94db-710d997ef648-config-data\") pod \"cinder-scheduler-0\" (UID: \"f6b872cd-f683-45bb-94db-710d997ef648\") " pod="openstack/cinder-scheduler-0" Jan 11 17:51:48 crc kubenswrapper[4837]: I0111 17:51:48.024035 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/f6b872cd-f683-45bb-94db-710d997ef648-scripts\") pod \"cinder-scheduler-0\" (UID: \"f6b872cd-f683-45bb-94db-710d997ef648\") " pod="openstack/cinder-scheduler-0" Jan 11 17:51:48 crc kubenswrapper[4837]: I0111 17:51:48.024060 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/f6b872cd-f683-45bb-94db-710d997ef648-etc-machine-id\") pod \"cinder-scheduler-0\" (UID: \"f6b872cd-f683-45bb-94db-710d997ef648\") " pod="openstack/cinder-scheduler-0" Jan 11 17:51:48 crc kubenswrapper[4837]: I0111 17:51:48.024118 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f6b872cd-f683-45bb-94db-710d997ef648-combined-ca-bundle\") pod \"cinder-scheduler-0\" (UID: \"f6b872cd-f683-45bb-94db-710d997ef648\") " pod="openstack/cinder-scheduler-0" Jan 11 17:51:48 crc kubenswrapper[4837]: I0111 17:51:48.024449 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/f6b872cd-f683-45bb-94db-710d997ef648-etc-machine-id\") pod \"cinder-scheduler-0\" (UID: \"f6b872cd-f683-45bb-94db-710d997ef648\") " pod="openstack/cinder-scheduler-0" Jan 11 17:51:48 crc kubenswrapper[4837]: I0111 17:51:48.031195 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f6b872cd-f683-45bb-94db-710d997ef648-combined-ca-bundle\") pod \"cinder-scheduler-0\" (UID: \"f6b872cd-f683-45bb-94db-710d997ef648\") " pod="openstack/cinder-scheduler-0" Jan 11 17:51:48 crc kubenswrapper[4837]: I0111 17:51:48.032072 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/f6b872cd-f683-45bb-94db-710d997ef648-scripts\") pod \"cinder-scheduler-0\" (UID: \"f6b872cd-f683-45bb-94db-710d997ef648\") " pod="openstack/cinder-scheduler-0" Jan 11 17:51:48 crc kubenswrapper[4837]: I0111 17:51:48.034997 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f6b872cd-f683-45bb-94db-710d997ef648-config-data\") pod \"cinder-scheduler-0\" (UID: \"f6b872cd-f683-45bb-94db-710d997ef648\") " pod="openstack/cinder-scheduler-0" Jan 11 17:51:48 crc kubenswrapper[4837]: I0111 17:51:48.042710 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mrhqs\" (UniqueName: \"kubernetes.io/projected/f6b872cd-f683-45bb-94db-710d997ef648-kube-api-access-mrhqs\") pod \"cinder-scheduler-0\" (UID: \"f6b872cd-f683-45bb-94db-710d997ef648\") " pod="openstack/cinder-scheduler-0" Jan 11 17:51:48 crc kubenswrapper[4837]: I0111 17:51:48.043023 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/f6b872cd-f683-45bb-94db-710d997ef648-config-data-custom\") pod \"cinder-scheduler-0\" (UID: \"f6b872cd-f683-45bb-94db-710d997ef648\") " pod="openstack/cinder-scheduler-0" Jan 11 17:51:48 crc kubenswrapper[4837]: I0111 17:51:48.228131 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-scheduler-0" Jan 11 17:51:48 crc kubenswrapper[4837]: I0111 17:51:48.377421 4837 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6bce9623-f279-4dfb-9cc2-e5a2de56ca9d" path="/var/lib/kubelet/pods/6bce9623-f279-4dfb-9cc2-e5a2de56ca9d/volumes" Jan 11 17:51:48 crc kubenswrapper[4837]: I0111 17:51:48.378293 4837 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="dd8316e1-8d7c-4c39-82d3-d200b7d6b164" path="/var/lib/kubelet/pods/dd8316e1-8d7c-4c39-82d3-d200b7d6b164/volumes" Jan 11 17:51:48 crc kubenswrapper[4837]: I0111 17:51:48.727912 4837 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-scheduler-0"] Jan 11 17:51:48 crc kubenswrapper[4837]: I0111 17:51:48.763290 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"f6b872cd-f683-45bb-94db-710d997ef648","Type":"ContainerStarted","Data":"1c7f3bfff01fa087bf2fb4c12b8d4b4a16764f5010645e3c566bb7fc488010ab"} Jan 11 17:51:49 crc kubenswrapper[4837]: I0111 17:51:49.201771 4837 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/swift-proxy-85f864d5b5-z8rsp"] Jan 11 17:51:49 crc kubenswrapper[4837]: I0111 17:51:49.203995 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/swift-proxy-85f864d5b5-z8rsp" Jan 11 17:51:49 crc kubenswrapper[4837]: I0111 17:51:49.211165 4837 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-swift-public-svc" Jan 11 17:51:49 crc kubenswrapper[4837]: I0111 17:51:49.211757 4837 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"swift-proxy-config-data" Jan 11 17:51:49 crc kubenswrapper[4837]: I0111 17:51:49.218837 4837 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/swift-proxy-85f864d5b5-z8rsp"] Jan 11 17:51:49 crc kubenswrapper[4837]: I0111 17:51:49.220468 4837 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-swift-internal-svc" Jan 11 17:51:49 crc kubenswrapper[4837]: I0111 17:51:49.356982 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/134689b3-4006-4e5e-a051-cf51f6c9cf51-config-data\") pod \"swift-proxy-85f864d5b5-z8rsp\" (UID: \"134689b3-4006-4e5e-a051-cf51f6c9cf51\") " pod="openstack/swift-proxy-85f864d5b5-z8rsp" Jan 11 17:51:49 crc kubenswrapper[4837]: I0111 17:51:49.357029 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/134689b3-4006-4e5e-a051-cf51f6c9cf51-log-httpd\") pod \"swift-proxy-85f864d5b5-z8rsp\" (UID: \"134689b3-4006-4e5e-a051-cf51f6c9cf51\") " pod="openstack/swift-proxy-85f864d5b5-z8rsp" Jan 11 17:51:49 crc kubenswrapper[4837]: I0111 17:51:49.357068 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-69vzf\" (UniqueName: \"kubernetes.io/projected/134689b3-4006-4e5e-a051-cf51f6c9cf51-kube-api-access-69vzf\") pod \"swift-proxy-85f864d5b5-z8rsp\" (UID: \"134689b3-4006-4e5e-a051-cf51f6c9cf51\") " pod="openstack/swift-proxy-85f864d5b5-z8rsp" Jan 11 17:51:49 crc kubenswrapper[4837]: I0111 17:51:49.357267 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/134689b3-4006-4e5e-a051-cf51f6c9cf51-combined-ca-bundle\") pod \"swift-proxy-85f864d5b5-z8rsp\" (UID: \"134689b3-4006-4e5e-a051-cf51f6c9cf51\") " pod="openstack/swift-proxy-85f864d5b5-z8rsp" Jan 11 17:51:49 crc kubenswrapper[4837]: I0111 17:51:49.357330 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/134689b3-4006-4e5e-a051-cf51f6c9cf51-etc-swift\") pod \"swift-proxy-85f864d5b5-z8rsp\" (UID: \"134689b3-4006-4e5e-a051-cf51f6c9cf51\") " pod="openstack/swift-proxy-85f864d5b5-z8rsp" Jan 11 17:51:49 crc kubenswrapper[4837]: I0111 17:51:49.357433 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/134689b3-4006-4e5e-a051-cf51f6c9cf51-run-httpd\") pod \"swift-proxy-85f864d5b5-z8rsp\" (UID: \"134689b3-4006-4e5e-a051-cf51f6c9cf51\") " pod="openstack/swift-proxy-85f864d5b5-z8rsp" Jan 11 17:51:49 crc kubenswrapper[4837]: I0111 17:51:49.357570 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/134689b3-4006-4e5e-a051-cf51f6c9cf51-internal-tls-certs\") pod \"swift-proxy-85f864d5b5-z8rsp\" (UID: \"134689b3-4006-4e5e-a051-cf51f6c9cf51\") " pod="openstack/swift-proxy-85f864d5b5-z8rsp" Jan 11 17:51:49 crc kubenswrapper[4837]: I0111 17:51:49.357603 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/134689b3-4006-4e5e-a051-cf51f6c9cf51-public-tls-certs\") pod \"swift-proxy-85f864d5b5-z8rsp\" (UID: \"134689b3-4006-4e5e-a051-cf51f6c9cf51\") " pod="openstack/swift-proxy-85f864d5b5-z8rsp" Jan 11 17:51:49 crc kubenswrapper[4837]: I0111 17:51:49.458648 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/134689b3-4006-4e5e-a051-cf51f6c9cf51-internal-tls-certs\") pod \"swift-proxy-85f864d5b5-z8rsp\" (UID: \"134689b3-4006-4e5e-a051-cf51f6c9cf51\") " pod="openstack/swift-proxy-85f864d5b5-z8rsp" Jan 11 17:51:49 crc kubenswrapper[4837]: I0111 17:51:49.458700 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/134689b3-4006-4e5e-a051-cf51f6c9cf51-public-tls-certs\") pod \"swift-proxy-85f864d5b5-z8rsp\" (UID: \"134689b3-4006-4e5e-a051-cf51f6c9cf51\") " pod="openstack/swift-proxy-85f864d5b5-z8rsp" Jan 11 17:51:49 crc kubenswrapper[4837]: I0111 17:51:49.458736 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/134689b3-4006-4e5e-a051-cf51f6c9cf51-config-data\") pod \"swift-proxy-85f864d5b5-z8rsp\" (UID: \"134689b3-4006-4e5e-a051-cf51f6c9cf51\") " pod="openstack/swift-proxy-85f864d5b5-z8rsp" Jan 11 17:51:49 crc kubenswrapper[4837]: I0111 17:51:49.458760 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/134689b3-4006-4e5e-a051-cf51f6c9cf51-log-httpd\") pod \"swift-proxy-85f864d5b5-z8rsp\" (UID: \"134689b3-4006-4e5e-a051-cf51f6c9cf51\") " pod="openstack/swift-proxy-85f864d5b5-z8rsp" Jan 11 17:51:49 crc kubenswrapper[4837]: I0111 17:51:49.458787 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-69vzf\" (UniqueName: \"kubernetes.io/projected/134689b3-4006-4e5e-a051-cf51f6c9cf51-kube-api-access-69vzf\") pod \"swift-proxy-85f864d5b5-z8rsp\" (UID: \"134689b3-4006-4e5e-a051-cf51f6c9cf51\") " pod="openstack/swift-proxy-85f864d5b5-z8rsp" Jan 11 17:51:49 crc kubenswrapper[4837]: I0111 17:51:49.458824 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/134689b3-4006-4e5e-a051-cf51f6c9cf51-combined-ca-bundle\") pod \"swift-proxy-85f864d5b5-z8rsp\" (UID: \"134689b3-4006-4e5e-a051-cf51f6c9cf51\") " pod="openstack/swift-proxy-85f864d5b5-z8rsp" Jan 11 17:51:49 crc kubenswrapper[4837]: I0111 17:51:49.458846 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/134689b3-4006-4e5e-a051-cf51f6c9cf51-etc-swift\") pod \"swift-proxy-85f864d5b5-z8rsp\" (UID: \"134689b3-4006-4e5e-a051-cf51f6c9cf51\") " pod="openstack/swift-proxy-85f864d5b5-z8rsp" Jan 11 17:51:49 crc kubenswrapper[4837]: I0111 17:51:49.458879 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/134689b3-4006-4e5e-a051-cf51f6c9cf51-run-httpd\") pod \"swift-proxy-85f864d5b5-z8rsp\" (UID: \"134689b3-4006-4e5e-a051-cf51f6c9cf51\") " pod="openstack/swift-proxy-85f864d5b5-z8rsp" Jan 11 17:51:49 crc kubenswrapper[4837]: I0111 17:51:49.459282 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/134689b3-4006-4e5e-a051-cf51f6c9cf51-run-httpd\") pod \"swift-proxy-85f864d5b5-z8rsp\" (UID: \"134689b3-4006-4e5e-a051-cf51f6c9cf51\") " pod="openstack/swift-proxy-85f864d5b5-z8rsp" Jan 11 17:51:49 crc kubenswrapper[4837]: I0111 17:51:49.460935 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/134689b3-4006-4e5e-a051-cf51f6c9cf51-log-httpd\") pod \"swift-proxy-85f864d5b5-z8rsp\" (UID: \"134689b3-4006-4e5e-a051-cf51f6c9cf51\") " pod="openstack/swift-proxy-85f864d5b5-z8rsp" Jan 11 17:51:49 crc kubenswrapper[4837]: I0111 17:51:49.463801 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/134689b3-4006-4e5e-a051-cf51f6c9cf51-internal-tls-certs\") pod \"swift-proxy-85f864d5b5-z8rsp\" (UID: \"134689b3-4006-4e5e-a051-cf51f6c9cf51\") " pod="openstack/swift-proxy-85f864d5b5-z8rsp" Jan 11 17:51:49 crc kubenswrapper[4837]: I0111 17:51:49.464043 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/134689b3-4006-4e5e-a051-cf51f6c9cf51-combined-ca-bundle\") pod \"swift-proxy-85f864d5b5-z8rsp\" (UID: \"134689b3-4006-4e5e-a051-cf51f6c9cf51\") " pod="openstack/swift-proxy-85f864d5b5-z8rsp" Jan 11 17:51:49 crc kubenswrapper[4837]: I0111 17:51:49.464757 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/134689b3-4006-4e5e-a051-cf51f6c9cf51-public-tls-certs\") pod \"swift-proxy-85f864d5b5-z8rsp\" (UID: \"134689b3-4006-4e5e-a051-cf51f6c9cf51\") " pod="openstack/swift-proxy-85f864d5b5-z8rsp" Jan 11 17:51:49 crc kubenswrapper[4837]: I0111 17:51:49.465153 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/134689b3-4006-4e5e-a051-cf51f6c9cf51-config-data\") pod \"swift-proxy-85f864d5b5-z8rsp\" (UID: \"134689b3-4006-4e5e-a051-cf51f6c9cf51\") " pod="openstack/swift-proxy-85f864d5b5-z8rsp" Jan 11 17:51:49 crc kubenswrapper[4837]: I0111 17:51:49.466926 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/134689b3-4006-4e5e-a051-cf51f6c9cf51-etc-swift\") pod \"swift-proxy-85f864d5b5-z8rsp\" (UID: \"134689b3-4006-4e5e-a051-cf51f6c9cf51\") " pod="openstack/swift-proxy-85f864d5b5-z8rsp" Jan 11 17:51:49 crc kubenswrapper[4837]: I0111 17:51:49.478302 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-69vzf\" (UniqueName: \"kubernetes.io/projected/134689b3-4006-4e5e-a051-cf51f6c9cf51-kube-api-access-69vzf\") pod \"swift-proxy-85f864d5b5-z8rsp\" (UID: \"134689b3-4006-4e5e-a051-cf51f6c9cf51\") " pod="openstack/swift-proxy-85f864d5b5-z8rsp" Jan 11 17:51:49 crc kubenswrapper[4837]: I0111 17:51:49.529308 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/swift-proxy-85f864d5b5-z8rsp" Jan 11 17:51:49 crc kubenswrapper[4837]: I0111 17:51:49.794331 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"f6b872cd-f683-45bb-94db-710d997ef648","Type":"ContainerStarted","Data":"134f4ebb61c8c337622762b8798e3cad72fe75f723e19304cd2035c2c61723f8"} Jan 11 17:51:49 crc kubenswrapper[4837]: I0111 17:51:49.830694 4837 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/horizon-5d4fd56848-nmkm6" podUID="af5aeb3b-e789-4f43-ac70-bb570e59027e" containerName="horizon" probeResult="failure" output="Get \"https://10.217.0.151:8443/dashboard/auth/login/?next=/dashboard/\": dial tcp 10.217.0.151:8443: connect: connection refused" Jan 11 17:51:50 crc kubenswrapper[4837]: I0111 17:51:50.087952 4837 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/cinder-api-0" Jan 11 17:51:50 crc kubenswrapper[4837]: I0111 17:51:50.098638 4837 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/swift-proxy-85f864d5b5-z8rsp"] Jan 11 17:51:50 crc kubenswrapper[4837]: W0111 17:51:50.110624 4837 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod134689b3_4006_4e5e_a051_cf51f6c9cf51.slice/crio-a4c99219d1ac564e257141576519d16df8c1f57ad770082b3007d76f5ba936ee WatchSource:0}: Error finding container a4c99219d1ac564e257141576519d16df8c1f57ad770082b3007d76f5ba936ee: Status 404 returned error can't find the container with id a4c99219d1ac564e257141576519d16df8c1f57ad770082b3007d76f5ba936ee Jan 11 17:51:50 crc kubenswrapper[4837]: I0111 17:51:50.807849 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-proxy-85f864d5b5-z8rsp" event={"ID":"134689b3-4006-4e5e-a051-cf51f6c9cf51","Type":"ContainerStarted","Data":"e4ab484727ee72ab1ead9878bc15997b619b8abbd0433a2eab71897e0e0ba797"} Jan 11 17:51:50 crc kubenswrapper[4837]: I0111 17:51:50.808233 4837 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/swift-proxy-85f864d5b5-z8rsp" Jan 11 17:51:50 crc kubenswrapper[4837]: I0111 17:51:50.808251 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-proxy-85f864d5b5-z8rsp" event={"ID":"134689b3-4006-4e5e-a051-cf51f6c9cf51","Type":"ContainerStarted","Data":"a7e200141e08fe25ae1ca48fac8a1514abdbc5a9d4fa5ddc778c15b15799f8d0"} Jan 11 17:51:50 crc kubenswrapper[4837]: I0111 17:51:50.808264 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-proxy-85f864d5b5-z8rsp" event={"ID":"134689b3-4006-4e5e-a051-cf51f6c9cf51","Type":"ContainerStarted","Data":"a4c99219d1ac564e257141576519d16df8c1f57ad770082b3007d76f5ba936ee"} Jan 11 17:51:50 crc kubenswrapper[4837]: I0111 17:51:50.810602 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"f6b872cd-f683-45bb-94db-710d997ef648","Type":"ContainerStarted","Data":"688d5153c9998426c442f1bd160873be3e7e1ee83fe6875f54c31aa8e329a6d8"} Jan 11 17:51:50 crc kubenswrapper[4837]: I0111 17:51:50.830213 4837 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/swift-proxy-85f864d5b5-z8rsp" podStartSLOduration=1.830197145 podStartE2EDuration="1.830197145s" podCreationTimestamp="2026-01-11 17:51:49 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-11 17:51:50.82930296 +0000 UTC m=+1285.007495656" watchObservedRunningTime="2026-01-11 17:51:50.830197145 +0000 UTC m=+1285.008389851" Jan 11 17:51:50 crc kubenswrapper[4837]: I0111 17:51:50.849950 4837 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/cinder-scheduler-0" podStartSLOduration=3.849931565 podStartE2EDuration="3.849931565s" podCreationTimestamp="2026-01-11 17:51:47 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-11 17:51:50.845813604 +0000 UTC m=+1285.024006310" watchObservedRunningTime="2026-01-11 17:51:50.849931565 +0000 UTC m=+1285.028124271" Jan 11 17:51:51 crc kubenswrapper[4837]: I0111 17:51:51.821605 4837 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/swift-proxy-85f864d5b5-z8rsp" Jan 11 17:51:53 crc kubenswrapper[4837]: I0111 17:51:53.229934 4837 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/cinder-scheduler-0" Jan 11 17:51:53 crc kubenswrapper[4837]: I0111 17:51:53.843642 4837 generic.go:334] "Generic (PLEG): container finished" podID="5ec0beaf-de63-407f-8d18-46738023ab11" containerID="b9d3d17ee78713de367859cb9ff25e286651fefcf150555d02643be60b2440a9" exitCode=137 Jan 11 17:51:53 crc kubenswrapper[4837]: I0111 17:51:53.843761 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"5ec0beaf-de63-407f-8d18-46738023ab11","Type":"ContainerDied","Data":"b9d3d17ee78713de367859cb9ff25e286651fefcf150555d02643be60b2440a9"} Jan 11 17:51:57 crc kubenswrapper[4837]: I0111 17:51:57.166919 4837 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Jan 11 17:51:57 crc kubenswrapper[4837]: I0111 17:51:57.322378 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/5ec0beaf-de63-407f-8d18-46738023ab11-sg-core-conf-yaml\") pod \"5ec0beaf-de63-407f-8d18-46738023ab11\" (UID: \"5ec0beaf-de63-407f-8d18-46738023ab11\") " Jan 11 17:51:57 crc kubenswrapper[4837]: I0111 17:51:57.322502 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-k964n\" (UniqueName: \"kubernetes.io/projected/5ec0beaf-de63-407f-8d18-46738023ab11-kube-api-access-k964n\") pod \"5ec0beaf-de63-407f-8d18-46738023ab11\" (UID: \"5ec0beaf-de63-407f-8d18-46738023ab11\") " Jan 11 17:51:57 crc kubenswrapper[4837]: I0111 17:51:57.322537 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/5ec0beaf-de63-407f-8d18-46738023ab11-run-httpd\") pod \"5ec0beaf-de63-407f-8d18-46738023ab11\" (UID: \"5ec0beaf-de63-407f-8d18-46738023ab11\") " Jan 11 17:51:57 crc kubenswrapper[4837]: I0111 17:51:57.322560 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5ec0beaf-de63-407f-8d18-46738023ab11-combined-ca-bundle\") pod \"5ec0beaf-de63-407f-8d18-46738023ab11\" (UID: \"5ec0beaf-de63-407f-8d18-46738023ab11\") " Jan 11 17:51:57 crc kubenswrapper[4837]: I0111 17:51:57.322602 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/5ec0beaf-de63-407f-8d18-46738023ab11-log-httpd\") pod \"5ec0beaf-de63-407f-8d18-46738023ab11\" (UID: \"5ec0beaf-de63-407f-8d18-46738023ab11\") " Jan 11 17:51:57 crc kubenswrapper[4837]: I0111 17:51:57.322659 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/5ec0beaf-de63-407f-8d18-46738023ab11-scripts\") pod \"5ec0beaf-de63-407f-8d18-46738023ab11\" (UID: \"5ec0beaf-de63-407f-8d18-46738023ab11\") " Jan 11 17:51:57 crc kubenswrapper[4837]: I0111 17:51:57.322781 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5ec0beaf-de63-407f-8d18-46738023ab11-config-data\") pod \"5ec0beaf-de63-407f-8d18-46738023ab11\" (UID: \"5ec0beaf-de63-407f-8d18-46738023ab11\") " Jan 11 17:51:57 crc kubenswrapper[4837]: I0111 17:51:57.323844 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/5ec0beaf-de63-407f-8d18-46738023ab11-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "5ec0beaf-de63-407f-8d18-46738023ab11" (UID: "5ec0beaf-de63-407f-8d18-46738023ab11"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 11 17:51:57 crc kubenswrapper[4837]: I0111 17:51:57.324218 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/5ec0beaf-de63-407f-8d18-46738023ab11-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "5ec0beaf-de63-407f-8d18-46738023ab11" (UID: "5ec0beaf-de63-407f-8d18-46738023ab11"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 11 17:51:57 crc kubenswrapper[4837]: I0111 17:51:57.329062 4837 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-internal-api-0"] Jan 11 17:51:57 crc kubenswrapper[4837]: I0111 17:51:57.329300 4837 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-internal-api-0" podUID="979d7c48-3688-478f-bb46-d78b535b84dc" containerName="glance-log" containerID="cri-o://97a13017e7d586891bc8a96f1fd39c70b32c09231632f3f470b5437de2e72751" gracePeriod=30 Jan 11 17:51:57 crc kubenswrapper[4837]: I0111 17:51:57.329775 4837 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-internal-api-0" podUID="979d7c48-3688-478f-bb46-d78b535b84dc" containerName="glance-httpd" containerID="cri-o://3430e6565caca4832c1e2cc748670ee244532fb427c7b39728e31430be081ebe" gracePeriod=30 Jan 11 17:51:57 crc kubenswrapper[4837]: I0111 17:51:57.330933 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5ec0beaf-de63-407f-8d18-46738023ab11-kube-api-access-k964n" (OuterVolumeSpecName: "kube-api-access-k964n") pod "5ec0beaf-de63-407f-8d18-46738023ab11" (UID: "5ec0beaf-de63-407f-8d18-46738023ab11"). InnerVolumeSpecName "kube-api-access-k964n". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 11 17:51:57 crc kubenswrapper[4837]: I0111 17:51:57.338801 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5ec0beaf-de63-407f-8d18-46738023ab11-scripts" (OuterVolumeSpecName: "scripts") pod "5ec0beaf-de63-407f-8d18-46738023ab11" (UID: "5ec0beaf-de63-407f-8d18-46738023ab11"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 11 17:51:57 crc kubenswrapper[4837]: I0111 17:51:57.371228 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5ec0beaf-de63-407f-8d18-46738023ab11-sg-core-conf-yaml" (OuterVolumeSpecName: "sg-core-conf-yaml") pod "5ec0beaf-de63-407f-8d18-46738023ab11" (UID: "5ec0beaf-de63-407f-8d18-46738023ab11"). InnerVolumeSpecName "sg-core-conf-yaml". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 11 17:51:57 crc kubenswrapper[4837]: I0111 17:51:57.406051 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5ec0beaf-de63-407f-8d18-46738023ab11-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "5ec0beaf-de63-407f-8d18-46738023ab11" (UID: "5ec0beaf-de63-407f-8d18-46738023ab11"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 11 17:51:57 crc kubenswrapper[4837]: I0111 17:51:57.415708 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5ec0beaf-de63-407f-8d18-46738023ab11-config-data" (OuterVolumeSpecName: "config-data") pod "5ec0beaf-de63-407f-8d18-46738023ab11" (UID: "5ec0beaf-de63-407f-8d18-46738023ab11"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 11 17:51:57 crc kubenswrapper[4837]: I0111 17:51:57.425853 4837 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-k964n\" (UniqueName: \"kubernetes.io/projected/5ec0beaf-de63-407f-8d18-46738023ab11-kube-api-access-k964n\") on node \"crc\" DevicePath \"\"" Jan 11 17:51:57 crc kubenswrapper[4837]: I0111 17:51:57.425885 4837 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/5ec0beaf-de63-407f-8d18-46738023ab11-run-httpd\") on node \"crc\" DevicePath \"\"" Jan 11 17:51:57 crc kubenswrapper[4837]: I0111 17:51:57.425896 4837 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5ec0beaf-de63-407f-8d18-46738023ab11-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 11 17:51:57 crc kubenswrapper[4837]: I0111 17:51:57.425906 4837 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/5ec0beaf-de63-407f-8d18-46738023ab11-log-httpd\") on node \"crc\" DevicePath \"\"" Jan 11 17:51:57 crc kubenswrapper[4837]: I0111 17:51:57.425916 4837 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/5ec0beaf-de63-407f-8d18-46738023ab11-scripts\") on node \"crc\" DevicePath \"\"" Jan 11 17:51:57 crc kubenswrapper[4837]: I0111 17:51:57.425924 4837 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5ec0beaf-de63-407f-8d18-46738023ab11-config-data\") on node \"crc\" DevicePath \"\"" Jan 11 17:51:57 crc kubenswrapper[4837]: I0111 17:51:57.425934 4837 reconciler_common.go:293] "Volume detached for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/5ec0beaf-de63-407f-8d18-46738023ab11-sg-core-conf-yaml\") on node \"crc\" DevicePath \"\"" Jan 11 17:51:57 crc kubenswrapper[4837]: I0111 17:51:57.908539 4837 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Jan 11 17:51:57 crc kubenswrapper[4837]: I0111 17:51:57.909021 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"5ec0beaf-de63-407f-8d18-46738023ab11","Type":"ContainerDied","Data":"321d247207531c2c1fe0901940f4550855249a0d284bd3a493d398bea93ffdfc"} Jan 11 17:51:57 crc kubenswrapper[4837]: I0111 17:51:57.909085 4837 scope.go:117] "RemoveContainer" containerID="b9d3d17ee78713de367859cb9ff25e286651fefcf150555d02643be60b2440a9" Jan 11 17:51:57 crc kubenswrapper[4837]: I0111 17:51:57.911083 4837 generic.go:334] "Generic (PLEG): container finished" podID="979d7c48-3688-478f-bb46-d78b535b84dc" containerID="97a13017e7d586891bc8a96f1fd39c70b32c09231632f3f470b5437de2e72751" exitCode=143 Jan 11 17:51:57 crc kubenswrapper[4837]: I0111 17:51:57.911181 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"979d7c48-3688-478f-bb46-d78b535b84dc","Type":"ContainerDied","Data":"97a13017e7d586891bc8a96f1fd39c70b32c09231632f3f470b5437de2e72751"} Jan 11 17:51:57 crc kubenswrapper[4837]: I0111 17:51:57.914812 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstackclient" event={"ID":"a2380f65-2f68-4a02-95c4-b3fd94ba3adc","Type":"ContainerStarted","Data":"853f2f2285fd8a1ec0ea52a83ef86ae584070d359305afc79d342092b73ad725"} Jan 11 17:51:57 crc kubenswrapper[4837]: I0111 17:51:57.928983 4837 scope.go:117] "RemoveContainer" containerID="a2be37e1652ad8ecbd0c0c8eb6badcd102f2579adebd7c475651f8d2fad86993" Jan 11 17:51:57 crc kubenswrapper[4837]: I0111 17:51:57.944917 4837 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/openstackclient" podStartSLOduration=2.550647199 podStartE2EDuration="13.944901068s" podCreationTimestamp="2026-01-11 17:51:44 +0000 UTC" firstStartedPulling="2026-01-11 17:51:45.435190859 +0000 UTC m=+1279.613383555" lastFinishedPulling="2026-01-11 17:51:56.829444718 +0000 UTC m=+1291.007637424" observedRunningTime="2026-01-11 17:51:57.938745603 +0000 UTC m=+1292.116938319" watchObservedRunningTime="2026-01-11 17:51:57.944901068 +0000 UTC m=+1292.123093774" Jan 11 17:51:57 crc kubenswrapper[4837]: I0111 17:51:57.952662 4837 scope.go:117] "RemoveContainer" containerID="6365fbb22458d0aa6474e7c381cd0e77f5f0e1ffa3601c5afba83111f5f7eff4" Jan 11 17:51:58 crc kubenswrapper[4837]: I0111 17:51:57.989778 4837 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Jan 11 17:51:58 crc kubenswrapper[4837]: I0111 17:51:58.001352 4837 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ceilometer-0"] Jan 11 17:51:58 crc kubenswrapper[4837]: I0111 17:51:58.009163 4837 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceilometer-0"] Jan 11 17:51:58 crc kubenswrapper[4837]: E0111 17:51:58.009486 4837 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5ec0beaf-de63-407f-8d18-46738023ab11" containerName="ceilometer-notification-agent" Jan 11 17:51:58 crc kubenswrapper[4837]: I0111 17:51:58.009501 4837 state_mem.go:107] "Deleted CPUSet assignment" podUID="5ec0beaf-de63-407f-8d18-46738023ab11" containerName="ceilometer-notification-agent" Jan 11 17:51:58 crc kubenswrapper[4837]: E0111 17:51:58.009512 4837 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5ec0beaf-de63-407f-8d18-46738023ab11" containerName="proxy-httpd" Jan 11 17:51:58 crc kubenswrapper[4837]: I0111 17:51:58.009519 4837 state_mem.go:107] "Deleted CPUSet assignment" podUID="5ec0beaf-de63-407f-8d18-46738023ab11" containerName="proxy-httpd" Jan 11 17:51:58 crc kubenswrapper[4837]: E0111 17:51:58.009543 4837 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5ec0beaf-de63-407f-8d18-46738023ab11" containerName="sg-core" Jan 11 17:51:58 crc kubenswrapper[4837]: I0111 17:51:58.009550 4837 state_mem.go:107] "Deleted CPUSet assignment" podUID="5ec0beaf-de63-407f-8d18-46738023ab11" containerName="sg-core" Jan 11 17:51:58 crc kubenswrapper[4837]: I0111 17:51:58.009727 4837 memory_manager.go:354] "RemoveStaleState removing state" podUID="5ec0beaf-de63-407f-8d18-46738023ab11" containerName="proxy-httpd" Jan 11 17:51:58 crc kubenswrapper[4837]: I0111 17:51:58.009738 4837 memory_manager.go:354] "RemoveStaleState removing state" podUID="5ec0beaf-de63-407f-8d18-46738023ab11" containerName="sg-core" Jan 11 17:51:58 crc kubenswrapper[4837]: I0111 17:51:58.009750 4837 memory_manager.go:354] "RemoveStaleState removing state" podUID="5ec0beaf-de63-407f-8d18-46738023ab11" containerName="ceilometer-notification-agent" Jan 11 17:51:58 crc kubenswrapper[4837]: I0111 17:51:58.011575 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Jan 11 17:51:58 crc kubenswrapper[4837]: I0111 17:51:58.016860 4837 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-scripts" Jan 11 17:51:58 crc kubenswrapper[4837]: I0111 17:51:58.016941 4837 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-config-data" Jan 11 17:51:58 crc kubenswrapper[4837]: I0111 17:51:58.032831 4837 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Jan 11 17:51:58 crc kubenswrapper[4837]: I0111 17:51:58.136475 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/5c4f4894-6491-47db-8b74-f62a2aa8e39b-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"5c4f4894-6491-47db-8b74-f62a2aa8e39b\") " pod="openstack/ceilometer-0" Jan 11 17:51:58 crc kubenswrapper[4837]: I0111 17:51:58.136528 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/5c4f4894-6491-47db-8b74-f62a2aa8e39b-scripts\") pod \"ceilometer-0\" (UID: \"5c4f4894-6491-47db-8b74-f62a2aa8e39b\") " pod="openstack/ceilometer-0" Jan 11 17:51:58 crc kubenswrapper[4837]: I0111 17:51:58.136647 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5c4f4894-6491-47db-8b74-f62a2aa8e39b-config-data\") pod \"ceilometer-0\" (UID: \"5c4f4894-6491-47db-8b74-f62a2aa8e39b\") " pod="openstack/ceilometer-0" Jan 11 17:51:58 crc kubenswrapper[4837]: I0111 17:51:58.136800 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5c4f4894-6491-47db-8b74-f62a2aa8e39b-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"5c4f4894-6491-47db-8b74-f62a2aa8e39b\") " pod="openstack/ceilometer-0" Jan 11 17:51:58 crc kubenswrapper[4837]: I0111 17:51:58.136837 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/5c4f4894-6491-47db-8b74-f62a2aa8e39b-run-httpd\") pod \"ceilometer-0\" (UID: \"5c4f4894-6491-47db-8b74-f62a2aa8e39b\") " pod="openstack/ceilometer-0" Jan 11 17:51:58 crc kubenswrapper[4837]: I0111 17:51:58.136854 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/5c4f4894-6491-47db-8b74-f62a2aa8e39b-log-httpd\") pod \"ceilometer-0\" (UID: \"5c4f4894-6491-47db-8b74-f62a2aa8e39b\") " pod="openstack/ceilometer-0" Jan 11 17:51:58 crc kubenswrapper[4837]: I0111 17:51:58.136907 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xld5s\" (UniqueName: \"kubernetes.io/projected/5c4f4894-6491-47db-8b74-f62a2aa8e39b-kube-api-access-xld5s\") pod \"ceilometer-0\" (UID: \"5c4f4894-6491-47db-8b74-f62a2aa8e39b\") " pod="openstack/ceilometer-0" Jan 11 17:51:58 crc kubenswrapper[4837]: I0111 17:51:58.239111 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5c4f4894-6491-47db-8b74-f62a2aa8e39b-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"5c4f4894-6491-47db-8b74-f62a2aa8e39b\") " pod="openstack/ceilometer-0" Jan 11 17:51:58 crc kubenswrapper[4837]: I0111 17:51:58.239168 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/5c4f4894-6491-47db-8b74-f62a2aa8e39b-run-httpd\") pod \"ceilometer-0\" (UID: \"5c4f4894-6491-47db-8b74-f62a2aa8e39b\") " pod="openstack/ceilometer-0" Jan 11 17:51:58 crc kubenswrapper[4837]: I0111 17:51:58.239188 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/5c4f4894-6491-47db-8b74-f62a2aa8e39b-log-httpd\") pod \"ceilometer-0\" (UID: \"5c4f4894-6491-47db-8b74-f62a2aa8e39b\") " pod="openstack/ceilometer-0" Jan 11 17:51:58 crc kubenswrapper[4837]: I0111 17:51:58.239222 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xld5s\" (UniqueName: \"kubernetes.io/projected/5c4f4894-6491-47db-8b74-f62a2aa8e39b-kube-api-access-xld5s\") pod \"ceilometer-0\" (UID: \"5c4f4894-6491-47db-8b74-f62a2aa8e39b\") " pod="openstack/ceilometer-0" Jan 11 17:51:58 crc kubenswrapper[4837]: I0111 17:51:58.239247 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/5c4f4894-6491-47db-8b74-f62a2aa8e39b-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"5c4f4894-6491-47db-8b74-f62a2aa8e39b\") " pod="openstack/ceilometer-0" Jan 11 17:51:58 crc kubenswrapper[4837]: I0111 17:51:58.239283 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/5c4f4894-6491-47db-8b74-f62a2aa8e39b-scripts\") pod \"ceilometer-0\" (UID: \"5c4f4894-6491-47db-8b74-f62a2aa8e39b\") " pod="openstack/ceilometer-0" Jan 11 17:51:58 crc kubenswrapper[4837]: I0111 17:51:58.239316 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5c4f4894-6491-47db-8b74-f62a2aa8e39b-config-data\") pod \"ceilometer-0\" (UID: \"5c4f4894-6491-47db-8b74-f62a2aa8e39b\") " pod="openstack/ceilometer-0" Jan 11 17:51:58 crc kubenswrapper[4837]: I0111 17:51:58.241417 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/5c4f4894-6491-47db-8b74-f62a2aa8e39b-log-httpd\") pod \"ceilometer-0\" (UID: \"5c4f4894-6491-47db-8b74-f62a2aa8e39b\") " pod="openstack/ceilometer-0" Jan 11 17:51:58 crc kubenswrapper[4837]: I0111 17:51:58.242530 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/5c4f4894-6491-47db-8b74-f62a2aa8e39b-run-httpd\") pod \"ceilometer-0\" (UID: \"5c4f4894-6491-47db-8b74-f62a2aa8e39b\") " pod="openstack/ceilometer-0" Jan 11 17:51:58 crc kubenswrapper[4837]: I0111 17:51:58.246468 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/5c4f4894-6491-47db-8b74-f62a2aa8e39b-scripts\") pod \"ceilometer-0\" (UID: \"5c4f4894-6491-47db-8b74-f62a2aa8e39b\") " pod="openstack/ceilometer-0" Jan 11 17:51:58 crc kubenswrapper[4837]: I0111 17:51:58.246737 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5c4f4894-6491-47db-8b74-f62a2aa8e39b-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"5c4f4894-6491-47db-8b74-f62a2aa8e39b\") " pod="openstack/ceilometer-0" Jan 11 17:51:58 crc kubenswrapper[4837]: I0111 17:51:58.246839 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/5c4f4894-6491-47db-8b74-f62a2aa8e39b-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"5c4f4894-6491-47db-8b74-f62a2aa8e39b\") " pod="openstack/ceilometer-0" Jan 11 17:51:58 crc kubenswrapper[4837]: I0111 17:51:58.247075 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5c4f4894-6491-47db-8b74-f62a2aa8e39b-config-data\") pod \"ceilometer-0\" (UID: \"5c4f4894-6491-47db-8b74-f62a2aa8e39b\") " pod="openstack/ceilometer-0" Jan 11 17:51:58 crc kubenswrapper[4837]: I0111 17:51:58.263533 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xld5s\" (UniqueName: \"kubernetes.io/projected/5c4f4894-6491-47db-8b74-f62a2aa8e39b-kube-api-access-xld5s\") pod \"ceilometer-0\" (UID: \"5c4f4894-6491-47db-8b74-f62a2aa8e39b\") " pod="openstack/ceilometer-0" Jan 11 17:51:58 crc kubenswrapper[4837]: I0111 17:51:58.330958 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Jan 11 17:51:58 crc kubenswrapper[4837]: I0111 17:51:58.378120 4837 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5ec0beaf-de63-407f-8d18-46738023ab11" path="/var/lib/kubelet/pods/5ec0beaf-de63-407f-8d18-46738023ab11/volumes" Jan 11 17:51:58 crc kubenswrapper[4837]: I0111 17:51:58.491115 4837 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/cinder-scheduler-0" Jan 11 17:51:58 crc kubenswrapper[4837]: I0111 17:51:58.788484 4837 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Jan 11 17:51:58 crc kubenswrapper[4837]: W0111 17:51:58.797984 4837 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod5c4f4894_6491_47db_8b74_f62a2aa8e39b.slice/crio-75c579f9c4ffb6cee80c6437a73f56e8e1bdc42a396b2537ab4f77980d10a341 WatchSource:0}: Error finding container 75c579f9c4ffb6cee80c6437a73f56e8e1bdc42a396b2537ab4f77980d10a341: Status 404 returned error can't find the container with id 75c579f9c4ffb6cee80c6437a73f56e8e1bdc42a396b2537ab4f77980d10a341 Jan 11 17:51:58 crc kubenswrapper[4837]: I0111 17:51:58.925409 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"5c4f4894-6491-47db-8b74-f62a2aa8e39b","Type":"ContainerStarted","Data":"75c579f9c4ffb6cee80c6437a73f56e8e1bdc42a396b2537ab4f77980d10a341"} Jan 11 17:51:59 crc kubenswrapper[4837]: I0111 17:51:59.542250 4837 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/swift-proxy-85f864d5b5-z8rsp" Jan 11 17:51:59 crc kubenswrapper[4837]: I0111 17:51:59.548091 4837 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/swift-proxy-85f864d5b5-z8rsp" Jan 11 17:51:59 crc kubenswrapper[4837]: I0111 17:51:59.746758 4837 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Jan 11 17:51:59 crc kubenswrapper[4837]: I0111 17:51:59.831710 4837 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/horizon-5d4fd56848-nmkm6" podUID="af5aeb3b-e789-4f43-ac70-bb570e59027e" containerName="horizon" probeResult="failure" output="Get \"https://10.217.0.151:8443/dashboard/auth/login/?next=/dashboard/\": dial tcp 10.217.0.151:8443: connect: connection refused" Jan 11 17:51:59 crc kubenswrapper[4837]: I0111 17:51:59.937335 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"5c4f4894-6491-47db-8b74-f62a2aa8e39b","Type":"ContainerStarted","Data":"838c87d73c9cbb7d52b43ccba724b178ed2caace81cc2bb2319851a9d40a6250"} Jan 11 17:52:00 crc kubenswrapper[4837]: I0111 17:52:00.474338 4837 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/glance-default-internal-api-0" podUID="979d7c48-3688-478f-bb46-d78b535b84dc" containerName="glance-log" probeResult="failure" output="Get \"https://10.217.0.154:9292/healthcheck\": read tcp 10.217.0.2:55222->10.217.0.154:9292: read: connection reset by peer" Jan 11 17:52:00 crc kubenswrapper[4837]: I0111 17:52:00.474367 4837 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/glance-default-internal-api-0" podUID="979d7c48-3688-478f-bb46-d78b535b84dc" containerName="glance-httpd" probeResult="failure" output="Get \"https://10.217.0.154:9292/healthcheck\": read tcp 10.217.0.2:55230->10.217.0.154:9292: read: connection reset by peer" Jan 11 17:52:00 crc kubenswrapper[4837]: I0111 17:52:00.945137 4837 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Jan 11 17:52:00 crc kubenswrapper[4837]: I0111 17:52:00.949077 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"5c4f4894-6491-47db-8b74-f62a2aa8e39b","Type":"ContainerStarted","Data":"621e2d3a0c40a000a43e685abe9f17daaacabb456ebe835ea1b2c9dbca65bf1e"} Jan 11 17:52:00 crc kubenswrapper[4837]: I0111 17:52:00.950765 4837 generic.go:334] "Generic (PLEG): container finished" podID="979d7c48-3688-478f-bb46-d78b535b84dc" containerID="3430e6565caca4832c1e2cc748670ee244532fb427c7b39728e31430be081ebe" exitCode=0 Jan 11 17:52:00 crc kubenswrapper[4837]: I0111 17:52:00.950793 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"979d7c48-3688-478f-bb46-d78b535b84dc","Type":"ContainerDied","Data":"3430e6565caca4832c1e2cc748670ee244532fb427c7b39728e31430be081ebe"} Jan 11 17:52:00 crc kubenswrapper[4837]: I0111 17:52:00.950811 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"979d7c48-3688-478f-bb46-d78b535b84dc","Type":"ContainerDied","Data":"ca1e84dc3cb6f40a2763a349082811b2575c3fb7758df69a625605bc60e628e9"} Jan 11 17:52:00 crc kubenswrapper[4837]: I0111 17:52:00.950828 4837 scope.go:117] "RemoveContainer" containerID="3430e6565caca4832c1e2cc748670ee244532fb427c7b39728e31430be081ebe" Jan 11 17:52:00 crc kubenswrapper[4837]: I0111 17:52:00.950940 4837 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Jan 11 17:52:00 crc kubenswrapper[4837]: I0111 17:52:00.992832 4837 scope.go:117] "RemoveContainer" containerID="97a13017e7d586891bc8a96f1fd39c70b32c09231632f3f470b5437de2e72751" Jan 11 17:52:01 crc kubenswrapper[4837]: I0111 17:52:01.020045 4837 scope.go:117] "RemoveContainer" containerID="3430e6565caca4832c1e2cc748670ee244532fb427c7b39728e31430be081ebe" Jan 11 17:52:01 crc kubenswrapper[4837]: E0111 17:52:01.023893 4837 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"3430e6565caca4832c1e2cc748670ee244532fb427c7b39728e31430be081ebe\": container with ID starting with 3430e6565caca4832c1e2cc748670ee244532fb427c7b39728e31430be081ebe not found: ID does not exist" containerID="3430e6565caca4832c1e2cc748670ee244532fb427c7b39728e31430be081ebe" Jan 11 17:52:01 crc kubenswrapper[4837]: I0111 17:52:01.023940 4837 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3430e6565caca4832c1e2cc748670ee244532fb427c7b39728e31430be081ebe"} err="failed to get container status \"3430e6565caca4832c1e2cc748670ee244532fb427c7b39728e31430be081ebe\": rpc error: code = NotFound desc = could not find container \"3430e6565caca4832c1e2cc748670ee244532fb427c7b39728e31430be081ebe\": container with ID starting with 3430e6565caca4832c1e2cc748670ee244532fb427c7b39728e31430be081ebe not found: ID does not exist" Jan 11 17:52:01 crc kubenswrapper[4837]: I0111 17:52:01.023966 4837 scope.go:117] "RemoveContainer" containerID="97a13017e7d586891bc8a96f1fd39c70b32c09231632f3f470b5437de2e72751" Jan 11 17:52:01 crc kubenswrapper[4837]: E0111 17:52:01.025911 4837 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"97a13017e7d586891bc8a96f1fd39c70b32c09231632f3f470b5437de2e72751\": container with ID starting with 97a13017e7d586891bc8a96f1fd39c70b32c09231632f3f470b5437de2e72751 not found: ID does not exist" containerID="97a13017e7d586891bc8a96f1fd39c70b32c09231632f3f470b5437de2e72751" Jan 11 17:52:01 crc kubenswrapper[4837]: I0111 17:52:01.025941 4837 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"97a13017e7d586891bc8a96f1fd39c70b32c09231632f3f470b5437de2e72751"} err="failed to get container status \"97a13017e7d586891bc8a96f1fd39c70b32c09231632f3f470b5437de2e72751\": rpc error: code = NotFound desc = could not find container \"97a13017e7d586891bc8a96f1fd39c70b32c09231632f3f470b5437de2e72751\": container with ID starting with 97a13017e7d586891bc8a96f1fd39c70b32c09231632f3f470b5437de2e72751 not found: ID does not exist" Jan 11 17:52:01 crc kubenswrapper[4837]: I0111 17:52:01.085831 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/979d7c48-3688-478f-bb46-d78b535b84dc-combined-ca-bundle\") pod \"979d7c48-3688-478f-bb46-d78b535b84dc\" (UID: \"979d7c48-3688-478f-bb46-d78b535b84dc\") " Jan 11 17:52:01 crc kubenswrapper[4837]: I0111 17:52:01.085919 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/979d7c48-3688-478f-bb46-d78b535b84dc-scripts\") pod \"979d7c48-3688-478f-bb46-d78b535b84dc\" (UID: \"979d7c48-3688-478f-bb46-d78b535b84dc\") " Jan 11 17:52:01 crc kubenswrapper[4837]: I0111 17:52:01.085981 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/979d7c48-3688-478f-bb46-d78b535b84dc-config-data\") pod \"979d7c48-3688-478f-bb46-d78b535b84dc\" (UID: \"979d7c48-3688-478f-bb46-d78b535b84dc\") " Jan 11 17:52:01 crc kubenswrapper[4837]: I0111 17:52:01.086011 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/979d7c48-3688-478f-bb46-d78b535b84dc-httpd-run\") pod \"979d7c48-3688-478f-bb46-d78b535b84dc\" (UID: \"979d7c48-3688-478f-bb46-d78b535b84dc\") " Jan 11 17:52:01 crc kubenswrapper[4837]: I0111 17:52:01.086028 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"glance\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") pod \"979d7c48-3688-478f-bb46-d78b535b84dc\" (UID: \"979d7c48-3688-478f-bb46-d78b535b84dc\") " Jan 11 17:52:01 crc kubenswrapper[4837]: I0111 17:52:01.086059 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/979d7c48-3688-478f-bb46-d78b535b84dc-logs\") pod \"979d7c48-3688-478f-bb46-d78b535b84dc\" (UID: \"979d7c48-3688-478f-bb46-d78b535b84dc\") " Jan 11 17:52:01 crc kubenswrapper[4837]: I0111 17:52:01.086092 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/979d7c48-3688-478f-bb46-d78b535b84dc-internal-tls-certs\") pod \"979d7c48-3688-478f-bb46-d78b535b84dc\" (UID: \"979d7c48-3688-478f-bb46-d78b535b84dc\") " Jan 11 17:52:01 crc kubenswrapper[4837]: I0111 17:52:01.086111 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7grhl\" (UniqueName: \"kubernetes.io/projected/979d7c48-3688-478f-bb46-d78b535b84dc-kube-api-access-7grhl\") pod \"979d7c48-3688-478f-bb46-d78b535b84dc\" (UID: \"979d7c48-3688-478f-bb46-d78b535b84dc\") " Jan 11 17:52:01 crc kubenswrapper[4837]: I0111 17:52:01.089480 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/979d7c48-3688-478f-bb46-d78b535b84dc-logs" (OuterVolumeSpecName: "logs") pod "979d7c48-3688-478f-bb46-d78b535b84dc" (UID: "979d7c48-3688-478f-bb46-d78b535b84dc"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 11 17:52:01 crc kubenswrapper[4837]: I0111 17:52:01.089654 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/979d7c48-3688-478f-bb46-d78b535b84dc-httpd-run" (OuterVolumeSpecName: "httpd-run") pod "979d7c48-3688-478f-bb46-d78b535b84dc" (UID: "979d7c48-3688-478f-bb46-d78b535b84dc"). InnerVolumeSpecName "httpd-run". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 11 17:52:01 crc kubenswrapper[4837]: I0111 17:52:01.092594 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/979d7c48-3688-478f-bb46-d78b535b84dc-kube-api-access-7grhl" (OuterVolumeSpecName: "kube-api-access-7grhl") pod "979d7c48-3688-478f-bb46-d78b535b84dc" (UID: "979d7c48-3688-478f-bb46-d78b535b84dc"). InnerVolumeSpecName "kube-api-access-7grhl". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 11 17:52:01 crc kubenswrapper[4837]: I0111 17:52:01.094790 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage01-crc" (OuterVolumeSpecName: "glance") pod "979d7c48-3688-478f-bb46-d78b535b84dc" (UID: "979d7c48-3688-478f-bb46-d78b535b84dc"). InnerVolumeSpecName "local-storage01-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Jan 11 17:52:01 crc kubenswrapper[4837]: I0111 17:52:01.098403 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/979d7c48-3688-478f-bb46-d78b535b84dc-scripts" (OuterVolumeSpecName: "scripts") pod "979d7c48-3688-478f-bb46-d78b535b84dc" (UID: "979d7c48-3688-478f-bb46-d78b535b84dc"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 11 17:52:01 crc kubenswrapper[4837]: I0111 17:52:01.121824 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/979d7c48-3688-478f-bb46-d78b535b84dc-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "979d7c48-3688-478f-bb46-d78b535b84dc" (UID: "979d7c48-3688-478f-bb46-d78b535b84dc"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 11 17:52:01 crc kubenswrapper[4837]: I0111 17:52:01.144514 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/979d7c48-3688-478f-bb46-d78b535b84dc-config-data" (OuterVolumeSpecName: "config-data") pod "979d7c48-3688-478f-bb46-d78b535b84dc" (UID: "979d7c48-3688-478f-bb46-d78b535b84dc"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 11 17:52:01 crc kubenswrapper[4837]: I0111 17:52:01.144606 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/979d7c48-3688-478f-bb46-d78b535b84dc-internal-tls-certs" (OuterVolumeSpecName: "internal-tls-certs") pod "979d7c48-3688-478f-bb46-d78b535b84dc" (UID: "979d7c48-3688-478f-bb46-d78b535b84dc"). InnerVolumeSpecName "internal-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 11 17:52:01 crc kubenswrapper[4837]: I0111 17:52:01.187532 4837 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/979d7c48-3688-478f-bb46-d78b535b84dc-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 11 17:52:01 crc kubenswrapper[4837]: I0111 17:52:01.187560 4837 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/979d7c48-3688-478f-bb46-d78b535b84dc-scripts\") on node \"crc\" DevicePath \"\"" Jan 11 17:52:01 crc kubenswrapper[4837]: I0111 17:52:01.187569 4837 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/979d7c48-3688-478f-bb46-d78b535b84dc-config-data\") on node \"crc\" DevicePath \"\"" Jan 11 17:52:01 crc kubenswrapper[4837]: I0111 17:52:01.187577 4837 reconciler_common.go:293] "Volume detached for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/979d7c48-3688-478f-bb46-d78b535b84dc-httpd-run\") on node \"crc\" DevicePath \"\"" Jan 11 17:52:01 crc kubenswrapper[4837]: I0111 17:52:01.187600 4837 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage01-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") on node \"crc\" " Jan 11 17:52:01 crc kubenswrapper[4837]: I0111 17:52:01.187609 4837 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/979d7c48-3688-478f-bb46-d78b535b84dc-logs\") on node \"crc\" DevicePath \"\"" Jan 11 17:52:01 crc kubenswrapper[4837]: I0111 17:52:01.187617 4837 reconciler_common.go:293] "Volume detached for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/979d7c48-3688-478f-bb46-d78b535b84dc-internal-tls-certs\") on node \"crc\" DevicePath \"\"" Jan 11 17:52:01 crc kubenswrapper[4837]: I0111 17:52:01.187626 4837 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-7grhl\" (UniqueName: \"kubernetes.io/projected/979d7c48-3688-478f-bb46-d78b535b84dc-kube-api-access-7grhl\") on node \"crc\" DevicePath \"\"" Jan 11 17:52:01 crc kubenswrapper[4837]: I0111 17:52:01.205378 4837 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage01-crc" (UniqueName: "kubernetes.io/local-volume/local-storage01-crc") on node "crc" Jan 11 17:52:01 crc kubenswrapper[4837]: I0111 17:52:01.289572 4837 reconciler_common.go:293] "Volume detached for volume \"local-storage01-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") on node \"crc\" DevicePath \"\"" Jan 11 17:52:01 crc kubenswrapper[4837]: I0111 17:52:01.294817 4837 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-internal-api-0"] Jan 11 17:52:01 crc kubenswrapper[4837]: I0111 17:52:01.312271 4837 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glance-default-internal-api-0"] Jan 11 17:52:01 crc kubenswrapper[4837]: I0111 17:52:01.322467 4837 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-default-internal-api-0"] Jan 11 17:52:01 crc kubenswrapper[4837]: E0111 17:52:01.322895 4837 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="979d7c48-3688-478f-bb46-d78b535b84dc" containerName="glance-httpd" Jan 11 17:52:01 crc kubenswrapper[4837]: I0111 17:52:01.322911 4837 state_mem.go:107] "Deleted CPUSet assignment" podUID="979d7c48-3688-478f-bb46-d78b535b84dc" containerName="glance-httpd" Jan 11 17:52:01 crc kubenswrapper[4837]: E0111 17:52:01.322929 4837 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="979d7c48-3688-478f-bb46-d78b535b84dc" containerName="glance-log" Jan 11 17:52:01 crc kubenswrapper[4837]: I0111 17:52:01.322936 4837 state_mem.go:107] "Deleted CPUSet assignment" podUID="979d7c48-3688-478f-bb46-d78b535b84dc" containerName="glance-log" Jan 11 17:52:01 crc kubenswrapper[4837]: I0111 17:52:01.323092 4837 memory_manager.go:354] "RemoveStaleState removing state" podUID="979d7c48-3688-478f-bb46-d78b535b84dc" containerName="glance-log" Jan 11 17:52:01 crc kubenswrapper[4837]: I0111 17:52:01.323116 4837 memory_manager.go:354] "RemoveStaleState removing state" podUID="979d7c48-3688-478f-bb46-d78b535b84dc" containerName="glance-httpd" Jan 11 17:52:01 crc kubenswrapper[4837]: I0111 17:52:01.324054 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Jan 11 17:52:01 crc kubenswrapper[4837]: I0111 17:52:01.329564 4837 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-internal-api-0"] Jan 11 17:52:01 crc kubenswrapper[4837]: I0111 17:52:01.356109 4837 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-default-internal-config-data" Jan 11 17:52:01 crc kubenswrapper[4837]: I0111 17:52:01.356415 4837 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-glance-default-internal-svc" Jan 11 17:52:01 crc kubenswrapper[4837]: I0111 17:52:01.492394 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/57a88fde-50af-4286-b9c6-8a5300b7f26b-scripts\") pod \"glance-default-internal-api-0\" (UID: \"57a88fde-50af-4286-b9c6-8a5300b7f26b\") " pod="openstack/glance-default-internal-api-0" Jan 11 17:52:01 crc kubenswrapper[4837]: I0111 17:52:01.492586 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fp7dc\" (UniqueName: \"kubernetes.io/projected/57a88fde-50af-4286-b9c6-8a5300b7f26b-kube-api-access-fp7dc\") pod \"glance-default-internal-api-0\" (UID: \"57a88fde-50af-4286-b9c6-8a5300b7f26b\") " pod="openstack/glance-default-internal-api-0" Jan 11 17:52:01 crc kubenswrapper[4837]: I0111 17:52:01.492780 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/57a88fde-50af-4286-b9c6-8a5300b7f26b-logs\") pod \"glance-default-internal-api-0\" (UID: \"57a88fde-50af-4286-b9c6-8a5300b7f26b\") " pod="openstack/glance-default-internal-api-0" Jan 11 17:52:01 crc kubenswrapper[4837]: I0111 17:52:01.492905 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/57a88fde-50af-4286-b9c6-8a5300b7f26b-internal-tls-certs\") pod \"glance-default-internal-api-0\" (UID: \"57a88fde-50af-4286-b9c6-8a5300b7f26b\") " pod="openstack/glance-default-internal-api-0" Jan 11 17:52:01 crc kubenswrapper[4837]: I0111 17:52:01.492971 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/57a88fde-50af-4286-b9c6-8a5300b7f26b-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"57a88fde-50af-4286-b9c6-8a5300b7f26b\") " pod="openstack/glance-default-internal-api-0" Jan 11 17:52:01 crc kubenswrapper[4837]: I0111 17:52:01.492999 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage01-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") pod \"glance-default-internal-api-0\" (UID: \"57a88fde-50af-4286-b9c6-8a5300b7f26b\") " pod="openstack/glance-default-internal-api-0" Jan 11 17:52:01 crc kubenswrapper[4837]: I0111 17:52:01.493043 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/57a88fde-50af-4286-b9c6-8a5300b7f26b-config-data\") pod \"glance-default-internal-api-0\" (UID: \"57a88fde-50af-4286-b9c6-8a5300b7f26b\") " pod="openstack/glance-default-internal-api-0" Jan 11 17:52:01 crc kubenswrapper[4837]: I0111 17:52:01.493253 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/57a88fde-50af-4286-b9c6-8a5300b7f26b-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"57a88fde-50af-4286-b9c6-8a5300b7f26b\") " pod="openstack/glance-default-internal-api-0" Jan 11 17:52:01 crc kubenswrapper[4837]: I0111 17:52:01.594896 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fp7dc\" (UniqueName: \"kubernetes.io/projected/57a88fde-50af-4286-b9c6-8a5300b7f26b-kube-api-access-fp7dc\") pod \"glance-default-internal-api-0\" (UID: \"57a88fde-50af-4286-b9c6-8a5300b7f26b\") " pod="openstack/glance-default-internal-api-0" Jan 11 17:52:01 crc kubenswrapper[4837]: I0111 17:52:01.594943 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/57a88fde-50af-4286-b9c6-8a5300b7f26b-logs\") pod \"glance-default-internal-api-0\" (UID: \"57a88fde-50af-4286-b9c6-8a5300b7f26b\") " pod="openstack/glance-default-internal-api-0" Jan 11 17:52:01 crc kubenswrapper[4837]: I0111 17:52:01.594978 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/57a88fde-50af-4286-b9c6-8a5300b7f26b-internal-tls-certs\") pod \"glance-default-internal-api-0\" (UID: \"57a88fde-50af-4286-b9c6-8a5300b7f26b\") " pod="openstack/glance-default-internal-api-0" Jan 11 17:52:01 crc kubenswrapper[4837]: I0111 17:52:01.595004 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/57a88fde-50af-4286-b9c6-8a5300b7f26b-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"57a88fde-50af-4286-b9c6-8a5300b7f26b\") " pod="openstack/glance-default-internal-api-0" Jan 11 17:52:01 crc kubenswrapper[4837]: I0111 17:52:01.595027 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage01-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") pod \"glance-default-internal-api-0\" (UID: \"57a88fde-50af-4286-b9c6-8a5300b7f26b\") " pod="openstack/glance-default-internal-api-0" Jan 11 17:52:01 crc kubenswrapper[4837]: I0111 17:52:01.595051 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/57a88fde-50af-4286-b9c6-8a5300b7f26b-config-data\") pod \"glance-default-internal-api-0\" (UID: \"57a88fde-50af-4286-b9c6-8a5300b7f26b\") " pod="openstack/glance-default-internal-api-0" Jan 11 17:52:01 crc kubenswrapper[4837]: I0111 17:52:01.595114 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/57a88fde-50af-4286-b9c6-8a5300b7f26b-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"57a88fde-50af-4286-b9c6-8a5300b7f26b\") " pod="openstack/glance-default-internal-api-0" Jan 11 17:52:01 crc kubenswrapper[4837]: I0111 17:52:01.595148 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/57a88fde-50af-4286-b9c6-8a5300b7f26b-scripts\") pod \"glance-default-internal-api-0\" (UID: \"57a88fde-50af-4286-b9c6-8a5300b7f26b\") " pod="openstack/glance-default-internal-api-0" Jan 11 17:52:01 crc kubenswrapper[4837]: I0111 17:52:01.595743 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/57a88fde-50af-4286-b9c6-8a5300b7f26b-logs\") pod \"glance-default-internal-api-0\" (UID: \"57a88fde-50af-4286-b9c6-8a5300b7f26b\") " pod="openstack/glance-default-internal-api-0" Jan 11 17:52:01 crc kubenswrapper[4837]: I0111 17:52:01.595792 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/57a88fde-50af-4286-b9c6-8a5300b7f26b-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"57a88fde-50af-4286-b9c6-8a5300b7f26b\") " pod="openstack/glance-default-internal-api-0" Jan 11 17:52:01 crc kubenswrapper[4837]: I0111 17:52:01.595916 4837 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage01-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") pod \"glance-default-internal-api-0\" (UID: \"57a88fde-50af-4286-b9c6-8a5300b7f26b\") device mount path \"/mnt/openstack/pv01\"" pod="openstack/glance-default-internal-api-0" Jan 11 17:52:01 crc kubenswrapper[4837]: I0111 17:52:01.600305 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/57a88fde-50af-4286-b9c6-8a5300b7f26b-scripts\") pod \"glance-default-internal-api-0\" (UID: \"57a88fde-50af-4286-b9c6-8a5300b7f26b\") " pod="openstack/glance-default-internal-api-0" Jan 11 17:52:01 crc kubenswrapper[4837]: I0111 17:52:01.604237 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/57a88fde-50af-4286-b9c6-8a5300b7f26b-config-data\") pod \"glance-default-internal-api-0\" (UID: \"57a88fde-50af-4286-b9c6-8a5300b7f26b\") " pod="openstack/glance-default-internal-api-0" Jan 11 17:52:01 crc kubenswrapper[4837]: I0111 17:52:01.604499 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/57a88fde-50af-4286-b9c6-8a5300b7f26b-internal-tls-certs\") pod \"glance-default-internal-api-0\" (UID: \"57a88fde-50af-4286-b9c6-8a5300b7f26b\") " pod="openstack/glance-default-internal-api-0" Jan 11 17:52:01 crc kubenswrapper[4837]: I0111 17:52:01.613260 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fp7dc\" (UniqueName: \"kubernetes.io/projected/57a88fde-50af-4286-b9c6-8a5300b7f26b-kube-api-access-fp7dc\") pod \"glance-default-internal-api-0\" (UID: \"57a88fde-50af-4286-b9c6-8a5300b7f26b\") " pod="openstack/glance-default-internal-api-0" Jan 11 17:52:01 crc kubenswrapper[4837]: I0111 17:52:01.615411 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/57a88fde-50af-4286-b9c6-8a5300b7f26b-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"57a88fde-50af-4286-b9c6-8a5300b7f26b\") " pod="openstack/glance-default-internal-api-0" Jan 11 17:52:01 crc kubenswrapper[4837]: I0111 17:52:01.625196 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage01-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") pod \"glance-default-internal-api-0\" (UID: \"57a88fde-50af-4286-b9c6-8a5300b7f26b\") " pod="openstack/glance-default-internal-api-0" Jan 11 17:52:01 crc kubenswrapper[4837]: I0111 17:52:01.681281 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Jan 11 17:52:01 crc kubenswrapper[4837]: I0111 17:52:01.963998 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"5c4f4894-6491-47db-8b74-f62a2aa8e39b","Type":"ContainerStarted","Data":"570266eb2340326c49259eac3b4565c9e614d7fd0f6630cef958494391b30887"} Jan 11 17:52:02 crc kubenswrapper[4837]: I0111 17:52:02.205860 4837 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-internal-api-0"] Jan 11 17:52:02 crc kubenswrapper[4837]: I0111 17:52:02.381541 4837 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="979d7c48-3688-478f-bb46-d78b535b84dc" path="/var/lib/kubelet/pods/979d7c48-3688-478f-bb46-d78b535b84dc/volumes" Jan 11 17:52:02 crc kubenswrapper[4837]: I0111 17:52:02.974433 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"5c4f4894-6491-47db-8b74-f62a2aa8e39b","Type":"ContainerStarted","Data":"17b97700fa8db4e9c3121024fe22641b854fd92cdb9009903ea74f86f98666ac"} Jan 11 17:52:02 crc kubenswrapper[4837]: I0111 17:52:02.974573 4837 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="5c4f4894-6491-47db-8b74-f62a2aa8e39b" containerName="ceilometer-central-agent" containerID="cri-o://838c87d73c9cbb7d52b43ccba724b178ed2caace81cc2bb2319851a9d40a6250" gracePeriod=30 Jan 11 17:52:02 crc kubenswrapper[4837]: I0111 17:52:02.974752 4837 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="5c4f4894-6491-47db-8b74-f62a2aa8e39b" containerName="ceilometer-notification-agent" containerID="cri-o://621e2d3a0c40a000a43e685abe9f17daaacabb456ebe835ea1b2c9dbca65bf1e" gracePeriod=30 Jan 11 17:52:02 crc kubenswrapper[4837]: I0111 17:52:02.974805 4837 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="5c4f4894-6491-47db-8b74-f62a2aa8e39b" containerName="proxy-httpd" containerID="cri-o://17b97700fa8db4e9c3121024fe22641b854fd92cdb9009903ea74f86f98666ac" gracePeriod=30 Jan 11 17:52:02 crc kubenswrapper[4837]: I0111 17:52:02.974818 4837 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="5c4f4894-6491-47db-8b74-f62a2aa8e39b" containerName="sg-core" containerID="cri-o://570266eb2340326c49259eac3b4565c9e614d7fd0f6630cef958494391b30887" gracePeriod=30 Jan 11 17:52:02 crc kubenswrapper[4837]: I0111 17:52:02.974932 4837 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ceilometer-0" Jan 11 17:52:02 crc kubenswrapper[4837]: I0111 17:52:02.980885 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"57a88fde-50af-4286-b9c6-8a5300b7f26b","Type":"ContainerStarted","Data":"a5cd88e4fee82ae1cdefd3757fd23d612e4c8f9559e2df9675b33f58e2a30ce7"} Jan 11 17:52:02 crc kubenswrapper[4837]: I0111 17:52:02.980923 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"57a88fde-50af-4286-b9c6-8a5300b7f26b","Type":"ContainerStarted","Data":"5e697696a0f0d58d20df9239129bc04816018d77d7f98c1ecc09af0e850da397"} Jan 11 17:52:03 crc kubenswrapper[4837]: I0111 17:52:03.010034 4837 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ceilometer-0" podStartSLOduration=2.188552382 podStartE2EDuration="6.010018572s" podCreationTimestamp="2026-01-11 17:51:57 +0000 UTC" firstStartedPulling="2026-01-11 17:51:58.80053192 +0000 UTC m=+1292.978724626" lastFinishedPulling="2026-01-11 17:52:02.62199811 +0000 UTC m=+1296.800190816" observedRunningTime="2026-01-11 17:52:02.999311905 +0000 UTC m=+1297.177504611" watchObservedRunningTime="2026-01-11 17:52:03.010018572 +0000 UTC m=+1297.188211278" Jan 11 17:52:03 crc kubenswrapper[4837]: I0111 17:52:03.990972 4837 generic.go:334] "Generic (PLEG): container finished" podID="af5aeb3b-e789-4f43-ac70-bb570e59027e" containerID="6d7775187bc9d8933cd74c6fb7ea42d457f38072bf52d7ec4be27690a1b39be4" exitCode=137 Jan 11 17:52:03 crc kubenswrapper[4837]: I0111 17:52:03.991452 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-5d4fd56848-nmkm6" event={"ID":"af5aeb3b-e789-4f43-ac70-bb570e59027e","Type":"ContainerDied","Data":"6d7775187bc9d8933cd74c6fb7ea42d457f38072bf52d7ec4be27690a1b39be4"} Jan 11 17:52:03 crc kubenswrapper[4837]: I0111 17:52:03.998646 4837 generic.go:334] "Generic (PLEG): container finished" podID="5c4f4894-6491-47db-8b74-f62a2aa8e39b" containerID="17b97700fa8db4e9c3121024fe22641b854fd92cdb9009903ea74f86f98666ac" exitCode=0 Jan 11 17:52:03 crc kubenswrapper[4837]: I0111 17:52:03.998701 4837 generic.go:334] "Generic (PLEG): container finished" podID="5c4f4894-6491-47db-8b74-f62a2aa8e39b" containerID="570266eb2340326c49259eac3b4565c9e614d7fd0f6630cef958494391b30887" exitCode=2 Jan 11 17:52:03 crc kubenswrapper[4837]: I0111 17:52:03.998715 4837 generic.go:334] "Generic (PLEG): container finished" podID="5c4f4894-6491-47db-8b74-f62a2aa8e39b" containerID="621e2d3a0c40a000a43e685abe9f17daaacabb456ebe835ea1b2c9dbca65bf1e" exitCode=0 Jan 11 17:52:03 crc kubenswrapper[4837]: I0111 17:52:03.998762 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"5c4f4894-6491-47db-8b74-f62a2aa8e39b","Type":"ContainerDied","Data":"17b97700fa8db4e9c3121024fe22641b854fd92cdb9009903ea74f86f98666ac"} Jan 11 17:52:03 crc kubenswrapper[4837]: I0111 17:52:03.998823 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"5c4f4894-6491-47db-8b74-f62a2aa8e39b","Type":"ContainerDied","Data":"570266eb2340326c49259eac3b4565c9e614d7fd0f6630cef958494391b30887"} Jan 11 17:52:03 crc kubenswrapper[4837]: I0111 17:52:03.998843 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"5c4f4894-6491-47db-8b74-f62a2aa8e39b","Type":"ContainerDied","Data":"621e2d3a0c40a000a43e685abe9f17daaacabb456ebe835ea1b2c9dbca65bf1e"} Jan 11 17:52:04 crc kubenswrapper[4837]: I0111 17:52:04.000853 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"57a88fde-50af-4286-b9c6-8a5300b7f26b","Type":"ContainerStarted","Data":"efe4497f5f0330ced64df5ad5d717be79ee40158e3201bdf7fb9d81c54f998f7"} Jan 11 17:52:04 crc kubenswrapper[4837]: I0111 17:52:04.023992 4837 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/glance-default-internal-api-0" podStartSLOduration=3.023977556 podStartE2EDuration="3.023977556s" podCreationTimestamp="2026-01-11 17:52:01 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-11 17:52:04.018118578 +0000 UTC m=+1298.196311284" watchObservedRunningTime="2026-01-11 17:52:04.023977556 +0000 UTC m=+1298.202170262" Jan 11 17:52:04 crc kubenswrapper[4837]: I0111 17:52:04.097293 4837 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-5d4fd56848-nmkm6" Jan 11 17:52:04 crc kubenswrapper[4837]: I0111 17:52:04.270902 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/af5aeb3b-e789-4f43-ac70-bb570e59027e-horizon-secret-key\") pod \"af5aeb3b-e789-4f43-ac70-bb570e59027e\" (UID: \"af5aeb3b-e789-4f43-ac70-bb570e59027e\") " Jan 11 17:52:04 crc kubenswrapper[4837]: I0111 17:52:04.270967 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/af5aeb3b-e789-4f43-ac70-bb570e59027e-combined-ca-bundle\") pod \"af5aeb3b-e789-4f43-ac70-bb570e59027e\" (UID: \"af5aeb3b-e789-4f43-ac70-bb570e59027e\") " Jan 11 17:52:04 crc kubenswrapper[4837]: I0111 17:52:04.271013 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"horizon-tls-certs\" (UniqueName: \"kubernetes.io/secret/af5aeb3b-e789-4f43-ac70-bb570e59027e-horizon-tls-certs\") pod \"af5aeb3b-e789-4f43-ac70-bb570e59027e\" (UID: \"af5aeb3b-e789-4f43-ac70-bb570e59027e\") " Jan 11 17:52:04 crc kubenswrapper[4837]: I0111 17:52:04.271178 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-vt67d\" (UniqueName: \"kubernetes.io/projected/af5aeb3b-e789-4f43-ac70-bb570e59027e-kube-api-access-vt67d\") pod \"af5aeb3b-e789-4f43-ac70-bb570e59027e\" (UID: \"af5aeb3b-e789-4f43-ac70-bb570e59027e\") " Jan 11 17:52:04 crc kubenswrapper[4837]: I0111 17:52:04.271201 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/af5aeb3b-e789-4f43-ac70-bb570e59027e-config-data\") pod \"af5aeb3b-e789-4f43-ac70-bb570e59027e\" (UID: \"af5aeb3b-e789-4f43-ac70-bb570e59027e\") " Jan 11 17:52:04 crc kubenswrapper[4837]: I0111 17:52:04.271294 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/af5aeb3b-e789-4f43-ac70-bb570e59027e-logs\") pod \"af5aeb3b-e789-4f43-ac70-bb570e59027e\" (UID: \"af5aeb3b-e789-4f43-ac70-bb570e59027e\") " Jan 11 17:52:04 crc kubenswrapper[4837]: I0111 17:52:04.271373 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/af5aeb3b-e789-4f43-ac70-bb570e59027e-scripts\") pod \"af5aeb3b-e789-4f43-ac70-bb570e59027e\" (UID: \"af5aeb3b-e789-4f43-ac70-bb570e59027e\") " Jan 11 17:52:04 crc kubenswrapper[4837]: I0111 17:52:04.278632 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/af5aeb3b-e789-4f43-ac70-bb570e59027e-logs" (OuterVolumeSpecName: "logs") pod "af5aeb3b-e789-4f43-ac70-bb570e59027e" (UID: "af5aeb3b-e789-4f43-ac70-bb570e59027e"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 11 17:52:04 crc kubenswrapper[4837]: I0111 17:52:04.286782 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/af5aeb3b-e789-4f43-ac70-bb570e59027e-horizon-secret-key" (OuterVolumeSpecName: "horizon-secret-key") pod "af5aeb3b-e789-4f43-ac70-bb570e59027e" (UID: "af5aeb3b-e789-4f43-ac70-bb570e59027e"). InnerVolumeSpecName "horizon-secret-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 11 17:52:04 crc kubenswrapper[4837]: I0111 17:52:04.305888 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/af5aeb3b-e789-4f43-ac70-bb570e59027e-kube-api-access-vt67d" (OuterVolumeSpecName: "kube-api-access-vt67d") pod "af5aeb3b-e789-4f43-ac70-bb570e59027e" (UID: "af5aeb3b-e789-4f43-ac70-bb570e59027e"). InnerVolumeSpecName "kube-api-access-vt67d". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 11 17:52:04 crc kubenswrapper[4837]: I0111 17:52:04.343589 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/af5aeb3b-e789-4f43-ac70-bb570e59027e-scripts" (OuterVolumeSpecName: "scripts") pod "af5aeb3b-e789-4f43-ac70-bb570e59027e" (UID: "af5aeb3b-e789-4f43-ac70-bb570e59027e"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 11 17:52:04 crc kubenswrapper[4837]: I0111 17:52:04.354556 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/af5aeb3b-e789-4f43-ac70-bb570e59027e-config-data" (OuterVolumeSpecName: "config-data") pod "af5aeb3b-e789-4f43-ac70-bb570e59027e" (UID: "af5aeb3b-e789-4f43-ac70-bb570e59027e"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 11 17:52:04 crc kubenswrapper[4837]: I0111 17:52:04.354978 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/af5aeb3b-e789-4f43-ac70-bb570e59027e-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "af5aeb3b-e789-4f43-ac70-bb570e59027e" (UID: "af5aeb3b-e789-4f43-ac70-bb570e59027e"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 11 17:52:04 crc kubenswrapper[4837]: I0111 17:52:04.362447 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/af5aeb3b-e789-4f43-ac70-bb570e59027e-horizon-tls-certs" (OuterVolumeSpecName: "horizon-tls-certs") pod "af5aeb3b-e789-4f43-ac70-bb570e59027e" (UID: "af5aeb3b-e789-4f43-ac70-bb570e59027e"). InnerVolumeSpecName "horizon-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 11 17:52:04 crc kubenswrapper[4837]: I0111 17:52:04.374117 4837 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/af5aeb3b-e789-4f43-ac70-bb570e59027e-logs\") on node \"crc\" DevicePath \"\"" Jan 11 17:52:04 crc kubenswrapper[4837]: I0111 17:52:04.374158 4837 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/af5aeb3b-e789-4f43-ac70-bb570e59027e-scripts\") on node \"crc\" DevicePath \"\"" Jan 11 17:52:04 crc kubenswrapper[4837]: I0111 17:52:04.374202 4837 reconciler_common.go:293] "Volume detached for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/af5aeb3b-e789-4f43-ac70-bb570e59027e-horizon-secret-key\") on node \"crc\" DevicePath \"\"" Jan 11 17:52:04 crc kubenswrapper[4837]: I0111 17:52:04.374219 4837 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/af5aeb3b-e789-4f43-ac70-bb570e59027e-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 11 17:52:04 crc kubenswrapper[4837]: I0111 17:52:04.374233 4837 reconciler_common.go:293] "Volume detached for volume \"horizon-tls-certs\" (UniqueName: \"kubernetes.io/secret/af5aeb3b-e789-4f43-ac70-bb570e59027e-horizon-tls-certs\") on node \"crc\" DevicePath \"\"" Jan 11 17:52:04 crc kubenswrapper[4837]: I0111 17:52:04.374271 4837 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-vt67d\" (UniqueName: \"kubernetes.io/projected/af5aeb3b-e789-4f43-ac70-bb570e59027e-kube-api-access-vt67d\") on node \"crc\" DevicePath \"\"" Jan 11 17:52:04 crc kubenswrapper[4837]: I0111 17:52:04.374287 4837 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/af5aeb3b-e789-4f43-ac70-bb570e59027e-config-data\") on node \"crc\" DevicePath \"\"" Jan 11 17:52:05 crc kubenswrapper[4837]: I0111 17:52:05.011216 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-5d4fd56848-nmkm6" event={"ID":"af5aeb3b-e789-4f43-ac70-bb570e59027e","Type":"ContainerDied","Data":"2cb13701105734300d330441ec671cbbb96ce16214153e592ff508e3acc0b60d"} Jan 11 17:52:05 crc kubenswrapper[4837]: I0111 17:52:05.011544 4837 scope.go:117] "RemoveContainer" containerID="31448bf7a925ddeb3fb3d1d51e19aa71d7fe4393dd1c0b9b7fa3db44c4fe276a" Jan 11 17:52:05 crc kubenswrapper[4837]: I0111 17:52:05.011241 4837 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-5d4fd56848-nmkm6" Jan 11 17:52:05 crc kubenswrapper[4837]: I0111 17:52:05.037117 4837 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/horizon-5d4fd56848-nmkm6"] Jan 11 17:52:05 crc kubenswrapper[4837]: I0111 17:52:05.047414 4837 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/horizon-5d4fd56848-nmkm6"] Jan 11 17:52:05 crc kubenswrapper[4837]: I0111 17:52:05.221843 4837 scope.go:117] "RemoveContainer" containerID="6d7775187bc9d8933cd74c6fb7ea42d457f38072bf52d7ec4be27690a1b39be4" Jan 11 17:52:06 crc kubenswrapper[4837]: I0111 17:52:06.401201 4837 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="af5aeb3b-e789-4f43-ac70-bb570e59027e" path="/var/lib/kubelet/pods/af5aeb3b-e789-4f43-ac70-bb570e59027e/volumes" Jan 11 17:52:08 crc kubenswrapper[4837]: I0111 17:52:08.123547 4837 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-external-api-0"] Jan 11 17:52:08 crc kubenswrapper[4837]: I0111 17:52:08.124136 4837 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-external-api-0" podUID="9669a553-3ac2-4189-86f2-08e5b972e66f" containerName="glance-log" containerID="cri-o://f8b0b716163d47875c4c1afde9b512bd9dec9d5c171052ca7623704f62ed4244" gracePeriod=30 Jan 11 17:52:08 crc kubenswrapper[4837]: I0111 17:52:08.124233 4837 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-external-api-0" podUID="9669a553-3ac2-4189-86f2-08e5b972e66f" containerName="glance-httpd" containerID="cri-o://b03561769cf8682e22a6f07ad29243939f373c2e2bbbdd661ab942b564c98613" gracePeriod=30 Jan 11 17:52:09 crc kubenswrapper[4837]: I0111 17:52:09.057046 4837 generic.go:334] "Generic (PLEG): container finished" podID="9669a553-3ac2-4189-86f2-08e5b972e66f" containerID="f8b0b716163d47875c4c1afde9b512bd9dec9d5c171052ca7623704f62ed4244" exitCode=143 Jan 11 17:52:09 crc kubenswrapper[4837]: I0111 17:52:09.057156 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"9669a553-3ac2-4189-86f2-08e5b972e66f","Type":"ContainerDied","Data":"f8b0b716163d47875c4c1afde9b512bd9dec9d5c171052ca7623704f62ed4244"} Jan 11 17:52:10 crc kubenswrapper[4837]: I0111 17:52:10.782369 4837 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Jan 11 17:52:10 crc kubenswrapper[4837]: I0111 17:52:10.877749 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/5c4f4894-6491-47db-8b74-f62a2aa8e39b-log-httpd\") pod \"5c4f4894-6491-47db-8b74-f62a2aa8e39b\" (UID: \"5c4f4894-6491-47db-8b74-f62a2aa8e39b\") " Jan 11 17:52:10 crc kubenswrapper[4837]: I0111 17:52:10.878135 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5c4f4894-6491-47db-8b74-f62a2aa8e39b-config-data\") pod \"5c4f4894-6491-47db-8b74-f62a2aa8e39b\" (UID: \"5c4f4894-6491-47db-8b74-f62a2aa8e39b\") " Jan 11 17:52:10 crc kubenswrapper[4837]: I0111 17:52:10.878177 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xld5s\" (UniqueName: \"kubernetes.io/projected/5c4f4894-6491-47db-8b74-f62a2aa8e39b-kube-api-access-xld5s\") pod \"5c4f4894-6491-47db-8b74-f62a2aa8e39b\" (UID: \"5c4f4894-6491-47db-8b74-f62a2aa8e39b\") " Jan 11 17:52:10 crc kubenswrapper[4837]: I0111 17:52:10.878244 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/5c4f4894-6491-47db-8b74-f62a2aa8e39b-sg-core-conf-yaml\") pod \"5c4f4894-6491-47db-8b74-f62a2aa8e39b\" (UID: \"5c4f4894-6491-47db-8b74-f62a2aa8e39b\") " Jan 11 17:52:10 crc kubenswrapper[4837]: I0111 17:52:10.878316 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/5c4f4894-6491-47db-8b74-f62a2aa8e39b-run-httpd\") pod \"5c4f4894-6491-47db-8b74-f62a2aa8e39b\" (UID: \"5c4f4894-6491-47db-8b74-f62a2aa8e39b\") " Jan 11 17:52:10 crc kubenswrapper[4837]: I0111 17:52:10.878378 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/5c4f4894-6491-47db-8b74-f62a2aa8e39b-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "5c4f4894-6491-47db-8b74-f62a2aa8e39b" (UID: "5c4f4894-6491-47db-8b74-f62a2aa8e39b"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 11 17:52:10 crc kubenswrapper[4837]: I0111 17:52:10.878501 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/5c4f4894-6491-47db-8b74-f62a2aa8e39b-scripts\") pod \"5c4f4894-6491-47db-8b74-f62a2aa8e39b\" (UID: \"5c4f4894-6491-47db-8b74-f62a2aa8e39b\") " Jan 11 17:52:10 crc kubenswrapper[4837]: I0111 17:52:10.878530 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5c4f4894-6491-47db-8b74-f62a2aa8e39b-combined-ca-bundle\") pod \"5c4f4894-6491-47db-8b74-f62a2aa8e39b\" (UID: \"5c4f4894-6491-47db-8b74-f62a2aa8e39b\") " Jan 11 17:52:10 crc kubenswrapper[4837]: I0111 17:52:10.878884 4837 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/5c4f4894-6491-47db-8b74-f62a2aa8e39b-log-httpd\") on node \"crc\" DevicePath \"\"" Jan 11 17:52:10 crc kubenswrapper[4837]: I0111 17:52:10.878886 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/5c4f4894-6491-47db-8b74-f62a2aa8e39b-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "5c4f4894-6491-47db-8b74-f62a2aa8e39b" (UID: "5c4f4894-6491-47db-8b74-f62a2aa8e39b"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 11 17:52:10 crc kubenswrapper[4837]: I0111 17:52:10.884990 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5c4f4894-6491-47db-8b74-f62a2aa8e39b-kube-api-access-xld5s" (OuterVolumeSpecName: "kube-api-access-xld5s") pod "5c4f4894-6491-47db-8b74-f62a2aa8e39b" (UID: "5c4f4894-6491-47db-8b74-f62a2aa8e39b"). InnerVolumeSpecName "kube-api-access-xld5s". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 11 17:52:10 crc kubenswrapper[4837]: I0111 17:52:10.892541 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5c4f4894-6491-47db-8b74-f62a2aa8e39b-scripts" (OuterVolumeSpecName: "scripts") pod "5c4f4894-6491-47db-8b74-f62a2aa8e39b" (UID: "5c4f4894-6491-47db-8b74-f62a2aa8e39b"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 11 17:52:10 crc kubenswrapper[4837]: I0111 17:52:10.902948 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5c4f4894-6491-47db-8b74-f62a2aa8e39b-sg-core-conf-yaml" (OuterVolumeSpecName: "sg-core-conf-yaml") pod "5c4f4894-6491-47db-8b74-f62a2aa8e39b" (UID: "5c4f4894-6491-47db-8b74-f62a2aa8e39b"). InnerVolumeSpecName "sg-core-conf-yaml". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 11 17:52:10 crc kubenswrapper[4837]: I0111 17:52:10.969885 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5c4f4894-6491-47db-8b74-f62a2aa8e39b-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "5c4f4894-6491-47db-8b74-f62a2aa8e39b" (UID: "5c4f4894-6491-47db-8b74-f62a2aa8e39b"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 11 17:52:10 crc kubenswrapper[4837]: I0111 17:52:10.984939 4837 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/5c4f4894-6491-47db-8b74-f62a2aa8e39b-scripts\") on node \"crc\" DevicePath \"\"" Jan 11 17:52:10 crc kubenswrapper[4837]: I0111 17:52:10.984972 4837 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5c4f4894-6491-47db-8b74-f62a2aa8e39b-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 11 17:52:10 crc kubenswrapper[4837]: I0111 17:52:10.984987 4837 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xld5s\" (UniqueName: \"kubernetes.io/projected/5c4f4894-6491-47db-8b74-f62a2aa8e39b-kube-api-access-xld5s\") on node \"crc\" DevicePath \"\"" Jan 11 17:52:10 crc kubenswrapper[4837]: I0111 17:52:10.984999 4837 reconciler_common.go:293] "Volume detached for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/5c4f4894-6491-47db-8b74-f62a2aa8e39b-sg-core-conf-yaml\") on node \"crc\" DevicePath \"\"" Jan 11 17:52:10 crc kubenswrapper[4837]: I0111 17:52:10.985012 4837 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/5c4f4894-6491-47db-8b74-f62a2aa8e39b-run-httpd\") on node \"crc\" DevicePath \"\"" Jan 11 17:52:11 crc kubenswrapper[4837]: I0111 17:52:11.005819 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5c4f4894-6491-47db-8b74-f62a2aa8e39b-config-data" (OuterVolumeSpecName: "config-data") pod "5c4f4894-6491-47db-8b74-f62a2aa8e39b" (UID: "5c4f4894-6491-47db-8b74-f62a2aa8e39b"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 11 17:52:11 crc kubenswrapper[4837]: I0111 17:52:11.086743 4837 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5c4f4894-6491-47db-8b74-f62a2aa8e39b-config-data\") on node \"crc\" DevicePath \"\"" Jan 11 17:52:11 crc kubenswrapper[4837]: I0111 17:52:11.102548 4837 generic.go:334] "Generic (PLEG): container finished" podID="5c4f4894-6491-47db-8b74-f62a2aa8e39b" containerID="838c87d73c9cbb7d52b43ccba724b178ed2caace81cc2bb2319851a9d40a6250" exitCode=0 Jan 11 17:52:11 crc kubenswrapper[4837]: I0111 17:52:11.102594 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"5c4f4894-6491-47db-8b74-f62a2aa8e39b","Type":"ContainerDied","Data":"838c87d73c9cbb7d52b43ccba724b178ed2caace81cc2bb2319851a9d40a6250"} Jan 11 17:52:11 crc kubenswrapper[4837]: I0111 17:52:11.102619 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"5c4f4894-6491-47db-8b74-f62a2aa8e39b","Type":"ContainerDied","Data":"75c579f9c4ffb6cee80c6437a73f56e8e1bdc42a396b2537ab4f77980d10a341"} Jan 11 17:52:11 crc kubenswrapper[4837]: I0111 17:52:11.102635 4837 scope.go:117] "RemoveContainer" containerID="17b97700fa8db4e9c3121024fe22641b854fd92cdb9009903ea74f86f98666ac" Jan 11 17:52:11 crc kubenswrapper[4837]: I0111 17:52:11.102790 4837 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Jan 11 17:52:11 crc kubenswrapper[4837]: I0111 17:52:11.137174 4837 scope.go:117] "RemoveContainer" containerID="570266eb2340326c49259eac3b4565c9e614d7fd0f6630cef958494391b30887" Jan 11 17:52:11 crc kubenswrapper[4837]: I0111 17:52:11.159727 4837 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Jan 11 17:52:11 crc kubenswrapper[4837]: I0111 17:52:11.179005 4837 scope.go:117] "RemoveContainer" containerID="621e2d3a0c40a000a43e685abe9f17daaacabb456ebe835ea1b2c9dbca65bf1e" Jan 11 17:52:11 crc kubenswrapper[4837]: I0111 17:52:11.182044 4837 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ceilometer-0"] Jan 11 17:52:11 crc kubenswrapper[4837]: I0111 17:52:11.211994 4837 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceilometer-0"] Jan 11 17:52:11 crc kubenswrapper[4837]: E0111 17:52:11.212742 4837 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5c4f4894-6491-47db-8b74-f62a2aa8e39b" containerName="proxy-httpd" Jan 11 17:52:11 crc kubenswrapper[4837]: I0111 17:52:11.212783 4837 state_mem.go:107] "Deleted CPUSet assignment" podUID="5c4f4894-6491-47db-8b74-f62a2aa8e39b" containerName="proxy-httpd" Jan 11 17:52:11 crc kubenswrapper[4837]: E0111 17:52:11.212806 4837 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5c4f4894-6491-47db-8b74-f62a2aa8e39b" containerName="sg-core" Jan 11 17:52:11 crc kubenswrapper[4837]: I0111 17:52:11.212813 4837 state_mem.go:107] "Deleted CPUSet assignment" podUID="5c4f4894-6491-47db-8b74-f62a2aa8e39b" containerName="sg-core" Jan 11 17:52:11 crc kubenswrapper[4837]: E0111 17:52:11.212829 4837 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5c4f4894-6491-47db-8b74-f62a2aa8e39b" containerName="ceilometer-central-agent" Jan 11 17:52:11 crc kubenswrapper[4837]: I0111 17:52:11.212836 4837 state_mem.go:107] "Deleted CPUSet assignment" podUID="5c4f4894-6491-47db-8b74-f62a2aa8e39b" containerName="ceilometer-central-agent" Jan 11 17:52:11 crc kubenswrapper[4837]: E0111 17:52:11.212847 4837 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5c4f4894-6491-47db-8b74-f62a2aa8e39b" containerName="ceilometer-notification-agent" Jan 11 17:52:11 crc kubenswrapper[4837]: I0111 17:52:11.212853 4837 state_mem.go:107] "Deleted CPUSet assignment" podUID="5c4f4894-6491-47db-8b74-f62a2aa8e39b" containerName="ceilometer-notification-agent" Jan 11 17:52:11 crc kubenswrapper[4837]: E0111 17:52:11.212864 4837 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="af5aeb3b-e789-4f43-ac70-bb570e59027e" containerName="horizon-log" Jan 11 17:52:11 crc kubenswrapper[4837]: I0111 17:52:11.212870 4837 state_mem.go:107] "Deleted CPUSet assignment" podUID="af5aeb3b-e789-4f43-ac70-bb570e59027e" containerName="horizon-log" Jan 11 17:52:11 crc kubenswrapper[4837]: E0111 17:52:11.212881 4837 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="af5aeb3b-e789-4f43-ac70-bb570e59027e" containerName="horizon" Jan 11 17:52:11 crc kubenswrapper[4837]: I0111 17:52:11.212888 4837 state_mem.go:107] "Deleted CPUSet assignment" podUID="af5aeb3b-e789-4f43-ac70-bb570e59027e" containerName="horizon" Jan 11 17:52:11 crc kubenswrapper[4837]: I0111 17:52:11.213067 4837 memory_manager.go:354] "RemoveStaleState removing state" podUID="af5aeb3b-e789-4f43-ac70-bb570e59027e" containerName="horizon-log" Jan 11 17:52:11 crc kubenswrapper[4837]: I0111 17:52:11.213077 4837 memory_manager.go:354] "RemoveStaleState removing state" podUID="af5aeb3b-e789-4f43-ac70-bb570e59027e" containerName="horizon" Jan 11 17:52:11 crc kubenswrapper[4837]: I0111 17:52:11.213086 4837 memory_manager.go:354] "RemoveStaleState removing state" podUID="5c4f4894-6491-47db-8b74-f62a2aa8e39b" containerName="ceilometer-notification-agent" Jan 11 17:52:11 crc kubenswrapper[4837]: I0111 17:52:11.213093 4837 memory_manager.go:354] "RemoveStaleState removing state" podUID="5c4f4894-6491-47db-8b74-f62a2aa8e39b" containerName="sg-core" Jan 11 17:52:11 crc kubenswrapper[4837]: I0111 17:52:11.213100 4837 memory_manager.go:354] "RemoveStaleState removing state" podUID="5c4f4894-6491-47db-8b74-f62a2aa8e39b" containerName="ceilometer-central-agent" Jan 11 17:52:11 crc kubenswrapper[4837]: I0111 17:52:11.213115 4837 memory_manager.go:354] "RemoveStaleState removing state" podUID="5c4f4894-6491-47db-8b74-f62a2aa8e39b" containerName="proxy-httpd" Jan 11 17:52:11 crc kubenswrapper[4837]: I0111 17:52:11.215811 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Jan 11 17:52:11 crc kubenswrapper[4837]: I0111 17:52:11.219861 4837 scope.go:117] "RemoveContainer" containerID="838c87d73c9cbb7d52b43ccba724b178ed2caace81cc2bb2319851a9d40a6250" Jan 11 17:52:11 crc kubenswrapper[4837]: I0111 17:52:11.221884 4837 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-config-data" Jan 11 17:52:11 crc kubenswrapper[4837]: I0111 17:52:11.222814 4837 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-scripts" Jan 11 17:52:11 crc kubenswrapper[4837]: I0111 17:52:11.241023 4837 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Jan 11 17:52:11 crc kubenswrapper[4837]: I0111 17:52:11.268916 4837 scope.go:117] "RemoveContainer" containerID="17b97700fa8db4e9c3121024fe22641b854fd92cdb9009903ea74f86f98666ac" Jan 11 17:52:11 crc kubenswrapper[4837]: E0111 17:52:11.269385 4837 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"17b97700fa8db4e9c3121024fe22641b854fd92cdb9009903ea74f86f98666ac\": container with ID starting with 17b97700fa8db4e9c3121024fe22641b854fd92cdb9009903ea74f86f98666ac not found: ID does not exist" containerID="17b97700fa8db4e9c3121024fe22641b854fd92cdb9009903ea74f86f98666ac" Jan 11 17:52:11 crc kubenswrapper[4837]: I0111 17:52:11.269415 4837 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"17b97700fa8db4e9c3121024fe22641b854fd92cdb9009903ea74f86f98666ac"} err="failed to get container status \"17b97700fa8db4e9c3121024fe22641b854fd92cdb9009903ea74f86f98666ac\": rpc error: code = NotFound desc = could not find container \"17b97700fa8db4e9c3121024fe22641b854fd92cdb9009903ea74f86f98666ac\": container with ID starting with 17b97700fa8db4e9c3121024fe22641b854fd92cdb9009903ea74f86f98666ac not found: ID does not exist" Jan 11 17:52:11 crc kubenswrapper[4837]: I0111 17:52:11.269437 4837 scope.go:117] "RemoveContainer" containerID="570266eb2340326c49259eac3b4565c9e614d7fd0f6630cef958494391b30887" Jan 11 17:52:11 crc kubenswrapper[4837]: E0111 17:52:11.269777 4837 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"570266eb2340326c49259eac3b4565c9e614d7fd0f6630cef958494391b30887\": container with ID starting with 570266eb2340326c49259eac3b4565c9e614d7fd0f6630cef958494391b30887 not found: ID does not exist" containerID="570266eb2340326c49259eac3b4565c9e614d7fd0f6630cef958494391b30887" Jan 11 17:52:11 crc kubenswrapper[4837]: I0111 17:52:11.269799 4837 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"570266eb2340326c49259eac3b4565c9e614d7fd0f6630cef958494391b30887"} err="failed to get container status \"570266eb2340326c49259eac3b4565c9e614d7fd0f6630cef958494391b30887\": rpc error: code = NotFound desc = could not find container \"570266eb2340326c49259eac3b4565c9e614d7fd0f6630cef958494391b30887\": container with ID starting with 570266eb2340326c49259eac3b4565c9e614d7fd0f6630cef958494391b30887 not found: ID does not exist" Jan 11 17:52:11 crc kubenswrapper[4837]: I0111 17:52:11.269813 4837 scope.go:117] "RemoveContainer" containerID="621e2d3a0c40a000a43e685abe9f17daaacabb456ebe835ea1b2c9dbca65bf1e" Jan 11 17:52:11 crc kubenswrapper[4837]: E0111 17:52:11.270061 4837 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"621e2d3a0c40a000a43e685abe9f17daaacabb456ebe835ea1b2c9dbca65bf1e\": container with ID starting with 621e2d3a0c40a000a43e685abe9f17daaacabb456ebe835ea1b2c9dbca65bf1e not found: ID does not exist" containerID="621e2d3a0c40a000a43e685abe9f17daaacabb456ebe835ea1b2c9dbca65bf1e" Jan 11 17:52:11 crc kubenswrapper[4837]: I0111 17:52:11.270079 4837 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"621e2d3a0c40a000a43e685abe9f17daaacabb456ebe835ea1b2c9dbca65bf1e"} err="failed to get container status \"621e2d3a0c40a000a43e685abe9f17daaacabb456ebe835ea1b2c9dbca65bf1e\": rpc error: code = NotFound desc = could not find container \"621e2d3a0c40a000a43e685abe9f17daaacabb456ebe835ea1b2c9dbca65bf1e\": container with ID starting with 621e2d3a0c40a000a43e685abe9f17daaacabb456ebe835ea1b2c9dbca65bf1e not found: ID does not exist" Jan 11 17:52:11 crc kubenswrapper[4837]: I0111 17:52:11.270090 4837 scope.go:117] "RemoveContainer" containerID="838c87d73c9cbb7d52b43ccba724b178ed2caace81cc2bb2319851a9d40a6250" Jan 11 17:52:11 crc kubenswrapper[4837]: E0111 17:52:11.270342 4837 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"838c87d73c9cbb7d52b43ccba724b178ed2caace81cc2bb2319851a9d40a6250\": container with ID starting with 838c87d73c9cbb7d52b43ccba724b178ed2caace81cc2bb2319851a9d40a6250 not found: ID does not exist" containerID="838c87d73c9cbb7d52b43ccba724b178ed2caace81cc2bb2319851a9d40a6250" Jan 11 17:52:11 crc kubenswrapper[4837]: I0111 17:52:11.270360 4837 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"838c87d73c9cbb7d52b43ccba724b178ed2caace81cc2bb2319851a9d40a6250"} err="failed to get container status \"838c87d73c9cbb7d52b43ccba724b178ed2caace81cc2bb2319851a9d40a6250\": rpc error: code = NotFound desc = could not find container \"838c87d73c9cbb7d52b43ccba724b178ed2caace81cc2bb2319851a9d40a6250\": container with ID starting with 838c87d73c9cbb7d52b43ccba724b178ed2caace81cc2bb2319851a9d40a6250 not found: ID does not exist" Jan 11 17:52:11 crc kubenswrapper[4837]: I0111 17:52:11.392543 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/23a5746a-8221-457c-8450-bdcded63f5c5-run-httpd\") pod \"ceilometer-0\" (UID: \"23a5746a-8221-457c-8450-bdcded63f5c5\") " pod="openstack/ceilometer-0" Jan 11 17:52:11 crc kubenswrapper[4837]: I0111 17:52:11.392671 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ttjn6\" (UniqueName: \"kubernetes.io/projected/23a5746a-8221-457c-8450-bdcded63f5c5-kube-api-access-ttjn6\") pod \"ceilometer-0\" (UID: \"23a5746a-8221-457c-8450-bdcded63f5c5\") " pod="openstack/ceilometer-0" Jan 11 17:52:11 crc kubenswrapper[4837]: I0111 17:52:11.392720 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/23a5746a-8221-457c-8450-bdcded63f5c5-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"23a5746a-8221-457c-8450-bdcded63f5c5\") " pod="openstack/ceilometer-0" Jan 11 17:52:11 crc kubenswrapper[4837]: I0111 17:52:11.392750 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/23a5746a-8221-457c-8450-bdcded63f5c5-log-httpd\") pod \"ceilometer-0\" (UID: \"23a5746a-8221-457c-8450-bdcded63f5c5\") " pod="openstack/ceilometer-0" Jan 11 17:52:11 crc kubenswrapper[4837]: I0111 17:52:11.392793 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/23a5746a-8221-457c-8450-bdcded63f5c5-scripts\") pod \"ceilometer-0\" (UID: \"23a5746a-8221-457c-8450-bdcded63f5c5\") " pod="openstack/ceilometer-0" Jan 11 17:52:11 crc kubenswrapper[4837]: I0111 17:52:11.392876 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/23a5746a-8221-457c-8450-bdcded63f5c5-config-data\") pod \"ceilometer-0\" (UID: \"23a5746a-8221-457c-8450-bdcded63f5c5\") " pod="openstack/ceilometer-0" Jan 11 17:52:11 crc kubenswrapper[4837]: I0111 17:52:11.392938 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/23a5746a-8221-457c-8450-bdcded63f5c5-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"23a5746a-8221-457c-8450-bdcded63f5c5\") " pod="openstack/ceilometer-0" Jan 11 17:52:11 crc kubenswrapper[4837]: I0111 17:52:11.494942 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/23a5746a-8221-457c-8450-bdcded63f5c5-run-httpd\") pod \"ceilometer-0\" (UID: \"23a5746a-8221-457c-8450-bdcded63f5c5\") " pod="openstack/ceilometer-0" Jan 11 17:52:11 crc kubenswrapper[4837]: I0111 17:52:11.495066 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ttjn6\" (UniqueName: \"kubernetes.io/projected/23a5746a-8221-457c-8450-bdcded63f5c5-kube-api-access-ttjn6\") pod \"ceilometer-0\" (UID: \"23a5746a-8221-457c-8450-bdcded63f5c5\") " pod="openstack/ceilometer-0" Jan 11 17:52:11 crc kubenswrapper[4837]: I0111 17:52:11.495104 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/23a5746a-8221-457c-8450-bdcded63f5c5-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"23a5746a-8221-457c-8450-bdcded63f5c5\") " pod="openstack/ceilometer-0" Jan 11 17:52:11 crc kubenswrapper[4837]: I0111 17:52:11.495130 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/23a5746a-8221-457c-8450-bdcded63f5c5-log-httpd\") pod \"ceilometer-0\" (UID: \"23a5746a-8221-457c-8450-bdcded63f5c5\") " pod="openstack/ceilometer-0" Jan 11 17:52:11 crc kubenswrapper[4837]: I0111 17:52:11.495169 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/23a5746a-8221-457c-8450-bdcded63f5c5-scripts\") pod \"ceilometer-0\" (UID: \"23a5746a-8221-457c-8450-bdcded63f5c5\") " pod="openstack/ceilometer-0" Jan 11 17:52:11 crc kubenswrapper[4837]: I0111 17:52:11.495190 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/23a5746a-8221-457c-8450-bdcded63f5c5-config-data\") pod \"ceilometer-0\" (UID: \"23a5746a-8221-457c-8450-bdcded63f5c5\") " pod="openstack/ceilometer-0" Jan 11 17:52:11 crc kubenswrapper[4837]: I0111 17:52:11.495217 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/23a5746a-8221-457c-8450-bdcded63f5c5-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"23a5746a-8221-457c-8450-bdcded63f5c5\") " pod="openstack/ceilometer-0" Jan 11 17:52:11 crc kubenswrapper[4837]: I0111 17:52:11.495520 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/23a5746a-8221-457c-8450-bdcded63f5c5-log-httpd\") pod \"ceilometer-0\" (UID: \"23a5746a-8221-457c-8450-bdcded63f5c5\") " pod="openstack/ceilometer-0" Jan 11 17:52:11 crc kubenswrapper[4837]: I0111 17:52:11.495918 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/23a5746a-8221-457c-8450-bdcded63f5c5-run-httpd\") pod \"ceilometer-0\" (UID: \"23a5746a-8221-457c-8450-bdcded63f5c5\") " pod="openstack/ceilometer-0" Jan 11 17:52:11 crc kubenswrapper[4837]: I0111 17:52:11.500304 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/23a5746a-8221-457c-8450-bdcded63f5c5-config-data\") pod \"ceilometer-0\" (UID: \"23a5746a-8221-457c-8450-bdcded63f5c5\") " pod="openstack/ceilometer-0" Jan 11 17:52:11 crc kubenswrapper[4837]: I0111 17:52:11.500608 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/23a5746a-8221-457c-8450-bdcded63f5c5-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"23a5746a-8221-457c-8450-bdcded63f5c5\") " pod="openstack/ceilometer-0" Jan 11 17:52:11 crc kubenswrapper[4837]: I0111 17:52:11.500898 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/23a5746a-8221-457c-8450-bdcded63f5c5-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"23a5746a-8221-457c-8450-bdcded63f5c5\") " pod="openstack/ceilometer-0" Jan 11 17:52:11 crc kubenswrapper[4837]: I0111 17:52:11.502365 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/23a5746a-8221-457c-8450-bdcded63f5c5-scripts\") pod \"ceilometer-0\" (UID: \"23a5746a-8221-457c-8450-bdcded63f5c5\") " pod="openstack/ceilometer-0" Jan 11 17:52:11 crc kubenswrapper[4837]: I0111 17:52:11.523016 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ttjn6\" (UniqueName: \"kubernetes.io/projected/23a5746a-8221-457c-8450-bdcded63f5c5-kube-api-access-ttjn6\") pod \"ceilometer-0\" (UID: \"23a5746a-8221-457c-8450-bdcded63f5c5\") " pod="openstack/ceilometer-0" Jan 11 17:52:11 crc kubenswrapper[4837]: I0111 17:52:11.540344 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Jan 11 17:52:11 crc kubenswrapper[4837]: I0111 17:52:11.682178 4837 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-default-internal-api-0" Jan 11 17:52:11 crc kubenswrapper[4837]: I0111 17:52:11.682250 4837 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-default-internal-api-0" Jan 11 17:52:11 crc kubenswrapper[4837]: I0111 17:52:11.722830 4837 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Jan 11 17:52:11 crc kubenswrapper[4837]: I0111 17:52:11.735867 4837 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-default-internal-api-0" Jan 11 17:52:11 crc kubenswrapper[4837]: I0111 17:52:11.753425 4837 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-default-internal-api-0" Jan 11 17:52:11 crc kubenswrapper[4837]: I0111 17:52:11.800180 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"glance\" (UniqueName: \"kubernetes.io/local-volume/local-storage12-crc\") pod \"9669a553-3ac2-4189-86f2-08e5b972e66f\" (UID: \"9669a553-3ac2-4189-86f2-08e5b972e66f\") " Jan 11 17:52:11 crc kubenswrapper[4837]: I0111 17:52:11.800236 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/9669a553-3ac2-4189-86f2-08e5b972e66f-logs\") pod \"9669a553-3ac2-4189-86f2-08e5b972e66f\" (UID: \"9669a553-3ac2-4189-86f2-08e5b972e66f\") " Jan 11 17:52:11 crc kubenswrapper[4837]: I0111 17:52:11.800356 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mhhfr\" (UniqueName: \"kubernetes.io/projected/9669a553-3ac2-4189-86f2-08e5b972e66f-kube-api-access-mhhfr\") pod \"9669a553-3ac2-4189-86f2-08e5b972e66f\" (UID: \"9669a553-3ac2-4189-86f2-08e5b972e66f\") " Jan 11 17:52:11 crc kubenswrapper[4837]: I0111 17:52:11.800407 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9669a553-3ac2-4189-86f2-08e5b972e66f-config-data\") pod \"9669a553-3ac2-4189-86f2-08e5b972e66f\" (UID: \"9669a553-3ac2-4189-86f2-08e5b972e66f\") " Jan 11 17:52:11 crc kubenswrapper[4837]: I0111 17:52:11.800434 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/9669a553-3ac2-4189-86f2-08e5b972e66f-httpd-run\") pod \"9669a553-3ac2-4189-86f2-08e5b972e66f\" (UID: \"9669a553-3ac2-4189-86f2-08e5b972e66f\") " Jan 11 17:52:11 crc kubenswrapper[4837]: I0111 17:52:11.800481 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9669a553-3ac2-4189-86f2-08e5b972e66f-combined-ca-bundle\") pod \"9669a553-3ac2-4189-86f2-08e5b972e66f\" (UID: \"9669a553-3ac2-4189-86f2-08e5b972e66f\") " Jan 11 17:52:11 crc kubenswrapper[4837]: I0111 17:52:11.800523 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/9669a553-3ac2-4189-86f2-08e5b972e66f-public-tls-certs\") pod \"9669a553-3ac2-4189-86f2-08e5b972e66f\" (UID: \"9669a553-3ac2-4189-86f2-08e5b972e66f\") " Jan 11 17:52:11 crc kubenswrapper[4837]: I0111 17:52:11.800586 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/9669a553-3ac2-4189-86f2-08e5b972e66f-scripts\") pod \"9669a553-3ac2-4189-86f2-08e5b972e66f\" (UID: \"9669a553-3ac2-4189-86f2-08e5b972e66f\") " Jan 11 17:52:11 crc kubenswrapper[4837]: I0111 17:52:11.806695 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage12-crc" (OuterVolumeSpecName: "glance") pod "9669a553-3ac2-4189-86f2-08e5b972e66f" (UID: "9669a553-3ac2-4189-86f2-08e5b972e66f"). InnerVolumeSpecName "local-storage12-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Jan 11 17:52:11 crc kubenswrapper[4837]: I0111 17:52:11.807598 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/9669a553-3ac2-4189-86f2-08e5b972e66f-logs" (OuterVolumeSpecName: "logs") pod "9669a553-3ac2-4189-86f2-08e5b972e66f" (UID: "9669a553-3ac2-4189-86f2-08e5b972e66f"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 11 17:52:11 crc kubenswrapper[4837]: I0111 17:52:11.809969 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/9669a553-3ac2-4189-86f2-08e5b972e66f-httpd-run" (OuterVolumeSpecName: "httpd-run") pod "9669a553-3ac2-4189-86f2-08e5b972e66f" (UID: "9669a553-3ac2-4189-86f2-08e5b972e66f"). InnerVolumeSpecName "httpd-run". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 11 17:52:11 crc kubenswrapper[4837]: I0111 17:52:11.811524 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9669a553-3ac2-4189-86f2-08e5b972e66f-kube-api-access-mhhfr" (OuterVolumeSpecName: "kube-api-access-mhhfr") pod "9669a553-3ac2-4189-86f2-08e5b972e66f" (UID: "9669a553-3ac2-4189-86f2-08e5b972e66f"). InnerVolumeSpecName "kube-api-access-mhhfr". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 11 17:52:11 crc kubenswrapper[4837]: I0111 17:52:11.816174 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9669a553-3ac2-4189-86f2-08e5b972e66f-scripts" (OuterVolumeSpecName: "scripts") pod "9669a553-3ac2-4189-86f2-08e5b972e66f" (UID: "9669a553-3ac2-4189-86f2-08e5b972e66f"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 11 17:52:11 crc kubenswrapper[4837]: I0111 17:52:11.850341 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9669a553-3ac2-4189-86f2-08e5b972e66f-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "9669a553-3ac2-4189-86f2-08e5b972e66f" (UID: "9669a553-3ac2-4189-86f2-08e5b972e66f"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 11 17:52:11 crc kubenswrapper[4837]: I0111 17:52:11.865950 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9669a553-3ac2-4189-86f2-08e5b972e66f-config-data" (OuterVolumeSpecName: "config-data") pod "9669a553-3ac2-4189-86f2-08e5b972e66f" (UID: "9669a553-3ac2-4189-86f2-08e5b972e66f"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 11 17:52:11 crc kubenswrapper[4837]: I0111 17:52:11.884462 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9669a553-3ac2-4189-86f2-08e5b972e66f-public-tls-certs" (OuterVolumeSpecName: "public-tls-certs") pod "9669a553-3ac2-4189-86f2-08e5b972e66f" (UID: "9669a553-3ac2-4189-86f2-08e5b972e66f"). InnerVolumeSpecName "public-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 11 17:52:11 crc kubenswrapper[4837]: I0111 17:52:11.902962 4837 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/9669a553-3ac2-4189-86f2-08e5b972e66f-scripts\") on node \"crc\" DevicePath \"\"" Jan 11 17:52:11 crc kubenswrapper[4837]: I0111 17:52:11.903009 4837 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage12-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage12-crc\") on node \"crc\" " Jan 11 17:52:11 crc kubenswrapper[4837]: I0111 17:52:11.903020 4837 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/9669a553-3ac2-4189-86f2-08e5b972e66f-logs\") on node \"crc\" DevicePath \"\"" Jan 11 17:52:11 crc kubenswrapper[4837]: I0111 17:52:11.903031 4837 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mhhfr\" (UniqueName: \"kubernetes.io/projected/9669a553-3ac2-4189-86f2-08e5b972e66f-kube-api-access-mhhfr\") on node \"crc\" DevicePath \"\"" Jan 11 17:52:11 crc kubenswrapper[4837]: I0111 17:52:11.903040 4837 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9669a553-3ac2-4189-86f2-08e5b972e66f-config-data\") on node \"crc\" DevicePath \"\"" Jan 11 17:52:11 crc kubenswrapper[4837]: I0111 17:52:11.903049 4837 reconciler_common.go:293] "Volume detached for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/9669a553-3ac2-4189-86f2-08e5b972e66f-httpd-run\") on node \"crc\" DevicePath \"\"" Jan 11 17:52:11 crc kubenswrapper[4837]: I0111 17:52:11.903056 4837 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9669a553-3ac2-4189-86f2-08e5b972e66f-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 11 17:52:11 crc kubenswrapper[4837]: I0111 17:52:11.903065 4837 reconciler_common.go:293] "Volume detached for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/9669a553-3ac2-4189-86f2-08e5b972e66f-public-tls-certs\") on node \"crc\" DevicePath \"\"" Jan 11 17:52:11 crc kubenswrapper[4837]: I0111 17:52:11.922514 4837 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage12-crc" (UniqueName: "kubernetes.io/local-volume/local-storage12-crc") on node "crc" Jan 11 17:52:12 crc kubenswrapper[4837]: I0111 17:52:12.005138 4837 reconciler_common.go:293] "Volume detached for volume \"local-storage12-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage12-crc\") on node \"crc\" DevicePath \"\"" Jan 11 17:52:12 crc kubenswrapper[4837]: I0111 17:52:12.036100 4837 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Jan 11 17:52:12 crc kubenswrapper[4837]: I0111 17:52:12.111163 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"23a5746a-8221-457c-8450-bdcded63f5c5","Type":"ContainerStarted","Data":"7d369eff557df17bd4d4b5703e2b58ac763ca3f267409881c904de646090ac1b"} Jan 11 17:52:12 crc kubenswrapper[4837]: I0111 17:52:12.112992 4837 generic.go:334] "Generic (PLEG): container finished" podID="9669a553-3ac2-4189-86f2-08e5b972e66f" containerID="b03561769cf8682e22a6f07ad29243939f373c2e2bbbdd661ab942b564c98613" exitCode=0 Jan 11 17:52:12 crc kubenswrapper[4837]: I0111 17:52:12.113061 4837 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Jan 11 17:52:12 crc kubenswrapper[4837]: I0111 17:52:12.113078 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"9669a553-3ac2-4189-86f2-08e5b972e66f","Type":"ContainerDied","Data":"b03561769cf8682e22a6f07ad29243939f373c2e2bbbdd661ab942b564c98613"} Jan 11 17:52:12 crc kubenswrapper[4837]: I0111 17:52:12.113427 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"9669a553-3ac2-4189-86f2-08e5b972e66f","Type":"ContainerDied","Data":"6e9ae8e0d53b14efbab3700a8054fcc5237fed420f5b695b4b8af5293168a4f8"} Jan 11 17:52:12 crc kubenswrapper[4837]: I0111 17:52:12.113454 4837 scope.go:117] "RemoveContainer" containerID="b03561769cf8682e22a6f07ad29243939f373c2e2bbbdd661ab942b564c98613" Jan 11 17:52:12 crc kubenswrapper[4837]: I0111 17:52:12.116340 4837 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-default-internal-api-0" Jan 11 17:52:12 crc kubenswrapper[4837]: I0111 17:52:12.116376 4837 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-default-internal-api-0" Jan 11 17:52:12 crc kubenswrapper[4837]: I0111 17:52:12.138948 4837 scope.go:117] "RemoveContainer" containerID="f8b0b716163d47875c4c1afde9b512bd9dec9d5c171052ca7623704f62ed4244" Jan 11 17:52:12 crc kubenswrapper[4837]: I0111 17:52:12.164515 4837 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-external-api-0"] Jan 11 17:52:12 crc kubenswrapper[4837]: I0111 17:52:12.165724 4837 scope.go:117] "RemoveContainer" containerID="b03561769cf8682e22a6f07ad29243939f373c2e2bbbdd661ab942b564c98613" Jan 11 17:52:12 crc kubenswrapper[4837]: E0111 17:52:12.169486 4837 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b03561769cf8682e22a6f07ad29243939f373c2e2bbbdd661ab942b564c98613\": container with ID starting with b03561769cf8682e22a6f07ad29243939f373c2e2bbbdd661ab942b564c98613 not found: ID does not exist" containerID="b03561769cf8682e22a6f07ad29243939f373c2e2bbbdd661ab942b564c98613" Jan 11 17:52:12 crc kubenswrapper[4837]: I0111 17:52:12.169527 4837 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b03561769cf8682e22a6f07ad29243939f373c2e2bbbdd661ab942b564c98613"} err="failed to get container status \"b03561769cf8682e22a6f07ad29243939f373c2e2bbbdd661ab942b564c98613\": rpc error: code = NotFound desc = could not find container \"b03561769cf8682e22a6f07ad29243939f373c2e2bbbdd661ab942b564c98613\": container with ID starting with b03561769cf8682e22a6f07ad29243939f373c2e2bbbdd661ab942b564c98613 not found: ID does not exist" Jan 11 17:52:12 crc kubenswrapper[4837]: I0111 17:52:12.169560 4837 scope.go:117] "RemoveContainer" containerID="f8b0b716163d47875c4c1afde9b512bd9dec9d5c171052ca7623704f62ed4244" Jan 11 17:52:12 crc kubenswrapper[4837]: E0111 17:52:12.172817 4837 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f8b0b716163d47875c4c1afde9b512bd9dec9d5c171052ca7623704f62ed4244\": container with ID starting with f8b0b716163d47875c4c1afde9b512bd9dec9d5c171052ca7623704f62ed4244 not found: ID does not exist" containerID="f8b0b716163d47875c4c1afde9b512bd9dec9d5c171052ca7623704f62ed4244" Jan 11 17:52:12 crc kubenswrapper[4837]: I0111 17:52:12.172865 4837 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f8b0b716163d47875c4c1afde9b512bd9dec9d5c171052ca7623704f62ed4244"} err="failed to get container status \"f8b0b716163d47875c4c1afde9b512bd9dec9d5c171052ca7623704f62ed4244\": rpc error: code = NotFound desc = could not find container \"f8b0b716163d47875c4c1afde9b512bd9dec9d5c171052ca7623704f62ed4244\": container with ID starting with f8b0b716163d47875c4c1afde9b512bd9dec9d5c171052ca7623704f62ed4244 not found: ID does not exist" Jan 11 17:52:12 crc kubenswrapper[4837]: I0111 17:52:12.175692 4837 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glance-default-external-api-0"] Jan 11 17:52:12 crc kubenswrapper[4837]: I0111 17:52:12.187825 4837 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-default-external-api-0"] Jan 11 17:52:12 crc kubenswrapper[4837]: E0111 17:52:12.188277 4837 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9669a553-3ac2-4189-86f2-08e5b972e66f" containerName="glance-httpd" Jan 11 17:52:12 crc kubenswrapper[4837]: I0111 17:52:12.188302 4837 state_mem.go:107] "Deleted CPUSet assignment" podUID="9669a553-3ac2-4189-86f2-08e5b972e66f" containerName="glance-httpd" Jan 11 17:52:12 crc kubenswrapper[4837]: E0111 17:52:12.188336 4837 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9669a553-3ac2-4189-86f2-08e5b972e66f" containerName="glance-log" Jan 11 17:52:12 crc kubenswrapper[4837]: I0111 17:52:12.188346 4837 state_mem.go:107] "Deleted CPUSet assignment" podUID="9669a553-3ac2-4189-86f2-08e5b972e66f" containerName="glance-log" Jan 11 17:52:12 crc kubenswrapper[4837]: I0111 17:52:12.188560 4837 memory_manager.go:354] "RemoveStaleState removing state" podUID="9669a553-3ac2-4189-86f2-08e5b972e66f" containerName="glance-log" Jan 11 17:52:12 crc kubenswrapper[4837]: I0111 17:52:12.188587 4837 memory_manager.go:354] "RemoveStaleState removing state" podUID="9669a553-3ac2-4189-86f2-08e5b972e66f" containerName="glance-httpd" Jan 11 17:52:12 crc kubenswrapper[4837]: I0111 17:52:12.189749 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Jan 11 17:52:12 crc kubenswrapper[4837]: I0111 17:52:12.192099 4837 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-default-external-config-data" Jan 11 17:52:12 crc kubenswrapper[4837]: I0111 17:52:12.192375 4837 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-glance-default-public-svc" Jan 11 17:52:12 crc kubenswrapper[4837]: I0111 17:52:12.199384 4837 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-external-api-0"] Jan 11 17:52:12 crc kubenswrapper[4837]: I0111 17:52:12.311888 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/f3914d94-6947-4a7c-ac5e-45bfe15ae144-public-tls-certs\") pod \"glance-default-external-api-0\" (UID: \"f3914d94-6947-4a7c-ac5e-45bfe15ae144\") " pod="openstack/glance-default-external-api-0" Jan 11 17:52:12 crc kubenswrapper[4837]: I0111 17:52:12.311951 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f3914d94-6947-4a7c-ac5e-45bfe15ae144-config-data\") pod \"glance-default-external-api-0\" (UID: \"f3914d94-6947-4a7c-ac5e-45bfe15ae144\") " pod="openstack/glance-default-external-api-0" Jan 11 17:52:12 crc kubenswrapper[4837]: I0111 17:52:12.312083 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f3914d94-6947-4a7c-ac5e-45bfe15ae144-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"f3914d94-6947-4a7c-ac5e-45bfe15ae144\") " pod="openstack/glance-default-external-api-0" Jan 11 17:52:12 crc kubenswrapper[4837]: I0111 17:52:12.312197 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage12-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage12-crc\") pod \"glance-default-external-api-0\" (UID: \"f3914d94-6947-4a7c-ac5e-45bfe15ae144\") " pod="openstack/glance-default-external-api-0" Jan 11 17:52:12 crc kubenswrapper[4837]: I0111 17:52:12.312450 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/f3914d94-6947-4a7c-ac5e-45bfe15ae144-scripts\") pod \"glance-default-external-api-0\" (UID: \"f3914d94-6947-4a7c-ac5e-45bfe15ae144\") " pod="openstack/glance-default-external-api-0" Jan 11 17:52:12 crc kubenswrapper[4837]: I0111 17:52:12.312532 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/f3914d94-6947-4a7c-ac5e-45bfe15ae144-logs\") pod \"glance-default-external-api-0\" (UID: \"f3914d94-6947-4a7c-ac5e-45bfe15ae144\") " pod="openstack/glance-default-external-api-0" Jan 11 17:52:12 crc kubenswrapper[4837]: I0111 17:52:12.312663 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gt44g\" (UniqueName: \"kubernetes.io/projected/f3914d94-6947-4a7c-ac5e-45bfe15ae144-kube-api-access-gt44g\") pod \"glance-default-external-api-0\" (UID: \"f3914d94-6947-4a7c-ac5e-45bfe15ae144\") " pod="openstack/glance-default-external-api-0" Jan 11 17:52:12 crc kubenswrapper[4837]: I0111 17:52:12.312708 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/f3914d94-6947-4a7c-ac5e-45bfe15ae144-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"f3914d94-6947-4a7c-ac5e-45bfe15ae144\") " pod="openstack/glance-default-external-api-0" Jan 11 17:52:12 crc kubenswrapper[4837]: I0111 17:52:12.374206 4837 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5c4f4894-6491-47db-8b74-f62a2aa8e39b" path="/var/lib/kubelet/pods/5c4f4894-6491-47db-8b74-f62a2aa8e39b/volumes" Jan 11 17:52:12 crc kubenswrapper[4837]: I0111 17:52:12.374964 4837 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="9669a553-3ac2-4189-86f2-08e5b972e66f" path="/var/lib/kubelet/pods/9669a553-3ac2-4189-86f2-08e5b972e66f/volumes" Jan 11 17:52:12 crc kubenswrapper[4837]: I0111 17:52:12.414257 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gt44g\" (UniqueName: \"kubernetes.io/projected/f3914d94-6947-4a7c-ac5e-45bfe15ae144-kube-api-access-gt44g\") pod \"glance-default-external-api-0\" (UID: \"f3914d94-6947-4a7c-ac5e-45bfe15ae144\") " pod="openstack/glance-default-external-api-0" Jan 11 17:52:12 crc kubenswrapper[4837]: I0111 17:52:12.414312 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/f3914d94-6947-4a7c-ac5e-45bfe15ae144-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"f3914d94-6947-4a7c-ac5e-45bfe15ae144\") " pod="openstack/glance-default-external-api-0" Jan 11 17:52:12 crc kubenswrapper[4837]: I0111 17:52:12.414366 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/f3914d94-6947-4a7c-ac5e-45bfe15ae144-public-tls-certs\") pod \"glance-default-external-api-0\" (UID: \"f3914d94-6947-4a7c-ac5e-45bfe15ae144\") " pod="openstack/glance-default-external-api-0" Jan 11 17:52:12 crc kubenswrapper[4837]: I0111 17:52:12.414404 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f3914d94-6947-4a7c-ac5e-45bfe15ae144-config-data\") pod \"glance-default-external-api-0\" (UID: \"f3914d94-6947-4a7c-ac5e-45bfe15ae144\") " pod="openstack/glance-default-external-api-0" Jan 11 17:52:12 crc kubenswrapper[4837]: I0111 17:52:12.414443 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f3914d94-6947-4a7c-ac5e-45bfe15ae144-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"f3914d94-6947-4a7c-ac5e-45bfe15ae144\") " pod="openstack/glance-default-external-api-0" Jan 11 17:52:12 crc kubenswrapper[4837]: I0111 17:52:12.414489 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage12-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage12-crc\") pod \"glance-default-external-api-0\" (UID: \"f3914d94-6947-4a7c-ac5e-45bfe15ae144\") " pod="openstack/glance-default-external-api-0" Jan 11 17:52:12 crc kubenswrapper[4837]: I0111 17:52:12.414588 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/f3914d94-6947-4a7c-ac5e-45bfe15ae144-scripts\") pod \"glance-default-external-api-0\" (UID: \"f3914d94-6947-4a7c-ac5e-45bfe15ae144\") " pod="openstack/glance-default-external-api-0" Jan 11 17:52:12 crc kubenswrapper[4837]: I0111 17:52:12.414629 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/f3914d94-6947-4a7c-ac5e-45bfe15ae144-logs\") pod \"glance-default-external-api-0\" (UID: \"f3914d94-6947-4a7c-ac5e-45bfe15ae144\") " pod="openstack/glance-default-external-api-0" Jan 11 17:52:12 crc kubenswrapper[4837]: I0111 17:52:12.414713 4837 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage12-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage12-crc\") pod \"glance-default-external-api-0\" (UID: \"f3914d94-6947-4a7c-ac5e-45bfe15ae144\") device mount path \"/mnt/openstack/pv12\"" pod="openstack/glance-default-external-api-0" Jan 11 17:52:12 crc kubenswrapper[4837]: I0111 17:52:12.414960 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/f3914d94-6947-4a7c-ac5e-45bfe15ae144-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"f3914d94-6947-4a7c-ac5e-45bfe15ae144\") " pod="openstack/glance-default-external-api-0" Jan 11 17:52:12 crc kubenswrapper[4837]: I0111 17:52:12.415148 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/f3914d94-6947-4a7c-ac5e-45bfe15ae144-logs\") pod \"glance-default-external-api-0\" (UID: \"f3914d94-6947-4a7c-ac5e-45bfe15ae144\") " pod="openstack/glance-default-external-api-0" Jan 11 17:52:12 crc kubenswrapper[4837]: I0111 17:52:12.422074 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f3914d94-6947-4a7c-ac5e-45bfe15ae144-config-data\") pod \"glance-default-external-api-0\" (UID: \"f3914d94-6947-4a7c-ac5e-45bfe15ae144\") " pod="openstack/glance-default-external-api-0" Jan 11 17:52:12 crc kubenswrapper[4837]: I0111 17:52:12.422431 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/f3914d94-6947-4a7c-ac5e-45bfe15ae144-public-tls-certs\") pod \"glance-default-external-api-0\" (UID: \"f3914d94-6947-4a7c-ac5e-45bfe15ae144\") " pod="openstack/glance-default-external-api-0" Jan 11 17:52:12 crc kubenswrapper[4837]: I0111 17:52:12.422499 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f3914d94-6947-4a7c-ac5e-45bfe15ae144-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"f3914d94-6947-4a7c-ac5e-45bfe15ae144\") " pod="openstack/glance-default-external-api-0" Jan 11 17:52:12 crc kubenswrapper[4837]: I0111 17:52:12.440212 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/f3914d94-6947-4a7c-ac5e-45bfe15ae144-scripts\") pod \"glance-default-external-api-0\" (UID: \"f3914d94-6947-4a7c-ac5e-45bfe15ae144\") " pod="openstack/glance-default-external-api-0" Jan 11 17:52:12 crc kubenswrapper[4837]: I0111 17:52:12.452124 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage12-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage12-crc\") pod \"glance-default-external-api-0\" (UID: \"f3914d94-6947-4a7c-ac5e-45bfe15ae144\") " pod="openstack/glance-default-external-api-0" Jan 11 17:52:12 crc kubenswrapper[4837]: I0111 17:52:12.454375 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gt44g\" (UniqueName: \"kubernetes.io/projected/f3914d94-6947-4a7c-ac5e-45bfe15ae144-kube-api-access-gt44g\") pod \"glance-default-external-api-0\" (UID: \"f3914d94-6947-4a7c-ac5e-45bfe15ae144\") " pod="openstack/glance-default-external-api-0" Jan 11 17:52:12 crc kubenswrapper[4837]: I0111 17:52:12.471569 4837 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-api-db-create-h9f9z"] Jan 11 17:52:12 crc kubenswrapper[4837]: I0111 17:52:12.472866 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-db-create-h9f9z" Jan 11 17:52:12 crc kubenswrapper[4837]: I0111 17:52:12.501587 4837 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-db-create-h9f9z"] Jan 11 17:52:12 crc kubenswrapper[4837]: I0111 17:52:12.509947 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Jan 11 17:52:12 crc kubenswrapper[4837]: I0111 17:52:12.516085 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/54d9b221-84b0-4c01-9870-30500cafdaf5-operator-scripts\") pod \"nova-api-db-create-h9f9z\" (UID: \"54d9b221-84b0-4c01-9870-30500cafdaf5\") " pod="openstack/nova-api-db-create-h9f9z" Jan 11 17:52:12 crc kubenswrapper[4837]: I0111 17:52:12.516215 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kx5xr\" (UniqueName: \"kubernetes.io/projected/54d9b221-84b0-4c01-9870-30500cafdaf5-kube-api-access-kx5xr\") pod \"nova-api-db-create-h9f9z\" (UID: \"54d9b221-84b0-4c01-9870-30500cafdaf5\") " pod="openstack/nova-api-db-create-h9f9z" Jan 11 17:52:12 crc kubenswrapper[4837]: I0111 17:52:12.588200 4837 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell0-db-create-hldbw"] Jan 11 17:52:12 crc kubenswrapper[4837]: I0111 17:52:12.596097 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-db-create-hldbw" Jan 11 17:52:12 crc kubenswrapper[4837]: I0111 17:52:12.620810 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kx5xr\" (UniqueName: \"kubernetes.io/projected/54d9b221-84b0-4c01-9870-30500cafdaf5-kube-api-access-kx5xr\") pod \"nova-api-db-create-h9f9z\" (UID: \"54d9b221-84b0-4c01-9870-30500cafdaf5\") " pod="openstack/nova-api-db-create-h9f9z" Jan 11 17:52:12 crc kubenswrapper[4837]: I0111 17:52:12.620980 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/54d9b221-84b0-4c01-9870-30500cafdaf5-operator-scripts\") pod \"nova-api-db-create-h9f9z\" (UID: \"54d9b221-84b0-4c01-9870-30500cafdaf5\") " pod="openstack/nova-api-db-create-h9f9z" Jan 11 17:52:12 crc kubenswrapper[4837]: I0111 17:52:12.622805 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/54d9b221-84b0-4c01-9870-30500cafdaf5-operator-scripts\") pod \"nova-api-db-create-h9f9z\" (UID: \"54d9b221-84b0-4c01-9870-30500cafdaf5\") " pod="openstack/nova-api-db-create-h9f9z" Jan 11 17:52:12 crc kubenswrapper[4837]: I0111 17:52:12.631410 4837 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-api-7633-account-create-update-2l7l4"] Jan 11 17:52:12 crc kubenswrapper[4837]: I0111 17:52:12.646093 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-7633-account-create-update-2l7l4" Jan 11 17:52:12 crc kubenswrapper[4837]: I0111 17:52:12.652660 4837 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-api-db-secret" Jan 11 17:52:12 crc kubenswrapper[4837]: I0111 17:52:12.674694 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-kx5xr\" (UniqueName: \"kubernetes.io/projected/54d9b221-84b0-4c01-9870-30500cafdaf5-kube-api-access-kx5xr\") pod \"nova-api-db-create-h9f9z\" (UID: \"54d9b221-84b0-4c01-9870-30500cafdaf5\") " pod="openstack/nova-api-db-create-h9f9z" Jan 11 17:52:12 crc kubenswrapper[4837]: I0111 17:52:12.678113 4837 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-db-create-hldbw"] Jan 11 17:52:12 crc kubenswrapper[4837]: I0111 17:52:12.712054 4837 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-7633-account-create-update-2l7l4"] Jan 11 17:52:12 crc kubenswrapper[4837]: I0111 17:52:12.725339 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bzzgz\" (UniqueName: \"kubernetes.io/projected/dc121be2-7012-40c1-8dbe-722d0e838685-kube-api-access-bzzgz\") pod \"nova-cell0-db-create-hldbw\" (UID: \"dc121be2-7012-40c1-8dbe-722d0e838685\") " pod="openstack/nova-cell0-db-create-hldbw" Jan 11 17:52:12 crc kubenswrapper[4837]: I0111 17:52:12.725620 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/dc121be2-7012-40c1-8dbe-722d0e838685-operator-scripts\") pod \"nova-cell0-db-create-hldbw\" (UID: \"dc121be2-7012-40c1-8dbe-722d0e838685\") " pod="openstack/nova-cell0-db-create-hldbw" Jan 11 17:52:12 crc kubenswrapper[4837]: I0111 17:52:12.725824 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/89212816-4e77-490e-ae62-d3aef36b0570-operator-scripts\") pod \"nova-api-7633-account-create-update-2l7l4\" (UID: \"89212816-4e77-490e-ae62-d3aef36b0570\") " pod="openstack/nova-api-7633-account-create-update-2l7l4" Jan 11 17:52:12 crc kubenswrapper[4837]: I0111 17:52:12.736919 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-q8k4d\" (UniqueName: \"kubernetes.io/projected/89212816-4e77-490e-ae62-d3aef36b0570-kube-api-access-q8k4d\") pod \"nova-api-7633-account-create-update-2l7l4\" (UID: \"89212816-4e77-490e-ae62-d3aef36b0570\") " pod="openstack/nova-api-7633-account-create-update-2l7l4" Jan 11 17:52:12 crc kubenswrapper[4837]: I0111 17:52:12.754052 4837 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-db-create-v96ck"] Jan 11 17:52:12 crc kubenswrapper[4837]: I0111 17:52:12.758898 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-db-create-v96ck" Jan 11 17:52:12 crc kubenswrapper[4837]: I0111 17:52:12.787454 4837 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-db-create-v96ck"] Jan 11 17:52:12 crc kubenswrapper[4837]: I0111 17:52:12.793404 4837 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell0-f0ad-account-create-update-6nvvv"] Jan 11 17:52:12 crc kubenswrapper[4837]: I0111 17:52:12.795962 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-f0ad-account-create-update-6nvvv" Jan 11 17:52:12 crc kubenswrapper[4837]: I0111 17:52:12.803503 4837 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell0-db-secret" Jan 11 17:52:12 crc kubenswrapper[4837]: I0111 17:52:12.806896 4837 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-f0ad-account-create-update-6nvvv"] Jan 11 17:52:12 crc kubenswrapper[4837]: I0111 17:52:12.839258 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/85847325-510d-4024-86ce-271201c83e9a-operator-scripts\") pod \"nova-cell1-db-create-v96ck\" (UID: \"85847325-510d-4024-86ce-271201c83e9a\") " pod="openstack/nova-cell1-db-create-v96ck" Jan 11 17:52:12 crc kubenswrapper[4837]: I0111 17:52:12.839325 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-q8k4d\" (UniqueName: \"kubernetes.io/projected/89212816-4e77-490e-ae62-d3aef36b0570-kube-api-access-q8k4d\") pod \"nova-api-7633-account-create-update-2l7l4\" (UID: \"89212816-4e77-490e-ae62-d3aef36b0570\") " pod="openstack/nova-api-7633-account-create-update-2l7l4" Jan 11 17:52:12 crc kubenswrapper[4837]: I0111 17:52:12.839377 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-f9pr9\" (UniqueName: \"kubernetes.io/projected/85847325-510d-4024-86ce-271201c83e9a-kube-api-access-f9pr9\") pod \"nova-cell1-db-create-v96ck\" (UID: \"85847325-510d-4024-86ce-271201c83e9a\") " pod="openstack/nova-cell1-db-create-v96ck" Jan 11 17:52:12 crc kubenswrapper[4837]: I0111 17:52:12.839395 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bzzgz\" (UniqueName: \"kubernetes.io/projected/dc121be2-7012-40c1-8dbe-722d0e838685-kube-api-access-bzzgz\") pod \"nova-cell0-db-create-hldbw\" (UID: \"dc121be2-7012-40c1-8dbe-722d0e838685\") " pod="openstack/nova-cell0-db-create-hldbw" Jan 11 17:52:12 crc kubenswrapper[4837]: I0111 17:52:12.839429 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/dc121be2-7012-40c1-8dbe-722d0e838685-operator-scripts\") pod \"nova-cell0-db-create-hldbw\" (UID: \"dc121be2-7012-40c1-8dbe-722d0e838685\") " pod="openstack/nova-cell0-db-create-hldbw" Jan 11 17:52:12 crc kubenswrapper[4837]: I0111 17:52:12.839477 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/89212816-4e77-490e-ae62-d3aef36b0570-operator-scripts\") pod \"nova-api-7633-account-create-update-2l7l4\" (UID: \"89212816-4e77-490e-ae62-d3aef36b0570\") " pod="openstack/nova-api-7633-account-create-update-2l7l4" Jan 11 17:52:12 crc kubenswrapper[4837]: I0111 17:52:12.840113 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/89212816-4e77-490e-ae62-d3aef36b0570-operator-scripts\") pod \"nova-api-7633-account-create-update-2l7l4\" (UID: \"89212816-4e77-490e-ae62-d3aef36b0570\") " pod="openstack/nova-api-7633-account-create-update-2l7l4" Jan 11 17:52:12 crc kubenswrapper[4837]: I0111 17:52:12.840561 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/dc121be2-7012-40c1-8dbe-722d0e838685-operator-scripts\") pod \"nova-cell0-db-create-hldbw\" (UID: \"dc121be2-7012-40c1-8dbe-722d0e838685\") " pod="openstack/nova-cell0-db-create-hldbw" Jan 11 17:52:12 crc kubenswrapper[4837]: I0111 17:52:12.870768 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-db-create-h9f9z" Jan 11 17:52:12 crc kubenswrapper[4837]: I0111 17:52:12.871330 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-q8k4d\" (UniqueName: \"kubernetes.io/projected/89212816-4e77-490e-ae62-d3aef36b0570-kube-api-access-q8k4d\") pod \"nova-api-7633-account-create-update-2l7l4\" (UID: \"89212816-4e77-490e-ae62-d3aef36b0570\") " pod="openstack/nova-api-7633-account-create-update-2l7l4" Jan 11 17:52:12 crc kubenswrapper[4837]: I0111 17:52:12.874107 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bzzgz\" (UniqueName: \"kubernetes.io/projected/dc121be2-7012-40c1-8dbe-722d0e838685-kube-api-access-bzzgz\") pod \"nova-cell0-db-create-hldbw\" (UID: \"dc121be2-7012-40c1-8dbe-722d0e838685\") " pod="openstack/nova-cell0-db-create-hldbw" Jan 11 17:52:12 crc kubenswrapper[4837]: I0111 17:52:12.942515 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/b1f55033-05b1-49a2-8848-17633aeea8ca-operator-scripts\") pod \"nova-cell0-f0ad-account-create-update-6nvvv\" (UID: \"b1f55033-05b1-49a2-8848-17633aeea8ca\") " pod="openstack/nova-cell0-f0ad-account-create-update-6nvvv" Jan 11 17:52:12 crc kubenswrapper[4837]: I0111 17:52:12.943080 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cvqng\" (UniqueName: \"kubernetes.io/projected/b1f55033-05b1-49a2-8848-17633aeea8ca-kube-api-access-cvqng\") pod \"nova-cell0-f0ad-account-create-update-6nvvv\" (UID: \"b1f55033-05b1-49a2-8848-17633aeea8ca\") " pod="openstack/nova-cell0-f0ad-account-create-update-6nvvv" Jan 11 17:52:12 crc kubenswrapper[4837]: I0111 17:52:12.943146 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/85847325-510d-4024-86ce-271201c83e9a-operator-scripts\") pod \"nova-cell1-db-create-v96ck\" (UID: \"85847325-510d-4024-86ce-271201c83e9a\") " pod="openstack/nova-cell1-db-create-v96ck" Jan 11 17:52:12 crc kubenswrapper[4837]: I0111 17:52:12.943280 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-f9pr9\" (UniqueName: \"kubernetes.io/projected/85847325-510d-4024-86ce-271201c83e9a-kube-api-access-f9pr9\") pod \"nova-cell1-db-create-v96ck\" (UID: \"85847325-510d-4024-86ce-271201c83e9a\") " pod="openstack/nova-cell1-db-create-v96ck" Jan 11 17:52:12 crc kubenswrapper[4837]: I0111 17:52:12.944567 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/85847325-510d-4024-86ce-271201c83e9a-operator-scripts\") pod \"nova-cell1-db-create-v96ck\" (UID: \"85847325-510d-4024-86ce-271201c83e9a\") " pod="openstack/nova-cell1-db-create-v96ck" Jan 11 17:52:12 crc kubenswrapper[4837]: I0111 17:52:12.957706 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-db-create-hldbw" Jan 11 17:52:12 crc kubenswrapper[4837]: I0111 17:52:12.985529 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-f9pr9\" (UniqueName: \"kubernetes.io/projected/85847325-510d-4024-86ce-271201c83e9a-kube-api-access-f9pr9\") pod \"nova-cell1-db-create-v96ck\" (UID: \"85847325-510d-4024-86ce-271201c83e9a\") " pod="openstack/nova-cell1-db-create-v96ck" Jan 11 17:52:12 crc kubenswrapper[4837]: I0111 17:52:12.995088 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-7633-account-create-update-2l7l4" Jan 11 17:52:13 crc kubenswrapper[4837]: I0111 17:52:13.009513 4837 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-e41d-account-create-update-l96tm"] Jan 11 17:52:13 crc kubenswrapper[4837]: I0111 17:52:13.010905 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-e41d-account-create-update-l96tm" Jan 11 17:52:13 crc kubenswrapper[4837]: I0111 17:52:13.017827 4837 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-db-secret" Jan 11 17:52:13 crc kubenswrapper[4837]: I0111 17:52:13.044917 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/b1f55033-05b1-49a2-8848-17633aeea8ca-operator-scripts\") pod \"nova-cell0-f0ad-account-create-update-6nvvv\" (UID: \"b1f55033-05b1-49a2-8848-17633aeea8ca\") " pod="openstack/nova-cell0-f0ad-account-create-update-6nvvv" Jan 11 17:52:13 crc kubenswrapper[4837]: I0111 17:52:13.045006 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cvqng\" (UniqueName: \"kubernetes.io/projected/b1f55033-05b1-49a2-8848-17633aeea8ca-kube-api-access-cvqng\") pod \"nova-cell0-f0ad-account-create-update-6nvvv\" (UID: \"b1f55033-05b1-49a2-8848-17633aeea8ca\") " pod="openstack/nova-cell0-f0ad-account-create-update-6nvvv" Jan 11 17:52:13 crc kubenswrapper[4837]: I0111 17:52:13.046072 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/b1f55033-05b1-49a2-8848-17633aeea8ca-operator-scripts\") pod \"nova-cell0-f0ad-account-create-update-6nvvv\" (UID: \"b1f55033-05b1-49a2-8848-17633aeea8ca\") " pod="openstack/nova-cell0-f0ad-account-create-update-6nvvv" Jan 11 17:52:13 crc kubenswrapper[4837]: I0111 17:52:13.050217 4837 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-e41d-account-create-update-l96tm"] Jan 11 17:52:13 crc kubenswrapper[4837]: I0111 17:52:13.072883 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-cvqng\" (UniqueName: \"kubernetes.io/projected/b1f55033-05b1-49a2-8848-17633aeea8ca-kube-api-access-cvqng\") pod \"nova-cell0-f0ad-account-create-update-6nvvv\" (UID: \"b1f55033-05b1-49a2-8848-17633aeea8ca\") " pod="openstack/nova-cell0-f0ad-account-create-update-6nvvv" Jan 11 17:52:13 crc kubenswrapper[4837]: I0111 17:52:13.088640 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-db-create-v96ck" Jan 11 17:52:13 crc kubenswrapper[4837]: I0111 17:52:13.124947 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-f0ad-account-create-update-6nvvv" Jan 11 17:52:13 crc kubenswrapper[4837]: I0111 17:52:13.147035 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/a509deff-a6eb-435a-8739-bc5b0489d32b-operator-scripts\") pod \"nova-cell1-e41d-account-create-update-l96tm\" (UID: \"a509deff-a6eb-435a-8739-bc5b0489d32b\") " pod="openstack/nova-cell1-e41d-account-create-update-l96tm" Jan 11 17:52:13 crc kubenswrapper[4837]: I0111 17:52:13.147283 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qlpgt\" (UniqueName: \"kubernetes.io/projected/a509deff-a6eb-435a-8739-bc5b0489d32b-kube-api-access-qlpgt\") pod \"nova-cell1-e41d-account-create-update-l96tm\" (UID: \"a509deff-a6eb-435a-8739-bc5b0489d32b\") " pod="openstack/nova-cell1-e41d-account-create-update-l96tm" Jan 11 17:52:13 crc kubenswrapper[4837]: I0111 17:52:13.151251 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"23a5746a-8221-457c-8450-bdcded63f5c5","Type":"ContainerStarted","Data":"eda43633915aae34c66b30d5cf6200395a8485bfa9dba6f466fbb9b8caa65937"} Jan 11 17:52:13 crc kubenswrapper[4837]: I0111 17:52:13.249884 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/a509deff-a6eb-435a-8739-bc5b0489d32b-operator-scripts\") pod \"nova-cell1-e41d-account-create-update-l96tm\" (UID: \"a509deff-a6eb-435a-8739-bc5b0489d32b\") " pod="openstack/nova-cell1-e41d-account-create-update-l96tm" Jan 11 17:52:13 crc kubenswrapper[4837]: I0111 17:52:13.250269 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qlpgt\" (UniqueName: \"kubernetes.io/projected/a509deff-a6eb-435a-8739-bc5b0489d32b-kube-api-access-qlpgt\") pod \"nova-cell1-e41d-account-create-update-l96tm\" (UID: \"a509deff-a6eb-435a-8739-bc5b0489d32b\") " pod="openstack/nova-cell1-e41d-account-create-update-l96tm" Jan 11 17:52:13 crc kubenswrapper[4837]: I0111 17:52:13.252664 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/a509deff-a6eb-435a-8739-bc5b0489d32b-operator-scripts\") pod \"nova-cell1-e41d-account-create-update-l96tm\" (UID: \"a509deff-a6eb-435a-8739-bc5b0489d32b\") " pod="openstack/nova-cell1-e41d-account-create-update-l96tm" Jan 11 17:52:13 crc kubenswrapper[4837]: I0111 17:52:13.268520 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qlpgt\" (UniqueName: \"kubernetes.io/projected/a509deff-a6eb-435a-8739-bc5b0489d32b-kube-api-access-qlpgt\") pod \"nova-cell1-e41d-account-create-update-l96tm\" (UID: \"a509deff-a6eb-435a-8739-bc5b0489d32b\") " pod="openstack/nova-cell1-e41d-account-create-update-l96tm" Jan 11 17:52:13 crc kubenswrapper[4837]: I0111 17:52:13.271212 4837 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-external-api-0"] Jan 11 17:52:13 crc kubenswrapper[4837]: W0111 17:52:13.282954 4837 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podf3914d94_6947_4a7c_ac5e_45bfe15ae144.slice/crio-a947157518a85bbc94baa93884dd0e9ae65a84bea59f9df8e8032f834a24c083 WatchSource:0}: Error finding container a947157518a85bbc94baa93884dd0e9ae65a84bea59f9df8e8032f834a24c083: Status 404 returned error can't find the container with id a947157518a85bbc94baa93884dd0e9ae65a84bea59f9df8e8032f834a24c083 Jan 11 17:52:13 crc kubenswrapper[4837]: I0111 17:52:13.518553 4837 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-db-create-h9f9z"] Jan 11 17:52:13 crc kubenswrapper[4837]: I0111 17:52:13.569074 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-e41d-account-create-update-l96tm" Jan 11 17:52:13 crc kubenswrapper[4837]: I0111 17:52:13.802007 4837 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-f0ad-account-create-update-6nvvv"] Jan 11 17:52:13 crc kubenswrapper[4837]: I0111 17:52:13.819052 4837 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-db-create-hldbw"] Jan 11 17:52:13 crc kubenswrapper[4837]: I0111 17:52:13.915860 4837 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-db-create-v96ck"] Jan 11 17:52:14 crc kubenswrapper[4837]: I0111 17:52:14.036820 4837 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-7633-account-create-update-2l7l4"] Jan 11 17:52:14 crc kubenswrapper[4837]: I0111 17:52:14.118419 4837 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-e41d-account-create-update-l96tm"] Jan 11 17:52:14 crc kubenswrapper[4837]: I0111 17:52:14.181787 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"23a5746a-8221-457c-8450-bdcded63f5c5","Type":"ContainerStarted","Data":"8bd489ab6dc35becad507f4a548c14211092e6f632d5e3ae7c9a256d90b9dc26"} Jan 11 17:52:14 crc kubenswrapper[4837]: I0111 17:52:14.188042 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-db-create-hldbw" event={"ID":"dc121be2-7012-40c1-8dbe-722d0e838685","Type":"ContainerStarted","Data":"3d6e50fd85c568c85366cf4cddf3e7061e6d06f29582ab430827919640f27f10"} Jan 11 17:52:14 crc kubenswrapper[4837]: I0111 17:52:14.189511 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-db-create-v96ck" event={"ID":"85847325-510d-4024-86ce-271201c83e9a","Type":"ContainerStarted","Data":"7ffa2b3a67467fc980f4deffb09e70afd7f0d89796d54fe8177e7d510c6d3cbf"} Jan 11 17:52:14 crc kubenswrapper[4837]: I0111 17:52:14.194839 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-db-create-h9f9z" event={"ID":"54d9b221-84b0-4c01-9870-30500cafdaf5","Type":"ContainerStarted","Data":"2484e2f3d4f9e02a4e3e8f7c00cb627e0fa0c74308de8ec37513e26776d877a3"} Jan 11 17:52:14 crc kubenswrapper[4837]: I0111 17:52:14.194876 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-db-create-h9f9z" event={"ID":"54d9b221-84b0-4c01-9870-30500cafdaf5","Type":"ContainerStarted","Data":"16a0328e026302d2b3bf0c078264fdd4cc6a5282a3f834375dab25a5bf745f2f"} Jan 11 17:52:14 crc kubenswrapper[4837]: I0111 17:52:14.206162 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"f3914d94-6947-4a7c-ac5e-45bfe15ae144","Type":"ContainerStarted","Data":"a947157518a85bbc94baa93884dd0e9ae65a84bea59f9df8e8032f834a24c083"} Jan 11 17:52:14 crc kubenswrapper[4837]: I0111 17:52:14.214578 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-f0ad-account-create-update-6nvvv" event={"ID":"b1f55033-05b1-49a2-8848-17633aeea8ca","Type":"ContainerStarted","Data":"d2a7a421c5bd5a18811511c6b53c97e10353ddb84fb0326da05f7ff8ea7ed982"} Jan 11 17:52:14 crc kubenswrapper[4837]: I0111 17:52:14.216021 4837 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Jan 11 17:52:14 crc kubenswrapper[4837]: I0111 17:52:14.216041 4837 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Jan 11 17:52:14 crc kubenswrapper[4837]: I0111 17:52:14.216784 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-7633-account-create-update-2l7l4" event={"ID":"89212816-4e77-490e-ae62-d3aef36b0570","Type":"ContainerStarted","Data":"7a21c6a58ad9e55e3850645da0aa199261c2f2f46b2d8a987b4586079d1e91a4"} Jan 11 17:52:14 crc kubenswrapper[4837]: I0111 17:52:14.780329 4837 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-default-internal-api-0" Jan 11 17:52:15 crc kubenswrapper[4837]: I0111 17:52:15.231343 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"f3914d94-6947-4a7c-ac5e-45bfe15ae144","Type":"ContainerStarted","Data":"3cd7f06f2417c2880e162e9d3fa61e619a118d38d6fba594179c354ce2765981"} Jan 11 17:52:15 crc kubenswrapper[4837]: I0111 17:52:15.232838 4837 generic.go:334] "Generic (PLEG): container finished" podID="b1f55033-05b1-49a2-8848-17633aeea8ca" containerID="6baa8ef7bb546d3bdf87c4bba378b15f00d28eefc5d10b4bb95bf7755e989703" exitCode=0 Jan 11 17:52:15 crc kubenswrapper[4837]: I0111 17:52:15.232892 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-f0ad-account-create-update-6nvvv" event={"ID":"b1f55033-05b1-49a2-8848-17633aeea8ca","Type":"ContainerDied","Data":"6baa8ef7bb546d3bdf87c4bba378b15f00d28eefc5d10b4bb95bf7755e989703"} Jan 11 17:52:15 crc kubenswrapper[4837]: I0111 17:52:15.234968 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-7633-account-create-update-2l7l4" event={"ID":"89212816-4e77-490e-ae62-d3aef36b0570","Type":"ContainerStarted","Data":"8cab87f746855c422ef01a1606595bfd84e949e4b33bdb593d09d8efdaf27647"} Jan 11 17:52:15 crc kubenswrapper[4837]: I0111 17:52:15.255713 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"23a5746a-8221-457c-8450-bdcded63f5c5","Type":"ContainerStarted","Data":"f3fa7b57408e597bcf73eaad13d435014b6b8a4c31930b25e691cd5d71a19061"} Jan 11 17:52:15 crc kubenswrapper[4837]: I0111 17:52:15.257056 4837 generic.go:334] "Generic (PLEG): container finished" podID="85847325-510d-4024-86ce-271201c83e9a" containerID="7fa53c8e94596391adea74ec48e1196c2e3b48844f54bb196268fa54002d16c6" exitCode=0 Jan 11 17:52:15 crc kubenswrapper[4837]: I0111 17:52:15.257107 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-db-create-v96ck" event={"ID":"85847325-510d-4024-86ce-271201c83e9a","Type":"ContainerDied","Data":"7fa53c8e94596391adea74ec48e1196c2e3b48844f54bb196268fa54002d16c6"} Jan 11 17:52:15 crc kubenswrapper[4837]: I0111 17:52:15.258442 4837 generic.go:334] "Generic (PLEG): container finished" podID="dc121be2-7012-40c1-8dbe-722d0e838685" containerID="bd8f463131730357cfa62a1f3faa0774e2d9f9588be3e824c3284c77ce387198" exitCode=0 Jan 11 17:52:15 crc kubenswrapper[4837]: I0111 17:52:15.258497 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-db-create-hldbw" event={"ID":"dc121be2-7012-40c1-8dbe-722d0e838685","Type":"ContainerDied","Data":"bd8f463131730357cfa62a1f3faa0774e2d9f9588be3e824c3284c77ce387198"} Jan 11 17:52:15 crc kubenswrapper[4837]: I0111 17:52:15.259524 4837 generic.go:334] "Generic (PLEG): container finished" podID="54d9b221-84b0-4c01-9870-30500cafdaf5" containerID="2484e2f3d4f9e02a4e3e8f7c00cb627e0fa0c74308de8ec37513e26776d877a3" exitCode=0 Jan 11 17:52:15 crc kubenswrapper[4837]: I0111 17:52:15.259569 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-db-create-h9f9z" event={"ID":"54d9b221-84b0-4c01-9870-30500cafdaf5","Type":"ContainerDied","Data":"2484e2f3d4f9e02a4e3e8f7c00cb627e0fa0c74308de8ec37513e26776d877a3"} Jan 11 17:52:15 crc kubenswrapper[4837]: I0111 17:52:15.275189 4837 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-api-7633-account-create-update-2l7l4" podStartSLOduration=3.275168832 podStartE2EDuration="3.275168832s" podCreationTimestamp="2026-01-11 17:52:12 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-11 17:52:15.272052948 +0000 UTC m=+1309.450245654" watchObservedRunningTime="2026-01-11 17:52:15.275168832 +0000 UTC m=+1309.453361538" Jan 11 17:52:15 crc kubenswrapper[4837]: I0111 17:52:15.302471 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-e41d-account-create-update-l96tm" event={"ID":"a509deff-a6eb-435a-8739-bc5b0489d32b","Type":"ContainerDied","Data":"437e503187a681729589f86207c5c5c71561dae2047ed8281de6b7e9f0d1498f"} Jan 11 17:52:15 crc kubenswrapper[4837]: I0111 17:52:15.302323 4837 generic.go:334] "Generic (PLEG): container finished" podID="a509deff-a6eb-435a-8739-bc5b0489d32b" containerID="437e503187a681729589f86207c5c5c71561dae2047ed8281de6b7e9f0d1498f" exitCode=0 Jan 11 17:52:15 crc kubenswrapper[4837]: I0111 17:52:15.302714 4837 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Jan 11 17:52:15 crc kubenswrapper[4837]: I0111 17:52:15.303081 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-e41d-account-create-update-l96tm" event={"ID":"a509deff-a6eb-435a-8739-bc5b0489d32b","Type":"ContainerStarted","Data":"186a9355fa5d0b65158dce79d42c58f76538bd581932d1d26320bde865f01090"} Jan 11 17:52:15 crc kubenswrapper[4837]: I0111 17:52:15.342465 4837 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Jan 11 17:52:15 crc kubenswrapper[4837]: I0111 17:52:15.518758 4837 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-default-internal-api-0" Jan 11 17:52:15 crc kubenswrapper[4837]: I0111 17:52:15.658976 4837 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-db-create-h9f9z" Jan 11 17:52:15 crc kubenswrapper[4837]: I0111 17:52:15.735896 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/54d9b221-84b0-4c01-9870-30500cafdaf5-operator-scripts\") pod \"54d9b221-84b0-4c01-9870-30500cafdaf5\" (UID: \"54d9b221-84b0-4c01-9870-30500cafdaf5\") " Jan 11 17:52:15 crc kubenswrapper[4837]: I0111 17:52:15.736345 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-kx5xr\" (UniqueName: \"kubernetes.io/projected/54d9b221-84b0-4c01-9870-30500cafdaf5-kube-api-access-kx5xr\") pod \"54d9b221-84b0-4c01-9870-30500cafdaf5\" (UID: \"54d9b221-84b0-4c01-9870-30500cafdaf5\") " Jan 11 17:52:15 crc kubenswrapper[4837]: I0111 17:52:15.737805 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/54d9b221-84b0-4c01-9870-30500cafdaf5-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "54d9b221-84b0-4c01-9870-30500cafdaf5" (UID: "54d9b221-84b0-4c01-9870-30500cafdaf5"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 11 17:52:15 crc kubenswrapper[4837]: I0111 17:52:15.742452 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/54d9b221-84b0-4c01-9870-30500cafdaf5-kube-api-access-kx5xr" (OuterVolumeSpecName: "kube-api-access-kx5xr") pod "54d9b221-84b0-4c01-9870-30500cafdaf5" (UID: "54d9b221-84b0-4c01-9870-30500cafdaf5"). InnerVolumeSpecName "kube-api-access-kx5xr". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 11 17:52:15 crc kubenswrapper[4837]: I0111 17:52:15.838327 4837 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-kx5xr\" (UniqueName: \"kubernetes.io/projected/54d9b221-84b0-4c01-9870-30500cafdaf5-kube-api-access-kx5xr\") on node \"crc\" DevicePath \"\"" Jan 11 17:52:15 crc kubenswrapper[4837]: I0111 17:52:15.838368 4837 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/54d9b221-84b0-4c01-9870-30500cafdaf5-operator-scripts\") on node \"crc\" DevicePath \"\"" Jan 11 17:52:16 crc kubenswrapper[4837]: I0111 17:52:16.313787 4837 generic.go:334] "Generic (PLEG): container finished" podID="89212816-4e77-490e-ae62-d3aef36b0570" containerID="8cab87f746855c422ef01a1606595bfd84e949e4b33bdb593d09d8efdaf27647" exitCode=0 Jan 11 17:52:16 crc kubenswrapper[4837]: I0111 17:52:16.313882 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-7633-account-create-update-2l7l4" event={"ID":"89212816-4e77-490e-ae62-d3aef36b0570","Type":"ContainerDied","Data":"8cab87f746855c422ef01a1606595bfd84e949e4b33bdb593d09d8efdaf27647"} Jan 11 17:52:16 crc kubenswrapper[4837]: I0111 17:52:16.316070 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-db-create-h9f9z" event={"ID":"54d9b221-84b0-4c01-9870-30500cafdaf5","Type":"ContainerDied","Data":"16a0328e026302d2b3bf0c078264fdd4cc6a5282a3f834375dab25a5bf745f2f"} Jan 11 17:52:16 crc kubenswrapper[4837]: I0111 17:52:16.316120 4837 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="16a0328e026302d2b3bf0c078264fdd4cc6a5282a3f834375dab25a5bf745f2f" Jan 11 17:52:16 crc kubenswrapper[4837]: I0111 17:52:16.316197 4837 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-db-create-h9f9z" Jan 11 17:52:16 crc kubenswrapper[4837]: I0111 17:52:16.319580 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"f3914d94-6947-4a7c-ac5e-45bfe15ae144","Type":"ContainerStarted","Data":"e78914ab70cb7543e30ffe564fcf92888eeb87af01441babf3b339428c0e3da1"} Jan 11 17:52:16 crc kubenswrapper[4837]: I0111 17:52:16.353580 4837 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/glance-default-external-api-0" podStartSLOduration=4.353564436 podStartE2EDuration="4.353564436s" podCreationTimestamp="2026-01-11 17:52:12 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-11 17:52:16.350899435 +0000 UTC m=+1310.529092161" watchObservedRunningTime="2026-01-11 17:52:16.353564436 +0000 UTC m=+1310.531757142" Jan 11 17:52:16 crc kubenswrapper[4837]: I0111 17:52:16.680786 4837 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-db-create-hldbw" Jan 11 17:52:16 crc kubenswrapper[4837]: I0111 17:52:16.844838 4837 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-f0ad-account-create-update-6nvvv" Jan 11 17:52:16 crc kubenswrapper[4837]: I0111 17:52:16.852189 4837 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-db-create-v96ck" Jan 11 17:52:16 crc kubenswrapper[4837]: I0111 17:52:16.854793 4837 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-e41d-account-create-update-l96tm" Jan 11 17:52:16 crc kubenswrapper[4837]: I0111 17:52:16.867452 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-bzzgz\" (UniqueName: \"kubernetes.io/projected/dc121be2-7012-40c1-8dbe-722d0e838685-kube-api-access-bzzgz\") pod \"dc121be2-7012-40c1-8dbe-722d0e838685\" (UID: \"dc121be2-7012-40c1-8dbe-722d0e838685\") " Jan 11 17:52:16 crc kubenswrapper[4837]: I0111 17:52:16.867500 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/dc121be2-7012-40c1-8dbe-722d0e838685-operator-scripts\") pod \"dc121be2-7012-40c1-8dbe-722d0e838685\" (UID: \"dc121be2-7012-40c1-8dbe-722d0e838685\") " Jan 11 17:52:16 crc kubenswrapper[4837]: I0111 17:52:16.868559 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/dc121be2-7012-40c1-8dbe-722d0e838685-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "dc121be2-7012-40c1-8dbe-722d0e838685" (UID: "dc121be2-7012-40c1-8dbe-722d0e838685"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 11 17:52:16 crc kubenswrapper[4837]: I0111 17:52:16.875532 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/dc121be2-7012-40c1-8dbe-722d0e838685-kube-api-access-bzzgz" (OuterVolumeSpecName: "kube-api-access-bzzgz") pod "dc121be2-7012-40c1-8dbe-722d0e838685" (UID: "dc121be2-7012-40c1-8dbe-722d0e838685"). InnerVolumeSpecName "kube-api-access-bzzgz". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 11 17:52:16 crc kubenswrapper[4837]: I0111 17:52:16.968730 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qlpgt\" (UniqueName: \"kubernetes.io/projected/a509deff-a6eb-435a-8739-bc5b0489d32b-kube-api-access-qlpgt\") pod \"a509deff-a6eb-435a-8739-bc5b0489d32b\" (UID: \"a509deff-a6eb-435a-8739-bc5b0489d32b\") " Jan 11 17:52:16 crc kubenswrapper[4837]: I0111 17:52:16.968776 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/b1f55033-05b1-49a2-8848-17633aeea8ca-operator-scripts\") pod \"b1f55033-05b1-49a2-8848-17633aeea8ca\" (UID: \"b1f55033-05b1-49a2-8848-17633aeea8ca\") " Jan 11 17:52:16 crc kubenswrapper[4837]: I0111 17:52:16.968846 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/a509deff-a6eb-435a-8739-bc5b0489d32b-operator-scripts\") pod \"a509deff-a6eb-435a-8739-bc5b0489d32b\" (UID: \"a509deff-a6eb-435a-8739-bc5b0489d32b\") " Jan 11 17:52:16 crc kubenswrapper[4837]: I0111 17:52:16.968941 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-f9pr9\" (UniqueName: \"kubernetes.io/projected/85847325-510d-4024-86ce-271201c83e9a-kube-api-access-f9pr9\") pod \"85847325-510d-4024-86ce-271201c83e9a\" (UID: \"85847325-510d-4024-86ce-271201c83e9a\") " Jan 11 17:52:16 crc kubenswrapper[4837]: I0111 17:52:16.969026 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-cvqng\" (UniqueName: \"kubernetes.io/projected/b1f55033-05b1-49a2-8848-17633aeea8ca-kube-api-access-cvqng\") pod \"b1f55033-05b1-49a2-8848-17633aeea8ca\" (UID: \"b1f55033-05b1-49a2-8848-17633aeea8ca\") " Jan 11 17:52:16 crc kubenswrapper[4837]: I0111 17:52:16.969045 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/85847325-510d-4024-86ce-271201c83e9a-operator-scripts\") pod \"85847325-510d-4024-86ce-271201c83e9a\" (UID: \"85847325-510d-4024-86ce-271201c83e9a\") " Jan 11 17:52:16 crc kubenswrapper[4837]: I0111 17:52:16.969393 4837 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-bzzgz\" (UniqueName: \"kubernetes.io/projected/dc121be2-7012-40c1-8dbe-722d0e838685-kube-api-access-bzzgz\") on node \"crc\" DevicePath \"\"" Jan 11 17:52:16 crc kubenswrapper[4837]: I0111 17:52:16.969412 4837 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/dc121be2-7012-40c1-8dbe-722d0e838685-operator-scripts\") on node \"crc\" DevicePath \"\"" Jan 11 17:52:16 crc kubenswrapper[4837]: I0111 17:52:16.969512 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b1f55033-05b1-49a2-8848-17633aeea8ca-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "b1f55033-05b1-49a2-8848-17633aeea8ca" (UID: "b1f55033-05b1-49a2-8848-17633aeea8ca"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 11 17:52:16 crc kubenswrapper[4837]: I0111 17:52:16.969815 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/85847325-510d-4024-86ce-271201c83e9a-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "85847325-510d-4024-86ce-271201c83e9a" (UID: "85847325-510d-4024-86ce-271201c83e9a"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 11 17:52:16 crc kubenswrapper[4837]: I0111 17:52:16.970163 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a509deff-a6eb-435a-8739-bc5b0489d32b-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "a509deff-a6eb-435a-8739-bc5b0489d32b" (UID: "a509deff-a6eb-435a-8739-bc5b0489d32b"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 11 17:52:16 crc kubenswrapper[4837]: I0111 17:52:16.975894 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a509deff-a6eb-435a-8739-bc5b0489d32b-kube-api-access-qlpgt" (OuterVolumeSpecName: "kube-api-access-qlpgt") pod "a509deff-a6eb-435a-8739-bc5b0489d32b" (UID: "a509deff-a6eb-435a-8739-bc5b0489d32b"). InnerVolumeSpecName "kube-api-access-qlpgt". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 11 17:52:16 crc kubenswrapper[4837]: I0111 17:52:16.976937 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b1f55033-05b1-49a2-8848-17633aeea8ca-kube-api-access-cvqng" (OuterVolumeSpecName: "kube-api-access-cvqng") pod "b1f55033-05b1-49a2-8848-17633aeea8ca" (UID: "b1f55033-05b1-49a2-8848-17633aeea8ca"). InnerVolumeSpecName "kube-api-access-cvqng". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 11 17:52:16 crc kubenswrapper[4837]: I0111 17:52:16.977032 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/85847325-510d-4024-86ce-271201c83e9a-kube-api-access-f9pr9" (OuterVolumeSpecName: "kube-api-access-f9pr9") pod "85847325-510d-4024-86ce-271201c83e9a" (UID: "85847325-510d-4024-86ce-271201c83e9a"). InnerVolumeSpecName "kube-api-access-f9pr9". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 11 17:52:17 crc kubenswrapper[4837]: I0111 17:52:17.072303 4837 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qlpgt\" (UniqueName: \"kubernetes.io/projected/a509deff-a6eb-435a-8739-bc5b0489d32b-kube-api-access-qlpgt\") on node \"crc\" DevicePath \"\"" Jan 11 17:52:17 crc kubenswrapper[4837]: I0111 17:52:17.072329 4837 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/b1f55033-05b1-49a2-8848-17633aeea8ca-operator-scripts\") on node \"crc\" DevicePath \"\"" Jan 11 17:52:17 crc kubenswrapper[4837]: I0111 17:52:17.072339 4837 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/a509deff-a6eb-435a-8739-bc5b0489d32b-operator-scripts\") on node \"crc\" DevicePath \"\"" Jan 11 17:52:17 crc kubenswrapper[4837]: I0111 17:52:17.072348 4837 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-f9pr9\" (UniqueName: \"kubernetes.io/projected/85847325-510d-4024-86ce-271201c83e9a-kube-api-access-f9pr9\") on node \"crc\" DevicePath \"\"" Jan 11 17:52:17 crc kubenswrapper[4837]: I0111 17:52:17.072357 4837 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-cvqng\" (UniqueName: \"kubernetes.io/projected/b1f55033-05b1-49a2-8848-17633aeea8ca-kube-api-access-cvqng\") on node \"crc\" DevicePath \"\"" Jan 11 17:52:17 crc kubenswrapper[4837]: I0111 17:52:17.072366 4837 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/85847325-510d-4024-86ce-271201c83e9a-operator-scripts\") on node \"crc\" DevicePath \"\"" Jan 11 17:52:17 crc kubenswrapper[4837]: I0111 17:52:17.329444 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-e41d-account-create-update-l96tm" event={"ID":"a509deff-a6eb-435a-8739-bc5b0489d32b","Type":"ContainerDied","Data":"186a9355fa5d0b65158dce79d42c58f76538bd581932d1d26320bde865f01090"} Jan 11 17:52:17 crc kubenswrapper[4837]: I0111 17:52:17.329467 4837 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-e41d-account-create-update-l96tm" Jan 11 17:52:17 crc kubenswrapper[4837]: I0111 17:52:17.329605 4837 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="186a9355fa5d0b65158dce79d42c58f76538bd581932d1d26320bde865f01090" Jan 11 17:52:17 crc kubenswrapper[4837]: I0111 17:52:17.331325 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-f0ad-account-create-update-6nvvv" event={"ID":"b1f55033-05b1-49a2-8848-17633aeea8ca","Type":"ContainerDied","Data":"d2a7a421c5bd5a18811511c6b53c97e10353ddb84fb0326da05f7ff8ea7ed982"} Jan 11 17:52:17 crc kubenswrapper[4837]: I0111 17:52:17.331372 4837 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="d2a7a421c5bd5a18811511c6b53c97e10353ddb84fb0326da05f7ff8ea7ed982" Jan 11 17:52:17 crc kubenswrapper[4837]: I0111 17:52:17.331850 4837 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-f0ad-account-create-update-6nvvv" Jan 11 17:52:17 crc kubenswrapper[4837]: I0111 17:52:17.333942 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"23a5746a-8221-457c-8450-bdcded63f5c5","Type":"ContainerStarted","Data":"b376cf78a0fba129151fd7eb78254fa620eb6ba37cd5c63f49819a27f161126a"} Jan 11 17:52:17 crc kubenswrapper[4837]: I0111 17:52:17.334081 4837 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="23a5746a-8221-457c-8450-bdcded63f5c5" containerName="ceilometer-central-agent" containerID="cri-o://eda43633915aae34c66b30d5cf6200395a8485bfa9dba6f466fbb9b8caa65937" gracePeriod=30 Jan 11 17:52:17 crc kubenswrapper[4837]: I0111 17:52:17.334152 4837 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ceilometer-0" Jan 11 17:52:17 crc kubenswrapper[4837]: I0111 17:52:17.334417 4837 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="23a5746a-8221-457c-8450-bdcded63f5c5" containerName="proxy-httpd" containerID="cri-o://b376cf78a0fba129151fd7eb78254fa620eb6ba37cd5c63f49819a27f161126a" gracePeriod=30 Jan 11 17:52:17 crc kubenswrapper[4837]: I0111 17:52:17.334462 4837 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="23a5746a-8221-457c-8450-bdcded63f5c5" containerName="sg-core" containerID="cri-o://f3fa7b57408e597bcf73eaad13d435014b6b8a4c31930b25e691cd5d71a19061" gracePeriod=30 Jan 11 17:52:17 crc kubenswrapper[4837]: I0111 17:52:17.334493 4837 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="23a5746a-8221-457c-8450-bdcded63f5c5" containerName="ceilometer-notification-agent" containerID="cri-o://8bd489ab6dc35becad507f4a548c14211092e6f632d5e3ae7c9a256d90b9dc26" gracePeriod=30 Jan 11 17:52:17 crc kubenswrapper[4837]: I0111 17:52:17.341855 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-db-create-v96ck" event={"ID":"85847325-510d-4024-86ce-271201c83e9a","Type":"ContainerDied","Data":"7ffa2b3a67467fc980f4deffb09e70afd7f0d89796d54fe8177e7d510c6d3cbf"} Jan 11 17:52:17 crc kubenswrapper[4837]: I0111 17:52:17.341901 4837 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="7ffa2b3a67467fc980f4deffb09e70afd7f0d89796d54fe8177e7d510c6d3cbf" Jan 11 17:52:17 crc kubenswrapper[4837]: I0111 17:52:17.341951 4837 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-db-create-v96ck" Jan 11 17:52:17 crc kubenswrapper[4837]: I0111 17:52:17.347722 4837 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-db-create-hldbw" Jan 11 17:52:17 crc kubenswrapper[4837]: I0111 17:52:17.348814 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-db-create-hldbw" event={"ID":"dc121be2-7012-40c1-8dbe-722d0e838685","Type":"ContainerDied","Data":"3d6e50fd85c568c85366cf4cddf3e7061e6d06f29582ab430827919640f27f10"} Jan 11 17:52:17 crc kubenswrapper[4837]: I0111 17:52:17.348852 4837 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="3d6e50fd85c568c85366cf4cddf3e7061e6d06f29582ab430827919640f27f10" Jan 11 17:52:17 crc kubenswrapper[4837]: I0111 17:52:17.358407 4837 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ceilometer-0" podStartSLOduration=2.38561697 podStartE2EDuration="6.358388634s" podCreationTimestamp="2026-01-11 17:52:11 +0000 UTC" firstStartedPulling="2026-01-11 17:52:12.034062239 +0000 UTC m=+1306.212254945" lastFinishedPulling="2026-01-11 17:52:16.006833903 +0000 UTC m=+1310.185026609" observedRunningTime="2026-01-11 17:52:17.356164945 +0000 UTC m=+1311.534357641" watchObservedRunningTime="2026-01-11 17:52:17.358388634 +0000 UTC m=+1311.536581340" Jan 11 17:52:17 crc kubenswrapper[4837]: I0111 17:52:17.724227 4837 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-7633-account-create-update-2l7l4" Jan 11 17:52:17 crc kubenswrapper[4837]: I0111 17:52:17.889603 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-q8k4d\" (UniqueName: \"kubernetes.io/projected/89212816-4e77-490e-ae62-d3aef36b0570-kube-api-access-q8k4d\") pod \"89212816-4e77-490e-ae62-d3aef36b0570\" (UID: \"89212816-4e77-490e-ae62-d3aef36b0570\") " Jan 11 17:52:17 crc kubenswrapper[4837]: I0111 17:52:17.889686 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/89212816-4e77-490e-ae62-d3aef36b0570-operator-scripts\") pod \"89212816-4e77-490e-ae62-d3aef36b0570\" (UID: \"89212816-4e77-490e-ae62-d3aef36b0570\") " Jan 11 17:52:17 crc kubenswrapper[4837]: I0111 17:52:17.890218 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/89212816-4e77-490e-ae62-d3aef36b0570-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "89212816-4e77-490e-ae62-d3aef36b0570" (UID: "89212816-4e77-490e-ae62-d3aef36b0570"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 11 17:52:17 crc kubenswrapper[4837]: I0111 17:52:17.897877 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/89212816-4e77-490e-ae62-d3aef36b0570-kube-api-access-q8k4d" (OuterVolumeSpecName: "kube-api-access-q8k4d") pod "89212816-4e77-490e-ae62-d3aef36b0570" (UID: "89212816-4e77-490e-ae62-d3aef36b0570"). InnerVolumeSpecName "kube-api-access-q8k4d". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 11 17:52:17 crc kubenswrapper[4837]: I0111 17:52:17.991295 4837 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-q8k4d\" (UniqueName: \"kubernetes.io/projected/89212816-4e77-490e-ae62-d3aef36b0570-kube-api-access-q8k4d\") on node \"crc\" DevicePath \"\"" Jan 11 17:52:17 crc kubenswrapper[4837]: I0111 17:52:17.991331 4837 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/89212816-4e77-490e-ae62-d3aef36b0570-operator-scripts\") on node \"crc\" DevicePath \"\"" Jan 11 17:52:18 crc kubenswrapper[4837]: I0111 17:52:18.358042 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-7633-account-create-update-2l7l4" event={"ID":"89212816-4e77-490e-ae62-d3aef36b0570","Type":"ContainerDied","Data":"7a21c6a58ad9e55e3850645da0aa199261c2f2f46b2d8a987b4586079d1e91a4"} Jan 11 17:52:18 crc kubenswrapper[4837]: I0111 17:52:18.358319 4837 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="7a21c6a58ad9e55e3850645da0aa199261c2f2f46b2d8a987b4586079d1e91a4" Jan 11 17:52:18 crc kubenswrapper[4837]: I0111 17:52:18.358392 4837 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-7633-account-create-update-2l7l4" Jan 11 17:52:18 crc kubenswrapper[4837]: I0111 17:52:18.360808 4837 generic.go:334] "Generic (PLEG): container finished" podID="23a5746a-8221-457c-8450-bdcded63f5c5" containerID="b376cf78a0fba129151fd7eb78254fa620eb6ba37cd5c63f49819a27f161126a" exitCode=0 Jan 11 17:52:18 crc kubenswrapper[4837]: I0111 17:52:18.360835 4837 generic.go:334] "Generic (PLEG): container finished" podID="23a5746a-8221-457c-8450-bdcded63f5c5" containerID="f3fa7b57408e597bcf73eaad13d435014b6b8a4c31930b25e691cd5d71a19061" exitCode=2 Jan 11 17:52:18 crc kubenswrapper[4837]: I0111 17:52:18.360843 4837 generic.go:334] "Generic (PLEG): container finished" podID="23a5746a-8221-457c-8450-bdcded63f5c5" containerID="8bd489ab6dc35becad507f4a548c14211092e6f632d5e3ae7c9a256d90b9dc26" exitCode=0 Jan 11 17:52:18 crc kubenswrapper[4837]: I0111 17:52:18.360846 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"23a5746a-8221-457c-8450-bdcded63f5c5","Type":"ContainerDied","Data":"b376cf78a0fba129151fd7eb78254fa620eb6ba37cd5c63f49819a27f161126a"} Jan 11 17:52:18 crc kubenswrapper[4837]: I0111 17:52:18.360893 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"23a5746a-8221-457c-8450-bdcded63f5c5","Type":"ContainerDied","Data":"f3fa7b57408e597bcf73eaad13d435014b6b8a4c31930b25e691cd5d71a19061"} Jan 11 17:52:18 crc kubenswrapper[4837]: I0111 17:52:18.360907 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"23a5746a-8221-457c-8450-bdcded63f5c5","Type":"ContainerDied","Data":"8bd489ab6dc35becad507f4a548c14211092e6f632d5e3ae7c9a256d90b9dc26"} Jan 11 17:52:22 crc kubenswrapper[4837]: I0111 17:52:22.511369 4837 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-default-external-api-0" Jan 11 17:52:22 crc kubenswrapper[4837]: I0111 17:52:22.511429 4837 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-default-external-api-0" Jan 11 17:52:22 crc kubenswrapper[4837]: I0111 17:52:22.541287 4837 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-default-external-api-0" Jan 11 17:52:22 crc kubenswrapper[4837]: I0111 17:52:22.550399 4837 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-default-external-api-0" Jan 11 17:52:23 crc kubenswrapper[4837]: I0111 17:52:23.055564 4837 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell0-conductor-db-sync-q5qqm"] Jan 11 17:52:23 crc kubenswrapper[4837]: E0111 17:52:23.056016 4837 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="54d9b221-84b0-4c01-9870-30500cafdaf5" containerName="mariadb-database-create" Jan 11 17:52:23 crc kubenswrapper[4837]: I0111 17:52:23.056033 4837 state_mem.go:107] "Deleted CPUSet assignment" podUID="54d9b221-84b0-4c01-9870-30500cafdaf5" containerName="mariadb-database-create" Jan 11 17:52:23 crc kubenswrapper[4837]: E0111 17:52:23.056044 4837 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b1f55033-05b1-49a2-8848-17633aeea8ca" containerName="mariadb-account-create-update" Jan 11 17:52:23 crc kubenswrapper[4837]: I0111 17:52:23.056051 4837 state_mem.go:107] "Deleted CPUSet assignment" podUID="b1f55033-05b1-49a2-8848-17633aeea8ca" containerName="mariadb-account-create-update" Jan 11 17:52:23 crc kubenswrapper[4837]: E0111 17:52:23.056073 4837 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="85847325-510d-4024-86ce-271201c83e9a" containerName="mariadb-database-create" Jan 11 17:52:23 crc kubenswrapper[4837]: I0111 17:52:23.056079 4837 state_mem.go:107] "Deleted CPUSet assignment" podUID="85847325-510d-4024-86ce-271201c83e9a" containerName="mariadb-database-create" Jan 11 17:52:23 crc kubenswrapper[4837]: E0111 17:52:23.056093 4837 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="89212816-4e77-490e-ae62-d3aef36b0570" containerName="mariadb-account-create-update" Jan 11 17:52:23 crc kubenswrapper[4837]: I0111 17:52:23.056100 4837 state_mem.go:107] "Deleted CPUSet assignment" podUID="89212816-4e77-490e-ae62-d3aef36b0570" containerName="mariadb-account-create-update" Jan 11 17:52:23 crc kubenswrapper[4837]: E0111 17:52:23.056112 4837 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="dc121be2-7012-40c1-8dbe-722d0e838685" containerName="mariadb-database-create" Jan 11 17:52:23 crc kubenswrapper[4837]: I0111 17:52:23.056118 4837 state_mem.go:107] "Deleted CPUSet assignment" podUID="dc121be2-7012-40c1-8dbe-722d0e838685" containerName="mariadb-database-create" Jan 11 17:52:23 crc kubenswrapper[4837]: E0111 17:52:23.056137 4837 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a509deff-a6eb-435a-8739-bc5b0489d32b" containerName="mariadb-account-create-update" Jan 11 17:52:23 crc kubenswrapper[4837]: I0111 17:52:23.056143 4837 state_mem.go:107] "Deleted CPUSet assignment" podUID="a509deff-a6eb-435a-8739-bc5b0489d32b" containerName="mariadb-account-create-update" Jan 11 17:52:23 crc kubenswrapper[4837]: I0111 17:52:23.056313 4837 memory_manager.go:354] "RemoveStaleState removing state" podUID="a509deff-a6eb-435a-8739-bc5b0489d32b" containerName="mariadb-account-create-update" Jan 11 17:52:23 crc kubenswrapper[4837]: I0111 17:52:23.056330 4837 memory_manager.go:354] "RemoveStaleState removing state" podUID="dc121be2-7012-40c1-8dbe-722d0e838685" containerName="mariadb-database-create" Jan 11 17:52:23 crc kubenswrapper[4837]: I0111 17:52:23.056342 4837 memory_manager.go:354] "RemoveStaleState removing state" podUID="54d9b221-84b0-4c01-9870-30500cafdaf5" containerName="mariadb-database-create" Jan 11 17:52:23 crc kubenswrapper[4837]: I0111 17:52:23.056351 4837 memory_manager.go:354] "RemoveStaleState removing state" podUID="b1f55033-05b1-49a2-8848-17633aeea8ca" containerName="mariadb-account-create-update" Jan 11 17:52:23 crc kubenswrapper[4837]: I0111 17:52:23.056358 4837 memory_manager.go:354] "RemoveStaleState removing state" podUID="89212816-4e77-490e-ae62-d3aef36b0570" containerName="mariadb-account-create-update" Jan 11 17:52:23 crc kubenswrapper[4837]: I0111 17:52:23.056368 4837 memory_manager.go:354] "RemoveStaleState removing state" podUID="85847325-510d-4024-86ce-271201c83e9a" containerName="mariadb-database-create" Jan 11 17:52:23 crc kubenswrapper[4837]: I0111 17:52:23.056955 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-db-sync-q5qqm" Jan 11 17:52:23 crc kubenswrapper[4837]: I0111 17:52:23.065114 4837 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell0-conductor-scripts" Jan 11 17:52:23 crc kubenswrapper[4837]: I0111 17:52:23.065470 4837 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell0-conductor-config-data" Jan 11 17:52:23 crc kubenswrapper[4837]: I0111 17:52:23.065569 4837 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-nova-dockercfg-2gtdx" Jan 11 17:52:23 crc kubenswrapper[4837]: I0111 17:52:23.080714 4837 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-conductor-db-sync-q5qqm"] Jan 11 17:52:23 crc kubenswrapper[4837]: I0111 17:52:23.197598 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0550c6b1-8707-42d4-a3ac-f2cd7f8ac80f-config-data\") pod \"nova-cell0-conductor-db-sync-q5qqm\" (UID: \"0550c6b1-8707-42d4-a3ac-f2cd7f8ac80f\") " pod="openstack/nova-cell0-conductor-db-sync-q5qqm" Jan 11 17:52:23 crc kubenswrapper[4837]: I0111 17:52:23.197660 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zsn99\" (UniqueName: \"kubernetes.io/projected/0550c6b1-8707-42d4-a3ac-f2cd7f8ac80f-kube-api-access-zsn99\") pod \"nova-cell0-conductor-db-sync-q5qqm\" (UID: \"0550c6b1-8707-42d4-a3ac-f2cd7f8ac80f\") " pod="openstack/nova-cell0-conductor-db-sync-q5qqm" Jan 11 17:52:23 crc kubenswrapper[4837]: I0111 17:52:23.197698 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0550c6b1-8707-42d4-a3ac-f2cd7f8ac80f-combined-ca-bundle\") pod \"nova-cell0-conductor-db-sync-q5qqm\" (UID: \"0550c6b1-8707-42d4-a3ac-f2cd7f8ac80f\") " pod="openstack/nova-cell0-conductor-db-sync-q5qqm" Jan 11 17:52:23 crc kubenswrapper[4837]: I0111 17:52:23.197780 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/0550c6b1-8707-42d4-a3ac-f2cd7f8ac80f-scripts\") pod \"nova-cell0-conductor-db-sync-q5qqm\" (UID: \"0550c6b1-8707-42d4-a3ac-f2cd7f8ac80f\") " pod="openstack/nova-cell0-conductor-db-sync-q5qqm" Jan 11 17:52:23 crc kubenswrapper[4837]: I0111 17:52:23.299912 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0550c6b1-8707-42d4-a3ac-f2cd7f8ac80f-config-data\") pod \"nova-cell0-conductor-db-sync-q5qqm\" (UID: \"0550c6b1-8707-42d4-a3ac-f2cd7f8ac80f\") " pod="openstack/nova-cell0-conductor-db-sync-q5qqm" Jan 11 17:52:23 crc kubenswrapper[4837]: I0111 17:52:23.300111 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zsn99\" (UniqueName: \"kubernetes.io/projected/0550c6b1-8707-42d4-a3ac-f2cd7f8ac80f-kube-api-access-zsn99\") pod \"nova-cell0-conductor-db-sync-q5qqm\" (UID: \"0550c6b1-8707-42d4-a3ac-f2cd7f8ac80f\") " pod="openstack/nova-cell0-conductor-db-sync-q5qqm" Jan 11 17:52:23 crc kubenswrapper[4837]: I0111 17:52:23.300196 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0550c6b1-8707-42d4-a3ac-f2cd7f8ac80f-combined-ca-bundle\") pod \"nova-cell0-conductor-db-sync-q5qqm\" (UID: \"0550c6b1-8707-42d4-a3ac-f2cd7f8ac80f\") " pod="openstack/nova-cell0-conductor-db-sync-q5qqm" Jan 11 17:52:23 crc kubenswrapper[4837]: I0111 17:52:23.300435 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/0550c6b1-8707-42d4-a3ac-f2cd7f8ac80f-scripts\") pod \"nova-cell0-conductor-db-sync-q5qqm\" (UID: \"0550c6b1-8707-42d4-a3ac-f2cd7f8ac80f\") " pod="openstack/nova-cell0-conductor-db-sync-q5qqm" Jan 11 17:52:23 crc kubenswrapper[4837]: I0111 17:52:23.306777 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/0550c6b1-8707-42d4-a3ac-f2cd7f8ac80f-scripts\") pod \"nova-cell0-conductor-db-sync-q5qqm\" (UID: \"0550c6b1-8707-42d4-a3ac-f2cd7f8ac80f\") " pod="openstack/nova-cell0-conductor-db-sync-q5qqm" Jan 11 17:52:23 crc kubenswrapper[4837]: I0111 17:52:23.308501 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0550c6b1-8707-42d4-a3ac-f2cd7f8ac80f-config-data\") pod \"nova-cell0-conductor-db-sync-q5qqm\" (UID: \"0550c6b1-8707-42d4-a3ac-f2cd7f8ac80f\") " pod="openstack/nova-cell0-conductor-db-sync-q5qqm" Jan 11 17:52:23 crc kubenswrapper[4837]: I0111 17:52:23.317413 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0550c6b1-8707-42d4-a3ac-f2cd7f8ac80f-combined-ca-bundle\") pod \"nova-cell0-conductor-db-sync-q5qqm\" (UID: \"0550c6b1-8707-42d4-a3ac-f2cd7f8ac80f\") " pod="openstack/nova-cell0-conductor-db-sync-q5qqm" Jan 11 17:52:23 crc kubenswrapper[4837]: I0111 17:52:23.339250 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zsn99\" (UniqueName: \"kubernetes.io/projected/0550c6b1-8707-42d4-a3ac-f2cd7f8ac80f-kube-api-access-zsn99\") pod \"nova-cell0-conductor-db-sync-q5qqm\" (UID: \"0550c6b1-8707-42d4-a3ac-f2cd7f8ac80f\") " pod="openstack/nova-cell0-conductor-db-sync-q5qqm" Jan 11 17:52:23 crc kubenswrapper[4837]: I0111 17:52:23.389212 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-db-sync-q5qqm" Jan 11 17:52:23 crc kubenswrapper[4837]: I0111 17:52:23.412728 4837 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-default-external-api-0" Jan 11 17:52:23 crc kubenswrapper[4837]: I0111 17:52:23.413027 4837 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-default-external-api-0" Jan 11 17:52:23 crc kubenswrapper[4837]: I0111 17:52:23.910226 4837 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-conductor-db-sync-q5qqm"] Jan 11 17:52:24 crc kubenswrapper[4837]: I0111 17:52:24.434574 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-db-sync-q5qqm" event={"ID":"0550c6b1-8707-42d4-a3ac-f2cd7f8ac80f","Type":"ContainerStarted","Data":"a5388a8b5937eb843b4ead121b654bba453556dea7529f6c2d7295942b160ad5"} Jan 11 17:52:25 crc kubenswrapper[4837]: I0111 17:52:25.228370 4837 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-default-external-api-0" Jan 11 17:52:25 crc kubenswrapper[4837]: I0111 17:52:25.292940 4837 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-default-external-api-0" Jan 11 17:52:27 crc kubenswrapper[4837]: I0111 17:52:27.368681 4837 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Jan 11 17:52:27 crc kubenswrapper[4837]: I0111 17:52:27.475917 4837 generic.go:334] "Generic (PLEG): container finished" podID="23a5746a-8221-457c-8450-bdcded63f5c5" containerID="eda43633915aae34c66b30d5cf6200395a8485bfa9dba6f466fbb9b8caa65937" exitCode=0 Jan 11 17:52:27 crc kubenswrapper[4837]: I0111 17:52:27.475960 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"23a5746a-8221-457c-8450-bdcded63f5c5","Type":"ContainerDied","Data":"eda43633915aae34c66b30d5cf6200395a8485bfa9dba6f466fbb9b8caa65937"} Jan 11 17:52:27 crc kubenswrapper[4837]: I0111 17:52:27.476027 4837 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Jan 11 17:52:27 crc kubenswrapper[4837]: I0111 17:52:27.476125 4837 scope.go:117] "RemoveContainer" containerID="b376cf78a0fba129151fd7eb78254fa620eb6ba37cd5c63f49819a27f161126a" Jan 11 17:52:27 crc kubenswrapper[4837]: I0111 17:52:27.476084 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"23a5746a-8221-457c-8450-bdcded63f5c5","Type":"ContainerDied","Data":"7d369eff557df17bd4d4b5703e2b58ac763ca3f267409881c904de646090ac1b"} Jan 11 17:52:27 crc kubenswrapper[4837]: I0111 17:52:27.531218 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/23a5746a-8221-457c-8450-bdcded63f5c5-sg-core-conf-yaml\") pod \"23a5746a-8221-457c-8450-bdcded63f5c5\" (UID: \"23a5746a-8221-457c-8450-bdcded63f5c5\") " Jan 11 17:52:27 crc kubenswrapper[4837]: I0111 17:52:27.531290 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/23a5746a-8221-457c-8450-bdcded63f5c5-log-httpd\") pod \"23a5746a-8221-457c-8450-bdcded63f5c5\" (UID: \"23a5746a-8221-457c-8450-bdcded63f5c5\") " Jan 11 17:52:27 crc kubenswrapper[4837]: I0111 17:52:27.531314 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/23a5746a-8221-457c-8450-bdcded63f5c5-scripts\") pod \"23a5746a-8221-457c-8450-bdcded63f5c5\" (UID: \"23a5746a-8221-457c-8450-bdcded63f5c5\") " Jan 11 17:52:27 crc kubenswrapper[4837]: I0111 17:52:27.531371 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/23a5746a-8221-457c-8450-bdcded63f5c5-run-httpd\") pod \"23a5746a-8221-457c-8450-bdcded63f5c5\" (UID: \"23a5746a-8221-457c-8450-bdcded63f5c5\") " Jan 11 17:52:27 crc kubenswrapper[4837]: I0111 17:52:27.531401 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ttjn6\" (UniqueName: \"kubernetes.io/projected/23a5746a-8221-457c-8450-bdcded63f5c5-kube-api-access-ttjn6\") pod \"23a5746a-8221-457c-8450-bdcded63f5c5\" (UID: \"23a5746a-8221-457c-8450-bdcded63f5c5\") " Jan 11 17:52:27 crc kubenswrapper[4837]: I0111 17:52:27.531473 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/23a5746a-8221-457c-8450-bdcded63f5c5-combined-ca-bundle\") pod \"23a5746a-8221-457c-8450-bdcded63f5c5\" (UID: \"23a5746a-8221-457c-8450-bdcded63f5c5\") " Jan 11 17:52:27 crc kubenswrapper[4837]: I0111 17:52:27.531684 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/23a5746a-8221-457c-8450-bdcded63f5c5-config-data\") pod \"23a5746a-8221-457c-8450-bdcded63f5c5\" (UID: \"23a5746a-8221-457c-8450-bdcded63f5c5\") " Jan 11 17:52:27 crc kubenswrapper[4837]: I0111 17:52:27.532751 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/23a5746a-8221-457c-8450-bdcded63f5c5-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "23a5746a-8221-457c-8450-bdcded63f5c5" (UID: "23a5746a-8221-457c-8450-bdcded63f5c5"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 11 17:52:27 crc kubenswrapper[4837]: I0111 17:52:27.533401 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/23a5746a-8221-457c-8450-bdcded63f5c5-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "23a5746a-8221-457c-8450-bdcded63f5c5" (UID: "23a5746a-8221-457c-8450-bdcded63f5c5"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 11 17:52:27 crc kubenswrapper[4837]: I0111 17:52:27.537432 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/23a5746a-8221-457c-8450-bdcded63f5c5-kube-api-access-ttjn6" (OuterVolumeSpecName: "kube-api-access-ttjn6") pod "23a5746a-8221-457c-8450-bdcded63f5c5" (UID: "23a5746a-8221-457c-8450-bdcded63f5c5"). InnerVolumeSpecName "kube-api-access-ttjn6". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 11 17:52:27 crc kubenswrapper[4837]: I0111 17:52:27.541788 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/23a5746a-8221-457c-8450-bdcded63f5c5-scripts" (OuterVolumeSpecName: "scripts") pod "23a5746a-8221-457c-8450-bdcded63f5c5" (UID: "23a5746a-8221-457c-8450-bdcded63f5c5"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 11 17:52:27 crc kubenswrapper[4837]: I0111 17:52:27.558149 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/23a5746a-8221-457c-8450-bdcded63f5c5-sg-core-conf-yaml" (OuterVolumeSpecName: "sg-core-conf-yaml") pod "23a5746a-8221-457c-8450-bdcded63f5c5" (UID: "23a5746a-8221-457c-8450-bdcded63f5c5"). InnerVolumeSpecName "sg-core-conf-yaml". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 11 17:52:27 crc kubenswrapper[4837]: I0111 17:52:27.623167 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/23a5746a-8221-457c-8450-bdcded63f5c5-config-data" (OuterVolumeSpecName: "config-data") pod "23a5746a-8221-457c-8450-bdcded63f5c5" (UID: "23a5746a-8221-457c-8450-bdcded63f5c5"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 11 17:52:27 crc kubenswrapper[4837]: I0111 17:52:27.633530 4837 reconciler_common.go:293] "Volume detached for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/23a5746a-8221-457c-8450-bdcded63f5c5-sg-core-conf-yaml\") on node \"crc\" DevicePath \"\"" Jan 11 17:52:27 crc kubenswrapper[4837]: I0111 17:52:27.633558 4837 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/23a5746a-8221-457c-8450-bdcded63f5c5-log-httpd\") on node \"crc\" DevicePath \"\"" Jan 11 17:52:27 crc kubenswrapper[4837]: I0111 17:52:27.633572 4837 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/23a5746a-8221-457c-8450-bdcded63f5c5-scripts\") on node \"crc\" DevicePath \"\"" Jan 11 17:52:27 crc kubenswrapper[4837]: I0111 17:52:27.633586 4837 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/23a5746a-8221-457c-8450-bdcded63f5c5-run-httpd\") on node \"crc\" DevicePath \"\"" Jan 11 17:52:27 crc kubenswrapper[4837]: I0111 17:52:27.633598 4837 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-ttjn6\" (UniqueName: \"kubernetes.io/projected/23a5746a-8221-457c-8450-bdcded63f5c5-kube-api-access-ttjn6\") on node \"crc\" DevicePath \"\"" Jan 11 17:52:27 crc kubenswrapper[4837]: I0111 17:52:27.633611 4837 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/23a5746a-8221-457c-8450-bdcded63f5c5-config-data\") on node \"crc\" DevicePath \"\"" Jan 11 17:52:27 crc kubenswrapper[4837]: I0111 17:52:27.637552 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/23a5746a-8221-457c-8450-bdcded63f5c5-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "23a5746a-8221-457c-8450-bdcded63f5c5" (UID: "23a5746a-8221-457c-8450-bdcded63f5c5"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 11 17:52:27 crc kubenswrapper[4837]: I0111 17:52:27.735063 4837 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/23a5746a-8221-457c-8450-bdcded63f5c5-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 11 17:52:27 crc kubenswrapper[4837]: I0111 17:52:27.841851 4837 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Jan 11 17:52:27 crc kubenswrapper[4837]: I0111 17:52:27.860973 4837 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ceilometer-0"] Jan 11 17:52:27 crc kubenswrapper[4837]: I0111 17:52:27.871825 4837 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceilometer-0"] Jan 11 17:52:27 crc kubenswrapper[4837]: E0111 17:52:27.872434 4837 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="23a5746a-8221-457c-8450-bdcded63f5c5" containerName="proxy-httpd" Jan 11 17:52:27 crc kubenswrapper[4837]: I0111 17:52:27.872457 4837 state_mem.go:107] "Deleted CPUSet assignment" podUID="23a5746a-8221-457c-8450-bdcded63f5c5" containerName="proxy-httpd" Jan 11 17:52:27 crc kubenswrapper[4837]: E0111 17:52:27.872480 4837 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="23a5746a-8221-457c-8450-bdcded63f5c5" containerName="sg-core" Jan 11 17:52:27 crc kubenswrapper[4837]: I0111 17:52:27.872488 4837 state_mem.go:107] "Deleted CPUSet assignment" podUID="23a5746a-8221-457c-8450-bdcded63f5c5" containerName="sg-core" Jan 11 17:52:27 crc kubenswrapper[4837]: E0111 17:52:27.872505 4837 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="23a5746a-8221-457c-8450-bdcded63f5c5" containerName="ceilometer-notification-agent" Jan 11 17:52:27 crc kubenswrapper[4837]: I0111 17:52:27.872514 4837 state_mem.go:107] "Deleted CPUSet assignment" podUID="23a5746a-8221-457c-8450-bdcded63f5c5" containerName="ceilometer-notification-agent" Jan 11 17:52:27 crc kubenswrapper[4837]: E0111 17:52:27.872547 4837 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="23a5746a-8221-457c-8450-bdcded63f5c5" containerName="ceilometer-central-agent" Jan 11 17:52:27 crc kubenswrapper[4837]: I0111 17:52:27.872556 4837 state_mem.go:107] "Deleted CPUSet assignment" podUID="23a5746a-8221-457c-8450-bdcded63f5c5" containerName="ceilometer-central-agent" Jan 11 17:52:27 crc kubenswrapper[4837]: I0111 17:52:27.872790 4837 memory_manager.go:354] "RemoveStaleState removing state" podUID="23a5746a-8221-457c-8450-bdcded63f5c5" containerName="ceilometer-central-agent" Jan 11 17:52:27 crc kubenswrapper[4837]: I0111 17:52:27.872822 4837 memory_manager.go:354] "RemoveStaleState removing state" podUID="23a5746a-8221-457c-8450-bdcded63f5c5" containerName="proxy-httpd" Jan 11 17:52:27 crc kubenswrapper[4837]: I0111 17:52:27.872842 4837 memory_manager.go:354] "RemoveStaleState removing state" podUID="23a5746a-8221-457c-8450-bdcded63f5c5" containerName="sg-core" Jan 11 17:52:27 crc kubenswrapper[4837]: I0111 17:52:27.872858 4837 memory_manager.go:354] "RemoveStaleState removing state" podUID="23a5746a-8221-457c-8450-bdcded63f5c5" containerName="ceilometer-notification-agent" Jan 11 17:52:27 crc kubenswrapper[4837]: I0111 17:52:27.874919 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Jan 11 17:52:27 crc kubenswrapper[4837]: I0111 17:52:27.877485 4837 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-config-data" Jan 11 17:52:27 crc kubenswrapper[4837]: I0111 17:52:27.878243 4837 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-scripts" Jan 11 17:52:27 crc kubenswrapper[4837]: I0111 17:52:27.895924 4837 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Jan 11 17:52:28 crc kubenswrapper[4837]: I0111 17:52:28.041961 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/473ed472-fd1a-47bf-8dd8-fed3c09f25d9-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"473ed472-fd1a-47bf-8dd8-fed3c09f25d9\") " pod="openstack/ceilometer-0" Jan 11 17:52:28 crc kubenswrapper[4837]: I0111 17:52:28.042066 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/473ed472-fd1a-47bf-8dd8-fed3c09f25d9-log-httpd\") pod \"ceilometer-0\" (UID: \"473ed472-fd1a-47bf-8dd8-fed3c09f25d9\") " pod="openstack/ceilometer-0" Jan 11 17:52:28 crc kubenswrapper[4837]: I0111 17:52:28.042198 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/473ed472-fd1a-47bf-8dd8-fed3c09f25d9-scripts\") pod \"ceilometer-0\" (UID: \"473ed472-fd1a-47bf-8dd8-fed3c09f25d9\") " pod="openstack/ceilometer-0" Jan 11 17:52:28 crc kubenswrapper[4837]: I0111 17:52:28.042241 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qr6lf\" (UniqueName: \"kubernetes.io/projected/473ed472-fd1a-47bf-8dd8-fed3c09f25d9-kube-api-access-qr6lf\") pod \"ceilometer-0\" (UID: \"473ed472-fd1a-47bf-8dd8-fed3c09f25d9\") " pod="openstack/ceilometer-0" Jan 11 17:52:28 crc kubenswrapper[4837]: I0111 17:52:28.042294 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/473ed472-fd1a-47bf-8dd8-fed3c09f25d9-run-httpd\") pod \"ceilometer-0\" (UID: \"473ed472-fd1a-47bf-8dd8-fed3c09f25d9\") " pod="openstack/ceilometer-0" Jan 11 17:52:28 crc kubenswrapper[4837]: I0111 17:52:28.042331 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/473ed472-fd1a-47bf-8dd8-fed3c09f25d9-config-data\") pod \"ceilometer-0\" (UID: \"473ed472-fd1a-47bf-8dd8-fed3c09f25d9\") " pod="openstack/ceilometer-0" Jan 11 17:52:28 crc kubenswrapper[4837]: I0111 17:52:28.042358 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/473ed472-fd1a-47bf-8dd8-fed3c09f25d9-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"473ed472-fd1a-47bf-8dd8-fed3c09f25d9\") " pod="openstack/ceilometer-0" Jan 11 17:52:28 crc kubenswrapper[4837]: I0111 17:52:28.143228 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/473ed472-fd1a-47bf-8dd8-fed3c09f25d9-scripts\") pod \"ceilometer-0\" (UID: \"473ed472-fd1a-47bf-8dd8-fed3c09f25d9\") " pod="openstack/ceilometer-0" Jan 11 17:52:28 crc kubenswrapper[4837]: I0111 17:52:28.143279 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qr6lf\" (UniqueName: \"kubernetes.io/projected/473ed472-fd1a-47bf-8dd8-fed3c09f25d9-kube-api-access-qr6lf\") pod \"ceilometer-0\" (UID: \"473ed472-fd1a-47bf-8dd8-fed3c09f25d9\") " pod="openstack/ceilometer-0" Jan 11 17:52:28 crc kubenswrapper[4837]: I0111 17:52:28.143314 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/473ed472-fd1a-47bf-8dd8-fed3c09f25d9-run-httpd\") pod \"ceilometer-0\" (UID: \"473ed472-fd1a-47bf-8dd8-fed3c09f25d9\") " pod="openstack/ceilometer-0" Jan 11 17:52:28 crc kubenswrapper[4837]: I0111 17:52:28.143338 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/473ed472-fd1a-47bf-8dd8-fed3c09f25d9-config-data\") pod \"ceilometer-0\" (UID: \"473ed472-fd1a-47bf-8dd8-fed3c09f25d9\") " pod="openstack/ceilometer-0" Jan 11 17:52:28 crc kubenswrapper[4837]: I0111 17:52:28.143356 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/473ed472-fd1a-47bf-8dd8-fed3c09f25d9-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"473ed472-fd1a-47bf-8dd8-fed3c09f25d9\") " pod="openstack/ceilometer-0" Jan 11 17:52:28 crc kubenswrapper[4837]: I0111 17:52:28.143394 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/473ed472-fd1a-47bf-8dd8-fed3c09f25d9-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"473ed472-fd1a-47bf-8dd8-fed3c09f25d9\") " pod="openstack/ceilometer-0" Jan 11 17:52:28 crc kubenswrapper[4837]: I0111 17:52:28.143438 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/473ed472-fd1a-47bf-8dd8-fed3c09f25d9-log-httpd\") pod \"ceilometer-0\" (UID: \"473ed472-fd1a-47bf-8dd8-fed3c09f25d9\") " pod="openstack/ceilometer-0" Jan 11 17:52:28 crc kubenswrapper[4837]: I0111 17:52:28.144145 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/473ed472-fd1a-47bf-8dd8-fed3c09f25d9-run-httpd\") pod \"ceilometer-0\" (UID: \"473ed472-fd1a-47bf-8dd8-fed3c09f25d9\") " pod="openstack/ceilometer-0" Jan 11 17:52:28 crc kubenswrapper[4837]: I0111 17:52:28.146582 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/473ed472-fd1a-47bf-8dd8-fed3c09f25d9-log-httpd\") pod \"ceilometer-0\" (UID: \"473ed472-fd1a-47bf-8dd8-fed3c09f25d9\") " pod="openstack/ceilometer-0" Jan 11 17:52:28 crc kubenswrapper[4837]: I0111 17:52:28.148648 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/473ed472-fd1a-47bf-8dd8-fed3c09f25d9-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"473ed472-fd1a-47bf-8dd8-fed3c09f25d9\") " pod="openstack/ceilometer-0" Jan 11 17:52:28 crc kubenswrapper[4837]: I0111 17:52:28.149335 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/473ed472-fd1a-47bf-8dd8-fed3c09f25d9-scripts\") pod \"ceilometer-0\" (UID: \"473ed472-fd1a-47bf-8dd8-fed3c09f25d9\") " pod="openstack/ceilometer-0" Jan 11 17:52:28 crc kubenswrapper[4837]: I0111 17:52:28.152710 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/473ed472-fd1a-47bf-8dd8-fed3c09f25d9-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"473ed472-fd1a-47bf-8dd8-fed3c09f25d9\") " pod="openstack/ceilometer-0" Jan 11 17:52:28 crc kubenswrapper[4837]: I0111 17:52:28.160736 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/473ed472-fd1a-47bf-8dd8-fed3c09f25d9-config-data\") pod \"ceilometer-0\" (UID: \"473ed472-fd1a-47bf-8dd8-fed3c09f25d9\") " pod="openstack/ceilometer-0" Jan 11 17:52:28 crc kubenswrapper[4837]: I0111 17:52:28.161880 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qr6lf\" (UniqueName: \"kubernetes.io/projected/473ed472-fd1a-47bf-8dd8-fed3c09f25d9-kube-api-access-qr6lf\") pod \"ceilometer-0\" (UID: \"473ed472-fd1a-47bf-8dd8-fed3c09f25d9\") " pod="openstack/ceilometer-0" Jan 11 17:52:28 crc kubenswrapper[4837]: I0111 17:52:28.199026 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Jan 11 17:52:28 crc kubenswrapper[4837]: I0111 17:52:28.375924 4837 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="23a5746a-8221-457c-8450-bdcded63f5c5" path="/var/lib/kubelet/pods/23a5746a-8221-457c-8450-bdcded63f5c5/volumes" Jan 11 17:52:32 crc kubenswrapper[4837]: I0111 17:52:32.139379 4837 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Jan 11 17:52:32 crc kubenswrapper[4837]: I0111 17:52:32.266917 4837 scope.go:117] "RemoveContainer" containerID="f3fa7b57408e597bcf73eaad13d435014b6b8a4c31930b25e691cd5d71a19061" Jan 11 17:52:32 crc kubenswrapper[4837]: I0111 17:52:32.319357 4837 scope.go:117] "RemoveContainer" containerID="8bd489ab6dc35becad507f4a548c14211092e6f632d5e3ae7c9a256d90b9dc26" Jan 11 17:52:32 crc kubenswrapper[4837]: I0111 17:52:32.478112 4837 scope.go:117] "RemoveContainer" containerID="eda43633915aae34c66b30d5cf6200395a8485bfa9dba6f466fbb9b8caa65937" Jan 11 17:52:32 crc kubenswrapper[4837]: I0111 17:52:32.514992 4837 scope.go:117] "RemoveContainer" containerID="b376cf78a0fba129151fd7eb78254fa620eb6ba37cd5c63f49819a27f161126a" Jan 11 17:52:32 crc kubenswrapper[4837]: E0111 17:52:32.515755 4837 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b376cf78a0fba129151fd7eb78254fa620eb6ba37cd5c63f49819a27f161126a\": container with ID starting with b376cf78a0fba129151fd7eb78254fa620eb6ba37cd5c63f49819a27f161126a not found: ID does not exist" containerID="b376cf78a0fba129151fd7eb78254fa620eb6ba37cd5c63f49819a27f161126a" Jan 11 17:52:32 crc kubenswrapper[4837]: I0111 17:52:32.515798 4837 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b376cf78a0fba129151fd7eb78254fa620eb6ba37cd5c63f49819a27f161126a"} err="failed to get container status \"b376cf78a0fba129151fd7eb78254fa620eb6ba37cd5c63f49819a27f161126a\": rpc error: code = NotFound desc = could not find container \"b376cf78a0fba129151fd7eb78254fa620eb6ba37cd5c63f49819a27f161126a\": container with ID starting with b376cf78a0fba129151fd7eb78254fa620eb6ba37cd5c63f49819a27f161126a not found: ID does not exist" Jan 11 17:52:32 crc kubenswrapper[4837]: I0111 17:52:32.515829 4837 scope.go:117] "RemoveContainer" containerID="f3fa7b57408e597bcf73eaad13d435014b6b8a4c31930b25e691cd5d71a19061" Jan 11 17:52:32 crc kubenswrapper[4837]: E0111 17:52:32.516295 4837 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f3fa7b57408e597bcf73eaad13d435014b6b8a4c31930b25e691cd5d71a19061\": container with ID starting with f3fa7b57408e597bcf73eaad13d435014b6b8a4c31930b25e691cd5d71a19061 not found: ID does not exist" containerID="f3fa7b57408e597bcf73eaad13d435014b6b8a4c31930b25e691cd5d71a19061" Jan 11 17:52:32 crc kubenswrapper[4837]: I0111 17:52:32.516326 4837 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f3fa7b57408e597bcf73eaad13d435014b6b8a4c31930b25e691cd5d71a19061"} err="failed to get container status \"f3fa7b57408e597bcf73eaad13d435014b6b8a4c31930b25e691cd5d71a19061\": rpc error: code = NotFound desc = could not find container \"f3fa7b57408e597bcf73eaad13d435014b6b8a4c31930b25e691cd5d71a19061\": container with ID starting with f3fa7b57408e597bcf73eaad13d435014b6b8a4c31930b25e691cd5d71a19061 not found: ID does not exist" Jan 11 17:52:32 crc kubenswrapper[4837]: I0111 17:52:32.516346 4837 scope.go:117] "RemoveContainer" containerID="8bd489ab6dc35becad507f4a548c14211092e6f632d5e3ae7c9a256d90b9dc26" Jan 11 17:52:32 crc kubenswrapper[4837]: E0111 17:52:32.516914 4837 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"8bd489ab6dc35becad507f4a548c14211092e6f632d5e3ae7c9a256d90b9dc26\": container with ID starting with 8bd489ab6dc35becad507f4a548c14211092e6f632d5e3ae7c9a256d90b9dc26 not found: ID does not exist" containerID="8bd489ab6dc35becad507f4a548c14211092e6f632d5e3ae7c9a256d90b9dc26" Jan 11 17:52:32 crc kubenswrapper[4837]: I0111 17:52:32.516945 4837 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8bd489ab6dc35becad507f4a548c14211092e6f632d5e3ae7c9a256d90b9dc26"} err="failed to get container status \"8bd489ab6dc35becad507f4a548c14211092e6f632d5e3ae7c9a256d90b9dc26\": rpc error: code = NotFound desc = could not find container \"8bd489ab6dc35becad507f4a548c14211092e6f632d5e3ae7c9a256d90b9dc26\": container with ID starting with 8bd489ab6dc35becad507f4a548c14211092e6f632d5e3ae7c9a256d90b9dc26 not found: ID does not exist" Jan 11 17:52:32 crc kubenswrapper[4837]: I0111 17:52:32.516960 4837 scope.go:117] "RemoveContainer" containerID="eda43633915aae34c66b30d5cf6200395a8485bfa9dba6f466fbb9b8caa65937" Jan 11 17:52:32 crc kubenswrapper[4837]: E0111 17:52:32.517275 4837 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"eda43633915aae34c66b30d5cf6200395a8485bfa9dba6f466fbb9b8caa65937\": container with ID starting with eda43633915aae34c66b30d5cf6200395a8485bfa9dba6f466fbb9b8caa65937 not found: ID does not exist" containerID="eda43633915aae34c66b30d5cf6200395a8485bfa9dba6f466fbb9b8caa65937" Jan 11 17:52:32 crc kubenswrapper[4837]: I0111 17:52:32.517333 4837 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"eda43633915aae34c66b30d5cf6200395a8485bfa9dba6f466fbb9b8caa65937"} err="failed to get container status \"eda43633915aae34c66b30d5cf6200395a8485bfa9dba6f466fbb9b8caa65937\": rpc error: code = NotFound desc = could not find container \"eda43633915aae34c66b30d5cf6200395a8485bfa9dba6f466fbb9b8caa65937\": container with ID starting with eda43633915aae34c66b30d5cf6200395a8485bfa9dba6f466fbb9b8caa65937 not found: ID does not exist" Jan 11 17:52:32 crc kubenswrapper[4837]: I0111 17:52:32.803718 4837 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Jan 11 17:52:32 crc kubenswrapper[4837]: W0111 17:52:32.812934 4837 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod473ed472_fd1a_47bf_8dd8_fed3c09f25d9.slice/crio-623a6989e054318bc13472c5f562cb0fc4264f29297a2df96eeeac75dee91dbc WatchSource:0}: Error finding container 623a6989e054318bc13472c5f562cb0fc4264f29297a2df96eeeac75dee91dbc: Status 404 returned error can't find the container with id 623a6989e054318bc13472c5f562cb0fc4264f29297a2df96eeeac75dee91dbc Jan 11 17:52:33 crc kubenswrapper[4837]: I0111 17:52:33.555416 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-db-sync-q5qqm" event={"ID":"0550c6b1-8707-42d4-a3ac-f2cd7f8ac80f","Type":"ContainerStarted","Data":"36d7dcea07af7a2b3cdbeff10a258a5334de2d4d97b15aeca8b1d43fdd0bdb14"} Jan 11 17:52:33 crc kubenswrapper[4837]: I0111 17:52:33.557258 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"473ed472-fd1a-47bf-8dd8-fed3c09f25d9","Type":"ContainerStarted","Data":"94e787115852033fe6fbe7306b6a56b15a320e12713c6571f0efbead1bf65ff8"} Jan 11 17:52:33 crc kubenswrapper[4837]: I0111 17:52:33.557310 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"473ed472-fd1a-47bf-8dd8-fed3c09f25d9","Type":"ContainerStarted","Data":"623a6989e054318bc13472c5f562cb0fc4264f29297a2df96eeeac75dee91dbc"} Jan 11 17:52:33 crc kubenswrapper[4837]: I0111 17:52:33.579959 4837 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell0-conductor-db-sync-q5qqm" podStartSLOduration=2.145872519 podStartE2EDuration="10.579942779s" podCreationTimestamp="2026-01-11 17:52:23 +0000 UTC" firstStartedPulling="2026-01-11 17:52:23.903360986 +0000 UTC m=+1318.081553732" lastFinishedPulling="2026-01-11 17:52:32.337431256 +0000 UTC m=+1326.515623992" observedRunningTime="2026-01-11 17:52:33.571846611 +0000 UTC m=+1327.750039347" watchObservedRunningTime="2026-01-11 17:52:33.579942779 +0000 UTC m=+1327.758135485" Jan 11 17:52:37 crc kubenswrapper[4837]: I0111 17:52:37.604899 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"473ed472-fd1a-47bf-8dd8-fed3c09f25d9","Type":"ContainerStarted","Data":"7a224cca94ebe32085b70abcadb938580b245daff86bff8e709e063b8f23811c"} Jan 11 17:52:43 crc kubenswrapper[4837]: I0111 17:52:43.514562 4837 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Jan 11 17:52:43 crc kubenswrapper[4837]: I0111 17:52:43.661983 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"473ed472-fd1a-47bf-8dd8-fed3c09f25d9","Type":"ContainerStarted","Data":"78eb8282a819e6e389f473e3f0590438dd87fb6d45d2042732de1e03168ccd4c"} Jan 11 17:52:45 crc kubenswrapper[4837]: I0111 17:52:45.683643 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"473ed472-fd1a-47bf-8dd8-fed3c09f25d9","Type":"ContainerStarted","Data":"79a4d8dcb0e12c9fe797ef4fc8a745be529ca049ed4264d14e9cc92f69d3920e"} Jan 11 17:52:45 crc kubenswrapper[4837]: I0111 17:52:45.683954 4837 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="473ed472-fd1a-47bf-8dd8-fed3c09f25d9" containerName="ceilometer-central-agent" containerID="cri-o://94e787115852033fe6fbe7306b6a56b15a320e12713c6571f0efbead1bf65ff8" gracePeriod=30 Jan 11 17:52:45 crc kubenswrapper[4837]: I0111 17:52:45.684136 4837 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="473ed472-fd1a-47bf-8dd8-fed3c09f25d9" containerName="sg-core" containerID="cri-o://78eb8282a819e6e389f473e3f0590438dd87fb6d45d2042732de1e03168ccd4c" gracePeriod=30 Jan 11 17:52:45 crc kubenswrapper[4837]: I0111 17:52:45.684028 4837 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="473ed472-fd1a-47bf-8dd8-fed3c09f25d9" containerName="proxy-httpd" containerID="cri-o://79a4d8dcb0e12c9fe797ef4fc8a745be529ca049ed4264d14e9cc92f69d3920e" gracePeriod=30 Jan 11 17:52:45 crc kubenswrapper[4837]: I0111 17:52:45.684105 4837 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="473ed472-fd1a-47bf-8dd8-fed3c09f25d9" containerName="ceilometer-notification-agent" containerID="cri-o://7a224cca94ebe32085b70abcadb938580b245daff86bff8e709e063b8f23811c" gracePeriod=30 Jan 11 17:52:45 crc kubenswrapper[4837]: I0111 17:52:45.684345 4837 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ceilometer-0" Jan 11 17:52:45 crc kubenswrapper[4837]: I0111 17:52:45.706439 4837 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ceilometer-0" podStartSLOduration=6.5506141719999995 podStartE2EDuration="18.706426964s" podCreationTimestamp="2026-01-11 17:52:27 +0000 UTC" firstStartedPulling="2026-01-11 17:52:32.816950746 +0000 UTC m=+1326.995143452" lastFinishedPulling="2026-01-11 17:52:44.972763538 +0000 UTC m=+1339.150956244" observedRunningTime="2026-01-11 17:52:45.706040973 +0000 UTC m=+1339.884233689" watchObservedRunningTime="2026-01-11 17:52:45.706426964 +0000 UTC m=+1339.884619670" Jan 11 17:52:46 crc kubenswrapper[4837]: I0111 17:52:46.477385 4837 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Jan 11 17:52:46 crc kubenswrapper[4837]: I0111 17:52:46.551265 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/473ed472-fd1a-47bf-8dd8-fed3c09f25d9-combined-ca-bundle\") pod \"473ed472-fd1a-47bf-8dd8-fed3c09f25d9\" (UID: \"473ed472-fd1a-47bf-8dd8-fed3c09f25d9\") " Jan 11 17:52:46 crc kubenswrapper[4837]: I0111 17:52:46.551338 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/473ed472-fd1a-47bf-8dd8-fed3c09f25d9-log-httpd\") pod \"473ed472-fd1a-47bf-8dd8-fed3c09f25d9\" (UID: \"473ed472-fd1a-47bf-8dd8-fed3c09f25d9\") " Jan 11 17:52:46 crc kubenswrapper[4837]: I0111 17:52:46.551382 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/473ed472-fd1a-47bf-8dd8-fed3c09f25d9-config-data\") pod \"473ed472-fd1a-47bf-8dd8-fed3c09f25d9\" (UID: \"473ed472-fd1a-47bf-8dd8-fed3c09f25d9\") " Jan 11 17:52:46 crc kubenswrapper[4837]: I0111 17:52:46.551436 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/473ed472-fd1a-47bf-8dd8-fed3c09f25d9-sg-core-conf-yaml\") pod \"473ed472-fd1a-47bf-8dd8-fed3c09f25d9\" (UID: \"473ed472-fd1a-47bf-8dd8-fed3c09f25d9\") " Jan 11 17:52:46 crc kubenswrapper[4837]: I0111 17:52:46.551458 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qr6lf\" (UniqueName: \"kubernetes.io/projected/473ed472-fd1a-47bf-8dd8-fed3c09f25d9-kube-api-access-qr6lf\") pod \"473ed472-fd1a-47bf-8dd8-fed3c09f25d9\" (UID: \"473ed472-fd1a-47bf-8dd8-fed3c09f25d9\") " Jan 11 17:52:46 crc kubenswrapper[4837]: I0111 17:52:46.551507 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/473ed472-fd1a-47bf-8dd8-fed3c09f25d9-run-httpd\") pod \"473ed472-fd1a-47bf-8dd8-fed3c09f25d9\" (UID: \"473ed472-fd1a-47bf-8dd8-fed3c09f25d9\") " Jan 11 17:52:46 crc kubenswrapper[4837]: I0111 17:52:46.551592 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/473ed472-fd1a-47bf-8dd8-fed3c09f25d9-scripts\") pod \"473ed472-fd1a-47bf-8dd8-fed3c09f25d9\" (UID: \"473ed472-fd1a-47bf-8dd8-fed3c09f25d9\") " Jan 11 17:52:46 crc kubenswrapper[4837]: I0111 17:52:46.552284 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/473ed472-fd1a-47bf-8dd8-fed3c09f25d9-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "473ed472-fd1a-47bf-8dd8-fed3c09f25d9" (UID: "473ed472-fd1a-47bf-8dd8-fed3c09f25d9"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 11 17:52:46 crc kubenswrapper[4837]: I0111 17:52:46.552631 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/473ed472-fd1a-47bf-8dd8-fed3c09f25d9-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "473ed472-fd1a-47bf-8dd8-fed3c09f25d9" (UID: "473ed472-fd1a-47bf-8dd8-fed3c09f25d9"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 11 17:52:46 crc kubenswrapper[4837]: I0111 17:52:46.556410 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/473ed472-fd1a-47bf-8dd8-fed3c09f25d9-kube-api-access-qr6lf" (OuterVolumeSpecName: "kube-api-access-qr6lf") pod "473ed472-fd1a-47bf-8dd8-fed3c09f25d9" (UID: "473ed472-fd1a-47bf-8dd8-fed3c09f25d9"). InnerVolumeSpecName "kube-api-access-qr6lf". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 11 17:52:46 crc kubenswrapper[4837]: I0111 17:52:46.556503 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/473ed472-fd1a-47bf-8dd8-fed3c09f25d9-scripts" (OuterVolumeSpecName: "scripts") pod "473ed472-fd1a-47bf-8dd8-fed3c09f25d9" (UID: "473ed472-fd1a-47bf-8dd8-fed3c09f25d9"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 11 17:52:46 crc kubenswrapper[4837]: I0111 17:52:46.585883 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/473ed472-fd1a-47bf-8dd8-fed3c09f25d9-sg-core-conf-yaml" (OuterVolumeSpecName: "sg-core-conf-yaml") pod "473ed472-fd1a-47bf-8dd8-fed3c09f25d9" (UID: "473ed472-fd1a-47bf-8dd8-fed3c09f25d9"). InnerVolumeSpecName "sg-core-conf-yaml". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 11 17:52:46 crc kubenswrapper[4837]: I0111 17:52:46.643937 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/473ed472-fd1a-47bf-8dd8-fed3c09f25d9-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "473ed472-fd1a-47bf-8dd8-fed3c09f25d9" (UID: "473ed472-fd1a-47bf-8dd8-fed3c09f25d9"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 11 17:52:46 crc kubenswrapper[4837]: I0111 17:52:46.653769 4837 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/473ed472-fd1a-47bf-8dd8-fed3c09f25d9-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 11 17:52:46 crc kubenswrapper[4837]: I0111 17:52:46.653805 4837 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/473ed472-fd1a-47bf-8dd8-fed3c09f25d9-log-httpd\") on node \"crc\" DevicePath \"\"" Jan 11 17:52:46 crc kubenswrapper[4837]: I0111 17:52:46.653815 4837 reconciler_common.go:293] "Volume detached for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/473ed472-fd1a-47bf-8dd8-fed3c09f25d9-sg-core-conf-yaml\") on node \"crc\" DevicePath \"\"" Jan 11 17:52:46 crc kubenswrapper[4837]: I0111 17:52:46.653826 4837 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qr6lf\" (UniqueName: \"kubernetes.io/projected/473ed472-fd1a-47bf-8dd8-fed3c09f25d9-kube-api-access-qr6lf\") on node \"crc\" DevicePath \"\"" Jan 11 17:52:46 crc kubenswrapper[4837]: I0111 17:52:46.653835 4837 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/473ed472-fd1a-47bf-8dd8-fed3c09f25d9-run-httpd\") on node \"crc\" DevicePath \"\"" Jan 11 17:52:46 crc kubenswrapper[4837]: I0111 17:52:46.653847 4837 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/473ed472-fd1a-47bf-8dd8-fed3c09f25d9-scripts\") on node \"crc\" DevicePath \"\"" Jan 11 17:52:46 crc kubenswrapper[4837]: I0111 17:52:46.684974 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/473ed472-fd1a-47bf-8dd8-fed3c09f25d9-config-data" (OuterVolumeSpecName: "config-data") pod "473ed472-fd1a-47bf-8dd8-fed3c09f25d9" (UID: "473ed472-fd1a-47bf-8dd8-fed3c09f25d9"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 11 17:52:46 crc kubenswrapper[4837]: I0111 17:52:46.693173 4837 generic.go:334] "Generic (PLEG): container finished" podID="473ed472-fd1a-47bf-8dd8-fed3c09f25d9" containerID="79a4d8dcb0e12c9fe797ef4fc8a745be529ca049ed4264d14e9cc92f69d3920e" exitCode=0 Jan 11 17:52:46 crc kubenswrapper[4837]: I0111 17:52:46.693211 4837 generic.go:334] "Generic (PLEG): container finished" podID="473ed472-fd1a-47bf-8dd8-fed3c09f25d9" containerID="78eb8282a819e6e389f473e3f0590438dd87fb6d45d2042732de1e03168ccd4c" exitCode=2 Jan 11 17:52:46 crc kubenswrapper[4837]: I0111 17:52:46.693222 4837 generic.go:334] "Generic (PLEG): container finished" podID="473ed472-fd1a-47bf-8dd8-fed3c09f25d9" containerID="7a224cca94ebe32085b70abcadb938580b245daff86bff8e709e063b8f23811c" exitCode=0 Jan 11 17:52:46 crc kubenswrapper[4837]: I0111 17:52:46.693233 4837 generic.go:334] "Generic (PLEG): container finished" podID="473ed472-fd1a-47bf-8dd8-fed3c09f25d9" containerID="94e787115852033fe6fbe7306b6a56b15a320e12713c6571f0efbead1bf65ff8" exitCode=0 Jan 11 17:52:46 crc kubenswrapper[4837]: I0111 17:52:46.693234 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"473ed472-fd1a-47bf-8dd8-fed3c09f25d9","Type":"ContainerDied","Data":"79a4d8dcb0e12c9fe797ef4fc8a745be529ca049ed4264d14e9cc92f69d3920e"} Jan 11 17:52:46 crc kubenswrapper[4837]: I0111 17:52:46.693264 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"473ed472-fd1a-47bf-8dd8-fed3c09f25d9","Type":"ContainerDied","Data":"78eb8282a819e6e389f473e3f0590438dd87fb6d45d2042732de1e03168ccd4c"} Jan 11 17:52:46 crc kubenswrapper[4837]: I0111 17:52:46.693273 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"473ed472-fd1a-47bf-8dd8-fed3c09f25d9","Type":"ContainerDied","Data":"7a224cca94ebe32085b70abcadb938580b245daff86bff8e709e063b8f23811c"} Jan 11 17:52:46 crc kubenswrapper[4837]: I0111 17:52:46.693281 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"473ed472-fd1a-47bf-8dd8-fed3c09f25d9","Type":"ContainerDied","Data":"94e787115852033fe6fbe7306b6a56b15a320e12713c6571f0efbead1bf65ff8"} Jan 11 17:52:46 crc kubenswrapper[4837]: I0111 17:52:46.693290 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"473ed472-fd1a-47bf-8dd8-fed3c09f25d9","Type":"ContainerDied","Data":"623a6989e054318bc13472c5f562cb0fc4264f29297a2df96eeeac75dee91dbc"} Jan 11 17:52:46 crc kubenswrapper[4837]: I0111 17:52:46.693318 4837 scope.go:117] "RemoveContainer" containerID="79a4d8dcb0e12c9fe797ef4fc8a745be529ca049ed4264d14e9cc92f69d3920e" Jan 11 17:52:46 crc kubenswrapper[4837]: I0111 17:52:46.693589 4837 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Jan 11 17:52:46 crc kubenswrapper[4837]: I0111 17:52:46.713065 4837 scope.go:117] "RemoveContainer" containerID="78eb8282a819e6e389f473e3f0590438dd87fb6d45d2042732de1e03168ccd4c" Jan 11 17:52:46 crc kubenswrapper[4837]: I0111 17:52:46.739115 4837 scope.go:117] "RemoveContainer" containerID="7a224cca94ebe32085b70abcadb938580b245daff86bff8e709e063b8f23811c" Jan 11 17:52:46 crc kubenswrapper[4837]: I0111 17:52:46.743964 4837 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Jan 11 17:52:46 crc kubenswrapper[4837]: I0111 17:52:46.752497 4837 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ceilometer-0"] Jan 11 17:52:46 crc kubenswrapper[4837]: I0111 17:52:46.755505 4837 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/473ed472-fd1a-47bf-8dd8-fed3c09f25d9-config-data\") on node \"crc\" DevicePath \"\"" Jan 11 17:52:46 crc kubenswrapper[4837]: I0111 17:52:46.763198 4837 scope.go:117] "RemoveContainer" containerID="94e787115852033fe6fbe7306b6a56b15a320e12713c6571f0efbead1bf65ff8" Jan 11 17:52:46 crc kubenswrapper[4837]: I0111 17:52:46.774666 4837 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceilometer-0"] Jan 11 17:52:46 crc kubenswrapper[4837]: E0111 17:52:46.775110 4837 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="473ed472-fd1a-47bf-8dd8-fed3c09f25d9" containerName="sg-core" Jan 11 17:52:46 crc kubenswrapper[4837]: I0111 17:52:46.775131 4837 state_mem.go:107] "Deleted CPUSet assignment" podUID="473ed472-fd1a-47bf-8dd8-fed3c09f25d9" containerName="sg-core" Jan 11 17:52:46 crc kubenswrapper[4837]: E0111 17:52:46.775156 4837 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="473ed472-fd1a-47bf-8dd8-fed3c09f25d9" containerName="ceilometer-notification-agent" Jan 11 17:52:46 crc kubenswrapper[4837]: I0111 17:52:46.775164 4837 state_mem.go:107] "Deleted CPUSet assignment" podUID="473ed472-fd1a-47bf-8dd8-fed3c09f25d9" containerName="ceilometer-notification-agent" Jan 11 17:52:46 crc kubenswrapper[4837]: E0111 17:52:46.775174 4837 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="473ed472-fd1a-47bf-8dd8-fed3c09f25d9" containerName="ceilometer-central-agent" Jan 11 17:52:46 crc kubenswrapper[4837]: I0111 17:52:46.775181 4837 state_mem.go:107] "Deleted CPUSet assignment" podUID="473ed472-fd1a-47bf-8dd8-fed3c09f25d9" containerName="ceilometer-central-agent" Jan 11 17:52:46 crc kubenswrapper[4837]: E0111 17:52:46.775210 4837 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="473ed472-fd1a-47bf-8dd8-fed3c09f25d9" containerName="proxy-httpd" Jan 11 17:52:46 crc kubenswrapper[4837]: I0111 17:52:46.775219 4837 state_mem.go:107] "Deleted CPUSet assignment" podUID="473ed472-fd1a-47bf-8dd8-fed3c09f25d9" containerName="proxy-httpd" Jan 11 17:52:46 crc kubenswrapper[4837]: I0111 17:52:46.775406 4837 memory_manager.go:354] "RemoveStaleState removing state" podUID="473ed472-fd1a-47bf-8dd8-fed3c09f25d9" containerName="ceilometer-central-agent" Jan 11 17:52:46 crc kubenswrapper[4837]: I0111 17:52:46.775419 4837 memory_manager.go:354] "RemoveStaleState removing state" podUID="473ed472-fd1a-47bf-8dd8-fed3c09f25d9" containerName="sg-core" Jan 11 17:52:46 crc kubenswrapper[4837]: I0111 17:52:46.775432 4837 memory_manager.go:354] "RemoveStaleState removing state" podUID="473ed472-fd1a-47bf-8dd8-fed3c09f25d9" containerName="ceilometer-notification-agent" Jan 11 17:52:46 crc kubenswrapper[4837]: I0111 17:52:46.775450 4837 memory_manager.go:354] "RemoveStaleState removing state" podUID="473ed472-fd1a-47bf-8dd8-fed3c09f25d9" containerName="proxy-httpd" Jan 11 17:52:46 crc kubenswrapper[4837]: I0111 17:52:46.777472 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Jan 11 17:52:46 crc kubenswrapper[4837]: I0111 17:52:46.782872 4837 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-scripts" Jan 11 17:52:46 crc kubenswrapper[4837]: I0111 17:52:46.783617 4837 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-config-data" Jan 11 17:52:46 crc kubenswrapper[4837]: I0111 17:52:46.790335 4837 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Jan 11 17:52:46 crc kubenswrapper[4837]: I0111 17:52:46.803564 4837 scope.go:117] "RemoveContainer" containerID="79a4d8dcb0e12c9fe797ef4fc8a745be529ca049ed4264d14e9cc92f69d3920e" Jan 11 17:52:46 crc kubenswrapper[4837]: E0111 17:52:46.804081 4837 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"79a4d8dcb0e12c9fe797ef4fc8a745be529ca049ed4264d14e9cc92f69d3920e\": container with ID starting with 79a4d8dcb0e12c9fe797ef4fc8a745be529ca049ed4264d14e9cc92f69d3920e not found: ID does not exist" containerID="79a4d8dcb0e12c9fe797ef4fc8a745be529ca049ed4264d14e9cc92f69d3920e" Jan 11 17:52:46 crc kubenswrapper[4837]: I0111 17:52:46.804135 4837 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"79a4d8dcb0e12c9fe797ef4fc8a745be529ca049ed4264d14e9cc92f69d3920e"} err="failed to get container status \"79a4d8dcb0e12c9fe797ef4fc8a745be529ca049ed4264d14e9cc92f69d3920e\": rpc error: code = NotFound desc = could not find container \"79a4d8dcb0e12c9fe797ef4fc8a745be529ca049ed4264d14e9cc92f69d3920e\": container with ID starting with 79a4d8dcb0e12c9fe797ef4fc8a745be529ca049ed4264d14e9cc92f69d3920e not found: ID does not exist" Jan 11 17:52:46 crc kubenswrapper[4837]: I0111 17:52:46.804165 4837 scope.go:117] "RemoveContainer" containerID="78eb8282a819e6e389f473e3f0590438dd87fb6d45d2042732de1e03168ccd4c" Jan 11 17:52:46 crc kubenswrapper[4837]: E0111 17:52:46.804729 4837 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"78eb8282a819e6e389f473e3f0590438dd87fb6d45d2042732de1e03168ccd4c\": container with ID starting with 78eb8282a819e6e389f473e3f0590438dd87fb6d45d2042732de1e03168ccd4c not found: ID does not exist" containerID="78eb8282a819e6e389f473e3f0590438dd87fb6d45d2042732de1e03168ccd4c" Jan 11 17:52:46 crc kubenswrapper[4837]: I0111 17:52:46.804771 4837 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"78eb8282a819e6e389f473e3f0590438dd87fb6d45d2042732de1e03168ccd4c"} err="failed to get container status \"78eb8282a819e6e389f473e3f0590438dd87fb6d45d2042732de1e03168ccd4c\": rpc error: code = NotFound desc = could not find container \"78eb8282a819e6e389f473e3f0590438dd87fb6d45d2042732de1e03168ccd4c\": container with ID starting with 78eb8282a819e6e389f473e3f0590438dd87fb6d45d2042732de1e03168ccd4c not found: ID does not exist" Jan 11 17:52:46 crc kubenswrapper[4837]: I0111 17:52:46.804798 4837 scope.go:117] "RemoveContainer" containerID="7a224cca94ebe32085b70abcadb938580b245daff86bff8e709e063b8f23811c" Jan 11 17:52:46 crc kubenswrapper[4837]: E0111 17:52:46.805183 4837 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"7a224cca94ebe32085b70abcadb938580b245daff86bff8e709e063b8f23811c\": container with ID starting with 7a224cca94ebe32085b70abcadb938580b245daff86bff8e709e063b8f23811c not found: ID does not exist" containerID="7a224cca94ebe32085b70abcadb938580b245daff86bff8e709e063b8f23811c" Jan 11 17:52:46 crc kubenswrapper[4837]: I0111 17:52:46.805221 4837 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7a224cca94ebe32085b70abcadb938580b245daff86bff8e709e063b8f23811c"} err="failed to get container status \"7a224cca94ebe32085b70abcadb938580b245daff86bff8e709e063b8f23811c\": rpc error: code = NotFound desc = could not find container \"7a224cca94ebe32085b70abcadb938580b245daff86bff8e709e063b8f23811c\": container with ID starting with 7a224cca94ebe32085b70abcadb938580b245daff86bff8e709e063b8f23811c not found: ID does not exist" Jan 11 17:52:46 crc kubenswrapper[4837]: I0111 17:52:46.805240 4837 scope.go:117] "RemoveContainer" containerID="94e787115852033fe6fbe7306b6a56b15a320e12713c6571f0efbead1bf65ff8" Jan 11 17:52:46 crc kubenswrapper[4837]: E0111 17:52:46.805487 4837 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"94e787115852033fe6fbe7306b6a56b15a320e12713c6571f0efbead1bf65ff8\": container with ID starting with 94e787115852033fe6fbe7306b6a56b15a320e12713c6571f0efbead1bf65ff8 not found: ID does not exist" containerID="94e787115852033fe6fbe7306b6a56b15a320e12713c6571f0efbead1bf65ff8" Jan 11 17:52:46 crc kubenswrapper[4837]: I0111 17:52:46.805515 4837 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"94e787115852033fe6fbe7306b6a56b15a320e12713c6571f0efbead1bf65ff8"} err="failed to get container status \"94e787115852033fe6fbe7306b6a56b15a320e12713c6571f0efbead1bf65ff8\": rpc error: code = NotFound desc = could not find container \"94e787115852033fe6fbe7306b6a56b15a320e12713c6571f0efbead1bf65ff8\": container with ID starting with 94e787115852033fe6fbe7306b6a56b15a320e12713c6571f0efbead1bf65ff8 not found: ID does not exist" Jan 11 17:52:46 crc kubenswrapper[4837]: I0111 17:52:46.805533 4837 scope.go:117] "RemoveContainer" containerID="79a4d8dcb0e12c9fe797ef4fc8a745be529ca049ed4264d14e9cc92f69d3920e" Jan 11 17:52:46 crc kubenswrapper[4837]: I0111 17:52:46.806014 4837 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"79a4d8dcb0e12c9fe797ef4fc8a745be529ca049ed4264d14e9cc92f69d3920e"} err="failed to get container status \"79a4d8dcb0e12c9fe797ef4fc8a745be529ca049ed4264d14e9cc92f69d3920e\": rpc error: code = NotFound desc = could not find container \"79a4d8dcb0e12c9fe797ef4fc8a745be529ca049ed4264d14e9cc92f69d3920e\": container with ID starting with 79a4d8dcb0e12c9fe797ef4fc8a745be529ca049ed4264d14e9cc92f69d3920e not found: ID does not exist" Jan 11 17:52:46 crc kubenswrapper[4837]: I0111 17:52:46.806040 4837 scope.go:117] "RemoveContainer" containerID="78eb8282a819e6e389f473e3f0590438dd87fb6d45d2042732de1e03168ccd4c" Jan 11 17:52:46 crc kubenswrapper[4837]: I0111 17:52:46.807125 4837 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"78eb8282a819e6e389f473e3f0590438dd87fb6d45d2042732de1e03168ccd4c"} err="failed to get container status \"78eb8282a819e6e389f473e3f0590438dd87fb6d45d2042732de1e03168ccd4c\": rpc error: code = NotFound desc = could not find container \"78eb8282a819e6e389f473e3f0590438dd87fb6d45d2042732de1e03168ccd4c\": container with ID starting with 78eb8282a819e6e389f473e3f0590438dd87fb6d45d2042732de1e03168ccd4c not found: ID does not exist" Jan 11 17:52:46 crc kubenswrapper[4837]: I0111 17:52:46.807176 4837 scope.go:117] "RemoveContainer" containerID="7a224cca94ebe32085b70abcadb938580b245daff86bff8e709e063b8f23811c" Jan 11 17:52:46 crc kubenswrapper[4837]: I0111 17:52:46.807449 4837 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7a224cca94ebe32085b70abcadb938580b245daff86bff8e709e063b8f23811c"} err="failed to get container status \"7a224cca94ebe32085b70abcadb938580b245daff86bff8e709e063b8f23811c\": rpc error: code = NotFound desc = could not find container \"7a224cca94ebe32085b70abcadb938580b245daff86bff8e709e063b8f23811c\": container with ID starting with 7a224cca94ebe32085b70abcadb938580b245daff86bff8e709e063b8f23811c not found: ID does not exist" Jan 11 17:52:46 crc kubenswrapper[4837]: I0111 17:52:46.807469 4837 scope.go:117] "RemoveContainer" containerID="94e787115852033fe6fbe7306b6a56b15a320e12713c6571f0efbead1bf65ff8" Jan 11 17:52:46 crc kubenswrapper[4837]: I0111 17:52:46.807959 4837 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"94e787115852033fe6fbe7306b6a56b15a320e12713c6571f0efbead1bf65ff8"} err="failed to get container status \"94e787115852033fe6fbe7306b6a56b15a320e12713c6571f0efbead1bf65ff8\": rpc error: code = NotFound desc = could not find container \"94e787115852033fe6fbe7306b6a56b15a320e12713c6571f0efbead1bf65ff8\": container with ID starting with 94e787115852033fe6fbe7306b6a56b15a320e12713c6571f0efbead1bf65ff8 not found: ID does not exist" Jan 11 17:52:46 crc kubenswrapper[4837]: I0111 17:52:46.807978 4837 scope.go:117] "RemoveContainer" containerID="79a4d8dcb0e12c9fe797ef4fc8a745be529ca049ed4264d14e9cc92f69d3920e" Jan 11 17:52:46 crc kubenswrapper[4837]: I0111 17:52:46.808336 4837 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"79a4d8dcb0e12c9fe797ef4fc8a745be529ca049ed4264d14e9cc92f69d3920e"} err="failed to get container status \"79a4d8dcb0e12c9fe797ef4fc8a745be529ca049ed4264d14e9cc92f69d3920e\": rpc error: code = NotFound desc = could not find container \"79a4d8dcb0e12c9fe797ef4fc8a745be529ca049ed4264d14e9cc92f69d3920e\": container with ID starting with 79a4d8dcb0e12c9fe797ef4fc8a745be529ca049ed4264d14e9cc92f69d3920e not found: ID does not exist" Jan 11 17:52:46 crc kubenswrapper[4837]: I0111 17:52:46.808367 4837 scope.go:117] "RemoveContainer" containerID="78eb8282a819e6e389f473e3f0590438dd87fb6d45d2042732de1e03168ccd4c" Jan 11 17:52:46 crc kubenswrapper[4837]: I0111 17:52:46.808615 4837 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"78eb8282a819e6e389f473e3f0590438dd87fb6d45d2042732de1e03168ccd4c"} err="failed to get container status \"78eb8282a819e6e389f473e3f0590438dd87fb6d45d2042732de1e03168ccd4c\": rpc error: code = NotFound desc = could not find container \"78eb8282a819e6e389f473e3f0590438dd87fb6d45d2042732de1e03168ccd4c\": container with ID starting with 78eb8282a819e6e389f473e3f0590438dd87fb6d45d2042732de1e03168ccd4c not found: ID does not exist" Jan 11 17:52:46 crc kubenswrapper[4837]: I0111 17:52:46.808644 4837 scope.go:117] "RemoveContainer" containerID="7a224cca94ebe32085b70abcadb938580b245daff86bff8e709e063b8f23811c" Jan 11 17:52:46 crc kubenswrapper[4837]: I0111 17:52:46.808869 4837 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7a224cca94ebe32085b70abcadb938580b245daff86bff8e709e063b8f23811c"} err="failed to get container status \"7a224cca94ebe32085b70abcadb938580b245daff86bff8e709e063b8f23811c\": rpc error: code = NotFound desc = could not find container \"7a224cca94ebe32085b70abcadb938580b245daff86bff8e709e063b8f23811c\": container with ID starting with 7a224cca94ebe32085b70abcadb938580b245daff86bff8e709e063b8f23811c not found: ID does not exist" Jan 11 17:52:46 crc kubenswrapper[4837]: I0111 17:52:46.808888 4837 scope.go:117] "RemoveContainer" containerID="94e787115852033fe6fbe7306b6a56b15a320e12713c6571f0efbead1bf65ff8" Jan 11 17:52:46 crc kubenswrapper[4837]: I0111 17:52:46.809789 4837 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"94e787115852033fe6fbe7306b6a56b15a320e12713c6571f0efbead1bf65ff8"} err="failed to get container status \"94e787115852033fe6fbe7306b6a56b15a320e12713c6571f0efbead1bf65ff8\": rpc error: code = NotFound desc = could not find container \"94e787115852033fe6fbe7306b6a56b15a320e12713c6571f0efbead1bf65ff8\": container with ID starting with 94e787115852033fe6fbe7306b6a56b15a320e12713c6571f0efbead1bf65ff8 not found: ID does not exist" Jan 11 17:52:46 crc kubenswrapper[4837]: I0111 17:52:46.809815 4837 scope.go:117] "RemoveContainer" containerID="79a4d8dcb0e12c9fe797ef4fc8a745be529ca049ed4264d14e9cc92f69d3920e" Jan 11 17:52:46 crc kubenswrapper[4837]: I0111 17:52:46.810019 4837 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"79a4d8dcb0e12c9fe797ef4fc8a745be529ca049ed4264d14e9cc92f69d3920e"} err="failed to get container status \"79a4d8dcb0e12c9fe797ef4fc8a745be529ca049ed4264d14e9cc92f69d3920e\": rpc error: code = NotFound desc = could not find container \"79a4d8dcb0e12c9fe797ef4fc8a745be529ca049ed4264d14e9cc92f69d3920e\": container with ID starting with 79a4d8dcb0e12c9fe797ef4fc8a745be529ca049ed4264d14e9cc92f69d3920e not found: ID does not exist" Jan 11 17:52:46 crc kubenswrapper[4837]: I0111 17:52:46.810038 4837 scope.go:117] "RemoveContainer" containerID="78eb8282a819e6e389f473e3f0590438dd87fb6d45d2042732de1e03168ccd4c" Jan 11 17:52:46 crc kubenswrapper[4837]: I0111 17:52:46.810291 4837 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"78eb8282a819e6e389f473e3f0590438dd87fb6d45d2042732de1e03168ccd4c"} err="failed to get container status \"78eb8282a819e6e389f473e3f0590438dd87fb6d45d2042732de1e03168ccd4c\": rpc error: code = NotFound desc = could not find container \"78eb8282a819e6e389f473e3f0590438dd87fb6d45d2042732de1e03168ccd4c\": container with ID starting with 78eb8282a819e6e389f473e3f0590438dd87fb6d45d2042732de1e03168ccd4c not found: ID does not exist" Jan 11 17:52:46 crc kubenswrapper[4837]: I0111 17:52:46.810310 4837 scope.go:117] "RemoveContainer" containerID="7a224cca94ebe32085b70abcadb938580b245daff86bff8e709e063b8f23811c" Jan 11 17:52:46 crc kubenswrapper[4837]: I0111 17:52:46.810582 4837 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7a224cca94ebe32085b70abcadb938580b245daff86bff8e709e063b8f23811c"} err="failed to get container status \"7a224cca94ebe32085b70abcadb938580b245daff86bff8e709e063b8f23811c\": rpc error: code = NotFound desc = could not find container \"7a224cca94ebe32085b70abcadb938580b245daff86bff8e709e063b8f23811c\": container with ID starting with 7a224cca94ebe32085b70abcadb938580b245daff86bff8e709e063b8f23811c not found: ID does not exist" Jan 11 17:52:46 crc kubenswrapper[4837]: I0111 17:52:46.810598 4837 scope.go:117] "RemoveContainer" containerID="94e787115852033fe6fbe7306b6a56b15a320e12713c6571f0efbead1bf65ff8" Jan 11 17:52:46 crc kubenswrapper[4837]: I0111 17:52:46.811376 4837 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"94e787115852033fe6fbe7306b6a56b15a320e12713c6571f0efbead1bf65ff8"} err="failed to get container status \"94e787115852033fe6fbe7306b6a56b15a320e12713c6571f0efbead1bf65ff8\": rpc error: code = NotFound desc = could not find container \"94e787115852033fe6fbe7306b6a56b15a320e12713c6571f0efbead1bf65ff8\": container with ID starting with 94e787115852033fe6fbe7306b6a56b15a320e12713c6571f0efbead1bf65ff8 not found: ID does not exist" Jan 11 17:52:46 crc kubenswrapper[4837]: I0111 17:52:46.856623 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ksfvc\" (UniqueName: \"kubernetes.io/projected/733daa2a-14f9-4db9-9b06-a6afe117d45f-kube-api-access-ksfvc\") pod \"ceilometer-0\" (UID: \"733daa2a-14f9-4db9-9b06-a6afe117d45f\") " pod="openstack/ceilometer-0" Jan 11 17:52:46 crc kubenswrapper[4837]: I0111 17:52:46.856754 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/733daa2a-14f9-4db9-9b06-a6afe117d45f-scripts\") pod \"ceilometer-0\" (UID: \"733daa2a-14f9-4db9-9b06-a6afe117d45f\") " pod="openstack/ceilometer-0" Jan 11 17:52:46 crc kubenswrapper[4837]: I0111 17:52:46.856794 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/733daa2a-14f9-4db9-9b06-a6afe117d45f-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"733daa2a-14f9-4db9-9b06-a6afe117d45f\") " pod="openstack/ceilometer-0" Jan 11 17:52:46 crc kubenswrapper[4837]: I0111 17:52:46.856831 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/733daa2a-14f9-4db9-9b06-a6afe117d45f-config-data\") pod \"ceilometer-0\" (UID: \"733daa2a-14f9-4db9-9b06-a6afe117d45f\") " pod="openstack/ceilometer-0" Jan 11 17:52:46 crc kubenswrapper[4837]: I0111 17:52:46.856962 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/733daa2a-14f9-4db9-9b06-a6afe117d45f-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"733daa2a-14f9-4db9-9b06-a6afe117d45f\") " pod="openstack/ceilometer-0" Jan 11 17:52:46 crc kubenswrapper[4837]: I0111 17:52:46.857018 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/733daa2a-14f9-4db9-9b06-a6afe117d45f-log-httpd\") pod \"ceilometer-0\" (UID: \"733daa2a-14f9-4db9-9b06-a6afe117d45f\") " pod="openstack/ceilometer-0" Jan 11 17:52:46 crc kubenswrapper[4837]: I0111 17:52:46.857059 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/733daa2a-14f9-4db9-9b06-a6afe117d45f-run-httpd\") pod \"ceilometer-0\" (UID: \"733daa2a-14f9-4db9-9b06-a6afe117d45f\") " pod="openstack/ceilometer-0" Jan 11 17:52:46 crc kubenswrapper[4837]: I0111 17:52:46.958185 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/733daa2a-14f9-4db9-9b06-a6afe117d45f-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"733daa2a-14f9-4db9-9b06-a6afe117d45f\") " pod="openstack/ceilometer-0" Jan 11 17:52:46 crc kubenswrapper[4837]: I0111 17:52:46.958237 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/733daa2a-14f9-4db9-9b06-a6afe117d45f-config-data\") pod \"ceilometer-0\" (UID: \"733daa2a-14f9-4db9-9b06-a6afe117d45f\") " pod="openstack/ceilometer-0" Jan 11 17:52:46 crc kubenswrapper[4837]: I0111 17:52:46.958310 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/733daa2a-14f9-4db9-9b06-a6afe117d45f-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"733daa2a-14f9-4db9-9b06-a6afe117d45f\") " pod="openstack/ceilometer-0" Jan 11 17:52:46 crc kubenswrapper[4837]: I0111 17:52:46.958339 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/733daa2a-14f9-4db9-9b06-a6afe117d45f-log-httpd\") pod \"ceilometer-0\" (UID: \"733daa2a-14f9-4db9-9b06-a6afe117d45f\") " pod="openstack/ceilometer-0" Jan 11 17:52:46 crc kubenswrapper[4837]: I0111 17:52:46.958357 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/733daa2a-14f9-4db9-9b06-a6afe117d45f-run-httpd\") pod \"ceilometer-0\" (UID: \"733daa2a-14f9-4db9-9b06-a6afe117d45f\") " pod="openstack/ceilometer-0" Jan 11 17:52:46 crc kubenswrapper[4837]: I0111 17:52:46.958389 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ksfvc\" (UniqueName: \"kubernetes.io/projected/733daa2a-14f9-4db9-9b06-a6afe117d45f-kube-api-access-ksfvc\") pod \"ceilometer-0\" (UID: \"733daa2a-14f9-4db9-9b06-a6afe117d45f\") " pod="openstack/ceilometer-0" Jan 11 17:52:46 crc kubenswrapper[4837]: I0111 17:52:46.958512 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/733daa2a-14f9-4db9-9b06-a6afe117d45f-scripts\") pod \"ceilometer-0\" (UID: \"733daa2a-14f9-4db9-9b06-a6afe117d45f\") " pod="openstack/ceilometer-0" Jan 11 17:52:46 crc kubenswrapper[4837]: I0111 17:52:46.958944 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/733daa2a-14f9-4db9-9b06-a6afe117d45f-run-httpd\") pod \"ceilometer-0\" (UID: \"733daa2a-14f9-4db9-9b06-a6afe117d45f\") " pod="openstack/ceilometer-0" Jan 11 17:52:46 crc kubenswrapper[4837]: I0111 17:52:46.958991 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/733daa2a-14f9-4db9-9b06-a6afe117d45f-log-httpd\") pod \"ceilometer-0\" (UID: \"733daa2a-14f9-4db9-9b06-a6afe117d45f\") " pod="openstack/ceilometer-0" Jan 11 17:52:46 crc kubenswrapper[4837]: I0111 17:52:46.962063 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/733daa2a-14f9-4db9-9b06-a6afe117d45f-config-data\") pod \"ceilometer-0\" (UID: \"733daa2a-14f9-4db9-9b06-a6afe117d45f\") " pod="openstack/ceilometer-0" Jan 11 17:52:46 crc kubenswrapper[4837]: I0111 17:52:46.962355 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/733daa2a-14f9-4db9-9b06-a6afe117d45f-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"733daa2a-14f9-4db9-9b06-a6afe117d45f\") " pod="openstack/ceilometer-0" Jan 11 17:52:46 crc kubenswrapper[4837]: I0111 17:52:46.962837 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/733daa2a-14f9-4db9-9b06-a6afe117d45f-scripts\") pod \"ceilometer-0\" (UID: \"733daa2a-14f9-4db9-9b06-a6afe117d45f\") " pod="openstack/ceilometer-0" Jan 11 17:52:46 crc kubenswrapper[4837]: I0111 17:52:46.962857 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/733daa2a-14f9-4db9-9b06-a6afe117d45f-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"733daa2a-14f9-4db9-9b06-a6afe117d45f\") " pod="openstack/ceilometer-0" Jan 11 17:52:46 crc kubenswrapper[4837]: I0111 17:52:46.978013 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ksfvc\" (UniqueName: \"kubernetes.io/projected/733daa2a-14f9-4db9-9b06-a6afe117d45f-kube-api-access-ksfvc\") pod \"ceilometer-0\" (UID: \"733daa2a-14f9-4db9-9b06-a6afe117d45f\") " pod="openstack/ceilometer-0" Jan 11 17:52:47 crc kubenswrapper[4837]: I0111 17:52:47.095655 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Jan 11 17:52:47 crc kubenswrapper[4837]: W0111 17:52:47.437054 4837 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod733daa2a_14f9_4db9_9b06_a6afe117d45f.slice/crio-efe9482e114d066bd9739e224ef72dd9fe4cf968aa0171d9071da9e71c1e86d0 WatchSource:0}: Error finding container efe9482e114d066bd9739e224ef72dd9fe4cf968aa0171d9071da9e71c1e86d0: Status 404 returned error can't find the container with id efe9482e114d066bd9739e224ef72dd9fe4cf968aa0171d9071da9e71c1e86d0 Jan 11 17:52:47 crc kubenswrapper[4837]: I0111 17:52:47.437919 4837 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Jan 11 17:52:47 crc kubenswrapper[4837]: I0111 17:52:47.706181 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"733daa2a-14f9-4db9-9b06-a6afe117d45f","Type":"ContainerStarted","Data":"efe9482e114d066bd9739e224ef72dd9fe4cf968aa0171d9071da9e71c1e86d0"} Jan 11 17:52:48 crc kubenswrapper[4837]: I0111 17:52:48.379196 4837 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="473ed472-fd1a-47bf-8dd8-fed3c09f25d9" path="/var/lib/kubelet/pods/473ed472-fd1a-47bf-8dd8-fed3c09f25d9/volumes" Jan 11 17:52:48 crc kubenswrapper[4837]: I0111 17:52:48.717512 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"733daa2a-14f9-4db9-9b06-a6afe117d45f","Type":"ContainerStarted","Data":"c173d6c6c423a1f861c50b41c188edb52e3cc20d19dc7ac4ecd2a02137875d46"} Jan 11 17:52:49 crc kubenswrapper[4837]: I0111 17:52:49.729344 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"733daa2a-14f9-4db9-9b06-a6afe117d45f","Type":"ContainerStarted","Data":"b9cf89d5fc6a1db3e6027d1987746e496028b18cfdf2922d1ea2a823464a0bb1"} Jan 11 17:52:50 crc kubenswrapper[4837]: I0111 17:52:50.745416 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"733daa2a-14f9-4db9-9b06-a6afe117d45f","Type":"ContainerStarted","Data":"3f3f30d83ec327526a1ce91de2a24a85d7a4d53b2ad32c0344bc32d7e03e013b"} Jan 11 17:52:51 crc kubenswrapper[4837]: I0111 17:52:51.759848 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"733daa2a-14f9-4db9-9b06-a6afe117d45f","Type":"ContainerStarted","Data":"bb9e8fc5fdf397b5d2cdeaa4535357a40d80a43da8c4ab0471ead928b457d960"} Jan 11 17:52:51 crc kubenswrapper[4837]: I0111 17:52:51.762046 4837 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ceilometer-0" Jan 11 17:52:51 crc kubenswrapper[4837]: I0111 17:52:51.763091 4837 generic.go:334] "Generic (PLEG): container finished" podID="0550c6b1-8707-42d4-a3ac-f2cd7f8ac80f" containerID="36d7dcea07af7a2b3cdbeff10a258a5334de2d4d97b15aeca8b1d43fdd0bdb14" exitCode=0 Jan 11 17:52:51 crc kubenswrapper[4837]: I0111 17:52:51.763401 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-db-sync-q5qqm" event={"ID":"0550c6b1-8707-42d4-a3ac-f2cd7f8ac80f","Type":"ContainerDied","Data":"36d7dcea07af7a2b3cdbeff10a258a5334de2d4d97b15aeca8b1d43fdd0bdb14"} Jan 11 17:52:51 crc kubenswrapper[4837]: I0111 17:52:51.795058 4837 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ceilometer-0" podStartSLOduration=2.156104129 podStartE2EDuration="5.795029637s" podCreationTimestamp="2026-01-11 17:52:46 +0000 UTC" firstStartedPulling="2026-01-11 17:52:47.440381536 +0000 UTC m=+1341.618574242" lastFinishedPulling="2026-01-11 17:52:51.079307044 +0000 UTC m=+1345.257499750" observedRunningTime="2026-01-11 17:52:51.783387544 +0000 UTC m=+1345.961580270" watchObservedRunningTime="2026-01-11 17:52:51.795029637 +0000 UTC m=+1345.973222383" Jan 11 17:52:53 crc kubenswrapper[4837]: I0111 17:52:53.159654 4837 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-db-sync-q5qqm" Jan 11 17:52:53 crc kubenswrapper[4837]: I0111 17:52:53.268799 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zsn99\" (UniqueName: \"kubernetes.io/projected/0550c6b1-8707-42d4-a3ac-f2cd7f8ac80f-kube-api-access-zsn99\") pod \"0550c6b1-8707-42d4-a3ac-f2cd7f8ac80f\" (UID: \"0550c6b1-8707-42d4-a3ac-f2cd7f8ac80f\") " Jan 11 17:52:53 crc kubenswrapper[4837]: I0111 17:52:53.268900 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0550c6b1-8707-42d4-a3ac-f2cd7f8ac80f-combined-ca-bundle\") pod \"0550c6b1-8707-42d4-a3ac-f2cd7f8ac80f\" (UID: \"0550c6b1-8707-42d4-a3ac-f2cd7f8ac80f\") " Jan 11 17:52:53 crc kubenswrapper[4837]: I0111 17:52:53.268961 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0550c6b1-8707-42d4-a3ac-f2cd7f8ac80f-config-data\") pod \"0550c6b1-8707-42d4-a3ac-f2cd7f8ac80f\" (UID: \"0550c6b1-8707-42d4-a3ac-f2cd7f8ac80f\") " Jan 11 17:52:53 crc kubenswrapper[4837]: I0111 17:52:53.269061 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/0550c6b1-8707-42d4-a3ac-f2cd7f8ac80f-scripts\") pod \"0550c6b1-8707-42d4-a3ac-f2cd7f8ac80f\" (UID: \"0550c6b1-8707-42d4-a3ac-f2cd7f8ac80f\") " Jan 11 17:52:53 crc kubenswrapper[4837]: I0111 17:52:53.275577 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0550c6b1-8707-42d4-a3ac-f2cd7f8ac80f-scripts" (OuterVolumeSpecName: "scripts") pod "0550c6b1-8707-42d4-a3ac-f2cd7f8ac80f" (UID: "0550c6b1-8707-42d4-a3ac-f2cd7f8ac80f"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 11 17:52:53 crc kubenswrapper[4837]: I0111 17:52:53.276216 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0550c6b1-8707-42d4-a3ac-f2cd7f8ac80f-kube-api-access-zsn99" (OuterVolumeSpecName: "kube-api-access-zsn99") pod "0550c6b1-8707-42d4-a3ac-f2cd7f8ac80f" (UID: "0550c6b1-8707-42d4-a3ac-f2cd7f8ac80f"). InnerVolumeSpecName "kube-api-access-zsn99". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 11 17:52:53 crc kubenswrapper[4837]: I0111 17:52:53.302540 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0550c6b1-8707-42d4-a3ac-f2cd7f8ac80f-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "0550c6b1-8707-42d4-a3ac-f2cd7f8ac80f" (UID: "0550c6b1-8707-42d4-a3ac-f2cd7f8ac80f"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 11 17:52:53 crc kubenswrapper[4837]: I0111 17:52:53.308567 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0550c6b1-8707-42d4-a3ac-f2cd7f8ac80f-config-data" (OuterVolumeSpecName: "config-data") pod "0550c6b1-8707-42d4-a3ac-f2cd7f8ac80f" (UID: "0550c6b1-8707-42d4-a3ac-f2cd7f8ac80f"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 11 17:52:53 crc kubenswrapper[4837]: I0111 17:52:53.371392 4837 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0550c6b1-8707-42d4-a3ac-f2cd7f8ac80f-config-data\") on node \"crc\" DevicePath \"\"" Jan 11 17:52:53 crc kubenswrapper[4837]: I0111 17:52:53.371424 4837 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/0550c6b1-8707-42d4-a3ac-f2cd7f8ac80f-scripts\") on node \"crc\" DevicePath \"\"" Jan 11 17:52:53 crc kubenswrapper[4837]: I0111 17:52:53.371436 4837 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-zsn99\" (UniqueName: \"kubernetes.io/projected/0550c6b1-8707-42d4-a3ac-f2cd7f8ac80f-kube-api-access-zsn99\") on node \"crc\" DevicePath \"\"" Jan 11 17:52:53 crc kubenswrapper[4837]: I0111 17:52:53.371447 4837 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0550c6b1-8707-42d4-a3ac-f2cd7f8ac80f-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 11 17:52:53 crc kubenswrapper[4837]: I0111 17:52:53.787013 4837 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-db-sync-q5qqm" Jan 11 17:52:53 crc kubenswrapper[4837]: I0111 17:52:53.787088 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-db-sync-q5qqm" event={"ID":"0550c6b1-8707-42d4-a3ac-f2cd7f8ac80f","Type":"ContainerDied","Data":"a5388a8b5937eb843b4ead121b654bba453556dea7529f6c2d7295942b160ad5"} Jan 11 17:52:53 crc kubenswrapper[4837]: I0111 17:52:53.787125 4837 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="a5388a8b5937eb843b4ead121b654bba453556dea7529f6c2d7295942b160ad5" Jan 11 17:52:53 crc kubenswrapper[4837]: I0111 17:52:53.926878 4837 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell0-conductor-0"] Jan 11 17:52:53 crc kubenswrapper[4837]: E0111 17:52:53.927914 4837 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0550c6b1-8707-42d4-a3ac-f2cd7f8ac80f" containerName="nova-cell0-conductor-db-sync" Jan 11 17:52:53 crc kubenswrapper[4837]: I0111 17:52:53.928333 4837 state_mem.go:107] "Deleted CPUSet assignment" podUID="0550c6b1-8707-42d4-a3ac-f2cd7f8ac80f" containerName="nova-cell0-conductor-db-sync" Jan 11 17:52:53 crc kubenswrapper[4837]: I0111 17:52:53.929156 4837 memory_manager.go:354] "RemoveStaleState removing state" podUID="0550c6b1-8707-42d4-a3ac-f2cd7f8ac80f" containerName="nova-cell0-conductor-db-sync" Jan 11 17:52:53 crc kubenswrapper[4837]: I0111 17:52:53.930659 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-0" Jan 11 17:52:53 crc kubenswrapper[4837]: I0111 17:52:53.934030 4837 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-nova-dockercfg-2gtdx" Jan 11 17:52:53 crc kubenswrapper[4837]: I0111 17:52:53.934136 4837 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell0-conductor-config-data" Jan 11 17:52:53 crc kubenswrapper[4837]: I0111 17:52:53.937616 4837 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-conductor-0"] Jan 11 17:52:53 crc kubenswrapper[4837]: I0111 17:52:53.985386 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cpkjr\" (UniqueName: \"kubernetes.io/projected/bb1d44ca-482f-455e-bb8c-7c409c3ad6f8-kube-api-access-cpkjr\") pod \"nova-cell0-conductor-0\" (UID: \"bb1d44ca-482f-455e-bb8c-7c409c3ad6f8\") " pod="openstack/nova-cell0-conductor-0" Jan 11 17:52:53 crc kubenswrapper[4837]: I0111 17:52:53.985621 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/bb1d44ca-482f-455e-bb8c-7c409c3ad6f8-config-data\") pod \"nova-cell0-conductor-0\" (UID: \"bb1d44ca-482f-455e-bb8c-7c409c3ad6f8\") " pod="openstack/nova-cell0-conductor-0" Jan 11 17:52:53 crc kubenswrapper[4837]: I0111 17:52:53.985723 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/bb1d44ca-482f-455e-bb8c-7c409c3ad6f8-combined-ca-bundle\") pod \"nova-cell0-conductor-0\" (UID: \"bb1d44ca-482f-455e-bb8c-7c409c3ad6f8\") " pod="openstack/nova-cell0-conductor-0" Jan 11 17:52:54 crc kubenswrapper[4837]: I0111 17:52:54.087212 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cpkjr\" (UniqueName: \"kubernetes.io/projected/bb1d44ca-482f-455e-bb8c-7c409c3ad6f8-kube-api-access-cpkjr\") pod \"nova-cell0-conductor-0\" (UID: \"bb1d44ca-482f-455e-bb8c-7c409c3ad6f8\") " pod="openstack/nova-cell0-conductor-0" Jan 11 17:52:54 crc kubenswrapper[4837]: I0111 17:52:54.087343 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/bb1d44ca-482f-455e-bb8c-7c409c3ad6f8-config-data\") pod \"nova-cell0-conductor-0\" (UID: \"bb1d44ca-482f-455e-bb8c-7c409c3ad6f8\") " pod="openstack/nova-cell0-conductor-0" Jan 11 17:52:54 crc kubenswrapper[4837]: I0111 17:52:54.087374 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/bb1d44ca-482f-455e-bb8c-7c409c3ad6f8-combined-ca-bundle\") pod \"nova-cell0-conductor-0\" (UID: \"bb1d44ca-482f-455e-bb8c-7c409c3ad6f8\") " pod="openstack/nova-cell0-conductor-0" Jan 11 17:52:54 crc kubenswrapper[4837]: I0111 17:52:54.093836 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/bb1d44ca-482f-455e-bb8c-7c409c3ad6f8-combined-ca-bundle\") pod \"nova-cell0-conductor-0\" (UID: \"bb1d44ca-482f-455e-bb8c-7c409c3ad6f8\") " pod="openstack/nova-cell0-conductor-0" Jan 11 17:52:54 crc kubenswrapper[4837]: I0111 17:52:54.095115 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/bb1d44ca-482f-455e-bb8c-7c409c3ad6f8-config-data\") pod \"nova-cell0-conductor-0\" (UID: \"bb1d44ca-482f-455e-bb8c-7c409c3ad6f8\") " pod="openstack/nova-cell0-conductor-0" Jan 11 17:52:54 crc kubenswrapper[4837]: I0111 17:52:54.129296 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-cpkjr\" (UniqueName: \"kubernetes.io/projected/bb1d44ca-482f-455e-bb8c-7c409c3ad6f8-kube-api-access-cpkjr\") pod \"nova-cell0-conductor-0\" (UID: \"bb1d44ca-482f-455e-bb8c-7c409c3ad6f8\") " pod="openstack/nova-cell0-conductor-0" Jan 11 17:52:54 crc kubenswrapper[4837]: I0111 17:52:54.264647 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-0" Jan 11 17:52:54 crc kubenswrapper[4837]: I0111 17:52:54.833761 4837 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-conductor-0"] Jan 11 17:52:55 crc kubenswrapper[4837]: I0111 17:52:55.829508 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-0" event={"ID":"bb1d44ca-482f-455e-bb8c-7c409c3ad6f8","Type":"ContainerStarted","Data":"5c7c2f445acd605c7ffddd6f424b537d7e9cc238450b16cf4c0d9ca03950de26"} Jan 11 17:52:55 crc kubenswrapper[4837]: I0111 17:52:55.830183 4837 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-cell0-conductor-0" Jan 11 17:52:55 crc kubenswrapper[4837]: I0111 17:52:55.830208 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-0" event={"ID":"bb1d44ca-482f-455e-bb8c-7c409c3ad6f8","Type":"ContainerStarted","Data":"91f51d187c08fa3a1e7d68b2afddb5ebdaaf3e29790f31cf97c749b703b3fac9"} Jan 11 17:52:55 crc kubenswrapper[4837]: I0111 17:52:55.850936 4837 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell0-conductor-0" podStartSLOduration=2.850919654 podStartE2EDuration="2.850919654s" podCreationTimestamp="2026-01-11 17:52:53 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-11 17:52:55.848496059 +0000 UTC m=+1350.026688765" watchObservedRunningTime="2026-01-11 17:52:55.850919654 +0000 UTC m=+1350.029112350" Jan 11 17:53:04 crc kubenswrapper[4837]: I0111 17:53:04.315202 4837 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-cell0-conductor-0" Jan 11 17:53:04 crc kubenswrapper[4837]: I0111 17:53:04.773629 4837 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell0-cell-mapping-5wzxr"] Jan 11 17:53:04 crc kubenswrapper[4837]: I0111 17:53:04.775045 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-cell-mapping-5wzxr" Jan 11 17:53:04 crc kubenswrapper[4837]: I0111 17:53:04.778094 4837 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell0-manage-config-data" Jan 11 17:53:04 crc kubenswrapper[4837]: I0111 17:53:04.780261 4837 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell0-manage-scripts" Jan 11 17:53:04 crc kubenswrapper[4837]: I0111 17:53:04.789410 4837 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-cell-mapping-5wzxr"] Jan 11 17:53:04 crc kubenswrapper[4837]: I0111 17:53:04.901531 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-m7fbr\" (UniqueName: \"kubernetes.io/projected/bb261701-19a1-4f8f-a84b-e8748c2eb561-kube-api-access-m7fbr\") pod \"nova-cell0-cell-mapping-5wzxr\" (UID: \"bb261701-19a1-4f8f-a84b-e8748c2eb561\") " pod="openstack/nova-cell0-cell-mapping-5wzxr" Jan 11 17:53:04 crc kubenswrapper[4837]: I0111 17:53:04.901586 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/bb261701-19a1-4f8f-a84b-e8748c2eb561-scripts\") pod \"nova-cell0-cell-mapping-5wzxr\" (UID: \"bb261701-19a1-4f8f-a84b-e8748c2eb561\") " pod="openstack/nova-cell0-cell-mapping-5wzxr" Jan 11 17:53:04 crc kubenswrapper[4837]: I0111 17:53:04.901791 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/bb261701-19a1-4f8f-a84b-e8748c2eb561-config-data\") pod \"nova-cell0-cell-mapping-5wzxr\" (UID: \"bb261701-19a1-4f8f-a84b-e8748c2eb561\") " pod="openstack/nova-cell0-cell-mapping-5wzxr" Jan 11 17:53:04 crc kubenswrapper[4837]: I0111 17:53:04.901820 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/bb261701-19a1-4f8f-a84b-e8748c2eb561-combined-ca-bundle\") pod \"nova-cell0-cell-mapping-5wzxr\" (UID: \"bb261701-19a1-4f8f-a84b-e8748c2eb561\") " pod="openstack/nova-cell0-cell-mapping-5wzxr" Jan 11 17:53:04 crc kubenswrapper[4837]: I0111 17:53:04.940321 4837 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Jan 11 17:53:04 crc kubenswrapper[4837]: I0111 17:53:04.944749 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-novncproxy-0" Jan 11 17:53:04 crc kubenswrapper[4837]: I0111 17:53:04.949544 4837 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-novncproxy-config-data" Jan 11 17:53:04 crc kubenswrapper[4837]: I0111 17:53:04.984317 4837 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Jan 11 17:53:05 crc kubenswrapper[4837]: I0111 17:53:05.003973 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-m7fbr\" (UniqueName: \"kubernetes.io/projected/bb261701-19a1-4f8f-a84b-e8748c2eb561-kube-api-access-m7fbr\") pod \"nova-cell0-cell-mapping-5wzxr\" (UID: \"bb261701-19a1-4f8f-a84b-e8748c2eb561\") " pod="openstack/nova-cell0-cell-mapping-5wzxr" Jan 11 17:53:05 crc kubenswrapper[4837]: I0111 17:53:05.004038 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lwrzf\" (UniqueName: \"kubernetes.io/projected/4a7db2fa-3a60-4cf5-91c0-dcd6c2698017-kube-api-access-lwrzf\") pod \"nova-cell1-novncproxy-0\" (UID: \"4a7db2fa-3a60-4cf5-91c0-dcd6c2698017\") " pod="openstack/nova-cell1-novncproxy-0" Jan 11 17:53:05 crc kubenswrapper[4837]: I0111 17:53:05.004069 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/bb261701-19a1-4f8f-a84b-e8748c2eb561-scripts\") pod \"nova-cell0-cell-mapping-5wzxr\" (UID: \"bb261701-19a1-4f8f-a84b-e8748c2eb561\") " pod="openstack/nova-cell0-cell-mapping-5wzxr" Jan 11 17:53:05 crc kubenswrapper[4837]: I0111 17:53:05.004124 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4a7db2fa-3a60-4cf5-91c0-dcd6c2698017-combined-ca-bundle\") pod \"nova-cell1-novncproxy-0\" (UID: \"4a7db2fa-3a60-4cf5-91c0-dcd6c2698017\") " pod="openstack/nova-cell1-novncproxy-0" Jan 11 17:53:05 crc kubenswrapper[4837]: I0111 17:53:05.008823 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/bb261701-19a1-4f8f-a84b-e8748c2eb561-config-data\") pod \"nova-cell0-cell-mapping-5wzxr\" (UID: \"bb261701-19a1-4f8f-a84b-e8748c2eb561\") " pod="openstack/nova-cell0-cell-mapping-5wzxr" Jan 11 17:53:05 crc kubenswrapper[4837]: I0111 17:53:05.008871 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/bb261701-19a1-4f8f-a84b-e8748c2eb561-combined-ca-bundle\") pod \"nova-cell0-cell-mapping-5wzxr\" (UID: \"bb261701-19a1-4f8f-a84b-e8748c2eb561\") " pod="openstack/nova-cell0-cell-mapping-5wzxr" Jan 11 17:53:05 crc kubenswrapper[4837]: I0111 17:53:05.008931 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4a7db2fa-3a60-4cf5-91c0-dcd6c2698017-config-data\") pod \"nova-cell1-novncproxy-0\" (UID: \"4a7db2fa-3a60-4cf5-91c0-dcd6c2698017\") " pod="openstack/nova-cell1-novncproxy-0" Jan 11 17:53:05 crc kubenswrapper[4837]: I0111 17:53:05.013080 4837 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-api-0"] Jan 11 17:53:05 crc kubenswrapper[4837]: I0111 17:53:05.014512 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Jan 11 17:53:05 crc kubenswrapper[4837]: I0111 17:53:05.015516 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/bb261701-19a1-4f8f-a84b-e8748c2eb561-config-data\") pod \"nova-cell0-cell-mapping-5wzxr\" (UID: \"bb261701-19a1-4f8f-a84b-e8748c2eb561\") " pod="openstack/nova-cell0-cell-mapping-5wzxr" Jan 11 17:53:05 crc kubenswrapper[4837]: I0111 17:53:05.019659 4837 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-api-config-data" Jan 11 17:53:05 crc kubenswrapper[4837]: I0111 17:53:05.019687 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/bb261701-19a1-4f8f-a84b-e8748c2eb561-combined-ca-bundle\") pod \"nova-cell0-cell-mapping-5wzxr\" (UID: \"bb261701-19a1-4f8f-a84b-e8748c2eb561\") " pod="openstack/nova-cell0-cell-mapping-5wzxr" Jan 11 17:53:05 crc kubenswrapper[4837]: I0111 17:53:05.022498 4837 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Jan 11 17:53:05 crc kubenswrapper[4837]: I0111 17:53:05.051251 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-m7fbr\" (UniqueName: \"kubernetes.io/projected/bb261701-19a1-4f8f-a84b-e8748c2eb561-kube-api-access-m7fbr\") pod \"nova-cell0-cell-mapping-5wzxr\" (UID: \"bb261701-19a1-4f8f-a84b-e8748c2eb561\") " pod="openstack/nova-cell0-cell-mapping-5wzxr" Jan 11 17:53:05 crc kubenswrapper[4837]: I0111 17:53:05.053016 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/bb261701-19a1-4f8f-a84b-e8748c2eb561-scripts\") pod \"nova-cell0-cell-mapping-5wzxr\" (UID: \"bb261701-19a1-4f8f-a84b-e8748c2eb561\") " pod="openstack/nova-cell0-cell-mapping-5wzxr" Jan 11 17:53:05 crc kubenswrapper[4837]: I0111 17:53:05.098130 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-cell-mapping-5wzxr" Jan 11 17:53:05 crc kubenswrapper[4837]: I0111 17:53:05.110825 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lwrzf\" (UniqueName: \"kubernetes.io/projected/4a7db2fa-3a60-4cf5-91c0-dcd6c2698017-kube-api-access-lwrzf\") pod \"nova-cell1-novncproxy-0\" (UID: \"4a7db2fa-3a60-4cf5-91c0-dcd6c2698017\") " pod="openstack/nova-cell1-novncproxy-0" Jan 11 17:53:05 crc kubenswrapper[4837]: I0111 17:53:05.110885 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/66fa097f-464b-4efd-b0e3-8129153d92d5-config-data\") pod \"nova-api-0\" (UID: \"66fa097f-464b-4efd-b0e3-8129153d92d5\") " pod="openstack/nova-api-0" Jan 11 17:53:05 crc kubenswrapper[4837]: I0111 17:53:05.110919 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4a7db2fa-3a60-4cf5-91c0-dcd6c2698017-combined-ca-bundle\") pod \"nova-cell1-novncproxy-0\" (UID: \"4a7db2fa-3a60-4cf5-91c0-dcd6c2698017\") " pod="openstack/nova-cell1-novncproxy-0" Jan 11 17:53:05 crc kubenswrapper[4837]: I0111 17:53:05.110964 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/66fa097f-464b-4efd-b0e3-8129153d92d5-logs\") pod \"nova-api-0\" (UID: \"66fa097f-464b-4efd-b0e3-8129153d92d5\") " pod="openstack/nova-api-0" Jan 11 17:53:05 crc kubenswrapper[4837]: I0111 17:53:05.110998 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4a7db2fa-3a60-4cf5-91c0-dcd6c2698017-config-data\") pod \"nova-cell1-novncproxy-0\" (UID: \"4a7db2fa-3a60-4cf5-91c0-dcd6c2698017\") " pod="openstack/nova-cell1-novncproxy-0" Jan 11 17:53:05 crc kubenswrapper[4837]: I0111 17:53:05.111016 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/66fa097f-464b-4efd-b0e3-8129153d92d5-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"66fa097f-464b-4efd-b0e3-8129153d92d5\") " pod="openstack/nova-api-0" Jan 11 17:53:05 crc kubenswrapper[4837]: I0111 17:53:05.111048 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jtbvw\" (UniqueName: \"kubernetes.io/projected/66fa097f-464b-4efd-b0e3-8129153d92d5-kube-api-access-jtbvw\") pod \"nova-api-0\" (UID: \"66fa097f-464b-4efd-b0e3-8129153d92d5\") " pod="openstack/nova-api-0" Jan 11 17:53:05 crc kubenswrapper[4837]: I0111 17:53:05.115046 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4a7db2fa-3a60-4cf5-91c0-dcd6c2698017-combined-ca-bundle\") pod \"nova-cell1-novncproxy-0\" (UID: \"4a7db2fa-3a60-4cf5-91c0-dcd6c2698017\") " pod="openstack/nova-cell1-novncproxy-0" Jan 11 17:53:05 crc kubenswrapper[4837]: I0111 17:53:05.132532 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4a7db2fa-3a60-4cf5-91c0-dcd6c2698017-config-data\") pod \"nova-cell1-novncproxy-0\" (UID: \"4a7db2fa-3a60-4cf5-91c0-dcd6c2698017\") " pod="openstack/nova-cell1-novncproxy-0" Jan 11 17:53:05 crc kubenswrapper[4837]: I0111 17:53:05.145197 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lwrzf\" (UniqueName: \"kubernetes.io/projected/4a7db2fa-3a60-4cf5-91c0-dcd6c2698017-kube-api-access-lwrzf\") pod \"nova-cell1-novncproxy-0\" (UID: \"4a7db2fa-3a60-4cf5-91c0-dcd6c2698017\") " pod="openstack/nova-cell1-novncproxy-0" Jan 11 17:53:05 crc kubenswrapper[4837]: I0111 17:53:05.152067 4837 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-scheduler-0"] Jan 11 17:53:05 crc kubenswrapper[4837]: I0111 17:53:05.153119 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Jan 11 17:53:05 crc kubenswrapper[4837]: I0111 17:53:05.160480 4837 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-scheduler-config-data" Jan 11 17:53:05 crc kubenswrapper[4837]: I0111 17:53:05.178981 4837 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-scheduler-0"] Jan 11 17:53:05 crc kubenswrapper[4837]: I0111 17:53:05.212335 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a362ee58-5327-4ec1-95ca-f0579f471f84-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"a362ee58-5327-4ec1-95ca-f0579f471f84\") " pod="openstack/nova-scheduler-0" Jan 11 17:53:05 crc kubenswrapper[4837]: I0111 17:53:05.212430 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/66fa097f-464b-4efd-b0e3-8129153d92d5-config-data\") pod \"nova-api-0\" (UID: \"66fa097f-464b-4efd-b0e3-8129153d92d5\") " pod="openstack/nova-api-0" Jan 11 17:53:05 crc kubenswrapper[4837]: I0111 17:53:05.212513 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/66fa097f-464b-4efd-b0e3-8129153d92d5-logs\") pod \"nova-api-0\" (UID: \"66fa097f-464b-4efd-b0e3-8129153d92d5\") " pod="openstack/nova-api-0" Jan 11 17:53:05 crc kubenswrapper[4837]: I0111 17:53:05.212560 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/66fa097f-464b-4efd-b0e3-8129153d92d5-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"66fa097f-464b-4efd-b0e3-8129153d92d5\") " pod="openstack/nova-api-0" Jan 11 17:53:05 crc kubenswrapper[4837]: I0111 17:53:05.212598 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jtbvw\" (UniqueName: \"kubernetes.io/projected/66fa097f-464b-4efd-b0e3-8129153d92d5-kube-api-access-jtbvw\") pod \"nova-api-0\" (UID: \"66fa097f-464b-4efd-b0e3-8129153d92d5\") " pod="openstack/nova-api-0" Jan 11 17:53:05 crc kubenswrapper[4837]: I0111 17:53:05.212622 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xk2qr\" (UniqueName: \"kubernetes.io/projected/a362ee58-5327-4ec1-95ca-f0579f471f84-kube-api-access-xk2qr\") pod \"nova-scheduler-0\" (UID: \"a362ee58-5327-4ec1-95ca-f0579f471f84\") " pod="openstack/nova-scheduler-0" Jan 11 17:53:05 crc kubenswrapper[4837]: I0111 17:53:05.212642 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a362ee58-5327-4ec1-95ca-f0579f471f84-config-data\") pod \"nova-scheduler-0\" (UID: \"a362ee58-5327-4ec1-95ca-f0579f471f84\") " pod="openstack/nova-scheduler-0" Jan 11 17:53:05 crc kubenswrapper[4837]: I0111 17:53:05.216519 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/66fa097f-464b-4efd-b0e3-8129153d92d5-logs\") pod \"nova-api-0\" (UID: \"66fa097f-464b-4efd-b0e3-8129153d92d5\") " pod="openstack/nova-api-0" Jan 11 17:53:05 crc kubenswrapper[4837]: I0111 17:53:05.227356 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/66fa097f-464b-4efd-b0e3-8129153d92d5-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"66fa097f-464b-4efd-b0e3-8129153d92d5\") " pod="openstack/nova-api-0" Jan 11 17:53:05 crc kubenswrapper[4837]: I0111 17:53:05.262055 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-novncproxy-0" Jan 11 17:53:05 crc kubenswrapper[4837]: I0111 17:53:05.263197 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/66fa097f-464b-4efd-b0e3-8129153d92d5-config-data\") pod \"nova-api-0\" (UID: \"66fa097f-464b-4efd-b0e3-8129153d92d5\") " pod="openstack/nova-api-0" Jan 11 17:53:05 crc kubenswrapper[4837]: I0111 17:53:05.263783 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jtbvw\" (UniqueName: \"kubernetes.io/projected/66fa097f-464b-4efd-b0e3-8129153d92d5-kube-api-access-jtbvw\") pod \"nova-api-0\" (UID: \"66fa097f-464b-4efd-b0e3-8129153d92d5\") " pod="openstack/nova-api-0" Jan 11 17:53:05 crc kubenswrapper[4837]: I0111 17:53:05.265970 4837 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-metadata-0"] Jan 11 17:53:05 crc kubenswrapper[4837]: I0111 17:53:05.271611 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Jan 11 17:53:05 crc kubenswrapper[4837]: I0111 17:53:05.273712 4837 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-metadata-config-data" Jan 11 17:53:05 crc kubenswrapper[4837]: I0111 17:53:05.299472 4837 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Jan 11 17:53:05 crc kubenswrapper[4837]: I0111 17:53:05.316568 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xk2qr\" (UniqueName: \"kubernetes.io/projected/a362ee58-5327-4ec1-95ca-f0579f471f84-kube-api-access-xk2qr\") pod \"nova-scheduler-0\" (UID: \"a362ee58-5327-4ec1-95ca-f0579f471f84\") " pod="openstack/nova-scheduler-0" Jan 11 17:53:05 crc kubenswrapper[4837]: I0111 17:53:05.316840 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a362ee58-5327-4ec1-95ca-f0579f471f84-config-data\") pod \"nova-scheduler-0\" (UID: \"a362ee58-5327-4ec1-95ca-f0579f471f84\") " pod="openstack/nova-scheduler-0" Jan 11 17:53:05 crc kubenswrapper[4837]: I0111 17:53:05.316889 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a362ee58-5327-4ec1-95ca-f0579f471f84-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"a362ee58-5327-4ec1-95ca-f0579f471f84\") " pod="openstack/nova-scheduler-0" Jan 11 17:53:05 crc kubenswrapper[4837]: I0111 17:53:05.316911 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2379c115-d0b5-4ed8-bcc3-1814d13efcef-config-data\") pod \"nova-metadata-0\" (UID: \"2379c115-d0b5-4ed8-bcc3-1814d13efcef\") " pod="openstack/nova-metadata-0" Jan 11 17:53:05 crc kubenswrapper[4837]: I0111 17:53:05.316925 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/2379c115-d0b5-4ed8-bcc3-1814d13efcef-logs\") pod \"nova-metadata-0\" (UID: \"2379c115-d0b5-4ed8-bcc3-1814d13efcef\") " pod="openstack/nova-metadata-0" Jan 11 17:53:05 crc kubenswrapper[4837]: I0111 17:53:05.316940 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vqvrm\" (UniqueName: \"kubernetes.io/projected/2379c115-d0b5-4ed8-bcc3-1814d13efcef-kube-api-access-vqvrm\") pod \"nova-metadata-0\" (UID: \"2379c115-d0b5-4ed8-bcc3-1814d13efcef\") " pod="openstack/nova-metadata-0" Jan 11 17:53:05 crc kubenswrapper[4837]: I0111 17:53:05.317076 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2379c115-d0b5-4ed8-bcc3-1814d13efcef-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"2379c115-d0b5-4ed8-bcc3-1814d13efcef\") " pod="openstack/nova-metadata-0" Jan 11 17:53:05 crc kubenswrapper[4837]: I0111 17:53:05.322892 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a362ee58-5327-4ec1-95ca-f0579f471f84-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"a362ee58-5327-4ec1-95ca-f0579f471f84\") " pod="openstack/nova-scheduler-0" Jan 11 17:53:05 crc kubenswrapper[4837]: I0111 17:53:05.335035 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a362ee58-5327-4ec1-95ca-f0579f471f84-config-data\") pod \"nova-scheduler-0\" (UID: \"a362ee58-5327-4ec1-95ca-f0579f471f84\") " pod="openstack/nova-scheduler-0" Jan 11 17:53:05 crc kubenswrapper[4837]: I0111 17:53:05.341215 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xk2qr\" (UniqueName: \"kubernetes.io/projected/a362ee58-5327-4ec1-95ca-f0579f471f84-kube-api-access-xk2qr\") pod \"nova-scheduler-0\" (UID: \"a362ee58-5327-4ec1-95ca-f0579f471f84\") " pod="openstack/nova-scheduler-0" Jan 11 17:53:05 crc kubenswrapper[4837]: I0111 17:53:05.364194 4837 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-845d6d6f59-8q68w"] Jan 11 17:53:05 crc kubenswrapper[4837]: I0111 17:53:05.365695 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-845d6d6f59-8q68w" Jan 11 17:53:05 crc kubenswrapper[4837]: I0111 17:53:05.403315 4837 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-845d6d6f59-8q68w"] Jan 11 17:53:05 crc kubenswrapper[4837]: I0111 17:53:05.404013 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Jan 11 17:53:05 crc kubenswrapper[4837]: I0111 17:53:05.419114 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/a01c38b2-b0ff-400f-a6af-3be08fad9373-ovsdbserver-sb\") pod \"dnsmasq-dns-845d6d6f59-8q68w\" (UID: \"a01c38b2-b0ff-400f-a6af-3be08fad9373\") " pod="openstack/dnsmasq-dns-845d6d6f59-8q68w" Jan 11 17:53:05 crc kubenswrapper[4837]: I0111 17:53:05.419176 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/a01c38b2-b0ff-400f-a6af-3be08fad9373-dns-swift-storage-0\") pod \"dnsmasq-dns-845d6d6f59-8q68w\" (UID: \"a01c38b2-b0ff-400f-a6af-3be08fad9373\") " pod="openstack/dnsmasq-dns-845d6d6f59-8q68w" Jan 11 17:53:05 crc kubenswrapper[4837]: I0111 17:53:05.419234 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2379c115-d0b5-4ed8-bcc3-1814d13efcef-config-data\") pod \"nova-metadata-0\" (UID: \"2379c115-d0b5-4ed8-bcc3-1814d13efcef\") " pod="openstack/nova-metadata-0" Jan 11 17:53:05 crc kubenswrapper[4837]: I0111 17:53:05.419256 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/2379c115-d0b5-4ed8-bcc3-1814d13efcef-logs\") pod \"nova-metadata-0\" (UID: \"2379c115-d0b5-4ed8-bcc3-1814d13efcef\") " pod="openstack/nova-metadata-0" Jan 11 17:53:05 crc kubenswrapper[4837]: I0111 17:53:05.419282 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vqvrm\" (UniqueName: \"kubernetes.io/projected/2379c115-d0b5-4ed8-bcc3-1814d13efcef-kube-api-access-vqvrm\") pod \"nova-metadata-0\" (UID: \"2379c115-d0b5-4ed8-bcc3-1814d13efcef\") " pod="openstack/nova-metadata-0" Jan 11 17:53:05 crc kubenswrapper[4837]: I0111 17:53:05.420214 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/2379c115-d0b5-4ed8-bcc3-1814d13efcef-logs\") pod \"nova-metadata-0\" (UID: \"2379c115-d0b5-4ed8-bcc3-1814d13efcef\") " pod="openstack/nova-metadata-0" Jan 11 17:53:05 crc kubenswrapper[4837]: I0111 17:53:05.420311 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qt2xh\" (UniqueName: \"kubernetes.io/projected/a01c38b2-b0ff-400f-a6af-3be08fad9373-kube-api-access-qt2xh\") pod \"dnsmasq-dns-845d6d6f59-8q68w\" (UID: \"a01c38b2-b0ff-400f-a6af-3be08fad9373\") " pod="openstack/dnsmasq-dns-845d6d6f59-8q68w" Jan 11 17:53:05 crc kubenswrapper[4837]: I0111 17:53:05.420374 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/a01c38b2-b0ff-400f-a6af-3be08fad9373-ovsdbserver-nb\") pod \"dnsmasq-dns-845d6d6f59-8q68w\" (UID: \"a01c38b2-b0ff-400f-a6af-3be08fad9373\") " pod="openstack/dnsmasq-dns-845d6d6f59-8q68w" Jan 11 17:53:05 crc kubenswrapper[4837]: I0111 17:53:05.420492 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a01c38b2-b0ff-400f-a6af-3be08fad9373-config\") pod \"dnsmasq-dns-845d6d6f59-8q68w\" (UID: \"a01c38b2-b0ff-400f-a6af-3be08fad9373\") " pod="openstack/dnsmasq-dns-845d6d6f59-8q68w" Jan 11 17:53:05 crc kubenswrapper[4837]: I0111 17:53:05.420630 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2379c115-d0b5-4ed8-bcc3-1814d13efcef-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"2379c115-d0b5-4ed8-bcc3-1814d13efcef\") " pod="openstack/nova-metadata-0" Jan 11 17:53:05 crc kubenswrapper[4837]: I0111 17:53:05.420664 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/a01c38b2-b0ff-400f-a6af-3be08fad9373-dns-svc\") pod \"dnsmasq-dns-845d6d6f59-8q68w\" (UID: \"a01c38b2-b0ff-400f-a6af-3be08fad9373\") " pod="openstack/dnsmasq-dns-845d6d6f59-8q68w" Jan 11 17:53:05 crc kubenswrapper[4837]: I0111 17:53:05.423200 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2379c115-d0b5-4ed8-bcc3-1814d13efcef-config-data\") pod \"nova-metadata-0\" (UID: \"2379c115-d0b5-4ed8-bcc3-1814d13efcef\") " pod="openstack/nova-metadata-0" Jan 11 17:53:05 crc kubenswrapper[4837]: I0111 17:53:05.425467 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2379c115-d0b5-4ed8-bcc3-1814d13efcef-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"2379c115-d0b5-4ed8-bcc3-1814d13efcef\") " pod="openstack/nova-metadata-0" Jan 11 17:53:05 crc kubenswrapper[4837]: I0111 17:53:05.443183 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vqvrm\" (UniqueName: \"kubernetes.io/projected/2379c115-d0b5-4ed8-bcc3-1814d13efcef-kube-api-access-vqvrm\") pod \"nova-metadata-0\" (UID: \"2379c115-d0b5-4ed8-bcc3-1814d13efcef\") " pod="openstack/nova-metadata-0" Jan 11 17:53:05 crc kubenswrapper[4837]: I0111 17:53:05.515168 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Jan 11 17:53:05 crc kubenswrapper[4837]: I0111 17:53:05.529629 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/a01c38b2-b0ff-400f-a6af-3be08fad9373-ovsdbserver-sb\") pod \"dnsmasq-dns-845d6d6f59-8q68w\" (UID: \"a01c38b2-b0ff-400f-a6af-3be08fad9373\") " pod="openstack/dnsmasq-dns-845d6d6f59-8q68w" Jan 11 17:53:05 crc kubenswrapper[4837]: I0111 17:53:05.530534 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/a01c38b2-b0ff-400f-a6af-3be08fad9373-ovsdbserver-sb\") pod \"dnsmasq-dns-845d6d6f59-8q68w\" (UID: \"a01c38b2-b0ff-400f-a6af-3be08fad9373\") " pod="openstack/dnsmasq-dns-845d6d6f59-8q68w" Jan 11 17:53:05 crc kubenswrapper[4837]: I0111 17:53:05.530632 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/a01c38b2-b0ff-400f-a6af-3be08fad9373-dns-swift-storage-0\") pod \"dnsmasq-dns-845d6d6f59-8q68w\" (UID: \"a01c38b2-b0ff-400f-a6af-3be08fad9373\") " pod="openstack/dnsmasq-dns-845d6d6f59-8q68w" Jan 11 17:53:05 crc kubenswrapper[4837]: I0111 17:53:05.530754 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qt2xh\" (UniqueName: \"kubernetes.io/projected/a01c38b2-b0ff-400f-a6af-3be08fad9373-kube-api-access-qt2xh\") pod \"dnsmasq-dns-845d6d6f59-8q68w\" (UID: \"a01c38b2-b0ff-400f-a6af-3be08fad9373\") " pod="openstack/dnsmasq-dns-845d6d6f59-8q68w" Jan 11 17:53:05 crc kubenswrapper[4837]: I0111 17:53:05.530790 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/a01c38b2-b0ff-400f-a6af-3be08fad9373-ovsdbserver-nb\") pod \"dnsmasq-dns-845d6d6f59-8q68w\" (UID: \"a01c38b2-b0ff-400f-a6af-3be08fad9373\") " pod="openstack/dnsmasq-dns-845d6d6f59-8q68w" Jan 11 17:53:05 crc kubenswrapper[4837]: I0111 17:53:05.530857 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a01c38b2-b0ff-400f-a6af-3be08fad9373-config\") pod \"dnsmasq-dns-845d6d6f59-8q68w\" (UID: \"a01c38b2-b0ff-400f-a6af-3be08fad9373\") " pod="openstack/dnsmasq-dns-845d6d6f59-8q68w" Jan 11 17:53:05 crc kubenswrapper[4837]: I0111 17:53:05.530950 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/a01c38b2-b0ff-400f-a6af-3be08fad9373-dns-svc\") pod \"dnsmasq-dns-845d6d6f59-8q68w\" (UID: \"a01c38b2-b0ff-400f-a6af-3be08fad9373\") " pod="openstack/dnsmasq-dns-845d6d6f59-8q68w" Jan 11 17:53:05 crc kubenswrapper[4837]: I0111 17:53:05.531946 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/a01c38b2-b0ff-400f-a6af-3be08fad9373-ovsdbserver-nb\") pod \"dnsmasq-dns-845d6d6f59-8q68w\" (UID: \"a01c38b2-b0ff-400f-a6af-3be08fad9373\") " pod="openstack/dnsmasq-dns-845d6d6f59-8q68w" Jan 11 17:53:05 crc kubenswrapper[4837]: I0111 17:53:05.532252 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/a01c38b2-b0ff-400f-a6af-3be08fad9373-dns-svc\") pod \"dnsmasq-dns-845d6d6f59-8q68w\" (UID: \"a01c38b2-b0ff-400f-a6af-3be08fad9373\") " pod="openstack/dnsmasq-dns-845d6d6f59-8q68w" Jan 11 17:53:05 crc kubenswrapper[4837]: I0111 17:53:05.532466 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/a01c38b2-b0ff-400f-a6af-3be08fad9373-dns-swift-storage-0\") pod \"dnsmasq-dns-845d6d6f59-8q68w\" (UID: \"a01c38b2-b0ff-400f-a6af-3be08fad9373\") " pod="openstack/dnsmasq-dns-845d6d6f59-8q68w" Jan 11 17:53:05 crc kubenswrapper[4837]: I0111 17:53:05.532629 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a01c38b2-b0ff-400f-a6af-3be08fad9373-config\") pod \"dnsmasq-dns-845d6d6f59-8q68w\" (UID: \"a01c38b2-b0ff-400f-a6af-3be08fad9373\") " pod="openstack/dnsmasq-dns-845d6d6f59-8q68w" Jan 11 17:53:05 crc kubenswrapper[4837]: I0111 17:53:05.557610 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qt2xh\" (UniqueName: \"kubernetes.io/projected/a01c38b2-b0ff-400f-a6af-3be08fad9373-kube-api-access-qt2xh\") pod \"dnsmasq-dns-845d6d6f59-8q68w\" (UID: \"a01c38b2-b0ff-400f-a6af-3be08fad9373\") " pod="openstack/dnsmasq-dns-845d6d6f59-8q68w" Jan 11 17:53:05 crc kubenswrapper[4837]: I0111 17:53:05.591510 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Jan 11 17:53:05 crc kubenswrapper[4837]: I0111 17:53:05.707749 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-845d6d6f59-8q68w" Jan 11 17:53:05 crc kubenswrapper[4837]: I0111 17:53:05.774832 4837 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-cell-mapping-5wzxr"] Jan 11 17:53:05 crc kubenswrapper[4837]: I0111 17:53:05.935830 4837 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Jan 11 17:53:05 crc kubenswrapper[4837]: I0111 17:53:05.940558 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-cell-mapping-5wzxr" event={"ID":"bb261701-19a1-4f8f-a84b-e8748c2eb561","Type":"ContainerStarted","Data":"645e7db218f4963327f9c0f84cb8a6f869269b704c7c603ea1b7e10fcf47e732"} Jan 11 17:53:05 crc kubenswrapper[4837]: I0111 17:53:05.941610 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-novncproxy-0" event={"ID":"4a7db2fa-3a60-4cf5-91c0-dcd6c2698017","Type":"ContainerStarted","Data":"64c0feed2f7dcfe54d08da836ffa3b2a08d7f3e1439561924272aa7c15cd0b30"} Jan 11 17:53:05 crc kubenswrapper[4837]: I0111 17:53:05.947554 4837 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Jan 11 17:53:06 crc kubenswrapper[4837]: I0111 17:53:06.189944 4837 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-conductor-db-sync-78z78"] Jan 11 17:53:06 crc kubenswrapper[4837]: I0111 17:53:06.191951 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-conductor-db-sync-78z78" Jan 11 17:53:06 crc kubenswrapper[4837]: I0111 17:53:06.194599 4837 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-conductor-config-data" Jan 11 17:53:06 crc kubenswrapper[4837]: I0111 17:53:06.195449 4837 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-conductor-scripts" Jan 11 17:53:06 crc kubenswrapper[4837]: I0111 17:53:06.200143 4837 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-conductor-db-sync-78z78"] Jan 11 17:53:06 crc kubenswrapper[4837]: I0111 17:53:06.207497 4837 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-scheduler-0"] Jan 11 17:53:06 crc kubenswrapper[4837]: I0111 17:53:06.249564 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/6da0d3e9-aabb-41f6-aaad-7cb1d1f203c8-scripts\") pod \"nova-cell1-conductor-db-sync-78z78\" (UID: \"6da0d3e9-aabb-41f6-aaad-7cb1d1f203c8\") " pod="openstack/nova-cell1-conductor-db-sync-78z78" Jan 11 17:53:06 crc kubenswrapper[4837]: I0111 17:53:06.249636 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ppf49\" (UniqueName: \"kubernetes.io/projected/6da0d3e9-aabb-41f6-aaad-7cb1d1f203c8-kube-api-access-ppf49\") pod \"nova-cell1-conductor-db-sync-78z78\" (UID: \"6da0d3e9-aabb-41f6-aaad-7cb1d1f203c8\") " pod="openstack/nova-cell1-conductor-db-sync-78z78" Jan 11 17:53:06 crc kubenswrapper[4837]: I0111 17:53:06.249685 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6da0d3e9-aabb-41f6-aaad-7cb1d1f203c8-combined-ca-bundle\") pod \"nova-cell1-conductor-db-sync-78z78\" (UID: \"6da0d3e9-aabb-41f6-aaad-7cb1d1f203c8\") " pod="openstack/nova-cell1-conductor-db-sync-78z78" Jan 11 17:53:06 crc kubenswrapper[4837]: I0111 17:53:06.249705 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6da0d3e9-aabb-41f6-aaad-7cb1d1f203c8-config-data\") pod \"nova-cell1-conductor-db-sync-78z78\" (UID: \"6da0d3e9-aabb-41f6-aaad-7cb1d1f203c8\") " pod="openstack/nova-cell1-conductor-db-sync-78z78" Jan 11 17:53:06 crc kubenswrapper[4837]: I0111 17:53:06.304935 4837 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Jan 11 17:53:06 crc kubenswrapper[4837]: I0111 17:53:06.351849 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/6da0d3e9-aabb-41f6-aaad-7cb1d1f203c8-scripts\") pod \"nova-cell1-conductor-db-sync-78z78\" (UID: \"6da0d3e9-aabb-41f6-aaad-7cb1d1f203c8\") " pod="openstack/nova-cell1-conductor-db-sync-78z78" Jan 11 17:53:06 crc kubenswrapper[4837]: I0111 17:53:06.351916 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ppf49\" (UniqueName: \"kubernetes.io/projected/6da0d3e9-aabb-41f6-aaad-7cb1d1f203c8-kube-api-access-ppf49\") pod \"nova-cell1-conductor-db-sync-78z78\" (UID: \"6da0d3e9-aabb-41f6-aaad-7cb1d1f203c8\") " pod="openstack/nova-cell1-conductor-db-sync-78z78" Jan 11 17:53:06 crc kubenswrapper[4837]: I0111 17:53:06.351974 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6da0d3e9-aabb-41f6-aaad-7cb1d1f203c8-combined-ca-bundle\") pod \"nova-cell1-conductor-db-sync-78z78\" (UID: \"6da0d3e9-aabb-41f6-aaad-7cb1d1f203c8\") " pod="openstack/nova-cell1-conductor-db-sync-78z78" Jan 11 17:53:06 crc kubenswrapper[4837]: I0111 17:53:06.352007 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6da0d3e9-aabb-41f6-aaad-7cb1d1f203c8-config-data\") pod \"nova-cell1-conductor-db-sync-78z78\" (UID: \"6da0d3e9-aabb-41f6-aaad-7cb1d1f203c8\") " pod="openstack/nova-cell1-conductor-db-sync-78z78" Jan 11 17:53:06 crc kubenswrapper[4837]: I0111 17:53:06.361107 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6da0d3e9-aabb-41f6-aaad-7cb1d1f203c8-combined-ca-bundle\") pod \"nova-cell1-conductor-db-sync-78z78\" (UID: \"6da0d3e9-aabb-41f6-aaad-7cb1d1f203c8\") " pod="openstack/nova-cell1-conductor-db-sync-78z78" Jan 11 17:53:06 crc kubenswrapper[4837]: I0111 17:53:06.361162 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6da0d3e9-aabb-41f6-aaad-7cb1d1f203c8-config-data\") pod \"nova-cell1-conductor-db-sync-78z78\" (UID: \"6da0d3e9-aabb-41f6-aaad-7cb1d1f203c8\") " pod="openstack/nova-cell1-conductor-db-sync-78z78" Jan 11 17:53:06 crc kubenswrapper[4837]: I0111 17:53:06.376334 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/6da0d3e9-aabb-41f6-aaad-7cb1d1f203c8-scripts\") pod \"nova-cell1-conductor-db-sync-78z78\" (UID: \"6da0d3e9-aabb-41f6-aaad-7cb1d1f203c8\") " pod="openstack/nova-cell1-conductor-db-sync-78z78" Jan 11 17:53:06 crc kubenswrapper[4837]: I0111 17:53:06.383278 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ppf49\" (UniqueName: \"kubernetes.io/projected/6da0d3e9-aabb-41f6-aaad-7cb1d1f203c8-kube-api-access-ppf49\") pod \"nova-cell1-conductor-db-sync-78z78\" (UID: \"6da0d3e9-aabb-41f6-aaad-7cb1d1f203c8\") " pod="openstack/nova-cell1-conductor-db-sync-78z78" Jan 11 17:53:06 crc kubenswrapper[4837]: W0111 17:53:06.415834 4837 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-poda01c38b2_b0ff_400f_a6af_3be08fad9373.slice/crio-0e581973aa6483229ef8f725f63551923adbb0ec9eb872dfeb60cec9534fb4ac WatchSource:0}: Error finding container 0e581973aa6483229ef8f725f63551923adbb0ec9eb872dfeb60cec9534fb4ac: Status 404 returned error can't find the container with id 0e581973aa6483229ef8f725f63551923adbb0ec9eb872dfeb60cec9534fb4ac Jan 11 17:53:06 crc kubenswrapper[4837]: I0111 17:53:06.440842 4837 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-845d6d6f59-8q68w"] Jan 11 17:53:06 crc kubenswrapper[4837]: I0111 17:53:06.555378 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-conductor-db-sync-78z78" Jan 11 17:53:06 crc kubenswrapper[4837]: I0111 17:53:06.969084 4837 generic.go:334] "Generic (PLEG): container finished" podID="a01c38b2-b0ff-400f-a6af-3be08fad9373" containerID="387185be2dafebb0af31a3a4622b4f11ac5d86bde2c0f5c750fae35bed33aefb" exitCode=0 Jan 11 17:53:06 crc kubenswrapper[4837]: I0111 17:53:06.969270 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-845d6d6f59-8q68w" event={"ID":"a01c38b2-b0ff-400f-a6af-3be08fad9373","Type":"ContainerDied","Data":"387185be2dafebb0af31a3a4622b4f11ac5d86bde2c0f5c750fae35bed33aefb"} Jan 11 17:53:06 crc kubenswrapper[4837]: I0111 17:53:06.969566 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-845d6d6f59-8q68w" event={"ID":"a01c38b2-b0ff-400f-a6af-3be08fad9373","Type":"ContainerStarted","Data":"0e581973aa6483229ef8f725f63551923adbb0ec9eb872dfeb60cec9534fb4ac"} Jan 11 17:53:06 crc kubenswrapper[4837]: I0111 17:53:06.977595 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"2379c115-d0b5-4ed8-bcc3-1814d13efcef","Type":"ContainerStarted","Data":"0db5df021593db59942caf1328b8d9a970c9c7c7c20761cd2aed33e3fae2d012"} Jan 11 17:53:06 crc kubenswrapper[4837]: I0111 17:53:06.982491 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-cell-mapping-5wzxr" event={"ID":"bb261701-19a1-4f8f-a84b-e8748c2eb561","Type":"ContainerStarted","Data":"9b9a7b0972d386315e54f02a158bf16ccf012c9b8757d163aac3a554e6f3dbef"} Jan 11 17:53:06 crc kubenswrapper[4837]: I0111 17:53:06.984308 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"66fa097f-464b-4efd-b0e3-8129153d92d5","Type":"ContainerStarted","Data":"a7ed6abb05b741be7317a82fce85ff03f4649cf15c7b46ef5d1ee1ba6b090185"} Jan 11 17:53:06 crc kubenswrapper[4837]: I0111 17:53:06.985778 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"a362ee58-5327-4ec1-95ca-f0579f471f84","Type":"ContainerStarted","Data":"7a9bde5fbf28a5f67349a0d4549c640521d1303fd02d734f8ee18f3aa8771f3c"} Jan 11 17:53:07 crc kubenswrapper[4837]: I0111 17:53:07.016410 4837 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell0-cell-mapping-5wzxr" podStartSLOduration=3.016388379 podStartE2EDuration="3.016388379s" podCreationTimestamp="2026-01-11 17:53:04 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-11 17:53:07.004749556 +0000 UTC m=+1361.182942252" watchObservedRunningTime="2026-01-11 17:53:07.016388379 +0000 UTC m=+1361.194581085" Jan 11 17:53:07 crc kubenswrapper[4837]: I0111 17:53:07.106765 4837 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-conductor-db-sync-78z78"] Jan 11 17:53:07 crc kubenswrapper[4837]: I0111 17:53:07.995950 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-845d6d6f59-8q68w" event={"ID":"a01c38b2-b0ff-400f-a6af-3be08fad9373","Type":"ContainerStarted","Data":"20a389c545715b8834414800356628d63aacbef1a7d9800a452baf08c3f21efe"} Jan 11 17:53:07 crc kubenswrapper[4837]: I0111 17:53:07.996220 4837 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-845d6d6f59-8q68w" Jan 11 17:53:07 crc kubenswrapper[4837]: I0111 17:53:07.997557 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-conductor-db-sync-78z78" event={"ID":"6da0d3e9-aabb-41f6-aaad-7cb1d1f203c8","Type":"ContainerStarted","Data":"2c96270d6304f99237df043c8920c8cf4396b96d28a8b4e85e2929e5b9c23861"} Jan 11 17:53:08 crc kubenswrapper[4837]: I0111 17:53:08.019766 4837 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-845d6d6f59-8q68w" podStartSLOduration=3.019745388 podStartE2EDuration="3.019745388s" podCreationTimestamp="2026-01-11 17:53:05 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-11 17:53:08.015222926 +0000 UTC m=+1362.193415642" watchObservedRunningTime="2026-01-11 17:53:08.019745388 +0000 UTC m=+1362.197938094" Jan 11 17:53:08 crc kubenswrapper[4837]: I0111 17:53:08.912359 4837 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-metadata-0"] Jan 11 17:53:08 crc kubenswrapper[4837]: I0111 17:53:08.922578 4837 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Jan 11 17:53:09 crc kubenswrapper[4837]: I0111 17:53:09.011078 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-conductor-db-sync-78z78" event={"ID":"6da0d3e9-aabb-41f6-aaad-7cb1d1f203c8","Type":"ContainerStarted","Data":"f633d6111f7d5bceba726b3ef0fefa136f025a6f85e8dc8f5ebde40e5830b39f"} Jan 11 17:53:09 crc kubenswrapper[4837]: I0111 17:53:09.034832 4837 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell1-conductor-db-sync-78z78" podStartSLOduration=3.034808061 podStartE2EDuration="3.034808061s" podCreationTimestamp="2026-01-11 17:53:06 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-11 17:53:09.02993623 +0000 UTC m=+1363.208128936" watchObservedRunningTime="2026-01-11 17:53:09.034808061 +0000 UTC m=+1363.213000787" Jan 11 17:53:10 crc kubenswrapper[4837]: I0111 17:53:10.021535 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"66fa097f-464b-4efd-b0e3-8129153d92d5","Type":"ContainerStarted","Data":"8d9553b51c598777bbe802217fb2c62cb4a91ae68e4c7bbda8a126f4b182a7a1"} Jan 11 17:53:10 crc kubenswrapper[4837]: I0111 17:53:10.022000 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"66fa097f-464b-4efd-b0e3-8129153d92d5","Type":"ContainerStarted","Data":"faf1bab0d37200648afc7be3eb186f5da2992aaa32ef7c37aefb8e6fa3474b17"} Jan 11 17:53:10 crc kubenswrapper[4837]: I0111 17:53:10.023525 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-novncproxy-0" event={"ID":"4a7db2fa-3a60-4cf5-91c0-dcd6c2698017","Type":"ContainerStarted","Data":"f92ef5ffcf365e1c7b136e774917941f7ce4f020a5c85792b9b22f5cddb3d118"} Jan 11 17:53:10 crc kubenswrapper[4837]: I0111 17:53:10.023664 4837 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-cell1-novncproxy-0" podUID="4a7db2fa-3a60-4cf5-91c0-dcd6c2698017" containerName="nova-cell1-novncproxy-novncproxy" containerID="cri-o://f92ef5ffcf365e1c7b136e774917941f7ce4f020a5c85792b9b22f5cddb3d118" gracePeriod=30 Jan 11 17:53:10 crc kubenswrapper[4837]: I0111 17:53:10.025268 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"a362ee58-5327-4ec1-95ca-f0579f471f84","Type":"ContainerStarted","Data":"41584b8c8efc8fccdccf666a89ab7871c0676b7f5c98145efd261b8b22383bca"} Jan 11 17:53:10 crc kubenswrapper[4837]: I0111 17:53:10.029427 4837 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-metadata-0" podUID="2379c115-d0b5-4ed8-bcc3-1814d13efcef" containerName="nova-metadata-log" containerID="cri-o://f88dcfc5081f1d804481272b5c1b2145d1723ab77ce57fa6015712a09e3863af" gracePeriod=30 Jan 11 17:53:10 crc kubenswrapper[4837]: I0111 17:53:10.029755 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"2379c115-d0b5-4ed8-bcc3-1814d13efcef","Type":"ContainerStarted","Data":"1380087c19e314dcbb1e016d88cd7a44c22a194bb7ffa84c2133a0bd1eb0e294"} Jan 11 17:53:10 crc kubenswrapper[4837]: I0111 17:53:10.029797 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"2379c115-d0b5-4ed8-bcc3-1814d13efcef","Type":"ContainerStarted","Data":"f88dcfc5081f1d804481272b5c1b2145d1723ab77ce57fa6015712a09e3863af"} Jan 11 17:53:10 crc kubenswrapper[4837]: I0111 17:53:10.029840 4837 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-metadata-0" podUID="2379c115-d0b5-4ed8-bcc3-1814d13efcef" containerName="nova-metadata-metadata" containerID="cri-o://1380087c19e314dcbb1e016d88cd7a44c22a194bb7ffa84c2133a0bd1eb0e294" gracePeriod=30 Jan 11 17:53:10 crc kubenswrapper[4837]: I0111 17:53:10.049730 4837 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-api-0" podStartSLOduration=2.80083345 podStartE2EDuration="6.049712272s" podCreationTimestamp="2026-01-11 17:53:04 +0000 UTC" firstStartedPulling="2026-01-11 17:53:05.947704195 +0000 UTC m=+1360.125896901" lastFinishedPulling="2026-01-11 17:53:09.196583017 +0000 UTC m=+1363.374775723" observedRunningTime="2026-01-11 17:53:10.041364598 +0000 UTC m=+1364.219557304" watchObservedRunningTime="2026-01-11 17:53:10.049712272 +0000 UTC m=+1364.227904978" Jan 11 17:53:10 crc kubenswrapper[4837]: I0111 17:53:10.064561 4837 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell1-novncproxy-0" podStartSLOduration=2.816743798 podStartE2EDuration="6.064549021s" podCreationTimestamp="2026-01-11 17:53:04 +0000 UTC" firstStartedPulling="2026-01-11 17:53:05.947371486 +0000 UTC m=+1360.125564182" lastFinishedPulling="2026-01-11 17:53:09.195176699 +0000 UTC m=+1363.373369405" observedRunningTime="2026-01-11 17:53:10.063356248 +0000 UTC m=+1364.241548954" watchObservedRunningTime="2026-01-11 17:53:10.064549021 +0000 UTC m=+1364.242741727" Jan 11 17:53:10 crc kubenswrapper[4837]: I0111 17:53:10.080081 4837 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-scheduler-0" podStartSLOduration=2.09008132 podStartE2EDuration="5.080063567s" podCreationTimestamp="2026-01-11 17:53:05 +0000 UTC" firstStartedPulling="2026-01-11 17:53:06.207104163 +0000 UTC m=+1360.385296869" lastFinishedPulling="2026-01-11 17:53:09.19708641 +0000 UTC m=+1363.375279116" observedRunningTime="2026-01-11 17:53:10.078034052 +0000 UTC m=+1364.256226748" watchObservedRunningTime="2026-01-11 17:53:10.080063567 +0000 UTC m=+1364.258256273" Jan 11 17:53:10 crc kubenswrapper[4837]: I0111 17:53:10.099278 4837 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-metadata-0" podStartSLOduration=2.226167284 podStartE2EDuration="5.099257312s" podCreationTimestamp="2026-01-11 17:53:05 +0000 UTC" firstStartedPulling="2026-01-11 17:53:06.325818671 +0000 UTC m=+1360.504011367" lastFinishedPulling="2026-01-11 17:53:09.198908689 +0000 UTC m=+1363.377101395" observedRunningTime="2026-01-11 17:53:10.090456846 +0000 UTC m=+1364.268649552" watchObservedRunningTime="2026-01-11 17:53:10.099257312 +0000 UTC m=+1364.277450008" Jan 11 17:53:10 crc kubenswrapper[4837]: I0111 17:53:10.264317 4837 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-cell1-novncproxy-0" Jan 11 17:53:10 crc kubenswrapper[4837]: I0111 17:53:10.516699 4837 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-scheduler-0" Jan 11 17:53:10 crc kubenswrapper[4837]: I0111 17:53:10.592444 4837 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-metadata-0" Jan 11 17:53:10 crc kubenswrapper[4837]: I0111 17:53:10.592497 4837 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-metadata-0" Jan 11 17:53:11 crc kubenswrapper[4837]: I0111 17:53:11.041380 4837 generic.go:334] "Generic (PLEG): container finished" podID="2379c115-d0b5-4ed8-bcc3-1814d13efcef" containerID="1380087c19e314dcbb1e016d88cd7a44c22a194bb7ffa84c2133a0bd1eb0e294" exitCode=0 Jan 11 17:53:11 crc kubenswrapper[4837]: I0111 17:53:11.042054 4837 generic.go:334] "Generic (PLEG): container finished" podID="2379c115-d0b5-4ed8-bcc3-1814d13efcef" containerID="f88dcfc5081f1d804481272b5c1b2145d1723ab77ce57fa6015712a09e3863af" exitCode=143 Jan 11 17:53:11 crc kubenswrapper[4837]: I0111 17:53:11.041444 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"2379c115-d0b5-4ed8-bcc3-1814d13efcef","Type":"ContainerDied","Data":"1380087c19e314dcbb1e016d88cd7a44c22a194bb7ffa84c2133a0bd1eb0e294"} Jan 11 17:53:11 crc kubenswrapper[4837]: I0111 17:53:11.042321 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"2379c115-d0b5-4ed8-bcc3-1814d13efcef","Type":"ContainerDied","Data":"f88dcfc5081f1d804481272b5c1b2145d1723ab77ce57fa6015712a09e3863af"} Jan 11 17:53:11 crc kubenswrapper[4837]: I0111 17:53:11.715223 4837 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Jan 11 17:53:11 crc kubenswrapper[4837]: I0111 17:53:11.781449 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/2379c115-d0b5-4ed8-bcc3-1814d13efcef-logs\") pod \"2379c115-d0b5-4ed8-bcc3-1814d13efcef\" (UID: \"2379c115-d0b5-4ed8-bcc3-1814d13efcef\") " Jan 11 17:53:11 crc kubenswrapper[4837]: I0111 17:53:11.781612 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2379c115-d0b5-4ed8-bcc3-1814d13efcef-config-data\") pod \"2379c115-d0b5-4ed8-bcc3-1814d13efcef\" (UID: \"2379c115-d0b5-4ed8-bcc3-1814d13efcef\") " Jan 11 17:53:11 crc kubenswrapper[4837]: I0111 17:53:11.781790 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2379c115-d0b5-4ed8-bcc3-1814d13efcef-combined-ca-bundle\") pod \"2379c115-d0b5-4ed8-bcc3-1814d13efcef\" (UID: \"2379c115-d0b5-4ed8-bcc3-1814d13efcef\") " Jan 11 17:53:11 crc kubenswrapper[4837]: I0111 17:53:11.781920 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-vqvrm\" (UniqueName: \"kubernetes.io/projected/2379c115-d0b5-4ed8-bcc3-1814d13efcef-kube-api-access-vqvrm\") pod \"2379c115-d0b5-4ed8-bcc3-1814d13efcef\" (UID: \"2379c115-d0b5-4ed8-bcc3-1814d13efcef\") " Jan 11 17:53:11 crc kubenswrapper[4837]: I0111 17:53:11.782724 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/2379c115-d0b5-4ed8-bcc3-1814d13efcef-logs" (OuterVolumeSpecName: "logs") pod "2379c115-d0b5-4ed8-bcc3-1814d13efcef" (UID: "2379c115-d0b5-4ed8-bcc3-1814d13efcef"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 11 17:53:11 crc kubenswrapper[4837]: I0111 17:53:11.792831 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2379c115-d0b5-4ed8-bcc3-1814d13efcef-kube-api-access-vqvrm" (OuterVolumeSpecName: "kube-api-access-vqvrm") pod "2379c115-d0b5-4ed8-bcc3-1814d13efcef" (UID: "2379c115-d0b5-4ed8-bcc3-1814d13efcef"). InnerVolumeSpecName "kube-api-access-vqvrm". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 11 17:53:11 crc kubenswrapper[4837]: I0111 17:53:11.824855 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2379c115-d0b5-4ed8-bcc3-1814d13efcef-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "2379c115-d0b5-4ed8-bcc3-1814d13efcef" (UID: "2379c115-d0b5-4ed8-bcc3-1814d13efcef"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 11 17:53:11 crc kubenswrapper[4837]: I0111 17:53:11.828751 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2379c115-d0b5-4ed8-bcc3-1814d13efcef-config-data" (OuterVolumeSpecName: "config-data") pod "2379c115-d0b5-4ed8-bcc3-1814d13efcef" (UID: "2379c115-d0b5-4ed8-bcc3-1814d13efcef"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 11 17:53:11 crc kubenswrapper[4837]: I0111 17:53:11.884420 4837 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-vqvrm\" (UniqueName: \"kubernetes.io/projected/2379c115-d0b5-4ed8-bcc3-1814d13efcef-kube-api-access-vqvrm\") on node \"crc\" DevicePath \"\"" Jan 11 17:53:11 crc kubenswrapper[4837]: I0111 17:53:11.884454 4837 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/2379c115-d0b5-4ed8-bcc3-1814d13efcef-logs\") on node \"crc\" DevicePath \"\"" Jan 11 17:53:11 crc kubenswrapper[4837]: I0111 17:53:11.884464 4837 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2379c115-d0b5-4ed8-bcc3-1814d13efcef-config-data\") on node \"crc\" DevicePath \"\"" Jan 11 17:53:11 crc kubenswrapper[4837]: I0111 17:53:11.884473 4837 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2379c115-d0b5-4ed8-bcc3-1814d13efcef-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 11 17:53:12 crc kubenswrapper[4837]: I0111 17:53:12.054393 4837 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Jan 11 17:53:12 crc kubenswrapper[4837]: I0111 17:53:12.054400 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"2379c115-d0b5-4ed8-bcc3-1814d13efcef","Type":"ContainerDied","Data":"0db5df021593db59942caf1328b8d9a970c9c7c7c20761cd2aed33e3fae2d012"} Jan 11 17:53:12 crc kubenswrapper[4837]: I0111 17:53:12.054466 4837 scope.go:117] "RemoveContainer" containerID="1380087c19e314dcbb1e016d88cd7a44c22a194bb7ffa84c2133a0bd1eb0e294" Jan 11 17:53:12 crc kubenswrapper[4837]: I0111 17:53:12.092198 4837 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-metadata-0"] Jan 11 17:53:12 crc kubenswrapper[4837]: I0111 17:53:12.098945 4837 scope.go:117] "RemoveContainer" containerID="f88dcfc5081f1d804481272b5c1b2145d1723ab77ce57fa6015712a09e3863af" Jan 11 17:53:12 crc kubenswrapper[4837]: I0111 17:53:12.100889 4837 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-metadata-0"] Jan 11 17:53:12 crc kubenswrapper[4837]: I0111 17:53:12.136362 4837 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-metadata-0"] Jan 11 17:53:12 crc kubenswrapper[4837]: E0111 17:53:12.137048 4837 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2379c115-d0b5-4ed8-bcc3-1814d13efcef" containerName="nova-metadata-metadata" Jan 11 17:53:12 crc kubenswrapper[4837]: I0111 17:53:12.137063 4837 state_mem.go:107] "Deleted CPUSet assignment" podUID="2379c115-d0b5-4ed8-bcc3-1814d13efcef" containerName="nova-metadata-metadata" Jan 11 17:53:12 crc kubenswrapper[4837]: E0111 17:53:12.137087 4837 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2379c115-d0b5-4ed8-bcc3-1814d13efcef" containerName="nova-metadata-log" Jan 11 17:53:12 crc kubenswrapper[4837]: I0111 17:53:12.137094 4837 state_mem.go:107] "Deleted CPUSet assignment" podUID="2379c115-d0b5-4ed8-bcc3-1814d13efcef" containerName="nova-metadata-log" Jan 11 17:53:12 crc kubenswrapper[4837]: I0111 17:53:12.137296 4837 memory_manager.go:354] "RemoveStaleState removing state" podUID="2379c115-d0b5-4ed8-bcc3-1814d13efcef" containerName="nova-metadata-log" Jan 11 17:53:12 crc kubenswrapper[4837]: I0111 17:53:12.137310 4837 memory_manager.go:354] "RemoveStaleState removing state" podUID="2379c115-d0b5-4ed8-bcc3-1814d13efcef" containerName="nova-metadata-metadata" Jan 11 17:53:12 crc kubenswrapper[4837]: I0111 17:53:12.138255 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Jan 11 17:53:12 crc kubenswrapper[4837]: I0111 17:53:12.140709 4837 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-metadata-config-data" Jan 11 17:53:12 crc kubenswrapper[4837]: I0111 17:53:12.141001 4837 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-metadata-internal-svc" Jan 11 17:53:12 crc kubenswrapper[4837]: I0111 17:53:12.156399 4837 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Jan 11 17:53:12 crc kubenswrapper[4837]: I0111 17:53:12.192270 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/492f8f04-263a-4e44-b44a-dc6a84ff214f-logs\") pod \"nova-metadata-0\" (UID: \"492f8f04-263a-4e44-b44a-dc6a84ff214f\") " pod="openstack/nova-metadata-0" Jan 11 17:53:12 crc kubenswrapper[4837]: I0111 17:53:12.192557 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-r6xfp\" (UniqueName: \"kubernetes.io/projected/492f8f04-263a-4e44-b44a-dc6a84ff214f-kube-api-access-r6xfp\") pod \"nova-metadata-0\" (UID: \"492f8f04-263a-4e44-b44a-dc6a84ff214f\") " pod="openstack/nova-metadata-0" Jan 11 17:53:12 crc kubenswrapper[4837]: I0111 17:53:12.192700 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/492f8f04-263a-4e44-b44a-dc6a84ff214f-config-data\") pod \"nova-metadata-0\" (UID: \"492f8f04-263a-4e44-b44a-dc6a84ff214f\") " pod="openstack/nova-metadata-0" Jan 11 17:53:12 crc kubenswrapper[4837]: I0111 17:53:12.192831 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/492f8f04-263a-4e44-b44a-dc6a84ff214f-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"492f8f04-263a-4e44-b44a-dc6a84ff214f\") " pod="openstack/nova-metadata-0" Jan 11 17:53:12 crc kubenswrapper[4837]: I0111 17:53:12.192951 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/492f8f04-263a-4e44-b44a-dc6a84ff214f-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"492f8f04-263a-4e44-b44a-dc6a84ff214f\") " pod="openstack/nova-metadata-0" Jan 11 17:53:12 crc kubenswrapper[4837]: I0111 17:53:12.295193 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-r6xfp\" (UniqueName: \"kubernetes.io/projected/492f8f04-263a-4e44-b44a-dc6a84ff214f-kube-api-access-r6xfp\") pod \"nova-metadata-0\" (UID: \"492f8f04-263a-4e44-b44a-dc6a84ff214f\") " pod="openstack/nova-metadata-0" Jan 11 17:53:12 crc kubenswrapper[4837]: I0111 17:53:12.295324 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/492f8f04-263a-4e44-b44a-dc6a84ff214f-config-data\") pod \"nova-metadata-0\" (UID: \"492f8f04-263a-4e44-b44a-dc6a84ff214f\") " pod="openstack/nova-metadata-0" Jan 11 17:53:12 crc kubenswrapper[4837]: I0111 17:53:12.295427 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/492f8f04-263a-4e44-b44a-dc6a84ff214f-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"492f8f04-263a-4e44-b44a-dc6a84ff214f\") " pod="openstack/nova-metadata-0" Jan 11 17:53:12 crc kubenswrapper[4837]: I0111 17:53:12.295531 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/492f8f04-263a-4e44-b44a-dc6a84ff214f-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"492f8f04-263a-4e44-b44a-dc6a84ff214f\") " pod="openstack/nova-metadata-0" Jan 11 17:53:12 crc kubenswrapper[4837]: I0111 17:53:12.295583 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/492f8f04-263a-4e44-b44a-dc6a84ff214f-logs\") pod \"nova-metadata-0\" (UID: \"492f8f04-263a-4e44-b44a-dc6a84ff214f\") " pod="openstack/nova-metadata-0" Jan 11 17:53:12 crc kubenswrapper[4837]: I0111 17:53:12.296461 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/492f8f04-263a-4e44-b44a-dc6a84ff214f-logs\") pod \"nova-metadata-0\" (UID: \"492f8f04-263a-4e44-b44a-dc6a84ff214f\") " pod="openstack/nova-metadata-0" Jan 11 17:53:12 crc kubenswrapper[4837]: I0111 17:53:12.301651 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/492f8f04-263a-4e44-b44a-dc6a84ff214f-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"492f8f04-263a-4e44-b44a-dc6a84ff214f\") " pod="openstack/nova-metadata-0" Jan 11 17:53:12 crc kubenswrapper[4837]: I0111 17:53:12.303225 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/492f8f04-263a-4e44-b44a-dc6a84ff214f-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"492f8f04-263a-4e44-b44a-dc6a84ff214f\") " pod="openstack/nova-metadata-0" Jan 11 17:53:12 crc kubenswrapper[4837]: I0111 17:53:12.306436 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/492f8f04-263a-4e44-b44a-dc6a84ff214f-config-data\") pod \"nova-metadata-0\" (UID: \"492f8f04-263a-4e44-b44a-dc6a84ff214f\") " pod="openstack/nova-metadata-0" Jan 11 17:53:12 crc kubenswrapper[4837]: I0111 17:53:12.325519 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-r6xfp\" (UniqueName: \"kubernetes.io/projected/492f8f04-263a-4e44-b44a-dc6a84ff214f-kube-api-access-r6xfp\") pod \"nova-metadata-0\" (UID: \"492f8f04-263a-4e44-b44a-dc6a84ff214f\") " pod="openstack/nova-metadata-0" Jan 11 17:53:12 crc kubenswrapper[4837]: I0111 17:53:12.380795 4837 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="2379c115-d0b5-4ed8-bcc3-1814d13efcef" path="/var/lib/kubelet/pods/2379c115-d0b5-4ed8-bcc3-1814d13efcef/volumes" Jan 11 17:53:12 crc kubenswrapper[4837]: I0111 17:53:12.470025 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Jan 11 17:53:13 crc kubenswrapper[4837]: I0111 17:53:13.054046 4837 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Jan 11 17:53:13 crc kubenswrapper[4837]: W0111 17:53:13.089842 4837 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod492f8f04_263a_4e44_b44a_dc6a84ff214f.slice/crio-4080a2ce78667e5456bab80283d1c81f5aba86a3bf00f48141df87a402a95f7f WatchSource:0}: Error finding container 4080a2ce78667e5456bab80283d1c81f5aba86a3bf00f48141df87a402a95f7f: Status 404 returned error can't find the container with id 4080a2ce78667e5456bab80283d1c81f5aba86a3bf00f48141df87a402a95f7f Jan 11 17:53:14 crc kubenswrapper[4837]: I0111 17:53:14.094070 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"492f8f04-263a-4e44-b44a-dc6a84ff214f","Type":"ContainerStarted","Data":"82a71692421a5b60c07a039a8292d21dc845780418c1a9702caa42f695a91022"} Jan 11 17:53:14 crc kubenswrapper[4837]: I0111 17:53:14.094489 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"492f8f04-263a-4e44-b44a-dc6a84ff214f","Type":"ContainerStarted","Data":"d2c1cf6ce177d7bfff60265f93f3f88a93f89bf44f1eb80086edcccc561c347f"} Jan 11 17:53:14 crc kubenswrapper[4837]: I0111 17:53:14.094513 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"492f8f04-263a-4e44-b44a-dc6a84ff214f","Type":"ContainerStarted","Data":"4080a2ce78667e5456bab80283d1c81f5aba86a3bf00f48141df87a402a95f7f"} Jan 11 17:53:14 crc kubenswrapper[4837]: I0111 17:53:14.098113 4837 generic.go:334] "Generic (PLEG): container finished" podID="bb261701-19a1-4f8f-a84b-e8748c2eb561" containerID="9b9a7b0972d386315e54f02a158bf16ccf012c9b8757d163aac3a554e6f3dbef" exitCode=0 Jan 11 17:53:14 crc kubenswrapper[4837]: I0111 17:53:14.098204 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-cell-mapping-5wzxr" event={"ID":"bb261701-19a1-4f8f-a84b-e8748c2eb561","Type":"ContainerDied","Data":"9b9a7b0972d386315e54f02a158bf16ccf012c9b8757d163aac3a554e6f3dbef"} Jan 11 17:53:14 crc kubenswrapper[4837]: I0111 17:53:14.127315 4837 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-metadata-0" podStartSLOduration=2.127293464 podStartE2EDuration="2.127293464s" podCreationTimestamp="2026-01-11 17:53:12 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-11 17:53:14.113987546 +0000 UTC m=+1368.292180302" watchObservedRunningTime="2026-01-11 17:53:14.127293464 +0000 UTC m=+1368.305486180" Jan 11 17:53:15 crc kubenswrapper[4837]: I0111 17:53:15.405443 4837 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-api-0" Jan 11 17:53:15 crc kubenswrapper[4837]: I0111 17:53:15.405906 4837 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-api-0" Jan 11 17:53:15 crc kubenswrapper[4837]: I0111 17:53:15.516642 4837 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-scheduler-0" Jan 11 17:53:15 crc kubenswrapper[4837]: I0111 17:53:15.571429 4837 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-scheduler-0" Jan 11 17:53:15 crc kubenswrapper[4837]: I0111 17:53:15.581715 4837 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-cell-mapping-5wzxr" Jan 11 17:53:15 crc kubenswrapper[4837]: I0111 17:53:15.666436 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/bb261701-19a1-4f8f-a84b-e8748c2eb561-config-data\") pod \"bb261701-19a1-4f8f-a84b-e8748c2eb561\" (UID: \"bb261701-19a1-4f8f-a84b-e8748c2eb561\") " Jan 11 17:53:15 crc kubenswrapper[4837]: I0111 17:53:15.666483 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/bb261701-19a1-4f8f-a84b-e8748c2eb561-scripts\") pod \"bb261701-19a1-4f8f-a84b-e8748c2eb561\" (UID: \"bb261701-19a1-4f8f-a84b-e8748c2eb561\") " Jan 11 17:53:15 crc kubenswrapper[4837]: I0111 17:53:15.666524 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/bb261701-19a1-4f8f-a84b-e8748c2eb561-combined-ca-bundle\") pod \"bb261701-19a1-4f8f-a84b-e8748c2eb561\" (UID: \"bb261701-19a1-4f8f-a84b-e8748c2eb561\") " Jan 11 17:53:15 crc kubenswrapper[4837]: I0111 17:53:15.666587 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-m7fbr\" (UniqueName: \"kubernetes.io/projected/bb261701-19a1-4f8f-a84b-e8748c2eb561-kube-api-access-m7fbr\") pod \"bb261701-19a1-4f8f-a84b-e8748c2eb561\" (UID: \"bb261701-19a1-4f8f-a84b-e8748c2eb561\") " Jan 11 17:53:15 crc kubenswrapper[4837]: I0111 17:53:15.675303 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bb261701-19a1-4f8f-a84b-e8748c2eb561-kube-api-access-m7fbr" (OuterVolumeSpecName: "kube-api-access-m7fbr") pod "bb261701-19a1-4f8f-a84b-e8748c2eb561" (UID: "bb261701-19a1-4f8f-a84b-e8748c2eb561"). InnerVolumeSpecName "kube-api-access-m7fbr". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 11 17:53:15 crc kubenswrapper[4837]: I0111 17:53:15.678800 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bb261701-19a1-4f8f-a84b-e8748c2eb561-scripts" (OuterVolumeSpecName: "scripts") pod "bb261701-19a1-4f8f-a84b-e8748c2eb561" (UID: "bb261701-19a1-4f8f-a84b-e8748c2eb561"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 11 17:53:15 crc kubenswrapper[4837]: I0111 17:53:15.695371 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bb261701-19a1-4f8f-a84b-e8748c2eb561-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "bb261701-19a1-4f8f-a84b-e8748c2eb561" (UID: "bb261701-19a1-4f8f-a84b-e8748c2eb561"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 11 17:53:15 crc kubenswrapper[4837]: I0111 17:53:15.710616 4837 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-845d6d6f59-8q68w" Jan 11 17:53:15 crc kubenswrapper[4837]: I0111 17:53:15.728609 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bb261701-19a1-4f8f-a84b-e8748c2eb561-config-data" (OuterVolumeSpecName: "config-data") pod "bb261701-19a1-4f8f-a84b-e8748c2eb561" (UID: "bb261701-19a1-4f8f-a84b-e8748c2eb561"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 11 17:53:15 crc kubenswrapper[4837]: I0111 17:53:15.774955 4837 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-m7fbr\" (UniqueName: \"kubernetes.io/projected/bb261701-19a1-4f8f-a84b-e8748c2eb561-kube-api-access-m7fbr\") on node \"crc\" DevicePath \"\"" Jan 11 17:53:15 crc kubenswrapper[4837]: I0111 17:53:15.775017 4837 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/bb261701-19a1-4f8f-a84b-e8748c2eb561-config-data\") on node \"crc\" DevicePath \"\"" Jan 11 17:53:15 crc kubenswrapper[4837]: I0111 17:53:15.775037 4837 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/bb261701-19a1-4f8f-a84b-e8748c2eb561-scripts\") on node \"crc\" DevicePath \"\"" Jan 11 17:53:15 crc kubenswrapper[4837]: I0111 17:53:15.775056 4837 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/bb261701-19a1-4f8f-a84b-e8748c2eb561-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 11 17:53:15 crc kubenswrapper[4837]: I0111 17:53:15.783747 4837 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-5784cf869f-g67l9"] Jan 11 17:53:15 crc kubenswrapper[4837]: I0111 17:53:15.784153 4837 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-5784cf869f-g67l9" podUID="93f666ee-5f8d-4403-83f6-87d4be8c0961" containerName="dnsmasq-dns" containerID="cri-o://ed239e40b4823cea4d674b64febf13e8f69e00a4959abf08b0694156c212182a" gracePeriod=10 Jan 11 17:53:16 crc kubenswrapper[4837]: I0111 17:53:16.116589 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-cell-mapping-5wzxr" event={"ID":"bb261701-19a1-4f8f-a84b-e8748c2eb561","Type":"ContainerDied","Data":"645e7db218f4963327f9c0f84cb8a6f869269b704c7c603ea1b7e10fcf47e732"} Jan 11 17:53:16 crc kubenswrapper[4837]: I0111 17:53:16.116636 4837 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="645e7db218f4963327f9c0f84cb8a6f869269b704c7c603ea1b7e10fcf47e732" Jan 11 17:53:16 crc kubenswrapper[4837]: I0111 17:53:16.116684 4837 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-cell-mapping-5wzxr" Jan 11 17:53:16 crc kubenswrapper[4837]: I0111 17:53:16.118990 4837 generic.go:334] "Generic (PLEG): container finished" podID="93f666ee-5f8d-4403-83f6-87d4be8c0961" containerID="ed239e40b4823cea4d674b64febf13e8f69e00a4959abf08b0694156c212182a" exitCode=0 Jan 11 17:53:16 crc kubenswrapper[4837]: I0111 17:53:16.119034 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5784cf869f-g67l9" event={"ID":"93f666ee-5f8d-4403-83f6-87d4be8c0961","Type":"ContainerDied","Data":"ed239e40b4823cea4d674b64febf13e8f69e00a4959abf08b0694156c212182a"} Jan 11 17:53:16 crc kubenswrapper[4837]: I0111 17:53:16.120468 4837 generic.go:334] "Generic (PLEG): container finished" podID="6da0d3e9-aabb-41f6-aaad-7cb1d1f203c8" containerID="f633d6111f7d5bceba726b3ef0fefa136f025a6f85e8dc8f5ebde40e5830b39f" exitCode=0 Jan 11 17:53:16 crc kubenswrapper[4837]: I0111 17:53:16.120881 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-conductor-db-sync-78z78" event={"ID":"6da0d3e9-aabb-41f6-aaad-7cb1d1f203c8","Type":"ContainerDied","Data":"f633d6111f7d5bceba726b3ef0fefa136f025a6f85e8dc8f5ebde40e5830b39f"} Jan 11 17:53:16 crc kubenswrapper[4837]: I0111 17:53:16.163600 4837 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-scheduler-0" Jan 11 17:53:16 crc kubenswrapper[4837]: I0111 17:53:16.196464 4837 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5784cf869f-g67l9" Jan 11 17:53:16 crc kubenswrapper[4837]: I0111 17:53:16.287534 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/93f666ee-5f8d-4403-83f6-87d4be8c0961-ovsdbserver-nb\") pod \"93f666ee-5f8d-4403-83f6-87d4be8c0961\" (UID: \"93f666ee-5f8d-4403-83f6-87d4be8c0961\") " Jan 11 17:53:16 crc kubenswrapper[4837]: I0111 17:53:16.287623 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/93f666ee-5f8d-4403-83f6-87d4be8c0961-ovsdbserver-sb\") pod \"93f666ee-5f8d-4403-83f6-87d4be8c0961\" (UID: \"93f666ee-5f8d-4403-83f6-87d4be8c0961\") " Jan 11 17:53:16 crc kubenswrapper[4837]: I0111 17:53:16.287864 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/93f666ee-5f8d-4403-83f6-87d4be8c0961-dns-svc\") pod \"93f666ee-5f8d-4403-83f6-87d4be8c0961\" (UID: \"93f666ee-5f8d-4403-83f6-87d4be8c0961\") " Jan 11 17:53:16 crc kubenswrapper[4837]: I0111 17:53:16.287904 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/93f666ee-5f8d-4403-83f6-87d4be8c0961-dns-swift-storage-0\") pod \"93f666ee-5f8d-4403-83f6-87d4be8c0961\" (UID: \"93f666ee-5f8d-4403-83f6-87d4be8c0961\") " Jan 11 17:53:16 crc kubenswrapper[4837]: I0111 17:53:16.287955 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/93f666ee-5f8d-4403-83f6-87d4be8c0961-config\") pod \"93f666ee-5f8d-4403-83f6-87d4be8c0961\" (UID: \"93f666ee-5f8d-4403-83f6-87d4be8c0961\") " Jan 11 17:53:16 crc kubenswrapper[4837]: I0111 17:53:16.288028 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-l8cpm\" (UniqueName: \"kubernetes.io/projected/93f666ee-5f8d-4403-83f6-87d4be8c0961-kube-api-access-l8cpm\") pod \"93f666ee-5f8d-4403-83f6-87d4be8c0961\" (UID: \"93f666ee-5f8d-4403-83f6-87d4be8c0961\") " Jan 11 17:53:16 crc kubenswrapper[4837]: I0111 17:53:16.295094 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/93f666ee-5f8d-4403-83f6-87d4be8c0961-kube-api-access-l8cpm" (OuterVolumeSpecName: "kube-api-access-l8cpm") pod "93f666ee-5f8d-4403-83f6-87d4be8c0961" (UID: "93f666ee-5f8d-4403-83f6-87d4be8c0961"). InnerVolumeSpecName "kube-api-access-l8cpm". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 11 17:53:16 crc kubenswrapper[4837]: I0111 17:53:16.319331 4837 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-0"] Jan 11 17:53:16 crc kubenswrapper[4837]: I0111 17:53:16.319525 4837 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-api-0" podUID="66fa097f-464b-4efd-b0e3-8129153d92d5" containerName="nova-api-log" containerID="cri-o://faf1bab0d37200648afc7be3eb186f5da2992aaa32ef7c37aefb8e6fa3474b17" gracePeriod=30 Jan 11 17:53:16 crc kubenswrapper[4837]: I0111 17:53:16.319900 4837 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-api-0" podUID="66fa097f-464b-4efd-b0e3-8129153d92d5" containerName="nova-api-api" containerID="cri-o://8d9553b51c598777bbe802217fb2c62cb4a91ae68e4c7bbda8a126f4b182a7a1" gracePeriod=30 Jan 11 17:53:16 crc kubenswrapper[4837]: I0111 17:53:16.323782 4837 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-api-0" podUID="66fa097f-464b-4efd-b0e3-8129153d92d5" containerName="nova-api-log" probeResult="failure" output="Get \"http://10.217.0.189:8774/\": EOF" Jan 11 17:53:16 crc kubenswrapper[4837]: I0111 17:53:16.324044 4837 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-api-0" podUID="66fa097f-464b-4efd-b0e3-8129153d92d5" containerName="nova-api-api" probeResult="failure" output="Get \"http://10.217.0.189:8774/\": EOF" Jan 11 17:53:16 crc kubenswrapper[4837]: I0111 17:53:16.347361 4837 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-metadata-0"] Jan 11 17:53:16 crc kubenswrapper[4837]: I0111 17:53:16.347611 4837 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-metadata-0" podUID="492f8f04-263a-4e44-b44a-dc6a84ff214f" containerName="nova-metadata-log" containerID="cri-o://d2c1cf6ce177d7bfff60265f93f3f88a93f89bf44f1eb80086edcccc561c347f" gracePeriod=30 Jan 11 17:53:16 crc kubenswrapper[4837]: I0111 17:53:16.348078 4837 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-metadata-0" podUID="492f8f04-263a-4e44-b44a-dc6a84ff214f" containerName="nova-metadata-metadata" containerID="cri-o://82a71692421a5b60c07a039a8292d21dc845780418c1a9702caa42f695a91022" gracePeriod=30 Jan 11 17:53:16 crc kubenswrapper[4837]: I0111 17:53:16.367475 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/93f666ee-5f8d-4403-83f6-87d4be8c0961-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "93f666ee-5f8d-4403-83f6-87d4be8c0961" (UID: "93f666ee-5f8d-4403-83f6-87d4be8c0961"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 11 17:53:16 crc kubenswrapper[4837]: I0111 17:53:16.382481 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/93f666ee-5f8d-4403-83f6-87d4be8c0961-dns-swift-storage-0" (OuterVolumeSpecName: "dns-swift-storage-0") pod "93f666ee-5f8d-4403-83f6-87d4be8c0961" (UID: "93f666ee-5f8d-4403-83f6-87d4be8c0961"). InnerVolumeSpecName "dns-swift-storage-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 11 17:53:16 crc kubenswrapper[4837]: I0111 17:53:16.385080 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/93f666ee-5f8d-4403-83f6-87d4be8c0961-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "93f666ee-5f8d-4403-83f6-87d4be8c0961" (UID: "93f666ee-5f8d-4403-83f6-87d4be8c0961"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 11 17:53:16 crc kubenswrapper[4837]: I0111 17:53:16.392484 4837 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/93f666ee-5f8d-4403-83f6-87d4be8c0961-dns-svc\") on node \"crc\" DevicePath \"\"" Jan 11 17:53:16 crc kubenswrapper[4837]: I0111 17:53:16.392515 4837 reconciler_common.go:293] "Volume detached for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/93f666ee-5f8d-4403-83f6-87d4be8c0961-dns-swift-storage-0\") on node \"crc\" DevicePath \"\"" Jan 11 17:53:16 crc kubenswrapper[4837]: I0111 17:53:16.392526 4837 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-l8cpm\" (UniqueName: \"kubernetes.io/projected/93f666ee-5f8d-4403-83f6-87d4be8c0961-kube-api-access-l8cpm\") on node \"crc\" DevicePath \"\"" Jan 11 17:53:16 crc kubenswrapper[4837]: I0111 17:53:16.392534 4837 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/93f666ee-5f8d-4403-83f6-87d4be8c0961-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Jan 11 17:53:16 crc kubenswrapper[4837]: I0111 17:53:16.394286 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/93f666ee-5f8d-4403-83f6-87d4be8c0961-config" (OuterVolumeSpecName: "config") pod "93f666ee-5f8d-4403-83f6-87d4be8c0961" (UID: "93f666ee-5f8d-4403-83f6-87d4be8c0961"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 11 17:53:16 crc kubenswrapper[4837]: I0111 17:53:16.404274 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/93f666ee-5f8d-4403-83f6-87d4be8c0961-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "93f666ee-5f8d-4403-83f6-87d4be8c0961" (UID: "93f666ee-5f8d-4403-83f6-87d4be8c0961"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 11 17:53:16 crc kubenswrapper[4837]: I0111 17:53:16.494331 4837 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/93f666ee-5f8d-4403-83f6-87d4be8c0961-config\") on node \"crc\" DevicePath \"\"" Jan 11 17:53:16 crc kubenswrapper[4837]: I0111 17:53:16.494355 4837 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/93f666ee-5f8d-4403-83f6-87d4be8c0961-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Jan 11 17:53:16 crc kubenswrapper[4837]: I0111 17:53:16.662394 4837 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-scheduler-0"] Jan 11 17:53:16 crc kubenswrapper[4837]: I0111 17:53:16.910150 4837 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Jan 11 17:53:17 crc kubenswrapper[4837]: I0111 17:53:17.003372 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/492f8f04-263a-4e44-b44a-dc6a84ff214f-logs\") pod \"492f8f04-263a-4e44-b44a-dc6a84ff214f\" (UID: \"492f8f04-263a-4e44-b44a-dc6a84ff214f\") " Jan 11 17:53:17 crc kubenswrapper[4837]: I0111 17:53:17.003729 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/492f8f04-263a-4e44-b44a-dc6a84ff214f-nova-metadata-tls-certs\") pod \"492f8f04-263a-4e44-b44a-dc6a84ff214f\" (UID: \"492f8f04-263a-4e44-b44a-dc6a84ff214f\") " Jan 11 17:53:17 crc kubenswrapper[4837]: I0111 17:53:17.003761 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-r6xfp\" (UniqueName: \"kubernetes.io/projected/492f8f04-263a-4e44-b44a-dc6a84ff214f-kube-api-access-r6xfp\") pod \"492f8f04-263a-4e44-b44a-dc6a84ff214f\" (UID: \"492f8f04-263a-4e44-b44a-dc6a84ff214f\") " Jan 11 17:53:17 crc kubenswrapper[4837]: I0111 17:53:17.003903 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/492f8f04-263a-4e44-b44a-dc6a84ff214f-config-data\") pod \"492f8f04-263a-4e44-b44a-dc6a84ff214f\" (UID: \"492f8f04-263a-4e44-b44a-dc6a84ff214f\") " Jan 11 17:53:17 crc kubenswrapper[4837]: I0111 17:53:17.003945 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/492f8f04-263a-4e44-b44a-dc6a84ff214f-combined-ca-bundle\") pod \"492f8f04-263a-4e44-b44a-dc6a84ff214f\" (UID: \"492f8f04-263a-4e44-b44a-dc6a84ff214f\") " Jan 11 17:53:17 crc kubenswrapper[4837]: I0111 17:53:17.005058 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/492f8f04-263a-4e44-b44a-dc6a84ff214f-logs" (OuterVolumeSpecName: "logs") pod "492f8f04-263a-4e44-b44a-dc6a84ff214f" (UID: "492f8f04-263a-4e44-b44a-dc6a84ff214f"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 11 17:53:17 crc kubenswrapper[4837]: I0111 17:53:17.007906 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/492f8f04-263a-4e44-b44a-dc6a84ff214f-kube-api-access-r6xfp" (OuterVolumeSpecName: "kube-api-access-r6xfp") pod "492f8f04-263a-4e44-b44a-dc6a84ff214f" (UID: "492f8f04-263a-4e44-b44a-dc6a84ff214f"). InnerVolumeSpecName "kube-api-access-r6xfp". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 11 17:53:17 crc kubenswrapper[4837]: I0111 17:53:17.032117 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/492f8f04-263a-4e44-b44a-dc6a84ff214f-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "492f8f04-263a-4e44-b44a-dc6a84ff214f" (UID: "492f8f04-263a-4e44-b44a-dc6a84ff214f"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 11 17:53:17 crc kubenswrapper[4837]: I0111 17:53:17.034431 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/492f8f04-263a-4e44-b44a-dc6a84ff214f-config-data" (OuterVolumeSpecName: "config-data") pod "492f8f04-263a-4e44-b44a-dc6a84ff214f" (UID: "492f8f04-263a-4e44-b44a-dc6a84ff214f"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 11 17:53:17 crc kubenswrapper[4837]: I0111 17:53:17.055166 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/492f8f04-263a-4e44-b44a-dc6a84ff214f-nova-metadata-tls-certs" (OuterVolumeSpecName: "nova-metadata-tls-certs") pod "492f8f04-263a-4e44-b44a-dc6a84ff214f" (UID: "492f8f04-263a-4e44-b44a-dc6a84ff214f"). InnerVolumeSpecName "nova-metadata-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 11 17:53:17 crc kubenswrapper[4837]: I0111 17:53:17.102543 4837 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ceilometer-0" Jan 11 17:53:17 crc kubenswrapper[4837]: I0111 17:53:17.105498 4837 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/492f8f04-263a-4e44-b44a-dc6a84ff214f-config-data\") on node \"crc\" DevicePath \"\"" Jan 11 17:53:17 crc kubenswrapper[4837]: I0111 17:53:17.105518 4837 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/492f8f04-263a-4e44-b44a-dc6a84ff214f-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 11 17:53:17 crc kubenswrapper[4837]: I0111 17:53:17.105526 4837 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/492f8f04-263a-4e44-b44a-dc6a84ff214f-logs\") on node \"crc\" DevicePath \"\"" Jan 11 17:53:17 crc kubenswrapper[4837]: I0111 17:53:17.105534 4837 reconciler_common.go:293] "Volume detached for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/492f8f04-263a-4e44-b44a-dc6a84ff214f-nova-metadata-tls-certs\") on node \"crc\" DevicePath \"\"" Jan 11 17:53:17 crc kubenswrapper[4837]: I0111 17:53:17.105542 4837 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-r6xfp\" (UniqueName: \"kubernetes.io/projected/492f8f04-263a-4e44-b44a-dc6a84ff214f-kube-api-access-r6xfp\") on node \"crc\" DevicePath \"\"" Jan 11 17:53:17 crc kubenswrapper[4837]: I0111 17:53:17.197715 4837 generic.go:334] "Generic (PLEG): container finished" podID="66fa097f-464b-4efd-b0e3-8129153d92d5" containerID="faf1bab0d37200648afc7be3eb186f5da2992aaa32ef7c37aefb8e6fa3474b17" exitCode=143 Jan 11 17:53:17 crc kubenswrapper[4837]: I0111 17:53:17.197785 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"66fa097f-464b-4efd-b0e3-8129153d92d5","Type":"ContainerDied","Data":"faf1bab0d37200648afc7be3eb186f5da2992aaa32ef7c37aefb8e6fa3474b17"} Jan 11 17:53:17 crc kubenswrapper[4837]: I0111 17:53:17.202944 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5784cf869f-g67l9" event={"ID":"93f666ee-5f8d-4403-83f6-87d4be8c0961","Type":"ContainerDied","Data":"94c675d79319e10e0baaa4d982b25b6f72883f8b715f95734542d0c51cf4dea4"} Jan 11 17:53:17 crc kubenswrapper[4837]: I0111 17:53:17.202999 4837 scope.go:117] "RemoveContainer" containerID="ed239e40b4823cea4d674b64febf13e8f69e00a4959abf08b0694156c212182a" Jan 11 17:53:17 crc kubenswrapper[4837]: I0111 17:53:17.203160 4837 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5784cf869f-g67l9" Jan 11 17:53:17 crc kubenswrapper[4837]: I0111 17:53:17.223750 4837 generic.go:334] "Generic (PLEG): container finished" podID="492f8f04-263a-4e44-b44a-dc6a84ff214f" containerID="82a71692421a5b60c07a039a8292d21dc845780418c1a9702caa42f695a91022" exitCode=0 Jan 11 17:53:17 crc kubenswrapper[4837]: I0111 17:53:17.223820 4837 generic.go:334] "Generic (PLEG): container finished" podID="492f8f04-263a-4e44-b44a-dc6a84ff214f" containerID="d2c1cf6ce177d7bfff60265f93f3f88a93f89bf44f1eb80086edcccc561c347f" exitCode=143 Jan 11 17:53:17 crc kubenswrapper[4837]: I0111 17:53:17.224581 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"492f8f04-263a-4e44-b44a-dc6a84ff214f","Type":"ContainerDied","Data":"82a71692421a5b60c07a039a8292d21dc845780418c1a9702caa42f695a91022"} Jan 11 17:53:17 crc kubenswrapper[4837]: I0111 17:53:17.224763 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"492f8f04-263a-4e44-b44a-dc6a84ff214f","Type":"ContainerDied","Data":"d2c1cf6ce177d7bfff60265f93f3f88a93f89bf44f1eb80086edcccc561c347f"} Jan 11 17:53:17 crc kubenswrapper[4837]: I0111 17:53:17.224858 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"492f8f04-263a-4e44-b44a-dc6a84ff214f","Type":"ContainerDied","Data":"4080a2ce78667e5456bab80283d1c81f5aba86a3bf00f48141df87a402a95f7f"} Jan 11 17:53:17 crc kubenswrapper[4837]: I0111 17:53:17.225040 4837 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Jan 11 17:53:17 crc kubenswrapper[4837]: I0111 17:53:17.240619 4837 scope.go:117] "RemoveContainer" containerID="fa47df7f17c26e02c12ae6c3c6f7e57a5ec73cd8f7bc2c53cacdf4e36a1ae592" Jan 11 17:53:17 crc kubenswrapper[4837]: I0111 17:53:17.252861 4837 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-5784cf869f-g67l9"] Jan 11 17:53:17 crc kubenswrapper[4837]: I0111 17:53:17.265247 4837 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-5784cf869f-g67l9"] Jan 11 17:53:17 crc kubenswrapper[4837]: I0111 17:53:17.282236 4837 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-metadata-0"] Jan 11 17:53:17 crc kubenswrapper[4837]: I0111 17:53:17.299093 4837 scope.go:117] "RemoveContainer" containerID="82a71692421a5b60c07a039a8292d21dc845780418c1a9702caa42f695a91022" Jan 11 17:53:17 crc kubenswrapper[4837]: I0111 17:53:17.304229 4837 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-metadata-0"] Jan 11 17:53:17 crc kubenswrapper[4837]: I0111 17:53:17.326895 4837 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-metadata-0"] Jan 11 17:53:17 crc kubenswrapper[4837]: E0111 17:53:17.327307 4837 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="93f666ee-5f8d-4403-83f6-87d4be8c0961" containerName="dnsmasq-dns" Jan 11 17:53:17 crc kubenswrapper[4837]: I0111 17:53:17.327318 4837 state_mem.go:107] "Deleted CPUSet assignment" podUID="93f666ee-5f8d-4403-83f6-87d4be8c0961" containerName="dnsmasq-dns" Jan 11 17:53:17 crc kubenswrapper[4837]: E0111 17:53:17.327340 4837 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="93f666ee-5f8d-4403-83f6-87d4be8c0961" containerName="init" Jan 11 17:53:17 crc kubenswrapper[4837]: I0111 17:53:17.327346 4837 state_mem.go:107] "Deleted CPUSet assignment" podUID="93f666ee-5f8d-4403-83f6-87d4be8c0961" containerName="init" Jan 11 17:53:17 crc kubenswrapper[4837]: E0111 17:53:17.327356 4837 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="bb261701-19a1-4f8f-a84b-e8748c2eb561" containerName="nova-manage" Jan 11 17:53:17 crc kubenswrapper[4837]: I0111 17:53:17.327364 4837 state_mem.go:107] "Deleted CPUSet assignment" podUID="bb261701-19a1-4f8f-a84b-e8748c2eb561" containerName="nova-manage" Jan 11 17:53:17 crc kubenswrapper[4837]: E0111 17:53:17.327380 4837 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="492f8f04-263a-4e44-b44a-dc6a84ff214f" containerName="nova-metadata-log" Jan 11 17:53:17 crc kubenswrapper[4837]: I0111 17:53:17.327385 4837 state_mem.go:107] "Deleted CPUSet assignment" podUID="492f8f04-263a-4e44-b44a-dc6a84ff214f" containerName="nova-metadata-log" Jan 11 17:53:17 crc kubenswrapper[4837]: E0111 17:53:17.327400 4837 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="492f8f04-263a-4e44-b44a-dc6a84ff214f" containerName="nova-metadata-metadata" Jan 11 17:53:17 crc kubenswrapper[4837]: I0111 17:53:17.327406 4837 state_mem.go:107] "Deleted CPUSet assignment" podUID="492f8f04-263a-4e44-b44a-dc6a84ff214f" containerName="nova-metadata-metadata" Jan 11 17:53:17 crc kubenswrapper[4837]: I0111 17:53:17.327574 4837 memory_manager.go:354] "RemoveStaleState removing state" podUID="93f666ee-5f8d-4403-83f6-87d4be8c0961" containerName="dnsmasq-dns" Jan 11 17:53:17 crc kubenswrapper[4837]: I0111 17:53:17.327588 4837 memory_manager.go:354] "RemoveStaleState removing state" podUID="492f8f04-263a-4e44-b44a-dc6a84ff214f" containerName="nova-metadata-metadata" Jan 11 17:53:17 crc kubenswrapper[4837]: I0111 17:53:17.327599 4837 memory_manager.go:354] "RemoveStaleState removing state" podUID="bb261701-19a1-4f8f-a84b-e8748c2eb561" containerName="nova-manage" Jan 11 17:53:17 crc kubenswrapper[4837]: I0111 17:53:17.327608 4837 memory_manager.go:354] "RemoveStaleState removing state" podUID="492f8f04-263a-4e44-b44a-dc6a84ff214f" containerName="nova-metadata-log" Jan 11 17:53:17 crc kubenswrapper[4837]: I0111 17:53:17.327865 4837 scope.go:117] "RemoveContainer" containerID="d2c1cf6ce177d7bfff60265f93f3f88a93f89bf44f1eb80086edcccc561c347f" Jan 11 17:53:17 crc kubenswrapper[4837]: I0111 17:53:17.329179 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Jan 11 17:53:17 crc kubenswrapper[4837]: I0111 17:53:17.334140 4837 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-metadata-internal-svc" Jan 11 17:53:17 crc kubenswrapper[4837]: I0111 17:53:17.336780 4837 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-metadata-config-data" Jan 11 17:53:17 crc kubenswrapper[4837]: I0111 17:53:17.341005 4837 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Jan 11 17:53:17 crc kubenswrapper[4837]: I0111 17:53:17.366244 4837 scope.go:117] "RemoveContainer" containerID="82a71692421a5b60c07a039a8292d21dc845780418c1a9702caa42f695a91022" Jan 11 17:53:17 crc kubenswrapper[4837]: E0111 17:53:17.366557 4837 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"82a71692421a5b60c07a039a8292d21dc845780418c1a9702caa42f695a91022\": container with ID starting with 82a71692421a5b60c07a039a8292d21dc845780418c1a9702caa42f695a91022 not found: ID does not exist" containerID="82a71692421a5b60c07a039a8292d21dc845780418c1a9702caa42f695a91022" Jan 11 17:53:17 crc kubenswrapper[4837]: I0111 17:53:17.366584 4837 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"82a71692421a5b60c07a039a8292d21dc845780418c1a9702caa42f695a91022"} err="failed to get container status \"82a71692421a5b60c07a039a8292d21dc845780418c1a9702caa42f695a91022\": rpc error: code = NotFound desc = could not find container \"82a71692421a5b60c07a039a8292d21dc845780418c1a9702caa42f695a91022\": container with ID starting with 82a71692421a5b60c07a039a8292d21dc845780418c1a9702caa42f695a91022 not found: ID does not exist" Jan 11 17:53:17 crc kubenswrapper[4837]: I0111 17:53:17.366604 4837 scope.go:117] "RemoveContainer" containerID="d2c1cf6ce177d7bfff60265f93f3f88a93f89bf44f1eb80086edcccc561c347f" Jan 11 17:53:17 crc kubenswrapper[4837]: E0111 17:53:17.366897 4837 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d2c1cf6ce177d7bfff60265f93f3f88a93f89bf44f1eb80086edcccc561c347f\": container with ID starting with d2c1cf6ce177d7bfff60265f93f3f88a93f89bf44f1eb80086edcccc561c347f not found: ID does not exist" containerID="d2c1cf6ce177d7bfff60265f93f3f88a93f89bf44f1eb80086edcccc561c347f" Jan 11 17:53:17 crc kubenswrapper[4837]: I0111 17:53:17.366918 4837 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d2c1cf6ce177d7bfff60265f93f3f88a93f89bf44f1eb80086edcccc561c347f"} err="failed to get container status \"d2c1cf6ce177d7bfff60265f93f3f88a93f89bf44f1eb80086edcccc561c347f\": rpc error: code = NotFound desc = could not find container \"d2c1cf6ce177d7bfff60265f93f3f88a93f89bf44f1eb80086edcccc561c347f\": container with ID starting with d2c1cf6ce177d7bfff60265f93f3f88a93f89bf44f1eb80086edcccc561c347f not found: ID does not exist" Jan 11 17:53:17 crc kubenswrapper[4837]: I0111 17:53:17.366930 4837 scope.go:117] "RemoveContainer" containerID="82a71692421a5b60c07a039a8292d21dc845780418c1a9702caa42f695a91022" Jan 11 17:53:17 crc kubenswrapper[4837]: I0111 17:53:17.367183 4837 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"82a71692421a5b60c07a039a8292d21dc845780418c1a9702caa42f695a91022"} err="failed to get container status \"82a71692421a5b60c07a039a8292d21dc845780418c1a9702caa42f695a91022\": rpc error: code = NotFound desc = could not find container \"82a71692421a5b60c07a039a8292d21dc845780418c1a9702caa42f695a91022\": container with ID starting with 82a71692421a5b60c07a039a8292d21dc845780418c1a9702caa42f695a91022 not found: ID does not exist" Jan 11 17:53:17 crc kubenswrapper[4837]: I0111 17:53:17.367202 4837 scope.go:117] "RemoveContainer" containerID="d2c1cf6ce177d7bfff60265f93f3f88a93f89bf44f1eb80086edcccc561c347f" Jan 11 17:53:17 crc kubenswrapper[4837]: I0111 17:53:17.367447 4837 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d2c1cf6ce177d7bfff60265f93f3f88a93f89bf44f1eb80086edcccc561c347f"} err="failed to get container status \"d2c1cf6ce177d7bfff60265f93f3f88a93f89bf44f1eb80086edcccc561c347f\": rpc error: code = NotFound desc = could not find container \"d2c1cf6ce177d7bfff60265f93f3f88a93f89bf44f1eb80086edcccc561c347f\": container with ID starting with d2c1cf6ce177d7bfff60265f93f3f88a93f89bf44f1eb80086edcccc561c347f not found: ID does not exist" Jan 11 17:53:17 crc kubenswrapper[4837]: I0111 17:53:17.411156 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-b7kqf\" (UniqueName: \"kubernetes.io/projected/67f85854-48e7-4151-9051-da416208058a-kube-api-access-b7kqf\") pod \"nova-metadata-0\" (UID: \"67f85854-48e7-4151-9051-da416208058a\") " pod="openstack/nova-metadata-0" Jan 11 17:53:17 crc kubenswrapper[4837]: I0111 17:53:17.411225 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/67f85854-48e7-4151-9051-da416208058a-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"67f85854-48e7-4151-9051-da416208058a\") " pod="openstack/nova-metadata-0" Jan 11 17:53:17 crc kubenswrapper[4837]: I0111 17:53:17.411250 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/67f85854-48e7-4151-9051-da416208058a-config-data\") pod \"nova-metadata-0\" (UID: \"67f85854-48e7-4151-9051-da416208058a\") " pod="openstack/nova-metadata-0" Jan 11 17:53:17 crc kubenswrapper[4837]: I0111 17:53:17.411292 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/67f85854-48e7-4151-9051-da416208058a-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"67f85854-48e7-4151-9051-da416208058a\") " pod="openstack/nova-metadata-0" Jan 11 17:53:17 crc kubenswrapper[4837]: I0111 17:53:17.411337 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/67f85854-48e7-4151-9051-da416208058a-logs\") pod \"nova-metadata-0\" (UID: \"67f85854-48e7-4151-9051-da416208058a\") " pod="openstack/nova-metadata-0" Jan 11 17:53:17 crc kubenswrapper[4837]: I0111 17:53:17.513158 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-b7kqf\" (UniqueName: \"kubernetes.io/projected/67f85854-48e7-4151-9051-da416208058a-kube-api-access-b7kqf\") pod \"nova-metadata-0\" (UID: \"67f85854-48e7-4151-9051-da416208058a\") " pod="openstack/nova-metadata-0" Jan 11 17:53:17 crc kubenswrapper[4837]: I0111 17:53:17.513238 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/67f85854-48e7-4151-9051-da416208058a-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"67f85854-48e7-4151-9051-da416208058a\") " pod="openstack/nova-metadata-0" Jan 11 17:53:17 crc kubenswrapper[4837]: I0111 17:53:17.513299 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/67f85854-48e7-4151-9051-da416208058a-config-data\") pod \"nova-metadata-0\" (UID: \"67f85854-48e7-4151-9051-da416208058a\") " pod="openstack/nova-metadata-0" Jan 11 17:53:17 crc kubenswrapper[4837]: I0111 17:53:17.513364 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/67f85854-48e7-4151-9051-da416208058a-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"67f85854-48e7-4151-9051-da416208058a\") " pod="openstack/nova-metadata-0" Jan 11 17:53:17 crc kubenswrapper[4837]: I0111 17:53:17.513433 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/67f85854-48e7-4151-9051-da416208058a-logs\") pod \"nova-metadata-0\" (UID: \"67f85854-48e7-4151-9051-da416208058a\") " pod="openstack/nova-metadata-0" Jan 11 17:53:17 crc kubenswrapper[4837]: I0111 17:53:17.513954 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/67f85854-48e7-4151-9051-da416208058a-logs\") pod \"nova-metadata-0\" (UID: \"67f85854-48e7-4151-9051-da416208058a\") " pod="openstack/nova-metadata-0" Jan 11 17:53:17 crc kubenswrapper[4837]: I0111 17:53:17.521456 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/67f85854-48e7-4151-9051-da416208058a-config-data\") pod \"nova-metadata-0\" (UID: \"67f85854-48e7-4151-9051-da416208058a\") " pod="openstack/nova-metadata-0" Jan 11 17:53:17 crc kubenswrapper[4837]: I0111 17:53:17.525538 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/67f85854-48e7-4151-9051-da416208058a-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"67f85854-48e7-4151-9051-da416208058a\") " pod="openstack/nova-metadata-0" Jan 11 17:53:17 crc kubenswrapper[4837]: I0111 17:53:17.527292 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/67f85854-48e7-4151-9051-da416208058a-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"67f85854-48e7-4151-9051-da416208058a\") " pod="openstack/nova-metadata-0" Jan 11 17:53:17 crc kubenswrapper[4837]: I0111 17:53:17.531647 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-b7kqf\" (UniqueName: \"kubernetes.io/projected/67f85854-48e7-4151-9051-da416208058a-kube-api-access-b7kqf\") pod \"nova-metadata-0\" (UID: \"67f85854-48e7-4151-9051-da416208058a\") " pod="openstack/nova-metadata-0" Jan 11 17:53:17 crc kubenswrapper[4837]: I0111 17:53:17.612155 4837 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-conductor-db-sync-78z78" Jan 11 17:53:17 crc kubenswrapper[4837]: I0111 17:53:17.655663 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Jan 11 17:53:17 crc kubenswrapper[4837]: I0111 17:53:17.718405 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6da0d3e9-aabb-41f6-aaad-7cb1d1f203c8-config-data\") pod \"6da0d3e9-aabb-41f6-aaad-7cb1d1f203c8\" (UID: \"6da0d3e9-aabb-41f6-aaad-7cb1d1f203c8\") " Jan 11 17:53:17 crc kubenswrapper[4837]: I0111 17:53:17.718964 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6da0d3e9-aabb-41f6-aaad-7cb1d1f203c8-combined-ca-bundle\") pod \"6da0d3e9-aabb-41f6-aaad-7cb1d1f203c8\" (UID: \"6da0d3e9-aabb-41f6-aaad-7cb1d1f203c8\") " Jan 11 17:53:17 crc kubenswrapper[4837]: I0111 17:53:17.719059 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ppf49\" (UniqueName: \"kubernetes.io/projected/6da0d3e9-aabb-41f6-aaad-7cb1d1f203c8-kube-api-access-ppf49\") pod \"6da0d3e9-aabb-41f6-aaad-7cb1d1f203c8\" (UID: \"6da0d3e9-aabb-41f6-aaad-7cb1d1f203c8\") " Jan 11 17:53:17 crc kubenswrapper[4837]: I0111 17:53:17.719118 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/6da0d3e9-aabb-41f6-aaad-7cb1d1f203c8-scripts\") pod \"6da0d3e9-aabb-41f6-aaad-7cb1d1f203c8\" (UID: \"6da0d3e9-aabb-41f6-aaad-7cb1d1f203c8\") " Jan 11 17:53:17 crc kubenswrapper[4837]: I0111 17:53:17.723591 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6da0d3e9-aabb-41f6-aaad-7cb1d1f203c8-scripts" (OuterVolumeSpecName: "scripts") pod "6da0d3e9-aabb-41f6-aaad-7cb1d1f203c8" (UID: "6da0d3e9-aabb-41f6-aaad-7cb1d1f203c8"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 11 17:53:17 crc kubenswrapper[4837]: I0111 17:53:17.726183 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6da0d3e9-aabb-41f6-aaad-7cb1d1f203c8-kube-api-access-ppf49" (OuterVolumeSpecName: "kube-api-access-ppf49") pod "6da0d3e9-aabb-41f6-aaad-7cb1d1f203c8" (UID: "6da0d3e9-aabb-41f6-aaad-7cb1d1f203c8"). InnerVolumeSpecName "kube-api-access-ppf49". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 11 17:53:17 crc kubenswrapper[4837]: I0111 17:53:17.744282 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6da0d3e9-aabb-41f6-aaad-7cb1d1f203c8-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "6da0d3e9-aabb-41f6-aaad-7cb1d1f203c8" (UID: "6da0d3e9-aabb-41f6-aaad-7cb1d1f203c8"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 11 17:53:17 crc kubenswrapper[4837]: I0111 17:53:17.747983 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6da0d3e9-aabb-41f6-aaad-7cb1d1f203c8-config-data" (OuterVolumeSpecName: "config-data") pod "6da0d3e9-aabb-41f6-aaad-7cb1d1f203c8" (UID: "6da0d3e9-aabb-41f6-aaad-7cb1d1f203c8"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 11 17:53:17 crc kubenswrapper[4837]: I0111 17:53:17.821362 4837 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6da0d3e9-aabb-41f6-aaad-7cb1d1f203c8-config-data\") on node \"crc\" DevicePath \"\"" Jan 11 17:53:17 crc kubenswrapper[4837]: I0111 17:53:17.821393 4837 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6da0d3e9-aabb-41f6-aaad-7cb1d1f203c8-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 11 17:53:17 crc kubenswrapper[4837]: I0111 17:53:17.821403 4837 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-ppf49\" (UniqueName: \"kubernetes.io/projected/6da0d3e9-aabb-41f6-aaad-7cb1d1f203c8-kube-api-access-ppf49\") on node \"crc\" DevicePath \"\"" Jan 11 17:53:17 crc kubenswrapper[4837]: I0111 17:53:17.821414 4837 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/6da0d3e9-aabb-41f6-aaad-7cb1d1f203c8-scripts\") on node \"crc\" DevicePath \"\"" Jan 11 17:53:18 crc kubenswrapper[4837]: W0111 17:53:18.204194 4837 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod67f85854_48e7_4151_9051_da416208058a.slice/crio-f464159971be53f939f389d4f897e56cf937b01d5d65a5cb7d09b06703116abf WatchSource:0}: Error finding container f464159971be53f939f389d4f897e56cf937b01d5d65a5cb7d09b06703116abf: Status 404 returned error can't find the container with id f464159971be53f939f389d4f897e56cf937b01d5d65a5cb7d09b06703116abf Jan 11 17:53:18 crc kubenswrapper[4837]: I0111 17:53:18.208356 4837 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Jan 11 17:53:18 crc kubenswrapper[4837]: I0111 17:53:18.253260 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-conductor-db-sync-78z78" event={"ID":"6da0d3e9-aabb-41f6-aaad-7cb1d1f203c8","Type":"ContainerDied","Data":"2c96270d6304f99237df043c8920c8cf4396b96d28a8b4e85e2929e5b9c23861"} Jan 11 17:53:18 crc kubenswrapper[4837]: I0111 17:53:18.253329 4837 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="2c96270d6304f99237df043c8920c8cf4396b96d28a8b4e85e2929e5b9c23861" Jan 11 17:53:18 crc kubenswrapper[4837]: I0111 17:53:18.253391 4837 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-conductor-db-sync-78z78" Jan 11 17:53:18 crc kubenswrapper[4837]: I0111 17:53:18.256439 4837 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-conductor-0"] Jan 11 17:53:18 crc kubenswrapper[4837]: E0111 17:53:18.256844 4837 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6da0d3e9-aabb-41f6-aaad-7cb1d1f203c8" containerName="nova-cell1-conductor-db-sync" Jan 11 17:53:18 crc kubenswrapper[4837]: I0111 17:53:18.256875 4837 state_mem.go:107] "Deleted CPUSet assignment" podUID="6da0d3e9-aabb-41f6-aaad-7cb1d1f203c8" containerName="nova-cell1-conductor-db-sync" Jan 11 17:53:18 crc kubenswrapper[4837]: I0111 17:53:18.257045 4837 memory_manager.go:354] "RemoveStaleState removing state" podUID="6da0d3e9-aabb-41f6-aaad-7cb1d1f203c8" containerName="nova-cell1-conductor-db-sync" Jan 11 17:53:18 crc kubenswrapper[4837]: I0111 17:53:18.257573 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-conductor-0" Jan 11 17:53:18 crc kubenswrapper[4837]: I0111 17:53:18.261262 4837 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-conductor-config-data" Jan 11 17:53:18 crc kubenswrapper[4837]: I0111 17:53:18.284215 4837 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-scheduler-0" podUID="a362ee58-5327-4ec1-95ca-f0579f471f84" containerName="nova-scheduler-scheduler" containerID="cri-o://41584b8c8efc8fccdccf666a89ab7871c0676b7f5c98145efd261b8b22383bca" gracePeriod=30 Jan 11 17:53:18 crc kubenswrapper[4837]: I0111 17:53:18.284424 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"67f85854-48e7-4151-9051-da416208058a","Type":"ContainerStarted","Data":"f464159971be53f939f389d4f897e56cf937b01d5d65a5cb7d09b06703116abf"} Jan 11 17:53:18 crc kubenswrapper[4837]: I0111 17:53:18.302105 4837 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-conductor-0"] Jan 11 17:53:18 crc kubenswrapper[4837]: I0111 17:53:18.341074 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/88d7f74b-9a47-4152-bec1-11e05030e750-config-data\") pod \"nova-cell1-conductor-0\" (UID: \"88d7f74b-9a47-4152-bec1-11e05030e750\") " pod="openstack/nova-cell1-conductor-0" Jan 11 17:53:18 crc kubenswrapper[4837]: I0111 17:53:18.341190 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/88d7f74b-9a47-4152-bec1-11e05030e750-combined-ca-bundle\") pod \"nova-cell1-conductor-0\" (UID: \"88d7f74b-9a47-4152-bec1-11e05030e750\") " pod="openstack/nova-cell1-conductor-0" Jan 11 17:53:18 crc kubenswrapper[4837]: I0111 17:53:18.341318 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-d6zwn\" (UniqueName: \"kubernetes.io/projected/88d7f74b-9a47-4152-bec1-11e05030e750-kube-api-access-d6zwn\") pod \"nova-cell1-conductor-0\" (UID: \"88d7f74b-9a47-4152-bec1-11e05030e750\") " pod="openstack/nova-cell1-conductor-0" Jan 11 17:53:18 crc kubenswrapper[4837]: I0111 17:53:18.373355 4837 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="492f8f04-263a-4e44-b44a-dc6a84ff214f" path="/var/lib/kubelet/pods/492f8f04-263a-4e44-b44a-dc6a84ff214f/volumes" Jan 11 17:53:18 crc kubenswrapper[4837]: I0111 17:53:18.373973 4837 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="93f666ee-5f8d-4403-83f6-87d4be8c0961" path="/var/lib/kubelet/pods/93f666ee-5f8d-4403-83f6-87d4be8c0961/volumes" Jan 11 17:53:18 crc kubenswrapper[4837]: E0111 17:53:18.402442 4837 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod6da0d3e9_aabb_41f6_aaad_7cb1d1f203c8.slice/crio-2c96270d6304f99237df043c8920c8cf4396b96d28a8b4e85e2929e5b9c23861\": RecentStats: unable to find data in memory cache]" Jan 11 17:53:18 crc kubenswrapper[4837]: I0111 17:53:18.443979 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/88d7f74b-9a47-4152-bec1-11e05030e750-config-data\") pod \"nova-cell1-conductor-0\" (UID: \"88d7f74b-9a47-4152-bec1-11e05030e750\") " pod="openstack/nova-cell1-conductor-0" Jan 11 17:53:18 crc kubenswrapper[4837]: I0111 17:53:18.444617 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/88d7f74b-9a47-4152-bec1-11e05030e750-combined-ca-bundle\") pod \"nova-cell1-conductor-0\" (UID: \"88d7f74b-9a47-4152-bec1-11e05030e750\") " pod="openstack/nova-cell1-conductor-0" Jan 11 17:53:18 crc kubenswrapper[4837]: I0111 17:53:18.444692 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-d6zwn\" (UniqueName: \"kubernetes.io/projected/88d7f74b-9a47-4152-bec1-11e05030e750-kube-api-access-d6zwn\") pod \"nova-cell1-conductor-0\" (UID: \"88d7f74b-9a47-4152-bec1-11e05030e750\") " pod="openstack/nova-cell1-conductor-0" Jan 11 17:53:18 crc kubenswrapper[4837]: I0111 17:53:18.456339 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/88d7f74b-9a47-4152-bec1-11e05030e750-config-data\") pod \"nova-cell1-conductor-0\" (UID: \"88d7f74b-9a47-4152-bec1-11e05030e750\") " pod="openstack/nova-cell1-conductor-0" Jan 11 17:53:18 crc kubenswrapper[4837]: I0111 17:53:18.456914 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/88d7f74b-9a47-4152-bec1-11e05030e750-combined-ca-bundle\") pod \"nova-cell1-conductor-0\" (UID: \"88d7f74b-9a47-4152-bec1-11e05030e750\") " pod="openstack/nova-cell1-conductor-0" Jan 11 17:53:18 crc kubenswrapper[4837]: I0111 17:53:18.459765 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-d6zwn\" (UniqueName: \"kubernetes.io/projected/88d7f74b-9a47-4152-bec1-11e05030e750-kube-api-access-d6zwn\") pod \"nova-cell1-conductor-0\" (UID: \"88d7f74b-9a47-4152-bec1-11e05030e750\") " pod="openstack/nova-cell1-conductor-0" Jan 11 17:53:18 crc kubenswrapper[4837]: I0111 17:53:18.587564 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-conductor-0" Jan 11 17:53:19 crc kubenswrapper[4837]: I0111 17:53:19.055754 4837 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-conductor-0"] Jan 11 17:53:19 crc kubenswrapper[4837]: W0111 17:53:19.070846 4837 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod88d7f74b_9a47_4152_bec1_11e05030e750.slice/crio-6d79a9769b5009fd6c3966c022020b2e6d9e5ebc9862c0d303ced8647979e923 WatchSource:0}: Error finding container 6d79a9769b5009fd6c3966c022020b2e6d9e5ebc9862c0d303ced8647979e923: Status 404 returned error can't find the container with id 6d79a9769b5009fd6c3966c022020b2e6d9e5ebc9862c0d303ced8647979e923 Jan 11 17:53:19 crc kubenswrapper[4837]: I0111 17:53:19.309386 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-conductor-0" event={"ID":"88d7f74b-9a47-4152-bec1-11e05030e750","Type":"ContainerStarted","Data":"6d79a9769b5009fd6c3966c022020b2e6d9e5ebc9862c0d303ced8647979e923"} Jan 11 17:53:19 crc kubenswrapper[4837]: I0111 17:53:19.314998 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"67f85854-48e7-4151-9051-da416208058a","Type":"ContainerStarted","Data":"bba8c1f23997bc8651038ff0037f72ff8ba795753b6b5b63c048b341c0246ca3"} Jan 11 17:53:20 crc kubenswrapper[4837]: I0111 17:53:20.323505 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-conductor-0" event={"ID":"88d7f74b-9a47-4152-bec1-11e05030e750","Type":"ContainerStarted","Data":"3eb5308e1a74e0674e6ca49c8e9bf48ec47ff0a76bca6f4416d5029534dcd031"} Jan 11 17:53:20 crc kubenswrapper[4837]: I0111 17:53:20.323899 4837 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-cell1-conductor-0" Jan 11 17:53:20 crc kubenswrapper[4837]: I0111 17:53:20.347527 4837 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell1-conductor-0" podStartSLOduration=2.347508587 podStartE2EDuration="2.347508587s" podCreationTimestamp="2026-01-11 17:53:18 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-11 17:53:20.336557123 +0000 UTC m=+1374.514749919" watchObservedRunningTime="2026-01-11 17:53:20.347508587 +0000 UTC m=+1374.525701293" Jan 11 17:53:20 crc kubenswrapper[4837]: E0111 17:53:20.519830 4837 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="41584b8c8efc8fccdccf666a89ab7871c0676b7f5c98145efd261b8b22383bca" cmd=["/usr/bin/pgrep","-r","DRST","nova-scheduler"] Jan 11 17:53:20 crc kubenswrapper[4837]: E0111 17:53:20.522572 4837 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="41584b8c8efc8fccdccf666a89ab7871c0676b7f5c98145efd261b8b22383bca" cmd=["/usr/bin/pgrep","-r","DRST","nova-scheduler"] Jan 11 17:53:20 crc kubenswrapper[4837]: E0111 17:53:20.524339 4837 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="41584b8c8efc8fccdccf666a89ab7871c0676b7f5c98145efd261b8b22383bca" cmd=["/usr/bin/pgrep","-r","DRST","nova-scheduler"] Jan 11 17:53:20 crc kubenswrapper[4837]: E0111 17:53:20.524377 4837 prober.go:104] "Probe errored" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" probeType="Readiness" pod="openstack/nova-scheduler-0" podUID="a362ee58-5327-4ec1-95ca-f0579f471f84" containerName="nova-scheduler-scheduler" Jan 11 17:53:21 crc kubenswrapper[4837]: I0111 17:53:21.206458 4837 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Jan 11 17:53:21 crc kubenswrapper[4837]: I0111 17:53:21.293182 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a362ee58-5327-4ec1-95ca-f0579f471f84-config-data\") pod \"a362ee58-5327-4ec1-95ca-f0579f471f84\" (UID: \"a362ee58-5327-4ec1-95ca-f0579f471f84\") " Jan 11 17:53:21 crc kubenswrapper[4837]: I0111 17:53:21.293413 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xk2qr\" (UniqueName: \"kubernetes.io/projected/a362ee58-5327-4ec1-95ca-f0579f471f84-kube-api-access-xk2qr\") pod \"a362ee58-5327-4ec1-95ca-f0579f471f84\" (UID: \"a362ee58-5327-4ec1-95ca-f0579f471f84\") " Jan 11 17:53:21 crc kubenswrapper[4837]: I0111 17:53:21.293447 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a362ee58-5327-4ec1-95ca-f0579f471f84-combined-ca-bundle\") pod \"a362ee58-5327-4ec1-95ca-f0579f471f84\" (UID: \"a362ee58-5327-4ec1-95ca-f0579f471f84\") " Jan 11 17:53:21 crc kubenswrapper[4837]: I0111 17:53:21.306036 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a362ee58-5327-4ec1-95ca-f0579f471f84-kube-api-access-xk2qr" (OuterVolumeSpecName: "kube-api-access-xk2qr") pod "a362ee58-5327-4ec1-95ca-f0579f471f84" (UID: "a362ee58-5327-4ec1-95ca-f0579f471f84"). InnerVolumeSpecName "kube-api-access-xk2qr". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 11 17:53:21 crc kubenswrapper[4837]: I0111 17:53:21.323075 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a362ee58-5327-4ec1-95ca-f0579f471f84-config-data" (OuterVolumeSpecName: "config-data") pod "a362ee58-5327-4ec1-95ca-f0579f471f84" (UID: "a362ee58-5327-4ec1-95ca-f0579f471f84"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 11 17:53:21 crc kubenswrapper[4837]: I0111 17:53:21.328367 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a362ee58-5327-4ec1-95ca-f0579f471f84-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "a362ee58-5327-4ec1-95ca-f0579f471f84" (UID: "a362ee58-5327-4ec1-95ca-f0579f471f84"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 11 17:53:21 crc kubenswrapper[4837]: I0111 17:53:21.333707 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"67f85854-48e7-4151-9051-da416208058a","Type":"ContainerStarted","Data":"b9d102816df988088ac57543649f2da552a2588403c9f36fdfb82e34b1efe477"} Jan 11 17:53:21 crc kubenswrapper[4837]: I0111 17:53:21.335476 4837 generic.go:334] "Generic (PLEG): container finished" podID="a362ee58-5327-4ec1-95ca-f0579f471f84" containerID="41584b8c8efc8fccdccf666a89ab7871c0676b7f5c98145efd261b8b22383bca" exitCode=0 Jan 11 17:53:21 crc kubenswrapper[4837]: I0111 17:53:21.335612 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"a362ee58-5327-4ec1-95ca-f0579f471f84","Type":"ContainerDied","Data":"41584b8c8efc8fccdccf666a89ab7871c0676b7f5c98145efd261b8b22383bca"} Jan 11 17:53:21 crc kubenswrapper[4837]: I0111 17:53:21.335714 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"a362ee58-5327-4ec1-95ca-f0579f471f84","Type":"ContainerDied","Data":"7a9bde5fbf28a5f67349a0d4549c640521d1303fd02d734f8ee18f3aa8771f3c"} Jan 11 17:53:21 crc kubenswrapper[4837]: I0111 17:53:21.335734 4837 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Jan 11 17:53:21 crc kubenswrapper[4837]: I0111 17:53:21.335742 4837 scope.go:117] "RemoveContainer" containerID="41584b8c8efc8fccdccf666a89ab7871c0676b7f5c98145efd261b8b22383bca" Jan 11 17:53:21 crc kubenswrapper[4837]: I0111 17:53:21.354485 4837 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-metadata-0" podStartSLOduration=4.354467957 podStartE2EDuration="4.354467957s" podCreationTimestamp="2026-01-11 17:53:17 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-11 17:53:21.35349692 +0000 UTC m=+1375.531689646" watchObservedRunningTime="2026-01-11 17:53:21.354467957 +0000 UTC m=+1375.532660663" Jan 11 17:53:21 crc kubenswrapper[4837]: I0111 17:53:21.396159 4837 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a362ee58-5327-4ec1-95ca-f0579f471f84-config-data\") on node \"crc\" DevicePath \"\"" Jan 11 17:53:21 crc kubenswrapper[4837]: I0111 17:53:21.396200 4837 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xk2qr\" (UniqueName: \"kubernetes.io/projected/a362ee58-5327-4ec1-95ca-f0579f471f84-kube-api-access-xk2qr\") on node \"crc\" DevicePath \"\"" Jan 11 17:53:21 crc kubenswrapper[4837]: I0111 17:53:21.396210 4837 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a362ee58-5327-4ec1-95ca-f0579f471f84-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 11 17:53:21 crc kubenswrapper[4837]: I0111 17:53:21.413217 4837 scope.go:117] "RemoveContainer" containerID="41584b8c8efc8fccdccf666a89ab7871c0676b7f5c98145efd261b8b22383bca" Jan 11 17:53:21 crc kubenswrapper[4837]: E0111 17:53:21.413665 4837 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"41584b8c8efc8fccdccf666a89ab7871c0676b7f5c98145efd261b8b22383bca\": container with ID starting with 41584b8c8efc8fccdccf666a89ab7871c0676b7f5c98145efd261b8b22383bca not found: ID does not exist" containerID="41584b8c8efc8fccdccf666a89ab7871c0676b7f5c98145efd261b8b22383bca" Jan 11 17:53:21 crc kubenswrapper[4837]: I0111 17:53:21.413706 4837 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"41584b8c8efc8fccdccf666a89ab7871c0676b7f5c98145efd261b8b22383bca"} err="failed to get container status \"41584b8c8efc8fccdccf666a89ab7871c0676b7f5c98145efd261b8b22383bca\": rpc error: code = NotFound desc = could not find container \"41584b8c8efc8fccdccf666a89ab7871c0676b7f5c98145efd261b8b22383bca\": container with ID starting with 41584b8c8efc8fccdccf666a89ab7871c0676b7f5c98145efd261b8b22383bca not found: ID does not exist" Jan 11 17:53:21 crc kubenswrapper[4837]: I0111 17:53:21.417950 4837 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-scheduler-0"] Jan 11 17:53:21 crc kubenswrapper[4837]: I0111 17:53:21.435454 4837 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-scheduler-0"] Jan 11 17:53:21 crc kubenswrapper[4837]: I0111 17:53:21.445819 4837 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-scheduler-0"] Jan 11 17:53:21 crc kubenswrapper[4837]: E0111 17:53:21.446275 4837 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a362ee58-5327-4ec1-95ca-f0579f471f84" containerName="nova-scheduler-scheduler" Jan 11 17:53:21 crc kubenswrapper[4837]: I0111 17:53:21.446295 4837 state_mem.go:107] "Deleted CPUSet assignment" podUID="a362ee58-5327-4ec1-95ca-f0579f471f84" containerName="nova-scheduler-scheduler" Jan 11 17:53:21 crc kubenswrapper[4837]: I0111 17:53:21.446514 4837 memory_manager.go:354] "RemoveStaleState removing state" podUID="a362ee58-5327-4ec1-95ca-f0579f471f84" containerName="nova-scheduler-scheduler" Jan 11 17:53:21 crc kubenswrapper[4837]: I0111 17:53:21.447168 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Jan 11 17:53:21 crc kubenswrapper[4837]: I0111 17:53:21.452431 4837 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-scheduler-config-data" Jan 11 17:53:21 crc kubenswrapper[4837]: I0111 17:53:21.452287 4837 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-scheduler-0"] Jan 11 17:53:21 crc kubenswrapper[4837]: I0111 17:53:21.484703 4837 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/kube-state-metrics-0"] Jan 11 17:53:21 crc kubenswrapper[4837]: I0111 17:53:21.484906 4837 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/kube-state-metrics-0" podUID="cad8e11f-3ef0-4043-a49e-308c103a973f" containerName="kube-state-metrics" containerID="cri-o://7359edf71a1234b308bbb5aa3c883a06057a34f99aa6ec662b1ce987950a4957" gracePeriod=30 Jan 11 17:53:21 crc kubenswrapper[4837]: I0111 17:53:21.497286 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5nrwj\" (UniqueName: \"kubernetes.io/projected/33ef5a38-f83c-40ea-8036-fe830d8a32a3-kube-api-access-5nrwj\") pod \"nova-scheduler-0\" (UID: \"33ef5a38-f83c-40ea-8036-fe830d8a32a3\") " pod="openstack/nova-scheduler-0" Jan 11 17:53:21 crc kubenswrapper[4837]: I0111 17:53:21.497387 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/33ef5a38-f83c-40ea-8036-fe830d8a32a3-config-data\") pod \"nova-scheduler-0\" (UID: \"33ef5a38-f83c-40ea-8036-fe830d8a32a3\") " pod="openstack/nova-scheduler-0" Jan 11 17:53:21 crc kubenswrapper[4837]: I0111 17:53:21.497474 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/33ef5a38-f83c-40ea-8036-fe830d8a32a3-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"33ef5a38-f83c-40ea-8036-fe830d8a32a3\") " pod="openstack/nova-scheduler-0" Jan 11 17:53:21 crc kubenswrapper[4837]: I0111 17:53:21.599839 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/33ef5a38-f83c-40ea-8036-fe830d8a32a3-config-data\") pod \"nova-scheduler-0\" (UID: \"33ef5a38-f83c-40ea-8036-fe830d8a32a3\") " pod="openstack/nova-scheduler-0" Jan 11 17:53:21 crc kubenswrapper[4837]: I0111 17:53:21.599942 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/33ef5a38-f83c-40ea-8036-fe830d8a32a3-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"33ef5a38-f83c-40ea-8036-fe830d8a32a3\") " pod="openstack/nova-scheduler-0" Jan 11 17:53:21 crc kubenswrapper[4837]: I0111 17:53:21.600068 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5nrwj\" (UniqueName: \"kubernetes.io/projected/33ef5a38-f83c-40ea-8036-fe830d8a32a3-kube-api-access-5nrwj\") pod \"nova-scheduler-0\" (UID: \"33ef5a38-f83c-40ea-8036-fe830d8a32a3\") " pod="openstack/nova-scheduler-0" Jan 11 17:53:21 crc kubenswrapper[4837]: I0111 17:53:21.604717 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/33ef5a38-f83c-40ea-8036-fe830d8a32a3-config-data\") pod \"nova-scheduler-0\" (UID: \"33ef5a38-f83c-40ea-8036-fe830d8a32a3\") " pod="openstack/nova-scheduler-0" Jan 11 17:53:21 crc kubenswrapper[4837]: I0111 17:53:21.613453 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/33ef5a38-f83c-40ea-8036-fe830d8a32a3-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"33ef5a38-f83c-40ea-8036-fe830d8a32a3\") " pod="openstack/nova-scheduler-0" Jan 11 17:53:21 crc kubenswrapper[4837]: I0111 17:53:21.619297 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5nrwj\" (UniqueName: \"kubernetes.io/projected/33ef5a38-f83c-40ea-8036-fe830d8a32a3-kube-api-access-5nrwj\") pod \"nova-scheduler-0\" (UID: \"33ef5a38-f83c-40ea-8036-fe830d8a32a3\") " pod="openstack/nova-scheduler-0" Jan 11 17:53:21 crc kubenswrapper[4837]: I0111 17:53:21.768134 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Jan 11 17:53:22 crc kubenswrapper[4837]: I0111 17:53:22.083765 4837 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/kube-state-metrics-0" Jan 11 17:53:22 crc kubenswrapper[4837]: I0111 17:53:22.107275 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-424k8\" (UniqueName: \"kubernetes.io/projected/cad8e11f-3ef0-4043-a49e-308c103a973f-kube-api-access-424k8\") pod \"cad8e11f-3ef0-4043-a49e-308c103a973f\" (UID: \"cad8e11f-3ef0-4043-a49e-308c103a973f\") " Jan 11 17:53:22 crc kubenswrapper[4837]: I0111 17:53:22.120862 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/cad8e11f-3ef0-4043-a49e-308c103a973f-kube-api-access-424k8" (OuterVolumeSpecName: "kube-api-access-424k8") pod "cad8e11f-3ef0-4043-a49e-308c103a973f" (UID: "cad8e11f-3ef0-4043-a49e-308c103a973f"). InnerVolumeSpecName "kube-api-access-424k8". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 11 17:53:22 crc kubenswrapper[4837]: I0111 17:53:22.170238 4837 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Jan 11 17:53:22 crc kubenswrapper[4837]: I0111 17:53:22.208789 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/66fa097f-464b-4efd-b0e3-8129153d92d5-config-data\") pod \"66fa097f-464b-4efd-b0e3-8129153d92d5\" (UID: \"66fa097f-464b-4efd-b0e3-8129153d92d5\") " Jan 11 17:53:22 crc kubenswrapper[4837]: I0111 17:53:22.209116 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jtbvw\" (UniqueName: \"kubernetes.io/projected/66fa097f-464b-4efd-b0e3-8129153d92d5-kube-api-access-jtbvw\") pod \"66fa097f-464b-4efd-b0e3-8129153d92d5\" (UID: \"66fa097f-464b-4efd-b0e3-8129153d92d5\") " Jan 11 17:53:22 crc kubenswrapper[4837]: I0111 17:53:22.209150 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/66fa097f-464b-4efd-b0e3-8129153d92d5-combined-ca-bundle\") pod \"66fa097f-464b-4efd-b0e3-8129153d92d5\" (UID: \"66fa097f-464b-4efd-b0e3-8129153d92d5\") " Jan 11 17:53:22 crc kubenswrapper[4837]: I0111 17:53:22.209196 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/66fa097f-464b-4efd-b0e3-8129153d92d5-logs\") pod \"66fa097f-464b-4efd-b0e3-8129153d92d5\" (UID: \"66fa097f-464b-4efd-b0e3-8129153d92d5\") " Jan 11 17:53:22 crc kubenswrapper[4837]: I0111 17:53:22.209665 4837 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-424k8\" (UniqueName: \"kubernetes.io/projected/cad8e11f-3ef0-4043-a49e-308c103a973f-kube-api-access-424k8\") on node \"crc\" DevicePath \"\"" Jan 11 17:53:22 crc kubenswrapper[4837]: I0111 17:53:22.210237 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/66fa097f-464b-4efd-b0e3-8129153d92d5-logs" (OuterVolumeSpecName: "logs") pod "66fa097f-464b-4efd-b0e3-8129153d92d5" (UID: "66fa097f-464b-4efd-b0e3-8129153d92d5"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 11 17:53:22 crc kubenswrapper[4837]: I0111 17:53:22.214057 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/66fa097f-464b-4efd-b0e3-8129153d92d5-kube-api-access-jtbvw" (OuterVolumeSpecName: "kube-api-access-jtbvw") pod "66fa097f-464b-4efd-b0e3-8129153d92d5" (UID: "66fa097f-464b-4efd-b0e3-8129153d92d5"). InnerVolumeSpecName "kube-api-access-jtbvw". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 11 17:53:22 crc kubenswrapper[4837]: I0111 17:53:22.235589 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/66fa097f-464b-4efd-b0e3-8129153d92d5-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "66fa097f-464b-4efd-b0e3-8129153d92d5" (UID: "66fa097f-464b-4efd-b0e3-8129153d92d5"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 11 17:53:22 crc kubenswrapper[4837]: I0111 17:53:22.237074 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/66fa097f-464b-4efd-b0e3-8129153d92d5-config-data" (OuterVolumeSpecName: "config-data") pod "66fa097f-464b-4efd-b0e3-8129153d92d5" (UID: "66fa097f-464b-4efd-b0e3-8129153d92d5"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 11 17:53:22 crc kubenswrapper[4837]: I0111 17:53:22.311369 4837 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jtbvw\" (UniqueName: \"kubernetes.io/projected/66fa097f-464b-4efd-b0e3-8129153d92d5-kube-api-access-jtbvw\") on node \"crc\" DevicePath \"\"" Jan 11 17:53:22 crc kubenswrapper[4837]: I0111 17:53:22.311402 4837 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/66fa097f-464b-4efd-b0e3-8129153d92d5-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 11 17:53:22 crc kubenswrapper[4837]: I0111 17:53:22.311416 4837 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/66fa097f-464b-4efd-b0e3-8129153d92d5-logs\") on node \"crc\" DevicePath \"\"" Jan 11 17:53:22 crc kubenswrapper[4837]: I0111 17:53:22.311428 4837 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/66fa097f-464b-4efd-b0e3-8129153d92d5-config-data\") on node \"crc\" DevicePath \"\"" Jan 11 17:53:22 crc kubenswrapper[4837]: I0111 17:53:22.344813 4837 generic.go:334] "Generic (PLEG): container finished" podID="66fa097f-464b-4efd-b0e3-8129153d92d5" containerID="8d9553b51c598777bbe802217fb2c62cb4a91ae68e4c7bbda8a126f4b182a7a1" exitCode=0 Jan 11 17:53:22 crc kubenswrapper[4837]: I0111 17:53:22.344867 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"66fa097f-464b-4efd-b0e3-8129153d92d5","Type":"ContainerDied","Data":"8d9553b51c598777bbe802217fb2c62cb4a91ae68e4c7bbda8a126f4b182a7a1"} Jan 11 17:53:22 crc kubenswrapper[4837]: I0111 17:53:22.344892 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"66fa097f-464b-4efd-b0e3-8129153d92d5","Type":"ContainerDied","Data":"a7ed6abb05b741be7317a82fce85ff03f4649cf15c7b46ef5d1ee1ba6b090185"} Jan 11 17:53:22 crc kubenswrapper[4837]: I0111 17:53:22.344907 4837 scope.go:117] "RemoveContainer" containerID="8d9553b51c598777bbe802217fb2c62cb4a91ae68e4c7bbda8a126f4b182a7a1" Jan 11 17:53:22 crc kubenswrapper[4837]: I0111 17:53:22.344987 4837 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Jan 11 17:53:22 crc kubenswrapper[4837]: I0111 17:53:22.357420 4837 generic.go:334] "Generic (PLEG): container finished" podID="cad8e11f-3ef0-4043-a49e-308c103a973f" containerID="7359edf71a1234b308bbb5aa3c883a06057a34f99aa6ec662b1ce987950a4957" exitCode=2 Jan 11 17:53:22 crc kubenswrapper[4837]: I0111 17:53:22.357536 4837 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/kube-state-metrics-0" Jan 11 17:53:22 crc kubenswrapper[4837]: I0111 17:53:22.357531 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/kube-state-metrics-0" event={"ID":"cad8e11f-3ef0-4043-a49e-308c103a973f","Type":"ContainerDied","Data":"7359edf71a1234b308bbb5aa3c883a06057a34f99aa6ec662b1ce987950a4957"} Jan 11 17:53:22 crc kubenswrapper[4837]: I0111 17:53:22.357606 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/kube-state-metrics-0" event={"ID":"cad8e11f-3ef0-4043-a49e-308c103a973f","Type":"ContainerDied","Data":"a21b0b133eabc2ef3669d0ea1e12c8eee8851315f3e5bf7d1bb79bce3fcaeda2"} Jan 11 17:53:22 crc kubenswrapper[4837]: I0111 17:53:22.380838 4837 scope.go:117] "RemoveContainer" containerID="faf1bab0d37200648afc7be3eb186f5da2992aaa32ef7c37aefb8e6fa3474b17" Jan 11 17:53:22 crc kubenswrapper[4837]: I0111 17:53:22.422874 4837 scope.go:117] "RemoveContainer" containerID="8d9553b51c598777bbe802217fb2c62cb4a91ae68e4c7bbda8a126f4b182a7a1" Jan 11 17:53:22 crc kubenswrapper[4837]: E0111 17:53:22.427808 4837 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"8d9553b51c598777bbe802217fb2c62cb4a91ae68e4c7bbda8a126f4b182a7a1\": container with ID starting with 8d9553b51c598777bbe802217fb2c62cb4a91ae68e4c7bbda8a126f4b182a7a1 not found: ID does not exist" containerID="8d9553b51c598777bbe802217fb2c62cb4a91ae68e4c7bbda8a126f4b182a7a1" Jan 11 17:53:22 crc kubenswrapper[4837]: I0111 17:53:22.427851 4837 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8d9553b51c598777bbe802217fb2c62cb4a91ae68e4c7bbda8a126f4b182a7a1"} err="failed to get container status \"8d9553b51c598777bbe802217fb2c62cb4a91ae68e4c7bbda8a126f4b182a7a1\": rpc error: code = NotFound desc = could not find container \"8d9553b51c598777bbe802217fb2c62cb4a91ae68e4c7bbda8a126f4b182a7a1\": container with ID starting with 8d9553b51c598777bbe802217fb2c62cb4a91ae68e4c7bbda8a126f4b182a7a1 not found: ID does not exist" Jan 11 17:53:22 crc kubenswrapper[4837]: I0111 17:53:22.427875 4837 scope.go:117] "RemoveContainer" containerID="faf1bab0d37200648afc7be3eb186f5da2992aaa32ef7c37aefb8e6fa3474b17" Jan 11 17:53:22 crc kubenswrapper[4837]: I0111 17:53:22.429657 4837 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a362ee58-5327-4ec1-95ca-f0579f471f84" path="/var/lib/kubelet/pods/a362ee58-5327-4ec1-95ca-f0579f471f84/volumes" Jan 11 17:53:22 crc kubenswrapper[4837]: I0111 17:53:22.430502 4837 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-scheduler-0"] Jan 11 17:53:22 crc kubenswrapper[4837]: I0111 17:53:22.430541 4837 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-0"] Jan 11 17:53:22 crc kubenswrapper[4837]: I0111 17:53:22.430560 4837 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-api-0"] Jan 11 17:53:22 crc kubenswrapper[4837]: E0111 17:53:22.433408 4837 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"faf1bab0d37200648afc7be3eb186f5da2992aaa32ef7c37aefb8e6fa3474b17\": container with ID starting with faf1bab0d37200648afc7be3eb186f5da2992aaa32ef7c37aefb8e6fa3474b17 not found: ID does not exist" containerID="faf1bab0d37200648afc7be3eb186f5da2992aaa32ef7c37aefb8e6fa3474b17" Jan 11 17:53:22 crc kubenswrapper[4837]: I0111 17:53:22.433469 4837 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"faf1bab0d37200648afc7be3eb186f5da2992aaa32ef7c37aefb8e6fa3474b17"} err="failed to get container status \"faf1bab0d37200648afc7be3eb186f5da2992aaa32ef7c37aefb8e6fa3474b17\": rpc error: code = NotFound desc = could not find container \"faf1bab0d37200648afc7be3eb186f5da2992aaa32ef7c37aefb8e6fa3474b17\": container with ID starting with faf1bab0d37200648afc7be3eb186f5da2992aaa32ef7c37aefb8e6fa3474b17 not found: ID does not exist" Jan 11 17:53:22 crc kubenswrapper[4837]: I0111 17:53:22.433500 4837 scope.go:117] "RemoveContainer" containerID="7359edf71a1234b308bbb5aa3c883a06057a34f99aa6ec662b1ce987950a4957" Jan 11 17:53:22 crc kubenswrapper[4837]: I0111 17:53:22.433888 4837 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-api-0"] Jan 11 17:53:22 crc kubenswrapper[4837]: E0111 17:53:22.434486 4837 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="66fa097f-464b-4efd-b0e3-8129153d92d5" containerName="nova-api-log" Jan 11 17:53:22 crc kubenswrapper[4837]: I0111 17:53:22.434551 4837 state_mem.go:107] "Deleted CPUSet assignment" podUID="66fa097f-464b-4efd-b0e3-8129153d92d5" containerName="nova-api-log" Jan 11 17:53:22 crc kubenswrapper[4837]: E0111 17:53:22.434609 4837 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="cad8e11f-3ef0-4043-a49e-308c103a973f" containerName="kube-state-metrics" Jan 11 17:53:22 crc kubenswrapper[4837]: I0111 17:53:22.434619 4837 state_mem.go:107] "Deleted CPUSet assignment" podUID="cad8e11f-3ef0-4043-a49e-308c103a973f" containerName="kube-state-metrics" Jan 11 17:53:22 crc kubenswrapper[4837]: E0111 17:53:22.434636 4837 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="66fa097f-464b-4efd-b0e3-8129153d92d5" containerName="nova-api-api" Jan 11 17:53:22 crc kubenswrapper[4837]: I0111 17:53:22.434644 4837 state_mem.go:107] "Deleted CPUSet assignment" podUID="66fa097f-464b-4efd-b0e3-8129153d92d5" containerName="nova-api-api" Jan 11 17:53:22 crc kubenswrapper[4837]: I0111 17:53:22.435013 4837 memory_manager.go:354] "RemoveStaleState removing state" podUID="66fa097f-464b-4efd-b0e3-8129153d92d5" containerName="nova-api-log" Jan 11 17:53:22 crc kubenswrapper[4837]: I0111 17:53:22.435051 4837 memory_manager.go:354] "RemoveStaleState removing state" podUID="cad8e11f-3ef0-4043-a49e-308c103a973f" containerName="kube-state-metrics" Jan 11 17:53:22 crc kubenswrapper[4837]: I0111 17:53:22.435146 4837 memory_manager.go:354] "RemoveStaleState removing state" podUID="66fa097f-464b-4efd-b0e3-8129153d92d5" containerName="nova-api-api" Jan 11 17:53:22 crc kubenswrapper[4837]: I0111 17:53:22.436643 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Jan 11 17:53:22 crc kubenswrapper[4837]: I0111 17:53:22.440953 4837 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-api-config-data" Jan 11 17:53:22 crc kubenswrapper[4837]: I0111 17:53:22.442802 4837 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/kube-state-metrics-0"] Jan 11 17:53:22 crc kubenswrapper[4837]: I0111 17:53:22.451509 4837 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/kube-state-metrics-0"] Jan 11 17:53:22 crc kubenswrapper[4837]: I0111 17:53:22.458476 4837 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Jan 11 17:53:22 crc kubenswrapper[4837]: I0111 17:53:22.466590 4837 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/kube-state-metrics-0"] Jan 11 17:53:22 crc kubenswrapper[4837]: I0111 17:53:22.468090 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/kube-state-metrics-0" Jan 11 17:53:22 crc kubenswrapper[4837]: I0111 17:53:22.469916 4837 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"kube-state-metrics-tls-config" Jan 11 17:53:22 crc kubenswrapper[4837]: I0111 17:53:22.471275 4837 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-kube-state-metrics-svc" Jan 11 17:53:22 crc kubenswrapper[4837]: I0111 17:53:22.475019 4837 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/kube-state-metrics-0"] Jan 11 17:53:22 crc kubenswrapper[4837]: I0111 17:53:22.501109 4837 scope.go:117] "RemoveContainer" containerID="7359edf71a1234b308bbb5aa3c883a06057a34f99aa6ec662b1ce987950a4957" Jan 11 17:53:22 crc kubenswrapper[4837]: E0111 17:53:22.501990 4837 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"7359edf71a1234b308bbb5aa3c883a06057a34f99aa6ec662b1ce987950a4957\": container with ID starting with 7359edf71a1234b308bbb5aa3c883a06057a34f99aa6ec662b1ce987950a4957 not found: ID does not exist" containerID="7359edf71a1234b308bbb5aa3c883a06057a34f99aa6ec662b1ce987950a4957" Jan 11 17:53:22 crc kubenswrapper[4837]: I0111 17:53:22.502032 4837 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7359edf71a1234b308bbb5aa3c883a06057a34f99aa6ec662b1ce987950a4957"} err="failed to get container status \"7359edf71a1234b308bbb5aa3c883a06057a34f99aa6ec662b1ce987950a4957\": rpc error: code = NotFound desc = could not find container \"7359edf71a1234b308bbb5aa3c883a06057a34f99aa6ec662b1ce987950a4957\": container with ID starting with 7359edf71a1234b308bbb5aa3c883a06057a34f99aa6ec662b1ce987950a4957 not found: ID does not exist" Jan 11 17:53:22 crc kubenswrapper[4837]: I0111 17:53:22.515983 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-state-metrics-tls-config\" (UniqueName: \"kubernetes.io/secret/e2197654-71c4-403f-98d8-994d0225a199-kube-state-metrics-tls-config\") pod \"kube-state-metrics-0\" (UID: \"e2197654-71c4-403f-98d8-994d0225a199\") " pod="openstack/kube-state-metrics-0" Jan 11 17:53:22 crc kubenswrapper[4837]: I0111 17:53:22.516023 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-75x9v\" (UniqueName: \"kubernetes.io/projected/e2197654-71c4-403f-98d8-994d0225a199-kube-api-access-75x9v\") pod \"kube-state-metrics-0\" (UID: \"e2197654-71c4-403f-98d8-994d0225a199\") " pod="openstack/kube-state-metrics-0" Jan 11 17:53:22 crc kubenswrapper[4837]: I0111 17:53:22.516063 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/828211f0-0290-4c77-9227-355162626da7-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"828211f0-0290-4c77-9227-355162626da7\") " pod="openstack/nova-api-0" Jan 11 17:53:22 crc kubenswrapper[4837]: I0111 17:53:22.516081 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-state-metrics-tls-certs\" (UniqueName: \"kubernetes.io/secret/e2197654-71c4-403f-98d8-994d0225a199-kube-state-metrics-tls-certs\") pod \"kube-state-metrics-0\" (UID: \"e2197654-71c4-403f-98d8-994d0225a199\") " pod="openstack/kube-state-metrics-0" Jan 11 17:53:22 crc kubenswrapper[4837]: I0111 17:53:22.516190 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/828211f0-0290-4c77-9227-355162626da7-logs\") pod \"nova-api-0\" (UID: \"828211f0-0290-4c77-9227-355162626da7\") " pod="openstack/nova-api-0" Jan 11 17:53:22 crc kubenswrapper[4837]: I0111 17:53:22.516460 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-d7lkj\" (UniqueName: \"kubernetes.io/projected/828211f0-0290-4c77-9227-355162626da7-kube-api-access-d7lkj\") pod \"nova-api-0\" (UID: \"828211f0-0290-4c77-9227-355162626da7\") " pod="openstack/nova-api-0" Jan 11 17:53:22 crc kubenswrapper[4837]: I0111 17:53:22.516502 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e2197654-71c4-403f-98d8-994d0225a199-combined-ca-bundle\") pod \"kube-state-metrics-0\" (UID: \"e2197654-71c4-403f-98d8-994d0225a199\") " pod="openstack/kube-state-metrics-0" Jan 11 17:53:22 crc kubenswrapper[4837]: I0111 17:53:22.516573 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/828211f0-0290-4c77-9227-355162626da7-config-data\") pod \"nova-api-0\" (UID: \"828211f0-0290-4c77-9227-355162626da7\") " pod="openstack/nova-api-0" Jan 11 17:53:22 crc kubenswrapper[4837]: I0111 17:53:22.618223 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/828211f0-0290-4c77-9227-355162626da7-logs\") pod \"nova-api-0\" (UID: \"828211f0-0290-4c77-9227-355162626da7\") " pod="openstack/nova-api-0" Jan 11 17:53:22 crc kubenswrapper[4837]: I0111 17:53:22.618554 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-d7lkj\" (UniqueName: \"kubernetes.io/projected/828211f0-0290-4c77-9227-355162626da7-kube-api-access-d7lkj\") pod \"nova-api-0\" (UID: \"828211f0-0290-4c77-9227-355162626da7\") " pod="openstack/nova-api-0" Jan 11 17:53:22 crc kubenswrapper[4837]: I0111 17:53:22.618578 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e2197654-71c4-403f-98d8-994d0225a199-combined-ca-bundle\") pod \"kube-state-metrics-0\" (UID: \"e2197654-71c4-403f-98d8-994d0225a199\") " pod="openstack/kube-state-metrics-0" Jan 11 17:53:22 crc kubenswrapper[4837]: I0111 17:53:22.618605 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/828211f0-0290-4c77-9227-355162626da7-config-data\") pod \"nova-api-0\" (UID: \"828211f0-0290-4c77-9227-355162626da7\") " pod="openstack/nova-api-0" Jan 11 17:53:22 crc kubenswrapper[4837]: I0111 17:53:22.618647 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-state-metrics-tls-config\" (UniqueName: \"kubernetes.io/secret/e2197654-71c4-403f-98d8-994d0225a199-kube-state-metrics-tls-config\") pod \"kube-state-metrics-0\" (UID: \"e2197654-71c4-403f-98d8-994d0225a199\") " pod="openstack/kube-state-metrics-0" Jan 11 17:53:22 crc kubenswrapper[4837]: I0111 17:53:22.618665 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-75x9v\" (UniqueName: \"kubernetes.io/projected/e2197654-71c4-403f-98d8-994d0225a199-kube-api-access-75x9v\") pod \"kube-state-metrics-0\" (UID: \"e2197654-71c4-403f-98d8-994d0225a199\") " pod="openstack/kube-state-metrics-0" Jan 11 17:53:22 crc kubenswrapper[4837]: I0111 17:53:22.618704 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/828211f0-0290-4c77-9227-355162626da7-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"828211f0-0290-4c77-9227-355162626da7\") " pod="openstack/nova-api-0" Jan 11 17:53:22 crc kubenswrapper[4837]: I0111 17:53:22.618720 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-state-metrics-tls-certs\" (UniqueName: \"kubernetes.io/secret/e2197654-71c4-403f-98d8-994d0225a199-kube-state-metrics-tls-certs\") pod \"kube-state-metrics-0\" (UID: \"e2197654-71c4-403f-98d8-994d0225a199\") " pod="openstack/kube-state-metrics-0" Jan 11 17:53:22 crc kubenswrapper[4837]: I0111 17:53:22.622010 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/828211f0-0290-4c77-9227-355162626da7-logs\") pod \"nova-api-0\" (UID: \"828211f0-0290-4c77-9227-355162626da7\") " pod="openstack/nova-api-0" Jan 11 17:53:22 crc kubenswrapper[4837]: I0111 17:53:22.623832 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-state-metrics-tls-certs\" (UniqueName: \"kubernetes.io/secret/e2197654-71c4-403f-98d8-994d0225a199-kube-state-metrics-tls-certs\") pod \"kube-state-metrics-0\" (UID: \"e2197654-71c4-403f-98d8-994d0225a199\") " pod="openstack/kube-state-metrics-0" Jan 11 17:53:22 crc kubenswrapper[4837]: I0111 17:53:22.626039 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/828211f0-0290-4c77-9227-355162626da7-config-data\") pod \"nova-api-0\" (UID: \"828211f0-0290-4c77-9227-355162626da7\") " pod="openstack/nova-api-0" Jan 11 17:53:22 crc kubenswrapper[4837]: I0111 17:53:22.626229 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-state-metrics-tls-config\" (UniqueName: \"kubernetes.io/secret/e2197654-71c4-403f-98d8-994d0225a199-kube-state-metrics-tls-config\") pod \"kube-state-metrics-0\" (UID: \"e2197654-71c4-403f-98d8-994d0225a199\") " pod="openstack/kube-state-metrics-0" Jan 11 17:53:22 crc kubenswrapper[4837]: I0111 17:53:22.627184 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e2197654-71c4-403f-98d8-994d0225a199-combined-ca-bundle\") pod \"kube-state-metrics-0\" (UID: \"e2197654-71c4-403f-98d8-994d0225a199\") " pod="openstack/kube-state-metrics-0" Jan 11 17:53:22 crc kubenswrapper[4837]: I0111 17:53:22.629416 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/828211f0-0290-4c77-9227-355162626da7-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"828211f0-0290-4c77-9227-355162626da7\") " pod="openstack/nova-api-0" Jan 11 17:53:22 crc kubenswrapper[4837]: I0111 17:53:22.638710 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-d7lkj\" (UniqueName: \"kubernetes.io/projected/828211f0-0290-4c77-9227-355162626da7-kube-api-access-d7lkj\") pod \"nova-api-0\" (UID: \"828211f0-0290-4c77-9227-355162626da7\") " pod="openstack/nova-api-0" Jan 11 17:53:22 crc kubenswrapper[4837]: I0111 17:53:22.643225 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-75x9v\" (UniqueName: \"kubernetes.io/projected/e2197654-71c4-403f-98d8-994d0225a199-kube-api-access-75x9v\") pod \"kube-state-metrics-0\" (UID: \"e2197654-71c4-403f-98d8-994d0225a199\") " pod="openstack/kube-state-metrics-0" Jan 11 17:53:22 crc kubenswrapper[4837]: I0111 17:53:22.656809 4837 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-metadata-0" Jan 11 17:53:22 crc kubenswrapper[4837]: I0111 17:53:22.656971 4837 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-metadata-0" Jan 11 17:53:22 crc kubenswrapper[4837]: I0111 17:53:22.761322 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Jan 11 17:53:22 crc kubenswrapper[4837]: I0111 17:53:22.787930 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/kube-state-metrics-0" Jan 11 17:53:23 crc kubenswrapper[4837]: I0111 17:53:23.156617 4837 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/kube-state-metrics-0"] Jan 11 17:53:23 crc kubenswrapper[4837]: I0111 17:53:23.254181 4837 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Jan 11 17:53:23 crc kubenswrapper[4837]: I0111 17:53:23.382182 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"828211f0-0290-4c77-9227-355162626da7","Type":"ContainerStarted","Data":"63a4382173bdba943b7dac60ba6885a769c654e8fc4ffe0302c3b49d62ac774f"} Jan 11 17:53:23 crc kubenswrapper[4837]: I0111 17:53:23.383873 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"33ef5a38-f83c-40ea-8036-fe830d8a32a3","Type":"ContainerStarted","Data":"05e495b4e01872df82d2f91347f5de6102ead04db40726141e08985501daa0a0"} Jan 11 17:53:23 crc kubenswrapper[4837]: I0111 17:53:23.383924 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"33ef5a38-f83c-40ea-8036-fe830d8a32a3","Type":"ContainerStarted","Data":"f5117c2dfd5aed3f94406d50570fe2d997abb59c1e98589d655d4a64b1e3ddad"} Jan 11 17:53:23 crc kubenswrapper[4837]: I0111 17:53:23.384974 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/kube-state-metrics-0" event={"ID":"e2197654-71c4-403f-98d8-994d0225a199","Type":"ContainerStarted","Data":"1eca4c417af63c131f6a9916eaa41166e6c4f5efdfdaf3472c4cb749011a1278"} Jan 11 17:53:23 crc kubenswrapper[4837]: I0111 17:53:23.404256 4837 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-scheduler-0" podStartSLOduration=2.404236416 podStartE2EDuration="2.404236416s" podCreationTimestamp="2026-01-11 17:53:21 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-11 17:53:23.399971302 +0000 UTC m=+1377.578164018" watchObservedRunningTime="2026-01-11 17:53:23.404236416 +0000 UTC m=+1377.582429122" Jan 11 17:53:23 crc kubenswrapper[4837]: I0111 17:53:23.540946 4837 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Jan 11 17:53:23 crc kubenswrapper[4837]: I0111 17:53:23.541570 4837 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="733daa2a-14f9-4db9-9b06-a6afe117d45f" containerName="ceilometer-central-agent" containerID="cri-o://c173d6c6c423a1f861c50b41c188edb52e3cc20d19dc7ac4ecd2a02137875d46" gracePeriod=30 Jan 11 17:53:23 crc kubenswrapper[4837]: I0111 17:53:23.541694 4837 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="733daa2a-14f9-4db9-9b06-a6afe117d45f" containerName="proxy-httpd" containerID="cri-o://bb9e8fc5fdf397b5d2cdeaa4535357a40d80a43da8c4ab0471ead928b457d960" gracePeriod=30 Jan 11 17:53:23 crc kubenswrapper[4837]: I0111 17:53:23.541700 4837 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="733daa2a-14f9-4db9-9b06-a6afe117d45f" containerName="ceilometer-notification-agent" containerID="cri-o://b9cf89d5fc6a1db3e6027d1987746e496028b18cfdf2922d1ea2a823464a0bb1" gracePeriod=30 Jan 11 17:53:23 crc kubenswrapper[4837]: I0111 17:53:23.541733 4837 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="733daa2a-14f9-4db9-9b06-a6afe117d45f" containerName="sg-core" containerID="cri-o://3f3f30d83ec327526a1ce91de2a24a85d7a4d53b2ad32c0344bc32d7e03e013b" gracePeriod=30 Jan 11 17:53:24 crc kubenswrapper[4837]: I0111 17:53:24.377160 4837 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="66fa097f-464b-4efd-b0e3-8129153d92d5" path="/var/lib/kubelet/pods/66fa097f-464b-4efd-b0e3-8129153d92d5/volumes" Jan 11 17:53:24 crc kubenswrapper[4837]: I0111 17:53:24.378516 4837 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="cad8e11f-3ef0-4043-a49e-308c103a973f" path="/var/lib/kubelet/pods/cad8e11f-3ef0-4043-a49e-308c103a973f/volumes" Jan 11 17:53:24 crc kubenswrapper[4837]: I0111 17:53:24.397446 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/kube-state-metrics-0" event={"ID":"e2197654-71c4-403f-98d8-994d0225a199","Type":"ContainerStarted","Data":"530d9a6aa0a5e1b2efe42e6604bee7898cbd790881d51c363571b17d860e451d"} Jan 11 17:53:24 crc kubenswrapper[4837]: I0111 17:53:24.398765 4837 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/kube-state-metrics-0" Jan 11 17:53:24 crc kubenswrapper[4837]: I0111 17:53:24.402521 4837 generic.go:334] "Generic (PLEG): container finished" podID="733daa2a-14f9-4db9-9b06-a6afe117d45f" containerID="bb9e8fc5fdf397b5d2cdeaa4535357a40d80a43da8c4ab0471ead928b457d960" exitCode=0 Jan 11 17:53:24 crc kubenswrapper[4837]: I0111 17:53:24.402551 4837 generic.go:334] "Generic (PLEG): container finished" podID="733daa2a-14f9-4db9-9b06-a6afe117d45f" containerID="3f3f30d83ec327526a1ce91de2a24a85d7a4d53b2ad32c0344bc32d7e03e013b" exitCode=2 Jan 11 17:53:24 crc kubenswrapper[4837]: I0111 17:53:24.402561 4837 generic.go:334] "Generic (PLEG): container finished" podID="733daa2a-14f9-4db9-9b06-a6afe117d45f" containerID="c173d6c6c423a1f861c50b41c188edb52e3cc20d19dc7ac4ecd2a02137875d46" exitCode=0 Jan 11 17:53:24 crc kubenswrapper[4837]: I0111 17:53:24.402596 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"733daa2a-14f9-4db9-9b06-a6afe117d45f","Type":"ContainerDied","Data":"bb9e8fc5fdf397b5d2cdeaa4535357a40d80a43da8c4ab0471ead928b457d960"} Jan 11 17:53:24 crc kubenswrapper[4837]: I0111 17:53:24.402630 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"733daa2a-14f9-4db9-9b06-a6afe117d45f","Type":"ContainerDied","Data":"3f3f30d83ec327526a1ce91de2a24a85d7a4d53b2ad32c0344bc32d7e03e013b"} Jan 11 17:53:24 crc kubenswrapper[4837]: I0111 17:53:24.402643 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"733daa2a-14f9-4db9-9b06-a6afe117d45f","Type":"ContainerDied","Data":"c173d6c6c423a1f861c50b41c188edb52e3cc20d19dc7ac4ecd2a02137875d46"} Jan 11 17:53:24 crc kubenswrapper[4837]: I0111 17:53:24.404769 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"828211f0-0290-4c77-9227-355162626da7","Type":"ContainerStarted","Data":"8cc66cf28427707b0c7b04e92d8fbf6d384b795df55598e6dc8b25c14da582a2"} Jan 11 17:53:24 crc kubenswrapper[4837]: I0111 17:53:24.404788 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"828211f0-0290-4c77-9227-355162626da7","Type":"ContainerStarted","Data":"7c5b8b71b307d694cce22170c6a15a275d77a4de9ceed97f2be6074fcc3e266c"} Jan 11 17:53:24 crc kubenswrapper[4837]: I0111 17:53:24.422413 4837 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/kube-state-metrics-0" podStartSLOduration=1.9286512820000001 podStartE2EDuration="2.422379986s" podCreationTimestamp="2026-01-11 17:53:22 +0000 UTC" firstStartedPulling="2026-01-11 17:53:23.168472978 +0000 UTC m=+1377.346665684" lastFinishedPulling="2026-01-11 17:53:23.662201672 +0000 UTC m=+1377.840394388" observedRunningTime="2026-01-11 17:53:24.41281228 +0000 UTC m=+1378.591005026" watchObservedRunningTime="2026-01-11 17:53:24.422379986 +0000 UTC m=+1378.600572732" Jan 11 17:53:24 crc kubenswrapper[4837]: I0111 17:53:24.446734 4837 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-api-0" podStartSLOduration=2.446711939 podStartE2EDuration="2.446711939s" podCreationTimestamp="2026-01-11 17:53:22 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-11 17:53:24.437665956 +0000 UTC m=+1378.615858662" watchObservedRunningTime="2026-01-11 17:53:24.446711939 +0000 UTC m=+1378.624904655" Jan 11 17:53:24 crc kubenswrapper[4837]: E0111 17:53:24.895164 4837 kubelet_node_status.go:756] "Failed to set some node status fields" err="failed to validate nodeIP: route ip+net: no such network interface" node="crc" Jan 11 17:53:25 crc kubenswrapper[4837]: I0111 17:53:25.117110 4837 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Jan 11 17:53:25 crc kubenswrapper[4837]: I0111 17:53:25.181251 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/733daa2a-14f9-4db9-9b06-a6afe117d45f-scripts\") pod \"733daa2a-14f9-4db9-9b06-a6afe117d45f\" (UID: \"733daa2a-14f9-4db9-9b06-a6afe117d45f\") " Jan 11 17:53:25 crc kubenswrapper[4837]: I0111 17:53:25.181348 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/733daa2a-14f9-4db9-9b06-a6afe117d45f-config-data\") pod \"733daa2a-14f9-4db9-9b06-a6afe117d45f\" (UID: \"733daa2a-14f9-4db9-9b06-a6afe117d45f\") " Jan 11 17:53:25 crc kubenswrapper[4837]: I0111 17:53:25.181413 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ksfvc\" (UniqueName: \"kubernetes.io/projected/733daa2a-14f9-4db9-9b06-a6afe117d45f-kube-api-access-ksfvc\") pod \"733daa2a-14f9-4db9-9b06-a6afe117d45f\" (UID: \"733daa2a-14f9-4db9-9b06-a6afe117d45f\") " Jan 11 17:53:25 crc kubenswrapper[4837]: I0111 17:53:25.181506 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/733daa2a-14f9-4db9-9b06-a6afe117d45f-sg-core-conf-yaml\") pod \"733daa2a-14f9-4db9-9b06-a6afe117d45f\" (UID: \"733daa2a-14f9-4db9-9b06-a6afe117d45f\") " Jan 11 17:53:25 crc kubenswrapper[4837]: I0111 17:53:25.181580 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/733daa2a-14f9-4db9-9b06-a6afe117d45f-combined-ca-bundle\") pod \"733daa2a-14f9-4db9-9b06-a6afe117d45f\" (UID: \"733daa2a-14f9-4db9-9b06-a6afe117d45f\") " Jan 11 17:53:25 crc kubenswrapper[4837]: I0111 17:53:25.181654 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/733daa2a-14f9-4db9-9b06-a6afe117d45f-log-httpd\") pod \"733daa2a-14f9-4db9-9b06-a6afe117d45f\" (UID: \"733daa2a-14f9-4db9-9b06-a6afe117d45f\") " Jan 11 17:53:25 crc kubenswrapper[4837]: I0111 17:53:25.181707 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/733daa2a-14f9-4db9-9b06-a6afe117d45f-run-httpd\") pod \"733daa2a-14f9-4db9-9b06-a6afe117d45f\" (UID: \"733daa2a-14f9-4db9-9b06-a6afe117d45f\") " Jan 11 17:53:25 crc kubenswrapper[4837]: I0111 17:53:25.182313 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/733daa2a-14f9-4db9-9b06-a6afe117d45f-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "733daa2a-14f9-4db9-9b06-a6afe117d45f" (UID: "733daa2a-14f9-4db9-9b06-a6afe117d45f"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 11 17:53:25 crc kubenswrapper[4837]: I0111 17:53:25.182454 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/733daa2a-14f9-4db9-9b06-a6afe117d45f-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "733daa2a-14f9-4db9-9b06-a6afe117d45f" (UID: "733daa2a-14f9-4db9-9b06-a6afe117d45f"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 11 17:53:25 crc kubenswrapper[4837]: I0111 17:53:25.189834 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/733daa2a-14f9-4db9-9b06-a6afe117d45f-scripts" (OuterVolumeSpecName: "scripts") pod "733daa2a-14f9-4db9-9b06-a6afe117d45f" (UID: "733daa2a-14f9-4db9-9b06-a6afe117d45f"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 11 17:53:25 crc kubenswrapper[4837]: I0111 17:53:25.189844 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/733daa2a-14f9-4db9-9b06-a6afe117d45f-kube-api-access-ksfvc" (OuterVolumeSpecName: "kube-api-access-ksfvc") pod "733daa2a-14f9-4db9-9b06-a6afe117d45f" (UID: "733daa2a-14f9-4db9-9b06-a6afe117d45f"). InnerVolumeSpecName "kube-api-access-ksfvc". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 11 17:53:25 crc kubenswrapper[4837]: I0111 17:53:25.217925 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/733daa2a-14f9-4db9-9b06-a6afe117d45f-sg-core-conf-yaml" (OuterVolumeSpecName: "sg-core-conf-yaml") pod "733daa2a-14f9-4db9-9b06-a6afe117d45f" (UID: "733daa2a-14f9-4db9-9b06-a6afe117d45f"). InnerVolumeSpecName "sg-core-conf-yaml". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 11 17:53:25 crc kubenswrapper[4837]: I0111 17:53:25.282895 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/733daa2a-14f9-4db9-9b06-a6afe117d45f-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "733daa2a-14f9-4db9-9b06-a6afe117d45f" (UID: "733daa2a-14f9-4db9-9b06-a6afe117d45f"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 11 17:53:25 crc kubenswrapper[4837]: I0111 17:53:25.284040 4837 reconciler_common.go:293] "Volume detached for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/733daa2a-14f9-4db9-9b06-a6afe117d45f-sg-core-conf-yaml\") on node \"crc\" DevicePath \"\"" Jan 11 17:53:25 crc kubenswrapper[4837]: I0111 17:53:25.284134 4837 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/733daa2a-14f9-4db9-9b06-a6afe117d45f-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 11 17:53:25 crc kubenswrapper[4837]: I0111 17:53:25.284148 4837 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/733daa2a-14f9-4db9-9b06-a6afe117d45f-log-httpd\") on node \"crc\" DevicePath \"\"" Jan 11 17:53:25 crc kubenswrapper[4837]: I0111 17:53:25.284161 4837 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/733daa2a-14f9-4db9-9b06-a6afe117d45f-run-httpd\") on node \"crc\" DevicePath \"\"" Jan 11 17:53:25 crc kubenswrapper[4837]: I0111 17:53:25.284199 4837 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/733daa2a-14f9-4db9-9b06-a6afe117d45f-scripts\") on node \"crc\" DevicePath \"\"" Jan 11 17:53:25 crc kubenswrapper[4837]: I0111 17:53:25.284211 4837 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-ksfvc\" (UniqueName: \"kubernetes.io/projected/733daa2a-14f9-4db9-9b06-a6afe117d45f-kube-api-access-ksfvc\") on node \"crc\" DevicePath \"\"" Jan 11 17:53:25 crc kubenswrapper[4837]: I0111 17:53:25.312393 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/733daa2a-14f9-4db9-9b06-a6afe117d45f-config-data" (OuterVolumeSpecName: "config-data") pod "733daa2a-14f9-4db9-9b06-a6afe117d45f" (UID: "733daa2a-14f9-4db9-9b06-a6afe117d45f"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 11 17:53:25 crc kubenswrapper[4837]: I0111 17:53:25.386421 4837 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/733daa2a-14f9-4db9-9b06-a6afe117d45f-config-data\") on node \"crc\" DevicePath \"\"" Jan 11 17:53:25 crc kubenswrapper[4837]: I0111 17:53:25.416459 4837 generic.go:334] "Generic (PLEG): container finished" podID="733daa2a-14f9-4db9-9b06-a6afe117d45f" containerID="b9cf89d5fc6a1db3e6027d1987746e496028b18cfdf2922d1ea2a823464a0bb1" exitCode=0 Jan 11 17:53:25 crc kubenswrapper[4837]: I0111 17:53:25.416512 4837 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Jan 11 17:53:25 crc kubenswrapper[4837]: I0111 17:53:25.416532 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"733daa2a-14f9-4db9-9b06-a6afe117d45f","Type":"ContainerDied","Data":"b9cf89d5fc6a1db3e6027d1987746e496028b18cfdf2922d1ea2a823464a0bb1"} Jan 11 17:53:25 crc kubenswrapper[4837]: I0111 17:53:25.417818 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"733daa2a-14f9-4db9-9b06-a6afe117d45f","Type":"ContainerDied","Data":"efe9482e114d066bd9739e224ef72dd9fe4cf968aa0171d9071da9e71c1e86d0"} Jan 11 17:53:25 crc kubenswrapper[4837]: I0111 17:53:25.417857 4837 scope.go:117] "RemoveContainer" containerID="bb9e8fc5fdf397b5d2cdeaa4535357a40d80a43da8c4ab0471ead928b457d960" Jan 11 17:53:25 crc kubenswrapper[4837]: I0111 17:53:25.438626 4837 scope.go:117] "RemoveContainer" containerID="3f3f30d83ec327526a1ce91de2a24a85d7a4d53b2ad32c0344bc32d7e03e013b" Jan 11 17:53:25 crc kubenswrapper[4837]: I0111 17:53:25.457474 4837 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Jan 11 17:53:25 crc kubenswrapper[4837]: I0111 17:53:25.471839 4837 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ceilometer-0"] Jan 11 17:53:25 crc kubenswrapper[4837]: I0111 17:53:25.485429 4837 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceilometer-0"] Jan 11 17:53:25 crc kubenswrapper[4837]: E0111 17:53:25.486298 4837 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="733daa2a-14f9-4db9-9b06-a6afe117d45f" containerName="sg-core" Jan 11 17:53:25 crc kubenswrapper[4837]: I0111 17:53:25.486463 4837 state_mem.go:107] "Deleted CPUSet assignment" podUID="733daa2a-14f9-4db9-9b06-a6afe117d45f" containerName="sg-core" Jan 11 17:53:25 crc kubenswrapper[4837]: E0111 17:53:25.486613 4837 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="733daa2a-14f9-4db9-9b06-a6afe117d45f" containerName="proxy-httpd" Jan 11 17:53:25 crc kubenswrapper[4837]: I0111 17:53:25.486757 4837 state_mem.go:107] "Deleted CPUSet assignment" podUID="733daa2a-14f9-4db9-9b06-a6afe117d45f" containerName="proxy-httpd" Jan 11 17:53:25 crc kubenswrapper[4837]: E0111 17:53:25.486906 4837 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="733daa2a-14f9-4db9-9b06-a6afe117d45f" containerName="ceilometer-central-agent" Jan 11 17:53:25 crc kubenswrapper[4837]: I0111 17:53:25.487017 4837 state_mem.go:107] "Deleted CPUSet assignment" podUID="733daa2a-14f9-4db9-9b06-a6afe117d45f" containerName="ceilometer-central-agent" Jan 11 17:53:25 crc kubenswrapper[4837]: E0111 17:53:25.487156 4837 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="733daa2a-14f9-4db9-9b06-a6afe117d45f" containerName="ceilometer-notification-agent" Jan 11 17:53:25 crc kubenswrapper[4837]: I0111 17:53:25.487298 4837 state_mem.go:107] "Deleted CPUSet assignment" podUID="733daa2a-14f9-4db9-9b06-a6afe117d45f" containerName="ceilometer-notification-agent" Jan 11 17:53:25 crc kubenswrapper[4837]: I0111 17:53:25.487780 4837 scope.go:117] "RemoveContainer" containerID="b9cf89d5fc6a1db3e6027d1987746e496028b18cfdf2922d1ea2a823464a0bb1" Jan 11 17:53:25 crc kubenswrapper[4837]: I0111 17:53:25.488082 4837 memory_manager.go:354] "RemoveStaleState removing state" podUID="733daa2a-14f9-4db9-9b06-a6afe117d45f" containerName="ceilometer-notification-agent" Jan 11 17:53:25 crc kubenswrapper[4837]: I0111 17:53:25.488275 4837 memory_manager.go:354] "RemoveStaleState removing state" podUID="733daa2a-14f9-4db9-9b06-a6afe117d45f" containerName="sg-core" Jan 11 17:53:25 crc kubenswrapper[4837]: I0111 17:53:25.488430 4837 memory_manager.go:354] "RemoveStaleState removing state" podUID="733daa2a-14f9-4db9-9b06-a6afe117d45f" containerName="proxy-httpd" Jan 11 17:53:25 crc kubenswrapper[4837]: I0111 17:53:25.488603 4837 memory_manager.go:354] "RemoveStaleState removing state" podUID="733daa2a-14f9-4db9-9b06-a6afe117d45f" containerName="ceilometer-central-agent" Jan 11 17:53:25 crc kubenswrapper[4837]: I0111 17:53:25.491580 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Jan 11 17:53:25 crc kubenswrapper[4837]: I0111 17:53:25.493822 4837 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Jan 11 17:53:25 crc kubenswrapper[4837]: I0111 17:53:25.495124 4837 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-scripts" Jan 11 17:53:25 crc kubenswrapper[4837]: I0111 17:53:25.495277 4837 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-ceilometer-internal-svc" Jan 11 17:53:25 crc kubenswrapper[4837]: I0111 17:53:25.495770 4837 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-config-data" Jan 11 17:53:25 crc kubenswrapper[4837]: I0111 17:53:25.541156 4837 scope.go:117] "RemoveContainer" containerID="c173d6c6c423a1f861c50b41c188edb52e3cc20d19dc7ac4ecd2a02137875d46" Jan 11 17:53:25 crc kubenswrapper[4837]: I0111 17:53:25.572455 4837 scope.go:117] "RemoveContainer" containerID="bb9e8fc5fdf397b5d2cdeaa4535357a40d80a43da8c4ab0471ead928b457d960" Jan 11 17:53:25 crc kubenswrapper[4837]: E0111 17:53:25.572910 4837 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"bb9e8fc5fdf397b5d2cdeaa4535357a40d80a43da8c4ab0471ead928b457d960\": container with ID starting with bb9e8fc5fdf397b5d2cdeaa4535357a40d80a43da8c4ab0471ead928b457d960 not found: ID does not exist" containerID="bb9e8fc5fdf397b5d2cdeaa4535357a40d80a43da8c4ab0471ead928b457d960" Jan 11 17:53:25 crc kubenswrapper[4837]: I0111 17:53:25.572951 4837 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"bb9e8fc5fdf397b5d2cdeaa4535357a40d80a43da8c4ab0471ead928b457d960"} err="failed to get container status \"bb9e8fc5fdf397b5d2cdeaa4535357a40d80a43da8c4ab0471ead928b457d960\": rpc error: code = NotFound desc = could not find container \"bb9e8fc5fdf397b5d2cdeaa4535357a40d80a43da8c4ab0471ead928b457d960\": container with ID starting with bb9e8fc5fdf397b5d2cdeaa4535357a40d80a43da8c4ab0471ead928b457d960 not found: ID does not exist" Jan 11 17:53:25 crc kubenswrapper[4837]: I0111 17:53:25.572997 4837 scope.go:117] "RemoveContainer" containerID="3f3f30d83ec327526a1ce91de2a24a85d7a4d53b2ad32c0344bc32d7e03e013b" Jan 11 17:53:25 crc kubenswrapper[4837]: E0111 17:53:25.573341 4837 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"3f3f30d83ec327526a1ce91de2a24a85d7a4d53b2ad32c0344bc32d7e03e013b\": container with ID starting with 3f3f30d83ec327526a1ce91de2a24a85d7a4d53b2ad32c0344bc32d7e03e013b not found: ID does not exist" containerID="3f3f30d83ec327526a1ce91de2a24a85d7a4d53b2ad32c0344bc32d7e03e013b" Jan 11 17:53:25 crc kubenswrapper[4837]: I0111 17:53:25.573360 4837 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3f3f30d83ec327526a1ce91de2a24a85d7a4d53b2ad32c0344bc32d7e03e013b"} err="failed to get container status \"3f3f30d83ec327526a1ce91de2a24a85d7a4d53b2ad32c0344bc32d7e03e013b\": rpc error: code = NotFound desc = could not find container \"3f3f30d83ec327526a1ce91de2a24a85d7a4d53b2ad32c0344bc32d7e03e013b\": container with ID starting with 3f3f30d83ec327526a1ce91de2a24a85d7a4d53b2ad32c0344bc32d7e03e013b not found: ID does not exist" Jan 11 17:53:25 crc kubenswrapper[4837]: I0111 17:53:25.573373 4837 scope.go:117] "RemoveContainer" containerID="b9cf89d5fc6a1db3e6027d1987746e496028b18cfdf2922d1ea2a823464a0bb1" Jan 11 17:53:25 crc kubenswrapper[4837]: E0111 17:53:25.573604 4837 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b9cf89d5fc6a1db3e6027d1987746e496028b18cfdf2922d1ea2a823464a0bb1\": container with ID starting with b9cf89d5fc6a1db3e6027d1987746e496028b18cfdf2922d1ea2a823464a0bb1 not found: ID does not exist" containerID="b9cf89d5fc6a1db3e6027d1987746e496028b18cfdf2922d1ea2a823464a0bb1" Jan 11 17:53:25 crc kubenswrapper[4837]: I0111 17:53:25.573635 4837 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b9cf89d5fc6a1db3e6027d1987746e496028b18cfdf2922d1ea2a823464a0bb1"} err="failed to get container status \"b9cf89d5fc6a1db3e6027d1987746e496028b18cfdf2922d1ea2a823464a0bb1\": rpc error: code = NotFound desc = could not find container \"b9cf89d5fc6a1db3e6027d1987746e496028b18cfdf2922d1ea2a823464a0bb1\": container with ID starting with b9cf89d5fc6a1db3e6027d1987746e496028b18cfdf2922d1ea2a823464a0bb1 not found: ID does not exist" Jan 11 17:53:25 crc kubenswrapper[4837]: I0111 17:53:25.573655 4837 scope.go:117] "RemoveContainer" containerID="c173d6c6c423a1f861c50b41c188edb52e3cc20d19dc7ac4ecd2a02137875d46" Jan 11 17:53:25 crc kubenswrapper[4837]: E0111 17:53:25.573896 4837 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"c173d6c6c423a1f861c50b41c188edb52e3cc20d19dc7ac4ecd2a02137875d46\": container with ID starting with c173d6c6c423a1f861c50b41c188edb52e3cc20d19dc7ac4ecd2a02137875d46 not found: ID does not exist" containerID="c173d6c6c423a1f861c50b41c188edb52e3cc20d19dc7ac4ecd2a02137875d46" Jan 11 17:53:25 crc kubenswrapper[4837]: I0111 17:53:25.573920 4837 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c173d6c6c423a1f861c50b41c188edb52e3cc20d19dc7ac4ecd2a02137875d46"} err="failed to get container status \"c173d6c6c423a1f861c50b41c188edb52e3cc20d19dc7ac4ecd2a02137875d46\": rpc error: code = NotFound desc = could not find container \"c173d6c6c423a1f861c50b41c188edb52e3cc20d19dc7ac4ecd2a02137875d46\": container with ID starting with c173d6c6c423a1f861c50b41c188edb52e3cc20d19dc7ac4ecd2a02137875d46 not found: ID does not exist" Jan 11 17:53:25 crc kubenswrapper[4837]: I0111 17:53:25.592251 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/ee7eb3c4-5f92-4488-9853-f7e4227e176f-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"ee7eb3c4-5f92-4488-9853-f7e4227e176f\") " pod="openstack/ceilometer-0" Jan 11 17:53:25 crc kubenswrapper[4837]: I0111 17:53:25.592343 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pkcfp\" (UniqueName: \"kubernetes.io/projected/ee7eb3c4-5f92-4488-9853-f7e4227e176f-kube-api-access-pkcfp\") pod \"ceilometer-0\" (UID: \"ee7eb3c4-5f92-4488-9853-f7e4227e176f\") " pod="openstack/ceilometer-0" Jan 11 17:53:25 crc kubenswrapper[4837]: I0111 17:53:25.592397 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ee7eb3c4-5f92-4488-9853-f7e4227e176f-config-data\") pod \"ceilometer-0\" (UID: \"ee7eb3c4-5f92-4488-9853-f7e4227e176f\") " pod="openstack/ceilometer-0" Jan 11 17:53:25 crc kubenswrapper[4837]: I0111 17:53:25.592429 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/ee7eb3c4-5f92-4488-9853-f7e4227e176f-run-httpd\") pod \"ceilometer-0\" (UID: \"ee7eb3c4-5f92-4488-9853-f7e4227e176f\") " pod="openstack/ceilometer-0" Jan 11 17:53:25 crc kubenswrapper[4837]: I0111 17:53:25.592504 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ee7eb3c4-5f92-4488-9853-f7e4227e176f-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"ee7eb3c4-5f92-4488-9853-f7e4227e176f\") " pod="openstack/ceilometer-0" Jan 11 17:53:25 crc kubenswrapper[4837]: I0111 17:53:25.592573 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/ee7eb3c4-5f92-4488-9853-f7e4227e176f-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"ee7eb3c4-5f92-4488-9853-f7e4227e176f\") " pod="openstack/ceilometer-0" Jan 11 17:53:25 crc kubenswrapper[4837]: I0111 17:53:25.592709 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/ee7eb3c4-5f92-4488-9853-f7e4227e176f-scripts\") pod \"ceilometer-0\" (UID: \"ee7eb3c4-5f92-4488-9853-f7e4227e176f\") " pod="openstack/ceilometer-0" Jan 11 17:53:25 crc kubenswrapper[4837]: I0111 17:53:25.592758 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/ee7eb3c4-5f92-4488-9853-f7e4227e176f-log-httpd\") pod \"ceilometer-0\" (UID: \"ee7eb3c4-5f92-4488-9853-f7e4227e176f\") " pod="openstack/ceilometer-0" Jan 11 17:53:25 crc kubenswrapper[4837]: I0111 17:53:25.694315 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/ee7eb3c4-5f92-4488-9853-f7e4227e176f-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"ee7eb3c4-5f92-4488-9853-f7e4227e176f\") " pod="openstack/ceilometer-0" Jan 11 17:53:25 crc kubenswrapper[4837]: I0111 17:53:25.694367 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-pkcfp\" (UniqueName: \"kubernetes.io/projected/ee7eb3c4-5f92-4488-9853-f7e4227e176f-kube-api-access-pkcfp\") pod \"ceilometer-0\" (UID: \"ee7eb3c4-5f92-4488-9853-f7e4227e176f\") " pod="openstack/ceilometer-0" Jan 11 17:53:25 crc kubenswrapper[4837]: I0111 17:53:25.694403 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ee7eb3c4-5f92-4488-9853-f7e4227e176f-config-data\") pod \"ceilometer-0\" (UID: \"ee7eb3c4-5f92-4488-9853-f7e4227e176f\") " pod="openstack/ceilometer-0" Jan 11 17:53:25 crc kubenswrapper[4837]: I0111 17:53:25.694428 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/ee7eb3c4-5f92-4488-9853-f7e4227e176f-run-httpd\") pod \"ceilometer-0\" (UID: \"ee7eb3c4-5f92-4488-9853-f7e4227e176f\") " pod="openstack/ceilometer-0" Jan 11 17:53:25 crc kubenswrapper[4837]: I0111 17:53:25.694472 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ee7eb3c4-5f92-4488-9853-f7e4227e176f-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"ee7eb3c4-5f92-4488-9853-f7e4227e176f\") " pod="openstack/ceilometer-0" Jan 11 17:53:25 crc kubenswrapper[4837]: I0111 17:53:25.694509 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/ee7eb3c4-5f92-4488-9853-f7e4227e176f-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"ee7eb3c4-5f92-4488-9853-f7e4227e176f\") " pod="openstack/ceilometer-0" Jan 11 17:53:25 crc kubenswrapper[4837]: I0111 17:53:25.694558 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/ee7eb3c4-5f92-4488-9853-f7e4227e176f-scripts\") pod \"ceilometer-0\" (UID: \"ee7eb3c4-5f92-4488-9853-f7e4227e176f\") " pod="openstack/ceilometer-0" Jan 11 17:53:25 crc kubenswrapper[4837]: I0111 17:53:25.694582 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/ee7eb3c4-5f92-4488-9853-f7e4227e176f-log-httpd\") pod \"ceilometer-0\" (UID: \"ee7eb3c4-5f92-4488-9853-f7e4227e176f\") " pod="openstack/ceilometer-0" Jan 11 17:53:25 crc kubenswrapper[4837]: I0111 17:53:25.695336 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/ee7eb3c4-5f92-4488-9853-f7e4227e176f-run-httpd\") pod \"ceilometer-0\" (UID: \"ee7eb3c4-5f92-4488-9853-f7e4227e176f\") " pod="openstack/ceilometer-0" Jan 11 17:53:25 crc kubenswrapper[4837]: I0111 17:53:25.695868 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/ee7eb3c4-5f92-4488-9853-f7e4227e176f-log-httpd\") pod \"ceilometer-0\" (UID: \"ee7eb3c4-5f92-4488-9853-f7e4227e176f\") " pod="openstack/ceilometer-0" Jan 11 17:53:25 crc kubenswrapper[4837]: I0111 17:53:25.701816 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/ee7eb3c4-5f92-4488-9853-f7e4227e176f-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"ee7eb3c4-5f92-4488-9853-f7e4227e176f\") " pod="openstack/ceilometer-0" Jan 11 17:53:25 crc kubenswrapper[4837]: I0111 17:53:25.702838 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ee7eb3c4-5f92-4488-9853-f7e4227e176f-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"ee7eb3c4-5f92-4488-9853-f7e4227e176f\") " pod="openstack/ceilometer-0" Jan 11 17:53:25 crc kubenswrapper[4837]: I0111 17:53:25.702912 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ee7eb3c4-5f92-4488-9853-f7e4227e176f-config-data\") pod \"ceilometer-0\" (UID: \"ee7eb3c4-5f92-4488-9853-f7e4227e176f\") " pod="openstack/ceilometer-0" Jan 11 17:53:25 crc kubenswrapper[4837]: I0111 17:53:25.703827 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/ee7eb3c4-5f92-4488-9853-f7e4227e176f-scripts\") pod \"ceilometer-0\" (UID: \"ee7eb3c4-5f92-4488-9853-f7e4227e176f\") " pod="openstack/ceilometer-0" Jan 11 17:53:25 crc kubenswrapper[4837]: I0111 17:53:25.704265 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/ee7eb3c4-5f92-4488-9853-f7e4227e176f-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"ee7eb3c4-5f92-4488-9853-f7e4227e176f\") " pod="openstack/ceilometer-0" Jan 11 17:53:25 crc kubenswrapper[4837]: I0111 17:53:25.713902 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-pkcfp\" (UniqueName: \"kubernetes.io/projected/ee7eb3c4-5f92-4488-9853-f7e4227e176f-kube-api-access-pkcfp\") pod \"ceilometer-0\" (UID: \"ee7eb3c4-5f92-4488-9853-f7e4227e176f\") " pod="openstack/ceilometer-0" Jan 11 17:53:25 crc kubenswrapper[4837]: I0111 17:53:25.809271 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Jan 11 17:53:26 crc kubenswrapper[4837]: I0111 17:53:26.270271 4837 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Jan 11 17:53:26 crc kubenswrapper[4837]: W0111 17:53:26.271592 4837 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podee7eb3c4_5f92_4488_9853_f7e4227e176f.slice/crio-eafa9baffe0c05904da0d5d63ccb4b3e251a82add0f708738c1d9fdc6de9f91d WatchSource:0}: Error finding container eafa9baffe0c05904da0d5d63ccb4b3e251a82add0f708738c1d9fdc6de9f91d: Status 404 returned error can't find the container with id eafa9baffe0c05904da0d5d63ccb4b3e251a82add0f708738c1d9fdc6de9f91d Jan 11 17:53:26 crc kubenswrapper[4837]: I0111 17:53:26.387833 4837 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="733daa2a-14f9-4db9-9b06-a6afe117d45f" path="/var/lib/kubelet/pods/733daa2a-14f9-4db9-9b06-a6afe117d45f/volumes" Jan 11 17:53:26 crc kubenswrapper[4837]: I0111 17:53:26.429868 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"ee7eb3c4-5f92-4488-9853-f7e4227e176f","Type":"ContainerStarted","Data":"eafa9baffe0c05904da0d5d63ccb4b3e251a82add0f708738c1d9fdc6de9f91d"} Jan 11 17:53:26 crc kubenswrapper[4837]: I0111 17:53:26.768838 4837 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-scheduler-0" Jan 11 17:53:27 crc kubenswrapper[4837]: I0111 17:53:27.477011 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"ee7eb3c4-5f92-4488-9853-f7e4227e176f","Type":"ContainerStarted","Data":"8e7fc61e189dc675f2de70ebbbb288b4dc6c38fc8eafc7533897329ce4cc860b"} Jan 11 17:53:27 crc kubenswrapper[4837]: I0111 17:53:27.656347 4837 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-metadata-0" Jan 11 17:53:27 crc kubenswrapper[4837]: I0111 17:53:27.656399 4837 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-metadata-0" Jan 11 17:53:28 crc kubenswrapper[4837]: I0111 17:53:28.490455 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"ee7eb3c4-5f92-4488-9853-f7e4227e176f","Type":"ContainerStarted","Data":"375272b3d5168c8e18fb629aab8185dc34e6d92e52d87041946a2e8051463e44"} Jan 11 17:53:28 crc kubenswrapper[4837]: I0111 17:53:28.490803 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"ee7eb3c4-5f92-4488-9853-f7e4227e176f","Type":"ContainerStarted","Data":"e6553301b2ebac718e0397f4f6aaf2943ed02f23e5cfd11903fa2c3f0799b48a"} Jan 11 17:53:28 crc kubenswrapper[4837]: I0111 17:53:28.665956 4837 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-cell1-conductor-0" Jan 11 17:53:28 crc kubenswrapper[4837]: I0111 17:53:28.668800 4837 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-metadata-0" podUID="67f85854-48e7-4151-9051-da416208058a" containerName="nova-metadata-metadata" probeResult="failure" output="Get \"https://10.217.0.195:8775/\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Jan 11 17:53:28 crc kubenswrapper[4837]: I0111 17:53:28.669054 4837 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-metadata-0" podUID="67f85854-48e7-4151-9051-da416208058a" containerName="nova-metadata-log" probeResult="failure" output="Get \"https://10.217.0.195:8775/\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Jan 11 17:53:31 crc kubenswrapper[4837]: I0111 17:53:31.560384 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"ee7eb3c4-5f92-4488-9853-f7e4227e176f","Type":"ContainerStarted","Data":"3db2a6f8fd5510bc8f798967034d30ae868105df5ca7251ca0c8401aeac78654"} Jan 11 17:53:31 crc kubenswrapper[4837]: I0111 17:53:31.561402 4837 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ceilometer-0" Jan 11 17:53:31 crc kubenswrapper[4837]: I0111 17:53:31.610776 4837 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ceilometer-0" podStartSLOduration=2.124779955 podStartE2EDuration="6.610743677s" podCreationTimestamp="2026-01-11 17:53:25 +0000 UTC" firstStartedPulling="2026-01-11 17:53:26.273853864 +0000 UTC m=+1380.452046610" lastFinishedPulling="2026-01-11 17:53:30.759817586 +0000 UTC m=+1384.938010332" observedRunningTime="2026-01-11 17:53:31.605553478 +0000 UTC m=+1385.783746214" watchObservedRunningTime="2026-01-11 17:53:31.610743677 +0000 UTC m=+1385.788936423" Jan 11 17:53:31 crc kubenswrapper[4837]: I0111 17:53:31.769078 4837 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-scheduler-0" Jan 11 17:53:31 crc kubenswrapper[4837]: I0111 17:53:31.823303 4837 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-scheduler-0" Jan 11 17:53:32 crc kubenswrapper[4837]: I0111 17:53:32.600393 4837 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-scheduler-0" Jan 11 17:53:32 crc kubenswrapper[4837]: I0111 17:53:32.762359 4837 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-api-0" Jan 11 17:53:32 crc kubenswrapper[4837]: I0111 17:53:32.762749 4837 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-api-0" Jan 11 17:53:32 crc kubenswrapper[4837]: I0111 17:53:32.799885 4837 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/kube-state-metrics-0" Jan 11 17:53:33 crc kubenswrapper[4837]: I0111 17:53:33.845894 4837 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-api-0" podUID="828211f0-0290-4c77-9227-355162626da7" containerName="nova-api-api" probeResult="failure" output="Get \"http://10.217.0.198:8774/\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Jan 11 17:53:33 crc kubenswrapper[4837]: I0111 17:53:33.845989 4837 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-api-0" podUID="828211f0-0290-4c77-9227-355162626da7" containerName="nova-api-log" probeResult="failure" output="Get \"http://10.217.0.198:8774/\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Jan 11 17:53:37 crc kubenswrapper[4837]: I0111 17:53:37.664070 4837 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-metadata-0" Jan 11 17:53:37 crc kubenswrapper[4837]: I0111 17:53:37.666317 4837 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-metadata-0" Jan 11 17:53:37 crc kubenswrapper[4837]: I0111 17:53:37.672876 4837 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-metadata-0" Jan 11 17:53:38 crc kubenswrapper[4837]: I0111 17:53:38.670240 4837 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-metadata-0" Jan 11 17:53:39 crc kubenswrapper[4837]: I0111 17:53:39.444747 4837 patch_prober.go:28] interesting pod/machine-config-daemon-pqnst container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Jan 11 17:53:39 crc kubenswrapper[4837]: I0111 17:53:39.445091 4837 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-pqnst" podUID="1cf6fa66-290a-4e29-bafc-e60185a22fe2" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Jan 11 17:53:40 crc kubenswrapper[4837]: I0111 17:53:40.468366 4837 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-novncproxy-0" Jan 11 17:53:40 crc kubenswrapper[4837]: I0111 17:53:40.511437 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4a7db2fa-3a60-4cf5-91c0-dcd6c2698017-config-data\") pod \"4a7db2fa-3a60-4cf5-91c0-dcd6c2698017\" (UID: \"4a7db2fa-3a60-4cf5-91c0-dcd6c2698017\") " Jan 11 17:53:40 crc kubenswrapper[4837]: I0111 17:53:40.511530 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-lwrzf\" (UniqueName: \"kubernetes.io/projected/4a7db2fa-3a60-4cf5-91c0-dcd6c2698017-kube-api-access-lwrzf\") pod \"4a7db2fa-3a60-4cf5-91c0-dcd6c2698017\" (UID: \"4a7db2fa-3a60-4cf5-91c0-dcd6c2698017\") " Jan 11 17:53:40 crc kubenswrapper[4837]: I0111 17:53:40.511722 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4a7db2fa-3a60-4cf5-91c0-dcd6c2698017-combined-ca-bundle\") pod \"4a7db2fa-3a60-4cf5-91c0-dcd6c2698017\" (UID: \"4a7db2fa-3a60-4cf5-91c0-dcd6c2698017\") " Jan 11 17:53:40 crc kubenswrapper[4837]: I0111 17:53:40.516859 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4a7db2fa-3a60-4cf5-91c0-dcd6c2698017-kube-api-access-lwrzf" (OuterVolumeSpecName: "kube-api-access-lwrzf") pod "4a7db2fa-3a60-4cf5-91c0-dcd6c2698017" (UID: "4a7db2fa-3a60-4cf5-91c0-dcd6c2698017"). InnerVolumeSpecName "kube-api-access-lwrzf". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 11 17:53:40 crc kubenswrapper[4837]: I0111 17:53:40.539299 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4a7db2fa-3a60-4cf5-91c0-dcd6c2698017-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "4a7db2fa-3a60-4cf5-91c0-dcd6c2698017" (UID: "4a7db2fa-3a60-4cf5-91c0-dcd6c2698017"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 11 17:53:40 crc kubenswrapper[4837]: I0111 17:53:40.547681 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4a7db2fa-3a60-4cf5-91c0-dcd6c2698017-config-data" (OuterVolumeSpecName: "config-data") pod "4a7db2fa-3a60-4cf5-91c0-dcd6c2698017" (UID: "4a7db2fa-3a60-4cf5-91c0-dcd6c2698017"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 11 17:53:40 crc kubenswrapper[4837]: I0111 17:53:40.614421 4837 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-lwrzf\" (UniqueName: \"kubernetes.io/projected/4a7db2fa-3a60-4cf5-91c0-dcd6c2698017-kube-api-access-lwrzf\") on node \"crc\" DevicePath \"\"" Jan 11 17:53:40 crc kubenswrapper[4837]: I0111 17:53:40.614452 4837 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4a7db2fa-3a60-4cf5-91c0-dcd6c2698017-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 11 17:53:40 crc kubenswrapper[4837]: I0111 17:53:40.614462 4837 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4a7db2fa-3a60-4cf5-91c0-dcd6c2698017-config-data\") on node \"crc\" DevicePath \"\"" Jan 11 17:53:40 crc kubenswrapper[4837]: I0111 17:53:40.686830 4837 generic.go:334] "Generic (PLEG): container finished" podID="4a7db2fa-3a60-4cf5-91c0-dcd6c2698017" containerID="f92ef5ffcf365e1c7b136e774917941f7ce4f020a5c85792b9b22f5cddb3d118" exitCode=137 Jan 11 17:53:40 crc kubenswrapper[4837]: I0111 17:53:40.687346 4837 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-novncproxy-0" Jan 11 17:53:40 crc kubenswrapper[4837]: I0111 17:53:40.687212 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-novncproxy-0" event={"ID":"4a7db2fa-3a60-4cf5-91c0-dcd6c2698017","Type":"ContainerDied","Data":"f92ef5ffcf365e1c7b136e774917941f7ce4f020a5c85792b9b22f5cddb3d118"} Jan 11 17:53:40 crc kubenswrapper[4837]: I0111 17:53:40.687433 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-novncproxy-0" event={"ID":"4a7db2fa-3a60-4cf5-91c0-dcd6c2698017","Type":"ContainerDied","Data":"64c0feed2f7dcfe54d08da836ffa3b2a08d7f3e1439561924272aa7c15cd0b30"} Jan 11 17:53:40 crc kubenswrapper[4837]: I0111 17:53:40.687504 4837 scope.go:117] "RemoveContainer" containerID="f92ef5ffcf365e1c7b136e774917941f7ce4f020a5c85792b9b22f5cddb3d118" Jan 11 17:53:40 crc kubenswrapper[4837]: I0111 17:53:40.730174 4837 scope.go:117] "RemoveContainer" containerID="f92ef5ffcf365e1c7b136e774917941f7ce4f020a5c85792b9b22f5cddb3d118" Jan 11 17:53:40 crc kubenswrapper[4837]: E0111 17:53:40.732145 4837 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f92ef5ffcf365e1c7b136e774917941f7ce4f020a5c85792b9b22f5cddb3d118\": container with ID starting with f92ef5ffcf365e1c7b136e774917941f7ce4f020a5c85792b9b22f5cddb3d118 not found: ID does not exist" containerID="f92ef5ffcf365e1c7b136e774917941f7ce4f020a5c85792b9b22f5cddb3d118" Jan 11 17:53:40 crc kubenswrapper[4837]: I0111 17:53:40.732212 4837 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f92ef5ffcf365e1c7b136e774917941f7ce4f020a5c85792b9b22f5cddb3d118"} err="failed to get container status \"f92ef5ffcf365e1c7b136e774917941f7ce4f020a5c85792b9b22f5cddb3d118\": rpc error: code = NotFound desc = could not find container \"f92ef5ffcf365e1c7b136e774917941f7ce4f020a5c85792b9b22f5cddb3d118\": container with ID starting with f92ef5ffcf365e1c7b136e774917941f7ce4f020a5c85792b9b22f5cddb3d118 not found: ID does not exist" Jan 11 17:53:40 crc kubenswrapper[4837]: I0111 17:53:40.744160 4837 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Jan 11 17:53:40 crc kubenswrapper[4837]: I0111 17:53:40.758807 4837 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Jan 11 17:53:40 crc kubenswrapper[4837]: I0111 17:53:40.783968 4837 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Jan 11 17:53:40 crc kubenswrapper[4837]: E0111 17:53:40.784539 4837 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4a7db2fa-3a60-4cf5-91c0-dcd6c2698017" containerName="nova-cell1-novncproxy-novncproxy" Jan 11 17:53:40 crc kubenswrapper[4837]: I0111 17:53:40.784567 4837 state_mem.go:107] "Deleted CPUSet assignment" podUID="4a7db2fa-3a60-4cf5-91c0-dcd6c2698017" containerName="nova-cell1-novncproxy-novncproxy" Jan 11 17:53:40 crc kubenswrapper[4837]: I0111 17:53:40.785083 4837 memory_manager.go:354] "RemoveStaleState removing state" podUID="4a7db2fa-3a60-4cf5-91c0-dcd6c2698017" containerName="nova-cell1-novncproxy-novncproxy" Jan 11 17:53:40 crc kubenswrapper[4837]: I0111 17:53:40.786093 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-novncproxy-0" Jan 11 17:53:40 crc kubenswrapper[4837]: I0111 17:53:40.788762 4837 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-novncproxy-cell1-public-svc" Jan 11 17:53:40 crc kubenswrapper[4837]: I0111 17:53:40.789166 4837 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-novncproxy-cell1-vencrypt" Jan 11 17:53:40 crc kubenswrapper[4837]: I0111 17:53:40.789373 4837 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-novncproxy-config-data" Jan 11 17:53:40 crc kubenswrapper[4837]: I0111 17:53:40.793857 4837 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Jan 11 17:53:40 crc kubenswrapper[4837]: I0111 17:53:40.818640 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-c4nl7\" (UniqueName: \"kubernetes.io/projected/d590d80f-b67b-4740-8433-bcab03dca733-kube-api-access-c4nl7\") pod \"nova-cell1-novncproxy-0\" (UID: \"d590d80f-b67b-4740-8433-bcab03dca733\") " pod="openstack/nova-cell1-novncproxy-0" Jan 11 17:53:40 crc kubenswrapper[4837]: I0111 17:53:40.818840 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"vencrypt-tls-certs\" (UniqueName: \"kubernetes.io/secret/d590d80f-b67b-4740-8433-bcab03dca733-vencrypt-tls-certs\") pod \"nova-cell1-novncproxy-0\" (UID: \"d590d80f-b67b-4740-8433-bcab03dca733\") " pod="openstack/nova-cell1-novncproxy-0" Jan 11 17:53:40 crc kubenswrapper[4837]: I0111 17:53:40.818946 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d590d80f-b67b-4740-8433-bcab03dca733-config-data\") pod \"nova-cell1-novncproxy-0\" (UID: \"d590d80f-b67b-4740-8433-bcab03dca733\") " pod="openstack/nova-cell1-novncproxy-0" Jan 11 17:53:40 crc kubenswrapper[4837]: I0111 17:53:40.819013 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-novncproxy-tls-certs\" (UniqueName: \"kubernetes.io/secret/d590d80f-b67b-4740-8433-bcab03dca733-nova-novncproxy-tls-certs\") pod \"nova-cell1-novncproxy-0\" (UID: \"d590d80f-b67b-4740-8433-bcab03dca733\") " pod="openstack/nova-cell1-novncproxy-0" Jan 11 17:53:40 crc kubenswrapper[4837]: I0111 17:53:40.819186 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d590d80f-b67b-4740-8433-bcab03dca733-combined-ca-bundle\") pod \"nova-cell1-novncproxy-0\" (UID: \"d590d80f-b67b-4740-8433-bcab03dca733\") " pod="openstack/nova-cell1-novncproxy-0" Jan 11 17:53:40 crc kubenswrapper[4837]: I0111 17:53:40.920774 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d590d80f-b67b-4740-8433-bcab03dca733-combined-ca-bundle\") pod \"nova-cell1-novncproxy-0\" (UID: \"d590d80f-b67b-4740-8433-bcab03dca733\") " pod="openstack/nova-cell1-novncproxy-0" Jan 11 17:53:40 crc kubenswrapper[4837]: I0111 17:53:40.920858 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-c4nl7\" (UniqueName: \"kubernetes.io/projected/d590d80f-b67b-4740-8433-bcab03dca733-kube-api-access-c4nl7\") pod \"nova-cell1-novncproxy-0\" (UID: \"d590d80f-b67b-4740-8433-bcab03dca733\") " pod="openstack/nova-cell1-novncproxy-0" Jan 11 17:53:40 crc kubenswrapper[4837]: I0111 17:53:40.920913 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"vencrypt-tls-certs\" (UniqueName: \"kubernetes.io/secret/d590d80f-b67b-4740-8433-bcab03dca733-vencrypt-tls-certs\") pod \"nova-cell1-novncproxy-0\" (UID: \"d590d80f-b67b-4740-8433-bcab03dca733\") " pod="openstack/nova-cell1-novncproxy-0" Jan 11 17:53:40 crc kubenswrapper[4837]: I0111 17:53:40.920948 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d590d80f-b67b-4740-8433-bcab03dca733-config-data\") pod \"nova-cell1-novncproxy-0\" (UID: \"d590d80f-b67b-4740-8433-bcab03dca733\") " pod="openstack/nova-cell1-novncproxy-0" Jan 11 17:53:40 crc kubenswrapper[4837]: I0111 17:53:40.920982 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-novncproxy-tls-certs\" (UniqueName: \"kubernetes.io/secret/d590d80f-b67b-4740-8433-bcab03dca733-nova-novncproxy-tls-certs\") pod \"nova-cell1-novncproxy-0\" (UID: \"d590d80f-b67b-4740-8433-bcab03dca733\") " pod="openstack/nova-cell1-novncproxy-0" Jan 11 17:53:40 crc kubenswrapper[4837]: I0111 17:53:40.924146 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d590d80f-b67b-4740-8433-bcab03dca733-config-data\") pod \"nova-cell1-novncproxy-0\" (UID: \"d590d80f-b67b-4740-8433-bcab03dca733\") " pod="openstack/nova-cell1-novncproxy-0" Jan 11 17:53:40 crc kubenswrapper[4837]: I0111 17:53:40.926002 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"vencrypt-tls-certs\" (UniqueName: \"kubernetes.io/secret/d590d80f-b67b-4740-8433-bcab03dca733-vencrypt-tls-certs\") pod \"nova-cell1-novncproxy-0\" (UID: \"d590d80f-b67b-4740-8433-bcab03dca733\") " pod="openstack/nova-cell1-novncproxy-0" Jan 11 17:53:40 crc kubenswrapper[4837]: I0111 17:53:40.926574 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-novncproxy-tls-certs\" (UniqueName: \"kubernetes.io/secret/d590d80f-b67b-4740-8433-bcab03dca733-nova-novncproxy-tls-certs\") pod \"nova-cell1-novncproxy-0\" (UID: \"d590d80f-b67b-4740-8433-bcab03dca733\") " pod="openstack/nova-cell1-novncproxy-0" Jan 11 17:53:40 crc kubenswrapper[4837]: I0111 17:53:40.933064 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d590d80f-b67b-4740-8433-bcab03dca733-combined-ca-bundle\") pod \"nova-cell1-novncproxy-0\" (UID: \"d590d80f-b67b-4740-8433-bcab03dca733\") " pod="openstack/nova-cell1-novncproxy-0" Jan 11 17:53:40 crc kubenswrapper[4837]: I0111 17:53:40.937063 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-c4nl7\" (UniqueName: \"kubernetes.io/projected/d590d80f-b67b-4740-8433-bcab03dca733-kube-api-access-c4nl7\") pod \"nova-cell1-novncproxy-0\" (UID: \"d590d80f-b67b-4740-8433-bcab03dca733\") " pod="openstack/nova-cell1-novncproxy-0" Jan 11 17:53:41 crc kubenswrapper[4837]: I0111 17:53:41.117549 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-novncproxy-0" Jan 11 17:53:41 crc kubenswrapper[4837]: I0111 17:53:41.594885 4837 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Jan 11 17:53:41 crc kubenswrapper[4837]: I0111 17:53:41.700337 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-novncproxy-0" event={"ID":"d590d80f-b67b-4740-8433-bcab03dca733","Type":"ContainerStarted","Data":"e431c264321fac34489115ae7b379d3284ce4f5ebe52293a921f5fcea633dcc1"} Jan 11 17:53:42 crc kubenswrapper[4837]: I0111 17:53:42.375392 4837 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="4a7db2fa-3a60-4cf5-91c0-dcd6c2698017" path="/var/lib/kubelet/pods/4a7db2fa-3a60-4cf5-91c0-dcd6c2698017/volumes" Jan 11 17:53:42 crc kubenswrapper[4837]: I0111 17:53:42.712091 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-novncproxy-0" event={"ID":"d590d80f-b67b-4740-8433-bcab03dca733","Type":"ContainerStarted","Data":"053e436d38354146a30f543acb8aea4af572f13a2e76cb24d285b893ac1742bc"} Jan 11 17:53:42 crc kubenswrapper[4837]: I0111 17:53:42.736049 4837 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell1-novncproxy-0" podStartSLOduration=2.736028204 podStartE2EDuration="2.736028204s" podCreationTimestamp="2026-01-11 17:53:40 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-11 17:53:42.731894214 +0000 UTC m=+1396.910086940" watchObservedRunningTime="2026-01-11 17:53:42.736028204 +0000 UTC m=+1396.914220920" Jan 11 17:53:42 crc kubenswrapper[4837]: I0111 17:53:42.766568 4837 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-api-0" Jan 11 17:53:42 crc kubenswrapper[4837]: I0111 17:53:42.767166 4837 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-api-0" Jan 11 17:53:42 crc kubenswrapper[4837]: I0111 17:53:42.767507 4837 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-api-0" Jan 11 17:53:42 crc kubenswrapper[4837]: I0111 17:53:42.770716 4837 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-api-0" Jan 11 17:53:43 crc kubenswrapper[4837]: I0111 17:53:43.724457 4837 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-api-0" Jan 11 17:53:43 crc kubenswrapper[4837]: I0111 17:53:43.730045 4837 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-api-0" Jan 11 17:53:43 crc kubenswrapper[4837]: I0111 17:53:43.944617 4837 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-59cf4bdb65-txfdc"] Jan 11 17:53:43 crc kubenswrapper[4837]: I0111 17:53:43.946282 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-59cf4bdb65-txfdc" Jan 11 17:53:43 crc kubenswrapper[4837]: I0111 17:53:43.962791 4837 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-59cf4bdb65-txfdc"] Jan 11 17:53:43 crc kubenswrapper[4837]: I0111 17:53:43.986984 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/d146eef1-2f66-448c-a614-f5832ddbaaa6-dns-swift-storage-0\") pod \"dnsmasq-dns-59cf4bdb65-txfdc\" (UID: \"d146eef1-2f66-448c-a614-f5832ddbaaa6\") " pod="openstack/dnsmasq-dns-59cf4bdb65-txfdc" Jan 11 17:53:43 crc kubenswrapper[4837]: I0111 17:53:43.987144 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/d146eef1-2f66-448c-a614-f5832ddbaaa6-ovsdbserver-sb\") pod \"dnsmasq-dns-59cf4bdb65-txfdc\" (UID: \"d146eef1-2f66-448c-a614-f5832ddbaaa6\") " pod="openstack/dnsmasq-dns-59cf4bdb65-txfdc" Jan 11 17:53:43 crc kubenswrapper[4837]: I0111 17:53:43.987170 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d146eef1-2f66-448c-a614-f5832ddbaaa6-config\") pod \"dnsmasq-dns-59cf4bdb65-txfdc\" (UID: \"d146eef1-2f66-448c-a614-f5832ddbaaa6\") " pod="openstack/dnsmasq-dns-59cf4bdb65-txfdc" Jan 11 17:53:43 crc kubenswrapper[4837]: I0111 17:53:43.987198 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/d146eef1-2f66-448c-a614-f5832ddbaaa6-ovsdbserver-nb\") pod \"dnsmasq-dns-59cf4bdb65-txfdc\" (UID: \"d146eef1-2f66-448c-a614-f5832ddbaaa6\") " pod="openstack/dnsmasq-dns-59cf4bdb65-txfdc" Jan 11 17:53:43 crc kubenswrapper[4837]: I0111 17:53:43.987349 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/d146eef1-2f66-448c-a614-f5832ddbaaa6-dns-svc\") pod \"dnsmasq-dns-59cf4bdb65-txfdc\" (UID: \"d146eef1-2f66-448c-a614-f5832ddbaaa6\") " pod="openstack/dnsmasq-dns-59cf4bdb65-txfdc" Jan 11 17:53:43 crc kubenswrapper[4837]: I0111 17:53:43.987397 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-c9jgw\" (UniqueName: \"kubernetes.io/projected/d146eef1-2f66-448c-a614-f5832ddbaaa6-kube-api-access-c9jgw\") pod \"dnsmasq-dns-59cf4bdb65-txfdc\" (UID: \"d146eef1-2f66-448c-a614-f5832ddbaaa6\") " pod="openstack/dnsmasq-dns-59cf4bdb65-txfdc" Jan 11 17:53:44 crc kubenswrapper[4837]: I0111 17:53:44.088775 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/d146eef1-2f66-448c-a614-f5832ddbaaa6-dns-svc\") pod \"dnsmasq-dns-59cf4bdb65-txfdc\" (UID: \"d146eef1-2f66-448c-a614-f5832ddbaaa6\") " pod="openstack/dnsmasq-dns-59cf4bdb65-txfdc" Jan 11 17:53:44 crc kubenswrapper[4837]: I0111 17:53:44.088823 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-c9jgw\" (UniqueName: \"kubernetes.io/projected/d146eef1-2f66-448c-a614-f5832ddbaaa6-kube-api-access-c9jgw\") pod \"dnsmasq-dns-59cf4bdb65-txfdc\" (UID: \"d146eef1-2f66-448c-a614-f5832ddbaaa6\") " pod="openstack/dnsmasq-dns-59cf4bdb65-txfdc" Jan 11 17:53:44 crc kubenswrapper[4837]: I0111 17:53:44.088855 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/d146eef1-2f66-448c-a614-f5832ddbaaa6-dns-swift-storage-0\") pod \"dnsmasq-dns-59cf4bdb65-txfdc\" (UID: \"d146eef1-2f66-448c-a614-f5832ddbaaa6\") " pod="openstack/dnsmasq-dns-59cf4bdb65-txfdc" Jan 11 17:53:44 crc kubenswrapper[4837]: I0111 17:53:44.088958 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/d146eef1-2f66-448c-a614-f5832ddbaaa6-ovsdbserver-sb\") pod \"dnsmasq-dns-59cf4bdb65-txfdc\" (UID: \"d146eef1-2f66-448c-a614-f5832ddbaaa6\") " pod="openstack/dnsmasq-dns-59cf4bdb65-txfdc" Jan 11 17:53:44 crc kubenswrapper[4837]: I0111 17:53:44.088982 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d146eef1-2f66-448c-a614-f5832ddbaaa6-config\") pod \"dnsmasq-dns-59cf4bdb65-txfdc\" (UID: \"d146eef1-2f66-448c-a614-f5832ddbaaa6\") " pod="openstack/dnsmasq-dns-59cf4bdb65-txfdc" Jan 11 17:53:44 crc kubenswrapper[4837]: I0111 17:53:44.089007 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/d146eef1-2f66-448c-a614-f5832ddbaaa6-ovsdbserver-nb\") pod \"dnsmasq-dns-59cf4bdb65-txfdc\" (UID: \"d146eef1-2f66-448c-a614-f5832ddbaaa6\") " pod="openstack/dnsmasq-dns-59cf4bdb65-txfdc" Jan 11 17:53:44 crc kubenswrapper[4837]: I0111 17:53:44.089825 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/d146eef1-2f66-448c-a614-f5832ddbaaa6-ovsdbserver-sb\") pod \"dnsmasq-dns-59cf4bdb65-txfdc\" (UID: \"d146eef1-2f66-448c-a614-f5832ddbaaa6\") " pod="openstack/dnsmasq-dns-59cf4bdb65-txfdc" Jan 11 17:53:44 crc kubenswrapper[4837]: I0111 17:53:44.089848 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/d146eef1-2f66-448c-a614-f5832ddbaaa6-dns-svc\") pod \"dnsmasq-dns-59cf4bdb65-txfdc\" (UID: \"d146eef1-2f66-448c-a614-f5832ddbaaa6\") " pod="openstack/dnsmasq-dns-59cf4bdb65-txfdc" Jan 11 17:53:44 crc kubenswrapper[4837]: I0111 17:53:44.089839 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/d146eef1-2f66-448c-a614-f5832ddbaaa6-dns-swift-storage-0\") pod \"dnsmasq-dns-59cf4bdb65-txfdc\" (UID: \"d146eef1-2f66-448c-a614-f5832ddbaaa6\") " pod="openstack/dnsmasq-dns-59cf4bdb65-txfdc" Jan 11 17:53:44 crc kubenswrapper[4837]: I0111 17:53:44.089976 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d146eef1-2f66-448c-a614-f5832ddbaaa6-config\") pod \"dnsmasq-dns-59cf4bdb65-txfdc\" (UID: \"d146eef1-2f66-448c-a614-f5832ddbaaa6\") " pod="openstack/dnsmasq-dns-59cf4bdb65-txfdc" Jan 11 17:53:44 crc kubenswrapper[4837]: I0111 17:53:44.089990 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/d146eef1-2f66-448c-a614-f5832ddbaaa6-ovsdbserver-nb\") pod \"dnsmasq-dns-59cf4bdb65-txfdc\" (UID: \"d146eef1-2f66-448c-a614-f5832ddbaaa6\") " pod="openstack/dnsmasq-dns-59cf4bdb65-txfdc" Jan 11 17:53:44 crc kubenswrapper[4837]: I0111 17:53:44.123598 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-c9jgw\" (UniqueName: \"kubernetes.io/projected/d146eef1-2f66-448c-a614-f5832ddbaaa6-kube-api-access-c9jgw\") pod \"dnsmasq-dns-59cf4bdb65-txfdc\" (UID: \"d146eef1-2f66-448c-a614-f5832ddbaaa6\") " pod="openstack/dnsmasq-dns-59cf4bdb65-txfdc" Jan 11 17:53:44 crc kubenswrapper[4837]: I0111 17:53:44.279823 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-59cf4bdb65-txfdc" Jan 11 17:53:44 crc kubenswrapper[4837]: W0111 17:53:44.773617 4837 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podd146eef1_2f66_448c_a614_f5832ddbaaa6.slice/crio-d71e930dfad19d22283ebcbe145ebb94f2aec3e69fd6ff607d45e5ea71aaee3c WatchSource:0}: Error finding container d71e930dfad19d22283ebcbe145ebb94f2aec3e69fd6ff607d45e5ea71aaee3c: Status 404 returned error can't find the container with id d71e930dfad19d22283ebcbe145ebb94f2aec3e69fd6ff607d45e5ea71aaee3c Jan 11 17:53:44 crc kubenswrapper[4837]: I0111 17:53:44.773666 4837 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-59cf4bdb65-txfdc"] Jan 11 17:53:45 crc kubenswrapper[4837]: I0111 17:53:45.740591 4837 generic.go:334] "Generic (PLEG): container finished" podID="d146eef1-2f66-448c-a614-f5832ddbaaa6" containerID="67b619fb41b4abee9735c879b3c66957d0f9204509bddc0c9821b76af68c53a5" exitCode=0 Jan 11 17:53:45 crc kubenswrapper[4837]: I0111 17:53:45.740709 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-59cf4bdb65-txfdc" event={"ID":"d146eef1-2f66-448c-a614-f5832ddbaaa6","Type":"ContainerDied","Data":"67b619fb41b4abee9735c879b3c66957d0f9204509bddc0c9821b76af68c53a5"} Jan 11 17:53:45 crc kubenswrapper[4837]: I0111 17:53:45.740959 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-59cf4bdb65-txfdc" event={"ID":"d146eef1-2f66-448c-a614-f5832ddbaaa6","Type":"ContainerStarted","Data":"d71e930dfad19d22283ebcbe145ebb94f2aec3e69fd6ff607d45e5ea71aaee3c"} Jan 11 17:53:45 crc kubenswrapper[4837]: I0111 17:53:45.928505 4837 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Jan 11 17:53:45 crc kubenswrapper[4837]: I0111 17:53:45.928876 4837 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="ee7eb3c4-5f92-4488-9853-f7e4227e176f" containerName="ceilometer-central-agent" containerID="cri-o://8e7fc61e189dc675f2de70ebbbb288b4dc6c38fc8eafc7533897329ce4cc860b" gracePeriod=30 Jan 11 17:53:45 crc kubenswrapper[4837]: I0111 17:53:45.928983 4837 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="ee7eb3c4-5f92-4488-9853-f7e4227e176f" containerName="ceilometer-notification-agent" containerID="cri-o://e6553301b2ebac718e0397f4f6aaf2943ed02f23e5cfd11903fa2c3f0799b48a" gracePeriod=30 Jan 11 17:53:45 crc kubenswrapper[4837]: I0111 17:53:45.928983 4837 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="ee7eb3c4-5f92-4488-9853-f7e4227e176f" containerName="proxy-httpd" containerID="cri-o://3db2a6f8fd5510bc8f798967034d30ae868105df5ca7251ca0c8401aeac78654" gracePeriod=30 Jan 11 17:53:45 crc kubenswrapper[4837]: I0111 17:53:45.932696 4837 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="ee7eb3c4-5f92-4488-9853-f7e4227e176f" containerName="sg-core" containerID="cri-o://375272b3d5168c8e18fb629aab8185dc34e6d92e52d87041946a2e8051463e44" gracePeriod=30 Jan 11 17:53:46 crc kubenswrapper[4837]: I0111 17:53:46.029557 4837 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/ceilometer-0" podUID="ee7eb3c4-5f92-4488-9853-f7e4227e176f" containerName="proxy-httpd" probeResult="failure" output="Get \"https://10.217.0.200:3000/\": read tcp 10.217.0.2:46302->10.217.0.200:3000: read: connection reset by peer" Jan 11 17:53:46 crc kubenswrapper[4837]: I0111 17:53:46.118071 4837 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-cell1-novncproxy-0" Jan 11 17:53:46 crc kubenswrapper[4837]: I0111 17:53:46.248482 4837 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-0"] Jan 11 17:53:46 crc kubenswrapper[4837]: I0111 17:53:46.751139 4837 generic.go:334] "Generic (PLEG): container finished" podID="ee7eb3c4-5f92-4488-9853-f7e4227e176f" containerID="3db2a6f8fd5510bc8f798967034d30ae868105df5ca7251ca0c8401aeac78654" exitCode=0 Jan 11 17:53:46 crc kubenswrapper[4837]: I0111 17:53:46.751171 4837 generic.go:334] "Generic (PLEG): container finished" podID="ee7eb3c4-5f92-4488-9853-f7e4227e176f" containerID="375272b3d5168c8e18fb629aab8185dc34e6d92e52d87041946a2e8051463e44" exitCode=2 Jan 11 17:53:46 crc kubenswrapper[4837]: I0111 17:53:46.751182 4837 generic.go:334] "Generic (PLEG): container finished" podID="ee7eb3c4-5f92-4488-9853-f7e4227e176f" containerID="8e7fc61e189dc675f2de70ebbbb288b4dc6c38fc8eafc7533897329ce4cc860b" exitCode=0 Jan 11 17:53:46 crc kubenswrapper[4837]: I0111 17:53:46.751224 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"ee7eb3c4-5f92-4488-9853-f7e4227e176f","Type":"ContainerDied","Data":"3db2a6f8fd5510bc8f798967034d30ae868105df5ca7251ca0c8401aeac78654"} Jan 11 17:53:46 crc kubenswrapper[4837]: I0111 17:53:46.751248 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"ee7eb3c4-5f92-4488-9853-f7e4227e176f","Type":"ContainerDied","Data":"375272b3d5168c8e18fb629aab8185dc34e6d92e52d87041946a2e8051463e44"} Jan 11 17:53:46 crc kubenswrapper[4837]: I0111 17:53:46.751259 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"ee7eb3c4-5f92-4488-9853-f7e4227e176f","Type":"ContainerDied","Data":"8e7fc61e189dc675f2de70ebbbb288b4dc6c38fc8eafc7533897329ce4cc860b"} Jan 11 17:53:46 crc kubenswrapper[4837]: I0111 17:53:46.753740 4837 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-api-0" podUID="828211f0-0290-4c77-9227-355162626da7" containerName="nova-api-log" containerID="cri-o://7c5b8b71b307d694cce22170c6a15a275d77a4de9ceed97f2be6074fcc3e266c" gracePeriod=30 Jan 11 17:53:46 crc kubenswrapper[4837]: I0111 17:53:46.754883 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-59cf4bdb65-txfdc" event={"ID":"d146eef1-2f66-448c-a614-f5832ddbaaa6","Type":"ContainerStarted","Data":"329a0e61e9944552996287a96c2c6a490d980fe654ce2e1cb9e5ead38b3a34b8"} Jan 11 17:53:46 crc kubenswrapper[4837]: I0111 17:53:46.754912 4837 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-59cf4bdb65-txfdc" Jan 11 17:53:46 crc kubenswrapper[4837]: I0111 17:53:46.755209 4837 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-api-0" podUID="828211f0-0290-4c77-9227-355162626da7" containerName="nova-api-api" containerID="cri-o://8cc66cf28427707b0c7b04e92d8fbf6d384b795df55598e6dc8b25c14da582a2" gracePeriod=30 Jan 11 17:53:46 crc kubenswrapper[4837]: I0111 17:53:46.796038 4837 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-59cf4bdb65-txfdc" podStartSLOduration=3.796020484 podStartE2EDuration="3.796020484s" podCreationTimestamp="2026-01-11 17:53:43 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-11 17:53:46.787942957 +0000 UTC m=+1400.966135673" watchObservedRunningTime="2026-01-11 17:53:46.796020484 +0000 UTC m=+1400.974213190" Jan 11 17:53:47 crc kubenswrapper[4837]: I0111 17:53:47.762597 4837 generic.go:334] "Generic (PLEG): container finished" podID="828211f0-0290-4c77-9227-355162626da7" containerID="7c5b8b71b307d694cce22170c6a15a275d77a4de9ceed97f2be6074fcc3e266c" exitCode=143 Jan 11 17:53:47 crc kubenswrapper[4837]: I0111 17:53:47.762690 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"828211f0-0290-4c77-9227-355162626da7","Type":"ContainerDied","Data":"7c5b8b71b307d694cce22170c6a15a275d77a4de9ceed97f2be6074fcc3e266c"} Jan 11 17:53:48 crc kubenswrapper[4837]: I0111 17:53:48.525391 4837 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Jan 11 17:53:48 crc kubenswrapper[4837]: I0111 17:53:48.668022 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/ee7eb3c4-5f92-4488-9853-f7e4227e176f-scripts\") pod \"ee7eb3c4-5f92-4488-9853-f7e4227e176f\" (UID: \"ee7eb3c4-5f92-4488-9853-f7e4227e176f\") " Jan 11 17:53:48 crc kubenswrapper[4837]: I0111 17:53:48.668137 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pkcfp\" (UniqueName: \"kubernetes.io/projected/ee7eb3c4-5f92-4488-9853-f7e4227e176f-kube-api-access-pkcfp\") pod \"ee7eb3c4-5f92-4488-9853-f7e4227e176f\" (UID: \"ee7eb3c4-5f92-4488-9853-f7e4227e176f\") " Jan 11 17:53:48 crc kubenswrapper[4837]: I0111 17:53:48.668169 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ee7eb3c4-5f92-4488-9853-f7e4227e176f-combined-ca-bundle\") pod \"ee7eb3c4-5f92-4488-9853-f7e4227e176f\" (UID: \"ee7eb3c4-5f92-4488-9853-f7e4227e176f\") " Jan 11 17:53:48 crc kubenswrapper[4837]: I0111 17:53:48.668223 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/ee7eb3c4-5f92-4488-9853-f7e4227e176f-log-httpd\") pod \"ee7eb3c4-5f92-4488-9853-f7e4227e176f\" (UID: \"ee7eb3c4-5f92-4488-9853-f7e4227e176f\") " Jan 11 17:53:48 crc kubenswrapper[4837]: I0111 17:53:48.668247 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/ee7eb3c4-5f92-4488-9853-f7e4227e176f-ceilometer-tls-certs\") pod \"ee7eb3c4-5f92-4488-9853-f7e4227e176f\" (UID: \"ee7eb3c4-5f92-4488-9853-f7e4227e176f\") " Jan 11 17:53:48 crc kubenswrapper[4837]: I0111 17:53:48.668284 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/ee7eb3c4-5f92-4488-9853-f7e4227e176f-sg-core-conf-yaml\") pod \"ee7eb3c4-5f92-4488-9853-f7e4227e176f\" (UID: \"ee7eb3c4-5f92-4488-9853-f7e4227e176f\") " Jan 11 17:53:48 crc kubenswrapper[4837]: I0111 17:53:48.668303 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ee7eb3c4-5f92-4488-9853-f7e4227e176f-config-data\") pod \"ee7eb3c4-5f92-4488-9853-f7e4227e176f\" (UID: \"ee7eb3c4-5f92-4488-9853-f7e4227e176f\") " Jan 11 17:53:48 crc kubenswrapper[4837]: I0111 17:53:48.668325 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/ee7eb3c4-5f92-4488-9853-f7e4227e176f-run-httpd\") pod \"ee7eb3c4-5f92-4488-9853-f7e4227e176f\" (UID: \"ee7eb3c4-5f92-4488-9853-f7e4227e176f\") " Jan 11 17:53:48 crc kubenswrapper[4837]: I0111 17:53:48.668995 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/ee7eb3c4-5f92-4488-9853-f7e4227e176f-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "ee7eb3c4-5f92-4488-9853-f7e4227e176f" (UID: "ee7eb3c4-5f92-4488-9853-f7e4227e176f"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 11 17:53:48 crc kubenswrapper[4837]: I0111 17:53:48.669043 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/ee7eb3c4-5f92-4488-9853-f7e4227e176f-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "ee7eb3c4-5f92-4488-9853-f7e4227e176f" (UID: "ee7eb3c4-5f92-4488-9853-f7e4227e176f"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 11 17:53:48 crc kubenswrapper[4837]: I0111 17:53:48.674114 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ee7eb3c4-5f92-4488-9853-f7e4227e176f-kube-api-access-pkcfp" (OuterVolumeSpecName: "kube-api-access-pkcfp") pod "ee7eb3c4-5f92-4488-9853-f7e4227e176f" (UID: "ee7eb3c4-5f92-4488-9853-f7e4227e176f"). InnerVolumeSpecName "kube-api-access-pkcfp". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 11 17:53:48 crc kubenswrapper[4837]: I0111 17:53:48.689882 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ee7eb3c4-5f92-4488-9853-f7e4227e176f-scripts" (OuterVolumeSpecName: "scripts") pod "ee7eb3c4-5f92-4488-9853-f7e4227e176f" (UID: "ee7eb3c4-5f92-4488-9853-f7e4227e176f"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 11 17:53:48 crc kubenswrapper[4837]: I0111 17:53:48.694811 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ee7eb3c4-5f92-4488-9853-f7e4227e176f-sg-core-conf-yaml" (OuterVolumeSpecName: "sg-core-conf-yaml") pod "ee7eb3c4-5f92-4488-9853-f7e4227e176f" (UID: "ee7eb3c4-5f92-4488-9853-f7e4227e176f"). InnerVolumeSpecName "sg-core-conf-yaml". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 11 17:53:48 crc kubenswrapper[4837]: I0111 17:53:48.750560 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ee7eb3c4-5f92-4488-9853-f7e4227e176f-ceilometer-tls-certs" (OuterVolumeSpecName: "ceilometer-tls-certs") pod "ee7eb3c4-5f92-4488-9853-f7e4227e176f" (UID: "ee7eb3c4-5f92-4488-9853-f7e4227e176f"). InnerVolumeSpecName "ceilometer-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 11 17:53:48 crc kubenswrapper[4837]: I0111 17:53:48.770307 4837 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-pkcfp\" (UniqueName: \"kubernetes.io/projected/ee7eb3c4-5f92-4488-9853-f7e4227e176f-kube-api-access-pkcfp\") on node \"crc\" DevicePath \"\"" Jan 11 17:53:48 crc kubenswrapper[4837]: I0111 17:53:48.770333 4837 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/ee7eb3c4-5f92-4488-9853-f7e4227e176f-log-httpd\") on node \"crc\" DevicePath \"\"" Jan 11 17:53:48 crc kubenswrapper[4837]: I0111 17:53:48.770343 4837 reconciler_common.go:293] "Volume detached for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/ee7eb3c4-5f92-4488-9853-f7e4227e176f-ceilometer-tls-certs\") on node \"crc\" DevicePath \"\"" Jan 11 17:53:48 crc kubenswrapper[4837]: I0111 17:53:48.770379 4837 reconciler_common.go:293] "Volume detached for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/ee7eb3c4-5f92-4488-9853-f7e4227e176f-sg-core-conf-yaml\") on node \"crc\" DevicePath \"\"" Jan 11 17:53:48 crc kubenswrapper[4837]: I0111 17:53:48.770388 4837 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/ee7eb3c4-5f92-4488-9853-f7e4227e176f-run-httpd\") on node \"crc\" DevicePath \"\"" Jan 11 17:53:48 crc kubenswrapper[4837]: I0111 17:53:48.770398 4837 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/ee7eb3c4-5f92-4488-9853-f7e4227e176f-scripts\") on node \"crc\" DevicePath \"\"" Jan 11 17:53:48 crc kubenswrapper[4837]: I0111 17:53:48.775953 4837 generic.go:334] "Generic (PLEG): container finished" podID="ee7eb3c4-5f92-4488-9853-f7e4227e176f" containerID="e6553301b2ebac718e0397f4f6aaf2943ed02f23e5cfd11903fa2c3f0799b48a" exitCode=0 Jan 11 17:53:48 crc kubenswrapper[4837]: I0111 17:53:48.775993 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"ee7eb3c4-5f92-4488-9853-f7e4227e176f","Type":"ContainerDied","Data":"e6553301b2ebac718e0397f4f6aaf2943ed02f23e5cfd11903fa2c3f0799b48a"} Jan 11 17:53:48 crc kubenswrapper[4837]: I0111 17:53:48.776015 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"ee7eb3c4-5f92-4488-9853-f7e4227e176f","Type":"ContainerDied","Data":"eafa9baffe0c05904da0d5d63ccb4b3e251a82add0f708738c1d9fdc6de9f91d"} Jan 11 17:53:48 crc kubenswrapper[4837]: I0111 17:53:48.776031 4837 scope.go:117] "RemoveContainer" containerID="3db2a6f8fd5510bc8f798967034d30ae868105df5ca7251ca0c8401aeac78654" Jan 11 17:53:48 crc kubenswrapper[4837]: I0111 17:53:48.776134 4837 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Jan 11 17:53:48 crc kubenswrapper[4837]: I0111 17:53:48.784633 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ee7eb3c4-5f92-4488-9853-f7e4227e176f-config-data" (OuterVolumeSpecName: "config-data") pod "ee7eb3c4-5f92-4488-9853-f7e4227e176f" (UID: "ee7eb3c4-5f92-4488-9853-f7e4227e176f"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 11 17:53:48 crc kubenswrapper[4837]: I0111 17:53:48.785245 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ee7eb3c4-5f92-4488-9853-f7e4227e176f-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "ee7eb3c4-5f92-4488-9853-f7e4227e176f" (UID: "ee7eb3c4-5f92-4488-9853-f7e4227e176f"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 11 17:53:48 crc kubenswrapper[4837]: I0111 17:53:48.820060 4837 scope.go:117] "RemoveContainer" containerID="375272b3d5168c8e18fb629aab8185dc34e6d92e52d87041946a2e8051463e44" Jan 11 17:53:48 crc kubenswrapper[4837]: I0111 17:53:48.844736 4837 scope.go:117] "RemoveContainer" containerID="e6553301b2ebac718e0397f4f6aaf2943ed02f23e5cfd11903fa2c3f0799b48a" Jan 11 17:53:48 crc kubenswrapper[4837]: I0111 17:53:48.862248 4837 scope.go:117] "RemoveContainer" containerID="8e7fc61e189dc675f2de70ebbbb288b4dc6c38fc8eafc7533897329ce4cc860b" Jan 11 17:53:48 crc kubenswrapper[4837]: I0111 17:53:48.871504 4837 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ee7eb3c4-5f92-4488-9853-f7e4227e176f-config-data\") on node \"crc\" DevicePath \"\"" Jan 11 17:53:48 crc kubenswrapper[4837]: I0111 17:53:48.871530 4837 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ee7eb3c4-5f92-4488-9853-f7e4227e176f-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 11 17:53:48 crc kubenswrapper[4837]: I0111 17:53:48.879368 4837 scope.go:117] "RemoveContainer" containerID="3db2a6f8fd5510bc8f798967034d30ae868105df5ca7251ca0c8401aeac78654" Jan 11 17:53:48 crc kubenswrapper[4837]: E0111 17:53:48.879666 4837 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"3db2a6f8fd5510bc8f798967034d30ae868105df5ca7251ca0c8401aeac78654\": container with ID starting with 3db2a6f8fd5510bc8f798967034d30ae868105df5ca7251ca0c8401aeac78654 not found: ID does not exist" containerID="3db2a6f8fd5510bc8f798967034d30ae868105df5ca7251ca0c8401aeac78654" Jan 11 17:53:48 crc kubenswrapper[4837]: I0111 17:53:48.879705 4837 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3db2a6f8fd5510bc8f798967034d30ae868105df5ca7251ca0c8401aeac78654"} err="failed to get container status \"3db2a6f8fd5510bc8f798967034d30ae868105df5ca7251ca0c8401aeac78654\": rpc error: code = NotFound desc = could not find container \"3db2a6f8fd5510bc8f798967034d30ae868105df5ca7251ca0c8401aeac78654\": container with ID starting with 3db2a6f8fd5510bc8f798967034d30ae868105df5ca7251ca0c8401aeac78654 not found: ID does not exist" Jan 11 17:53:48 crc kubenswrapper[4837]: I0111 17:53:48.879724 4837 scope.go:117] "RemoveContainer" containerID="375272b3d5168c8e18fb629aab8185dc34e6d92e52d87041946a2e8051463e44" Jan 11 17:53:48 crc kubenswrapper[4837]: E0111 17:53:48.880571 4837 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"375272b3d5168c8e18fb629aab8185dc34e6d92e52d87041946a2e8051463e44\": container with ID starting with 375272b3d5168c8e18fb629aab8185dc34e6d92e52d87041946a2e8051463e44 not found: ID does not exist" containerID="375272b3d5168c8e18fb629aab8185dc34e6d92e52d87041946a2e8051463e44" Jan 11 17:53:48 crc kubenswrapper[4837]: I0111 17:53:48.880596 4837 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"375272b3d5168c8e18fb629aab8185dc34e6d92e52d87041946a2e8051463e44"} err="failed to get container status \"375272b3d5168c8e18fb629aab8185dc34e6d92e52d87041946a2e8051463e44\": rpc error: code = NotFound desc = could not find container \"375272b3d5168c8e18fb629aab8185dc34e6d92e52d87041946a2e8051463e44\": container with ID starting with 375272b3d5168c8e18fb629aab8185dc34e6d92e52d87041946a2e8051463e44 not found: ID does not exist" Jan 11 17:53:48 crc kubenswrapper[4837]: I0111 17:53:48.880608 4837 scope.go:117] "RemoveContainer" containerID="e6553301b2ebac718e0397f4f6aaf2943ed02f23e5cfd11903fa2c3f0799b48a" Jan 11 17:53:48 crc kubenswrapper[4837]: E0111 17:53:48.880949 4837 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e6553301b2ebac718e0397f4f6aaf2943ed02f23e5cfd11903fa2c3f0799b48a\": container with ID starting with e6553301b2ebac718e0397f4f6aaf2943ed02f23e5cfd11903fa2c3f0799b48a not found: ID does not exist" containerID="e6553301b2ebac718e0397f4f6aaf2943ed02f23e5cfd11903fa2c3f0799b48a" Jan 11 17:53:48 crc kubenswrapper[4837]: I0111 17:53:48.881003 4837 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e6553301b2ebac718e0397f4f6aaf2943ed02f23e5cfd11903fa2c3f0799b48a"} err="failed to get container status \"e6553301b2ebac718e0397f4f6aaf2943ed02f23e5cfd11903fa2c3f0799b48a\": rpc error: code = NotFound desc = could not find container \"e6553301b2ebac718e0397f4f6aaf2943ed02f23e5cfd11903fa2c3f0799b48a\": container with ID starting with e6553301b2ebac718e0397f4f6aaf2943ed02f23e5cfd11903fa2c3f0799b48a not found: ID does not exist" Jan 11 17:53:48 crc kubenswrapper[4837]: I0111 17:53:48.881040 4837 scope.go:117] "RemoveContainer" containerID="8e7fc61e189dc675f2de70ebbbb288b4dc6c38fc8eafc7533897329ce4cc860b" Jan 11 17:53:48 crc kubenswrapper[4837]: E0111 17:53:48.881355 4837 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"8e7fc61e189dc675f2de70ebbbb288b4dc6c38fc8eafc7533897329ce4cc860b\": container with ID starting with 8e7fc61e189dc675f2de70ebbbb288b4dc6c38fc8eafc7533897329ce4cc860b not found: ID does not exist" containerID="8e7fc61e189dc675f2de70ebbbb288b4dc6c38fc8eafc7533897329ce4cc860b" Jan 11 17:53:48 crc kubenswrapper[4837]: I0111 17:53:48.881381 4837 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8e7fc61e189dc675f2de70ebbbb288b4dc6c38fc8eafc7533897329ce4cc860b"} err="failed to get container status \"8e7fc61e189dc675f2de70ebbbb288b4dc6c38fc8eafc7533897329ce4cc860b\": rpc error: code = NotFound desc = could not find container \"8e7fc61e189dc675f2de70ebbbb288b4dc6c38fc8eafc7533897329ce4cc860b\": container with ID starting with 8e7fc61e189dc675f2de70ebbbb288b4dc6c38fc8eafc7533897329ce4cc860b not found: ID does not exist" Jan 11 17:53:49 crc kubenswrapper[4837]: I0111 17:53:49.118928 4837 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Jan 11 17:53:49 crc kubenswrapper[4837]: I0111 17:53:49.133151 4837 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ceilometer-0"] Jan 11 17:53:49 crc kubenswrapper[4837]: I0111 17:53:49.143406 4837 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceilometer-0"] Jan 11 17:53:49 crc kubenswrapper[4837]: E0111 17:53:49.145534 4837 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ee7eb3c4-5f92-4488-9853-f7e4227e176f" containerName="ceilometer-notification-agent" Jan 11 17:53:49 crc kubenswrapper[4837]: I0111 17:53:49.145564 4837 state_mem.go:107] "Deleted CPUSet assignment" podUID="ee7eb3c4-5f92-4488-9853-f7e4227e176f" containerName="ceilometer-notification-agent" Jan 11 17:53:49 crc kubenswrapper[4837]: E0111 17:53:49.145580 4837 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ee7eb3c4-5f92-4488-9853-f7e4227e176f" containerName="ceilometer-central-agent" Jan 11 17:53:49 crc kubenswrapper[4837]: I0111 17:53:49.145587 4837 state_mem.go:107] "Deleted CPUSet assignment" podUID="ee7eb3c4-5f92-4488-9853-f7e4227e176f" containerName="ceilometer-central-agent" Jan 11 17:53:49 crc kubenswrapper[4837]: E0111 17:53:49.145606 4837 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ee7eb3c4-5f92-4488-9853-f7e4227e176f" containerName="sg-core" Jan 11 17:53:49 crc kubenswrapper[4837]: I0111 17:53:49.145612 4837 state_mem.go:107] "Deleted CPUSet assignment" podUID="ee7eb3c4-5f92-4488-9853-f7e4227e176f" containerName="sg-core" Jan 11 17:53:49 crc kubenswrapper[4837]: E0111 17:53:49.145632 4837 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ee7eb3c4-5f92-4488-9853-f7e4227e176f" containerName="proxy-httpd" Jan 11 17:53:49 crc kubenswrapper[4837]: I0111 17:53:49.145638 4837 state_mem.go:107] "Deleted CPUSet assignment" podUID="ee7eb3c4-5f92-4488-9853-f7e4227e176f" containerName="proxy-httpd" Jan 11 17:53:49 crc kubenswrapper[4837]: I0111 17:53:49.145926 4837 memory_manager.go:354] "RemoveStaleState removing state" podUID="ee7eb3c4-5f92-4488-9853-f7e4227e176f" containerName="ceilometer-central-agent" Jan 11 17:53:49 crc kubenswrapper[4837]: I0111 17:53:49.145953 4837 memory_manager.go:354] "RemoveStaleState removing state" podUID="ee7eb3c4-5f92-4488-9853-f7e4227e176f" containerName="ceilometer-notification-agent" Jan 11 17:53:49 crc kubenswrapper[4837]: I0111 17:53:49.145972 4837 memory_manager.go:354] "RemoveStaleState removing state" podUID="ee7eb3c4-5f92-4488-9853-f7e4227e176f" containerName="proxy-httpd" Jan 11 17:53:49 crc kubenswrapper[4837]: I0111 17:53:49.146002 4837 memory_manager.go:354] "RemoveStaleState removing state" podUID="ee7eb3c4-5f92-4488-9853-f7e4227e176f" containerName="sg-core" Jan 11 17:53:49 crc kubenswrapper[4837]: I0111 17:53:49.148092 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Jan 11 17:53:49 crc kubenswrapper[4837]: I0111 17:53:49.152647 4837 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-config-data" Jan 11 17:53:49 crc kubenswrapper[4837]: I0111 17:53:49.153649 4837 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-ceilometer-internal-svc" Jan 11 17:53:49 crc kubenswrapper[4837]: I0111 17:53:49.153818 4837 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-scripts" Jan 11 17:53:49 crc kubenswrapper[4837]: I0111 17:53:49.158139 4837 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Jan 11 17:53:49 crc kubenswrapper[4837]: I0111 17:53:49.178457 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/19839462-912e-421f-8d6d-a5ef5d8129f5-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"19839462-912e-421f-8d6d-a5ef5d8129f5\") " pod="openstack/ceilometer-0" Jan 11 17:53:49 crc kubenswrapper[4837]: I0111 17:53:49.178563 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/19839462-912e-421f-8d6d-a5ef5d8129f5-run-httpd\") pod \"ceilometer-0\" (UID: \"19839462-912e-421f-8d6d-a5ef5d8129f5\") " pod="openstack/ceilometer-0" Jan 11 17:53:49 crc kubenswrapper[4837]: I0111 17:53:49.178625 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/19839462-912e-421f-8d6d-a5ef5d8129f5-scripts\") pod \"ceilometer-0\" (UID: \"19839462-912e-421f-8d6d-a5ef5d8129f5\") " pod="openstack/ceilometer-0" Jan 11 17:53:49 crc kubenswrapper[4837]: I0111 17:53:49.178661 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-g4qkf\" (UniqueName: \"kubernetes.io/projected/19839462-912e-421f-8d6d-a5ef5d8129f5-kube-api-access-g4qkf\") pod \"ceilometer-0\" (UID: \"19839462-912e-421f-8d6d-a5ef5d8129f5\") " pod="openstack/ceilometer-0" Jan 11 17:53:49 crc kubenswrapper[4837]: I0111 17:53:49.178730 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/19839462-912e-421f-8d6d-a5ef5d8129f5-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"19839462-912e-421f-8d6d-a5ef5d8129f5\") " pod="openstack/ceilometer-0" Jan 11 17:53:49 crc kubenswrapper[4837]: I0111 17:53:49.178927 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/19839462-912e-421f-8d6d-a5ef5d8129f5-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"19839462-912e-421f-8d6d-a5ef5d8129f5\") " pod="openstack/ceilometer-0" Jan 11 17:53:49 crc kubenswrapper[4837]: I0111 17:53:49.178961 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/19839462-912e-421f-8d6d-a5ef5d8129f5-config-data\") pod \"ceilometer-0\" (UID: \"19839462-912e-421f-8d6d-a5ef5d8129f5\") " pod="openstack/ceilometer-0" Jan 11 17:53:49 crc kubenswrapper[4837]: I0111 17:53:49.178979 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/19839462-912e-421f-8d6d-a5ef5d8129f5-log-httpd\") pod \"ceilometer-0\" (UID: \"19839462-912e-421f-8d6d-a5ef5d8129f5\") " pod="openstack/ceilometer-0" Jan 11 17:53:49 crc kubenswrapper[4837]: I0111 17:53:49.280093 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-g4qkf\" (UniqueName: \"kubernetes.io/projected/19839462-912e-421f-8d6d-a5ef5d8129f5-kube-api-access-g4qkf\") pod \"ceilometer-0\" (UID: \"19839462-912e-421f-8d6d-a5ef5d8129f5\") " pod="openstack/ceilometer-0" Jan 11 17:53:49 crc kubenswrapper[4837]: I0111 17:53:49.280545 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/19839462-912e-421f-8d6d-a5ef5d8129f5-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"19839462-912e-421f-8d6d-a5ef5d8129f5\") " pod="openstack/ceilometer-0" Jan 11 17:53:49 crc kubenswrapper[4837]: I0111 17:53:49.280723 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/19839462-912e-421f-8d6d-a5ef5d8129f5-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"19839462-912e-421f-8d6d-a5ef5d8129f5\") " pod="openstack/ceilometer-0" Jan 11 17:53:49 crc kubenswrapper[4837]: I0111 17:53:49.280767 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/19839462-912e-421f-8d6d-a5ef5d8129f5-config-data\") pod \"ceilometer-0\" (UID: \"19839462-912e-421f-8d6d-a5ef5d8129f5\") " pod="openstack/ceilometer-0" Jan 11 17:53:49 crc kubenswrapper[4837]: I0111 17:53:49.280800 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/19839462-912e-421f-8d6d-a5ef5d8129f5-log-httpd\") pod \"ceilometer-0\" (UID: \"19839462-912e-421f-8d6d-a5ef5d8129f5\") " pod="openstack/ceilometer-0" Jan 11 17:53:49 crc kubenswrapper[4837]: I0111 17:53:49.280870 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/19839462-912e-421f-8d6d-a5ef5d8129f5-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"19839462-912e-421f-8d6d-a5ef5d8129f5\") " pod="openstack/ceilometer-0" Jan 11 17:53:49 crc kubenswrapper[4837]: I0111 17:53:49.280910 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/19839462-912e-421f-8d6d-a5ef5d8129f5-run-httpd\") pod \"ceilometer-0\" (UID: \"19839462-912e-421f-8d6d-a5ef5d8129f5\") " pod="openstack/ceilometer-0" Jan 11 17:53:49 crc kubenswrapper[4837]: I0111 17:53:49.280949 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/19839462-912e-421f-8d6d-a5ef5d8129f5-scripts\") pod \"ceilometer-0\" (UID: \"19839462-912e-421f-8d6d-a5ef5d8129f5\") " pod="openstack/ceilometer-0" Jan 11 17:53:49 crc kubenswrapper[4837]: I0111 17:53:49.282330 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/19839462-912e-421f-8d6d-a5ef5d8129f5-run-httpd\") pod \"ceilometer-0\" (UID: \"19839462-912e-421f-8d6d-a5ef5d8129f5\") " pod="openstack/ceilometer-0" Jan 11 17:53:49 crc kubenswrapper[4837]: I0111 17:53:49.282717 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/19839462-912e-421f-8d6d-a5ef5d8129f5-log-httpd\") pod \"ceilometer-0\" (UID: \"19839462-912e-421f-8d6d-a5ef5d8129f5\") " pod="openstack/ceilometer-0" Jan 11 17:53:49 crc kubenswrapper[4837]: I0111 17:53:49.291391 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/19839462-912e-421f-8d6d-a5ef5d8129f5-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"19839462-912e-421f-8d6d-a5ef5d8129f5\") " pod="openstack/ceilometer-0" Jan 11 17:53:49 crc kubenswrapper[4837]: I0111 17:53:49.291579 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/19839462-912e-421f-8d6d-a5ef5d8129f5-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"19839462-912e-421f-8d6d-a5ef5d8129f5\") " pod="openstack/ceilometer-0" Jan 11 17:53:49 crc kubenswrapper[4837]: I0111 17:53:49.292819 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/19839462-912e-421f-8d6d-a5ef5d8129f5-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"19839462-912e-421f-8d6d-a5ef5d8129f5\") " pod="openstack/ceilometer-0" Jan 11 17:53:49 crc kubenswrapper[4837]: I0111 17:53:49.298015 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/19839462-912e-421f-8d6d-a5ef5d8129f5-config-data\") pod \"ceilometer-0\" (UID: \"19839462-912e-421f-8d6d-a5ef5d8129f5\") " pod="openstack/ceilometer-0" Jan 11 17:53:49 crc kubenswrapper[4837]: I0111 17:53:49.302632 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-g4qkf\" (UniqueName: \"kubernetes.io/projected/19839462-912e-421f-8d6d-a5ef5d8129f5-kube-api-access-g4qkf\") pod \"ceilometer-0\" (UID: \"19839462-912e-421f-8d6d-a5ef5d8129f5\") " pod="openstack/ceilometer-0" Jan 11 17:53:49 crc kubenswrapper[4837]: I0111 17:53:49.303268 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/19839462-912e-421f-8d6d-a5ef5d8129f5-scripts\") pod \"ceilometer-0\" (UID: \"19839462-912e-421f-8d6d-a5ef5d8129f5\") " pod="openstack/ceilometer-0" Jan 11 17:53:49 crc kubenswrapper[4837]: I0111 17:53:49.531220 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Jan 11 17:53:50 crc kubenswrapper[4837]: I0111 17:53:50.067900 4837 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Jan 11 17:53:50 crc kubenswrapper[4837]: I0111 17:53:50.371872 4837 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Jan 11 17:53:50 crc kubenswrapper[4837]: I0111 17:53:50.382968 4837 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ee7eb3c4-5f92-4488-9853-f7e4227e176f" path="/var/lib/kubelet/pods/ee7eb3c4-5f92-4488-9853-f7e4227e176f/volumes" Jan 11 17:53:50 crc kubenswrapper[4837]: I0111 17:53:50.507093 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/828211f0-0290-4c77-9227-355162626da7-combined-ca-bundle\") pod \"828211f0-0290-4c77-9227-355162626da7\" (UID: \"828211f0-0290-4c77-9227-355162626da7\") " Jan 11 17:53:50 crc kubenswrapper[4837]: I0111 17:53:50.507476 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/828211f0-0290-4c77-9227-355162626da7-logs\") pod \"828211f0-0290-4c77-9227-355162626da7\" (UID: \"828211f0-0290-4c77-9227-355162626da7\") " Jan 11 17:53:50 crc kubenswrapper[4837]: I0111 17:53:50.507859 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-d7lkj\" (UniqueName: \"kubernetes.io/projected/828211f0-0290-4c77-9227-355162626da7-kube-api-access-d7lkj\") pod \"828211f0-0290-4c77-9227-355162626da7\" (UID: \"828211f0-0290-4c77-9227-355162626da7\") " Jan 11 17:53:50 crc kubenswrapper[4837]: I0111 17:53:50.507987 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/828211f0-0290-4c77-9227-355162626da7-config-data\") pod \"828211f0-0290-4c77-9227-355162626da7\" (UID: \"828211f0-0290-4c77-9227-355162626da7\") " Jan 11 17:53:50 crc kubenswrapper[4837]: I0111 17:53:50.508172 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/828211f0-0290-4c77-9227-355162626da7-logs" (OuterVolumeSpecName: "logs") pod "828211f0-0290-4c77-9227-355162626da7" (UID: "828211f0-0290-4c77-9227-355162626da7"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 11 17:53:50 crc kubenswrapper[4837]: I0111 17:53:50.508651 4837 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/828211f0-0290-4c77-9227-355162626da7-logs\") on node \"crc\" DevicePath \"\"" Jan 11 17:53:50 crc kubenswrapper[4837]: I0111 17:53:50.513886 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/828211f0-0290-4c77-9227-355162626da7-kube-api-access-d7lkj" (OuterVolumeSpecName: "kube-api-access-d7lkj") pod "828211f0-0290-4c77-9227-355162626da7" (UID: "828211f0-0290-4c77-9227-355162626da7"). InnerVolumeSpecName "kube-api-access-d7lkj". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 11 17:53:50 crc kubenswrapper[4837]: I0111 17:53:50.542921 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/828211f0-0290-4c77-9227-355162626da7-config-data" (OuterVolumeSpecName: "config-data") pod "828211f0-0290-4c77-9227-355162626da7" (UID: "828211f0-0290-4c77-9227-355162626da7"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 11 17:53:50 crc kubenswrapper[4837]: I0111 17:53:50.549051 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/828211f0-0290-4c77-9227-355162626da7-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "828211f0-0290-4c77-9227-355162626da7" (UID: "828211f0-0290-4c77-9227-355162626da7"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 11 17:53:50 crc kubenswrapper[4837]: I0111 17:53:50.610475 4837 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-d7lkj\" (UniqueName: \"kubernetes.io/projected/828211f0-0290-4c77-9227-355162626da7-kube-api-access-d7lkj\") on node \"crc\" DevicePath \"\"" Jan 11 17:53:50 crc kubenswrapper[4837]: I0111 17:53:50.610520 4837 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/828211f0-0290-4c77-9227-355162626da7-config-data\") on node \"crc\" DevicePath \"\"" Jan 11 17:53:50 crc kubenswrapper[4837]: I0111 17:53:50.610531 4837 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/828211f0-0290-4c77-9227-355162626da7-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 11 17:53:50 crc kubenswrapper[4837]: I0111 17:53:50.809643 4837 generic.go:334] "Generic (PLEG): container finished" podID="828211f0-0290-4c77-9227-355162626da7" containerID="8cc66cf28427707b0c7b04e92d8fbf6d384b795df55598e6dc8b25c14da582a2" exitCode=0 Jan 11 17:53:50 crc kubenswrapper[4837]: I0111 17:53:50.809780 4837 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Jan 11 17:53:50 crc kubenswrapper[4837]: I0111 17:53:50.809781 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"828211f0-0290-4c77-9227-355162626da7","Type":"ContainerDied","Data":"8cc66cf28427707b0c7b04e92d8fbf6d384b795df55598e6dc8b25c14da582a2"} Jan 11 17:53:50 crc kubenswrapper[4837]: I0111 17:53:50.809815 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"828211f0-0290-4c77-9227-355162626da7","Type":"ContainerDied","Data":"63a4382173bdba943b7dac60ba6885a769c654e8fc4ffe0302c3b49d62ac774f"} Jan 11 17:53:50 crc kubenswrapper[4837]: I0111 17:53:50.809834 4837 scope.go:117] "RemoveContainer" containerID="8cc66cf28427707b0c7b04e92d8fbf6d384b795df55598e6dc8b25c14da582a2" Jan 11 17:53:50 crc kubenswrapper[4837]: I0111 17:53:50.819614 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"19839462-912e-421f-8d6d-a5ef5d8129f5","Type":"ContainerStarted","Data":"0fdaf886636b46198a33ce997f5f592b9198e0f1ac87d4ed69daa77ec90651ca"} Jan 11 17:53:50 crc kubenswrapper[4837]: I0111 17:53:50.819648 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"19839462-912e-421f-8d6d-a5ef5d8129f5","Type":"ContainerStarted","Data":"ff4bd60c8c8f283b0abbfee9f75d1a7b612365f3008f498053cb6d462dc921e8"} Jan 11 17:53:50 crc kubenswrapper[4837]: I0111 17:53:50.848075 4837 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-0"] Jan 11 17:53:50 crc kubenswrapper[4837]: I0111 17:53:50.855018 4837 scope.go:117] "RemoveContainer" containerID="7c5b8b71b307d694cce22170c6a15a275d77a4de9ceed97f2be6074fcc3e266c" Jan 11 17:53:50 crc kubenswrapper[4837]: I0111 17:53:50.855892 4837 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-api-0"] Jan 11 17:53:50 crc kubenswrapper[4837]: I0111 17:53:50.874444 4837 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-api-0"] Jan 11 17:53:50 crc kubenswrapper[4837]: E0111 17:53:50.874897 4837 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="828211f0-0290-4c77-9227-355162626da7" containerName="nova-api-log" Jan 11 17:53:50 crc kubenswrapper[4837]: I0111 17:53:50.874916 4837 state_mem.go:107] "Deleted CPUSet assignment" podUID="828211f0-0290-4c77-9227-355162626da7" containerName="nova-api-log" Jan 11 17:53:50 crc kubenswrapper[4837]: E0111 17:53:50.874931 4837 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="828211f0-0290-4c77-9227-355162626da7" containerName="nova-api-api" Jan 11 17:53:50 crc kubenswrapper[4837]: I0111 17:53:50.874939 4837 state_mem.go:107] "Deleted CPUSet assignment" podUID="828211f0-0290-4c77-9227-355162626da7" containerName="nova-api-api" Jan 11 17:53:50 crc kubenswrapper[4837]: I0111 17:53:50.875221 4837 memory_manager.go:354] "RemoveStaleState removing state" podUID="828211f0-0290-4c77-9227-355162626da7" containerName="nova-api-log" Jan 11 17:53:50 crc kubenswrapper[4837]: I0111 17:53:50.875243 4837 memory_manager.go:354] "RemoveStaleState removing state" podUID="828211f0-0290-4c77-9227-355162626da7" containerName="nova-api-api" Jan 11 17:53:50 crc kubenswrapper[4837]: I0111 17:53:50.876514 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Jan 11 17:53:50 crc kubenswrapper[4837]: I0111 17:53:50.880379 4837 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-internal-svc" Jan 11 17:53:50 crc kubenswrapper[4837]: I0111 17:53:50.880749 4837 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-public-svc" Jan 11 17:53:50 crc kubenswrapper[4837]: I0111 17:53:50.880916 4837 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-api-config-data" Jan 11 17:53:50 crc kubenswrapper[4837]: I0111 17:53:50.884587 4837 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Jan 11 17:53:50 crc kubenswrapper[4837]: I0111 17:53:50.902858 4837 scope.go:117] "RemoveContainer" containerID="8cc66cf28427707b0c7b04e92d8fbf6d384b795df55598e6dc8b25c14da582a2" Jan 11 17:53:50 crc kubenswrapper[4837]: E0111 17:53:50.903905 4837 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"8cc66cf28427707b0c7b04e92d8fbf6d384b795df55598e6dc8b25c14da582a2\": container with ID starting with 8cc66cf28427707b0c7b04e92d8fbf6d384b795df55598e6dc8b25c14da582a2 not found: ID does not exist" containerID="8cc66cf28427707b0c7b04e92d8fbf6d384b795df55598e6dc8b25c14da582a2" Jan 11 17:53:50 crc kubenswrapper[4837]: I0111 17:53:50.903949 4837 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8cc66cf28427707b0c7b04e92d8fbf6d384b795df55598e6dc8b25c14da582a2"} err="failed to get container status \"8cc66cf28427707b0c7b04e92d8fbf6d384b795df55598e6dc8b25c14da582a2\": rpc error: code = NotFound desc = could not find container \"8cc66cf28427707b0c7b04e92d8fbf6d384b795df55598e6dc8b25c14da582a2\": container with ID starting with 8cc66cf28427707b0c7b04e92d8fbf6d384b795df55598e6dc8b25c14da582a2 not found: ID does not exist" Jan 11 17:53:50 crc kubenswrapper[4837]: I0111 17:53:50.903975 4837 scope.go:117] "RemoveContainer" containerID="7c5b8b71b307d694cce22170c6a15a275d77a4de9ceed97f2be6074fcc3e266c" Jan 11 17:53:50 crc kubenswrapper[4837]: E0111 17:53:50.905242 4837 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"7c5b8b71b307d694cce22170c6a15a275d77a4de9ceed97f2be6074fcc3e266c\": container with ID starting with 7c5b8b71b307d694cce22170c6a15a275d77a4de9ceed97f2be6074fcc3e266c not found: ID does not exist" containerID="7c5b8b71b307d694cce22170c6a15a275d77a4de9ceed97f2be6074fcc3e266c" Jan 11 17:53:50 crc kubenswrapper[4837]: I0111 17:53:50.905270 4837 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7c5b8b71b307d694cce22170c6a15a275d77a4de9ceed97f2be6074fcc3e266c"} err="failed to get container status \"7c5b8b71b307d694cce22170c6a15a275d77a4de9ceed97f2be6074fcc3e266c\": rpc error: code = NotFound desc = could not find container \"7c5b8b71b307d694cce22170c6a15a275d77a4de9ceed97f2be6074fcc3e266c\": container with ID starting with 7c5b8b71b307d694cce22170c6a15a275d77a4de9ceed97f2be6074fcc3e266c not found: ID does not exist" Jan 11 17:53:51 crc kubenswrapper[4837]: I0111 17:53:51.015615 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-sx7v2\" (UniqueName: \"kubernetes.io/projected/c636cfd6-27dc-4171-942e-8bb2cb7046a1-kube-api-access-sx7v2\") pod \"nova-api-0\" (UID: \"c636cfd6-27dc-4171-942e-8bb2cb7046a1\") " pod="openstack/nova-api-0" Jan 11 17:53:51 crc kubenswrapper[4837]: I0111 17:53:51.015779 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/c636cfd6-27dc-4171-942e-8bb2cb7046a1-logs\") pod \"nova-api-0\" (UID: \"c636cfd6-27dc-4171-942e-8bb2cb7046a1\") " pod="openstack/nova-api-0" Jan 11 17:53:51 crc kubenswrapper[4837]: I0111 17:53:51.015830 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/c636cfd6-27dc-4171-942e-8bb2cb7046a1-public-tls-certs\") pod \"nova-api-0\" (UID: \"c636cfd6-27dc-4171-942e-8bb2cb7046a1\") " pod="openstack/nova-api-0" Jan 11 17:53:51 crc kubenswrapper[4837]: I0111 17:53:51.015851 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/c636cfd6-27dc-4171-942e-8bb2cb7046a1-internal-tls-certs\") pod \"nova-api-0\" (UID: \"c636cfd6-27dc-4171-942e-8bb2cb7046a1\") " pod="openstack/nova-api-0" Jan 11 17:53:51 crc kubenswrapper[4837]: I0111 17:53:51.015886 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c636cfd6-27dc-4171-942e-8bb2cb7046a1-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"c636cfd6-27dc-4171-942e-8bb2cb7046a1\") " pod="openstack/nova-api-0" Jan 11 17:53:51 crc kubenswrapper[4837]: I0111 17:53:51.015940 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c636cfd6-27dc-4171-942e-8bb2cb7046a1-config-data\") pod \"nova-api-0\" (UID: \"c636cfd6-27dc-4171-942e-8bb2cb7046a1\") " pod="openstack/nova-api-0" Jan 11 17:53:51 crc kubenswrapper[4837]: I0111 17:53:51.117682 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/c636cfd6-27dc-4171-942e-8bb2cb7046a1-public-tls-certs\") pod \"nova-api-0\" (UID: \"c636cfd6-27dc-4171-942e-8bb2cb7046a1\") " pod="openstack/nova-api-0" Jan 11 17:53:51 crc kubenswrapper[4837]: I0111 17:53:51.118450 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/c636cfd6-27dc-4171-942e-8bb2cb7046a1-internal-tls-certs\") pod \"nova-api-0\" (UID: \"c636cfd6-27dc-4171-942e-8bb2cb7046a1\") " pod="openstack/nova-api-0" Jan 11 17:53:51 crc kubenswrapper[4837]: I0111 17:53:51.118513 4837 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-cell1-novncproxy-0" Jan 11 17:53:51 crc kubenswrapper[4837]: I0111 17:53:51.118603 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c636cfd6-27dc-4171-942e-8bb2cb7046a1-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"c636cfd6-27dc-4171-942e-8bb2cb7046a1\") " pod="openstack/nova-api-0" Jan 11 17:53:51 crc kubenswrapper[4837]: I0111 17:53:51.118733 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c636cfd6-27dc-4171-942e-8bb2cb7046a1-config-data\") pod \"nova-api-0\" (UID: \"c636cfd6-27dc-4171-942e-8bb2cb7046a1\") " pod="openstack/nova-api-0" Jan 11 17:53:51 crc kubenswrapper[4837]: I0111 17:53:51.118836 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-sx7v2\" (UniqueName: \"kubernetes.io/projected/c636cfd6-27dc-4171-942e-8bb2cb7046a1-kube-api-access-sx7v2\") pod \"nova-api-0\" (UID: \"c636cfd6-27dc-4171-942e-8bb2cb7046a1\") " pod="openstack/nova-api-0" Jan 11 17:53:51 crc kubenswrapper[4837]: I0111 17:53:51.118973 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/c636cfd6-27dc-4171-942e-8bb2cb7046a1-logs\") pod \"nova-api-0\" (UID: \"c636cfd6-27dc-4171-942e-8bb2cb7046a1\") " pod="openstack/nova-api-0" Jan 11 17:53:51 crc kubenswrapper[4837]: I0111 17:53:51.119341 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/c636cfd6-27dc-4171-942e-8bb2cb7046a1-logs\") pod \"nova-api-0\" (UID: \"c636cfd6-27dc-4171-942e-8bb2cb7046a1\") " pod="openstack/nova-api-0" Jan 11 17:53:51 crc kubenswrapper[4837]: I0111 17:53:51.123233 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/c636cfd6-27dc-4171-942e-8bb2cb7046a1-internal-tls-certs\") pod \"nova-api-0\" (UID: \"c636cfd6-27dc-4171-942e-8bb2cb7046a1\") " pod="openstack/nova-api-0" Jan 11 17:53:51 crc kubenswrapper[4837]: I0111 17:53:51.123546 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/c636cfd6-27dc-4171-942e-8bb2cb7046a1-public-tls-certs\") pod \"nova-api-0\" (UID: \"c636cfd6-27dc-4171-942e-8bb2cb7046a1\") " pod="openstack/nova-api-0" Jan 11 17:53:51 crc kubenswrapper[4837]: I0111 17:53:51.124644 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c636cfd6-27dc-4171-942e-8bb2cb7046a1-config-data\") pod \"nova-api-0\" (UID: \"c636cfd6-27dc-4171-942e-8bb2cb7046a1\") " pod="openstack/nova-api-0" Jan 11 17:53:51 crc kubenswrapper[4837]: I0111 17:53:51.130018 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c636cfd6-27dc-4171-942e-8bb2cb7046a1-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"c636cfd6-27dc-4171-942e-8bb2cb7046a1\") " pod="openstack/nova-api-0" Jan 11 17:53:51 crc kubenswrapper[4837]: I0111 17:53:51.137583 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-sx7v2\" (UniqueName: \"kubernetes.io/projected/c636cfd6-27dc-4171-942e-8bb2cb7046a1-kube-api-access-sx7v2\") pod \"nova-api-0\" (UID: \"c636cfd6-27dc-4171-942e-8bb2cb7046a1\") " pod="openstack/nova-api-0" Jan 11 17:53:51 crc kubenswrapper[4837]: I0111 17:53:51.138565 4837 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-cell1-novncproxy-0" Jan 11 17:53:51 crc kubenswrapper[4837]: I0111 17:53:51.198102 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Jan 11 17:53:51 crc kubenswrapper[4837]: I0111 17:53:51.713290 4837 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Jan 11 17:53:51 crc kubenswrapper[4837]: W0111 17:53:51.721906 4837 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podc636cfd6_27dc_4171_942e_8bb2cb7046a1.slice/crio-748c577171a0956f51ab747f28a2dfa160cb78179cb4ffc463b4cd50401d2eca WatchSource:0}: Error finding container 748c577171a0956f51ab747f28a2dfa160cb78179cb4ffc463b4cd50401d2eca: Status 404 returned error can't find the container with id 748c577171a0956f51ab747f28a2dfa160cb78179cb4ffc463b4cd50401d2eca Jan 11 17:53:51 crc kubenswrapper[4837]: I0111 17:53:51.831302 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"19839462-912e-421f-8d6d-a5ef5d8129f5","Type":"ContainerStarted","Data":"7babf98d1e51fcdac0de875f4f59afc2022b331fcc13dfbfc2ed6176dbc817b0"} Jan 11 17:53:51 crc kubenswrapper[4837]: I0111 17:53:51.832861 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"c636cfd6-27dc-4171-942e-8bb2cb7046a1","Type":"ContainerStarted","Data":"748c577171a0956f51ab747f28a2dfa160cb78179cb4ffc463b4cd50401d2eca"} Jan 11 17:53:51 crc kubenswrapper[4837]: I0111 17:53:51.856338 4837 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-cell1-novncproxy-0" Jan 11 17:53:52 crc kubenswrapper[4837]: I0111 17:53:52.022501 4837 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-cell-mapping-mdgzf"] Jan 11 17:53:52 crc kubenswrapper[4837]: I0111 17:53:52.023799 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-cell-mapping-mdgzf" Jan 11 17:53:52 crc kubenswrapper[4837]: I0111 17:53:52.028026 4837 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-manage-scripts" Jan 11 17:53:52 crc kubenswrapper[4837]: I0111 17:53:52.028553 4837 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-manage-config-data" Jan 11 17:53:52 crc kubenswrapper[4837]: I0111 17:53:52.033790 4837 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-cell-mapping-mdgzf"] Jan 11 17:53:52 crc kubenswrapper[4837]: I0111 17:53:52.137110 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/293316f1-dfe8-42c8-82d3-365be90cdbd1-combined-ca-bundle\") pod \"nova-cell1-cell-mapping-mdgzf\" (UID: \"293316f1-dfe8-42c8-82d3-365be90cdbd1\") " pod="openstack/nova-cell1-cell-mapping-mdgzf" Jan 11 17:53:52 crc kubenswrapper[4837]: I0111 17:53:52.137177 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/293316f1-dfe8-42c8-82d3-365be90cdbd1-scripts\") pod \"nova-cell1-cell-mapping-mdgzf\" (UID: \"293316f1-dfe8-42c8-82d3-365be90cdbd1\") " pod="openstack/nova-cell1-cell-mapping-mdgzf" Jan 11 17:53:52 crc kubenswrapper[4837]: I0111 17:53:52.137204 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-r5l9q\" (UniqueName: \"kubernetes.io/projected/293316f1-dfe8-42c8-82d3-365be90cdbd1-kube-api-access-r5l9q\") pod \"nova-cell1-cell-mapping-mdgzf\" (UID: \"293316f1-dfe8-42c8-82d3-365be90cdbd1\") " pod="openstack/nova-cell1-cell-mapping-mdgzf" Jan 11 17:53:52 crc kubenswrapper[4837]: I0111 17:53:52.137706 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/293316f1-dfe8-42c8-82d3-365be90cdbd1-config-data\") pod \"nova-cell1-cell-mapping-mdgzf\" (UID: \"293316f1-dfe8-42c8-82d3-365be90cdbd1\") " pod="openstack/nova-cell1-cell-mapping-mdgzf" Jan 11 17:53:52 crc kubenswrapper[4837]: I0111 17:53:52.240130 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/293316f1-dfe8-42c8-82d3-365be90cdbd1-combined-ca-bundle\") pod \"nova-cell1-cell-mapping-mdgzf\" (UID: \"293316f1-dfe8-42c8-82d3-365be90cdbd1\") " pod="openstack/nova-cell1-cell-mapping-mdgzf" Jan 11 17:53:52 crc kubenswrapper[4837]: I0111 17:53:52.240250 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/293316f1-dfe8-42c8-82d3-365be90cdbd1-scripts\") pod \"nova-cell1-cell-mapping-mdgzf\" (UID: \"293316f1-dfe8-42c8-82d3-365be90cdbd1\") " pod="openstack/nova-cell1-cell-mapping-mdgzf" Jan 11 17:53:52 crc kubenswrapper[4837]: I0111 17:53:52.240287 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-r5l9q\" (UniqueName: \"kubernetes.io/projected/293316f1-dfe8-42c8-82d3-365be90cdbd1-kube-api-access-r5l9q\") pod \"nova-cell1-cell-mapping-mdgzf\" (UID: \"293316f1-dfe8-42c8-82d3-365be90cdbd1\") " pod="openstack/nova-cell1-cell-mapping-mdgzf" Jan 11 17:53:52 crc kubenswrapper[4837]: I0111 17:53:52.240447 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/293316f1-dfe8-42c8-82d3-365be90cdbd1-config-data\") pod \"nova-cell1-cell-mapping-mdgzf\" (UID: \"293316f1-dfe8-42c8-82d3-365be90cdbd1\") " pod="openstack/nova-cell1-cell-mapping-mdgzf" Jan 11 17:53:52 crc kubenswrapper[4837]: I0111 17:53:52.245325 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/293316f1-dfe8-42c8-82d3-365be90cdbd1-scripts\") pod \"nova-cell1-cell-mapping-mdgzf\" (UID: \"293316f1-dfe8-42c8-82d3-365be90cdbd1\") " pod="openstack/nova-cell1-cell-mapping-mdgzf" Jan 11 17:53:52 crc kubenswrapper[4837]: I0111 17:53:52.245743 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/293316f1-dfe8-42c8-82d3-365be90cdbd1-combined-ca-bundle\") pod \"nova-cell1-cell-mapping-mdgzf\" (UID: \"293316f1-dfe8-42c8-82d3-365be90cdbd1\") " pod="openstack/nova-cell1-cell-mapping-mdgzf" Jan 11 17:53:52 crc kubenswrapper[4837]: I0111 17:53:52.253967 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/293316f1-dfe8-42c8-82d3-365be90cdbd1-config-data\") pod \"nova-cell1-cell-mapping-mdgzf\" (UID: \"293316f1-dfe8-42c8-82d3-365be90cdbd1\") " pod="openstack/nova-cell1-cell-mapping-mdgzf" Jan 11 17:53:52 crc kubenswrapper[4837]: I0111 17:53:52.258097 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-r5l9q\" (UniqueName: \"kubernetes.io/projected/293316f1-dfe8-42c8-82d3-365be90cdbd1-kube-api-access-r5l9q\") pod \"nova-cell1-cell-mapping-mdgzf\" (UID: \"293316f1-dfe8-42c8-82d3-365be90cdbd1\") " pod="openstack/nova-cell1-cell-mapping-mdgzf" Jan 11 17:53:52 crc kubenswrapper[4837]: I0111 17:53:52.349999 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-cell-mapping-mdgzf" Jan 11 17:53:52 crc kubenswrapper[4837]: I0111 17:53:52.373666 4837 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="828211f0-0290-4c77-9227-355162626da7" path="/var/lib/kubelet/pods/828211f0-0290-4c77-9227-355162626da7/volumes" Jan 11 17:53:52 crc kubenswrapper[4837]: I0111 17:53:52.828067 4837 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-cell-mapping-mdgzf"] Jan 11 17:53:52 crc kubenswrapper[4837]: W0111 17:53:52.828981 4837 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod293316f1_dfe8_42c8_82d3_365be90cdbd1.slice/crio-df8fa80274ca24b2230bbbe9ff9caf29d79c60a40fdb2b0e8d722f229336506e WatchSource:0}: Error finding container df8fa80274ca24b2230bbbe9ff9caf29d79c60a40fdb2b0e8d722f229336506e: Status 404 returned error can't find the container with id df8fa80274ca24b2230bbbe9ff9caf29d79c60a40fdb2b0e8d722f229336506e Jan 11 17:53:52 crc kubenswrapper[4837]: I0111 17:53:52.843785 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-cell-mapping-mdgzf" event={"ID":"293316f1-dfe8-42c8-82d3-365be90cdbd1","Type":"ContainerStarted","Data":"df8fa80274ca24b2230bbbe9ff9caf29d79c60a40fdb2b0e8d722f229336506e"} Jan 11 17:53:52 crc kubenswrapper[4837]: I0111 17:53:52.846110 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"19839462-912e-421f-8d6d-a5ef5d8129f5","Type":"ContainerStarted","Data":"9dbb67cfb66353a1373870732ba75a7b63226d1e6dcf091e60e5cf54d603edb9"} Jan 11 17:53:52 crc kubenswrapper[4837]: I0111 17:53:52.848671 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"c636cfd6-27dc-4171-942e-8bb2cb7046a1","Type":"ContainerStarted","Data":"39e71a19adbcfd783551df4c96c33819de6ed1dc22700fd2bbc0b191bfd573b2"} Jan 11 17:53:52 crc kubenswrapper[4837]: I0111 17:53:52.848727 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"c636cfd6-27dc-4171-942e-8bb2cb7046a1","Type":"ContainerStarted","Data":"9b5615d0bcebedfe1699bb1f876da5f76de922fea5f541fa04a8d2b17f60b26c"} Jan 11 17:53:52 crc kubenswrapper[4837]: I0111 17:53:52.874435 4837 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-api-0" podStartSLOduration=2.874373299 podStartE2EDuration="2.874373299s" podCreationTimestamp="2026-01-11 17:53:50 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-11 17:53:52.866453567 +0000 UTC m=+1407.044646293" watchObservedRunningTime="2026-01-11 17:53:52.874373299 +0000 UTC m=+1407.052566005" Jan 11 17:53:53 crc kubenswrapper[4837]: I0111 17:53:53.859183 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-cell-mapping-mdgzf" event={"ID":"293316f1-dfe8-42c8-82d3-365be90cdbd1","Type":"ContainerStarted","Data":"1fbb68e7f3679aad99ef6abda0b2c572c0228d6ef797d566597d8a067bfc7bbb"} Jan 11 17:53:53 crc kubenswrapper[4837]: I0111 17:53:53.892263 4837 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell1-cell-mapping-mdgzf" podStartSLOduration=1.892235112 podStartE2EDuration="1.892235112s" podCreationTimestamp="2026-01-11 17:53:52 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-11 17:53:53.871157865 +0000 UTC m=+1408.049350581" watchObservedRunningTime="2026-01-11 17:53:53.892235112 +0000 UTC m=+1408.070427828" Jan 11 17:53:54 crc kubenswrapper[4837]: I0111 17:53:54.285898 4837 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-59cf4bdb65-txfdc" Jan 11 17:53:54 crc kubenswrapper[4837]: I0111 17:53:54.419623 4837 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-845d6d6f59-8q68w"] Jan 11 17:53:54 crc kubenswrapper[4837]: I0111 17:53:54.419901 4837 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-845d6d6f59-8q68w" podUID="a01c38b2-b0ff-400f-a6af-3be08fad9373" containerName="dnsmasq-dns" containerID="cri-o://20a389c545715b8834414800356628d63aacbef1a7d9800a452baf08c3f21efe" gracePeriod=10 Jan 11 17:53:54 crc kubenswrapper[4837]: I0111 17:53:54.869921 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"19839462-912e-421f-8d6d-a5ef5d8129f5","Type":"ContainerStarted","Data":"f49edcf2e1013d5f68c040fed240f43b531c047095161fd288cefde31cfd7515"} Jan 11 17:53:54 crc kubenswrapper[4837]: I0111 17:53:54.870454 4837 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ceilometer-0" Jan 11 17:53:54 crc kubenswrapper[4837]: I0111 17:53:54.871861 4837 generic.go:334] "Generic (PLEG): container finished" podID="a01c38b2-b0ff-400f-a6af-3be08fad9373" containerID="20a389c545715b8834414800356628d63aacbef1a7d9800a452baf08c3f21efe" exitCode=0 Jan 11 17:53:54 crc kubenswrapper[4837]: I0111 17:53:54.871892 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-845d6d6f59-8q68w" event={"ID":"a01c38b2-b0ff-400f-a6af-3be08fad9373","Type":"ContainerDied","Data":"20a389c545715b8834414800356628d63aacbef1a7d9800a452baf08c3f21efe"} Jan 11 17:53:54 crc kubenswrapper[4837]: I0111 17:53:54.871944 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-845d6d6f59-8q68w" event={"ID":"a01c38b2-b0ff-400f-a6af-3be08fad9373","Type":"ContainerDied","Data":"0e581973aa6483229ef8f725f63551923adbb0ec9eb872dfeb60cec9534fb4ac"} Jan 11 17:53:54 crc kubenswrapper[4837]: I0111 17:53:54.871960 4837 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="0e581973aa6483229ef8f725f63551923adbb0ec9eb872dfeb60cec9534fb4ac" Jan 11 17:53:54 crc kubenswrapper[4837]: I0111 17:53:54.892937 4837 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-845d6d6f59-8q68w" Jan 11 17:53:54 crc kubenswrapper[4837]: I0111 17:53:54.930218 4837 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ceilometer-0" podStartSLOduration=2.282267124 podStartE2EDuration="5.930198232s" podCreationTimestamp="2026-01-11 17:53:49 +0000 UTC" firstStartedPulling="2026-01-11 17:53:50.079578322 +0000 UTC m=+1404.257771028" lastFinishedPulling="2026-01-11 17:53:53.7275094 +0000 UTC m=+1407.905702136" observedRunningTime="2026-01-11 17:53:54.906415034 +0000 UTC m=+1409.084607760" watchObservedRunningTime="2026-01-11 17:53:54.930198232 +0000 UTC m=+1409.108390938" Jan 11 17:53:55 crc kubenswrapper[4837]: I0111 17:53:55.014420 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/a01c38b2-b0ff-400f-a6af-3be08fad9373-ovsdbserver-nb\") pod \"a01c38b2-b0ff-400f-a6af-3be08fad9373\" (UID: \"a01c38b2-b0ff-400f-a6af-3be08fad9373\") " Jan 11 17:53:55 crc kubenswrapper[4837]: I0111 17:53:55.014533 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/a01c38b2-b0ff-400f-a6af-3be08fad9373-ovsdbserver-sb\") pod \"a01c38b2-b0ff-400f-a6af-3be08fad9373\" (UID: \"a01c38b2-b0ff-400f-a6af-3be08fad9373\") " Jan 11 17:53:55 crc kubenswrapper[4837]: I0111 17:53:55.014609 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a01c38b2-b0ff-400f-a6af-3be08fad9373-config\") pod \"a01c38b2-b0ff-400f-a6af-3be08fad9373\" (UID: \"a01c38b2-b0ff-400f-a6af-3be08fad9373\") " Jan 11 17:53:55 crc kubenswrapper[4837]: I0111 17:53:55.014642 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/a01c38b2-b0ff-400f-a6af-3be08fad9373-dns-svc\") pod \"a01c38b2-b0ff-400f-a6af-3be08fad9373\" (UID: \"a01c38b2-b0ff-400f-a6af-3be08fad9373\") " Jan 11 17:53:55 crc kubenswrapper[4837]: I0111 17:53:55.014780 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qt2xh\" (UniqueName: \"kubernetes.io/projected/a01c38b2-b0ff-400f-a6af-3be08fad9373-kube-api-access-qt2xh\") pod \"a01c38b2-b0ff-400f-a6af-3be08fad9373\" (UID: \"a01c38b2-b0ff-400f-a6af-3be08fad9373\") " Jan 11 17:53:55 crc kubenswrapper[4837]: I0111 17:53:55.014891 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/a01c38b2-b0ff-400f-a6af-3be08fad9373-dns-swift-storage-0\") pod \"a01c38b2-b0ff-400f-a6af-3be08fad9373\" (UID: \"a01c38b2-b0ff-400f-a6af-3be08fad9373\") " Jan 11 17:53:55 crc kubenswrapper[4837]: I0111 17:53:55.032899 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a01c38b2-b0ff-400f-a6af-3be08fad9373-kube-api-access-qt2xh" (OuterVolumeSpecName: "kube-api-access-qt2xh") pod "a01c38b2-b0ff-400f-a6af-3be08fad9373" (UID: "a01c38b2-b0ff-400f-a6af-3be08fad9373"). InnerVolumeSpecName "kube-api-access-qt2xh". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 11 17:53:55 crc kubenswrapper[4837]: I0111 17:53:55.130185 4837 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qt2xh\" (UniqueName: \"kubernetes.io/projected/a01c38b2-b0ff-400f-a6af-3be08fad9373-kube-api-access-qt2xh\") on node \"crc\" DevicePath \"\"" Jan 11 17:53:55 crc kubenswrapper[4837]: I0111 17:53:55.140857 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a01c38b2-b0ff-400f-a6af-3be08fad9373-config" (OuterVolumeSpecName: "config") pod "a01c38b2-b0ff-400f-a6af-3be08fad9373" (UID: "a01c38b2-b0ff-400f-a6af-3be08fad9373"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 11 17:53:55 crc kubenswrapper[4837]: I0111 17:53:55.152354 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a01c38b2-b0ff-400f-a6af-3be08fad9373-dns-swift-storage-0" (OuterVolumeSpecName: "dns-swift-storage-0") pod "a01c38b2-b0ff-400f-a6af-3be08fad9373" (UID: "a01c38b2-b0ff-400f-a6af-3be08fad9373"). InnerVolumeSpecName "dns-swift-storage-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 11 17:53:55 crc kubenswrapper[4837]: I0111 17:53:55.153723 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a01c38b2-b0ff-400f-a6af-3be08fad9373-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "a01c38b2-b0ff-400f-a6af-3be08fad9373" (UID: "a01c38b2-b0ff-400f-a6af-3be08fad9373"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 11 17:53:55 crc kubenswrapper[4837]: I0111 17:53:55.182730 4837 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-t482m"] Jan 11 17:53:55 crc kubenswrapper[4837]: E0111 17:53:55.183157 4837 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a01c38b2-b0ff-400f-a6af-3be08fad9373" containerName="init" Jan 11 17:53:55 crc kubenswrapper[4837]: I0111 17:53:55.183174 4837 state_mem.go:107] "Deleted CPUSet assignment" podUID="a01c38b2-b0ff-400f-a6af-3be08fad9373" containerName="init" Jan 11 17:53:55 crc kubenswrapper[4837]: E0111 17:53:55.183195 4837 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a01c38b2-b0ff-400f-a6af-3be08fad9373" containerName="dnsmasq-dns" Jan 11 17:53:55 crc kubenswrapper[4837]: I0111 17:53:55.183202 4837 state_mem.go:107] "Deleted CPUSet assignment" podUID="a01c38b2-b0ff-400f-a6af-3be08fad9373" containerName="dnsmasq-dns" Jan 11 17:53:55 crc kubenswrapper[4837]: I0111 17:53:55.183401 4837 memory_manager.go:354] "RemoveStaleState removing state" podUID="a01c38b2-b0ff-400f-a6af-3be08fad9373" containerName="dnsmasq-dns" Jan 11 17:53:55 crc kubenswrapper[4837]: I0111 17:53:55.184667 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-t482m" Jan 11 17:53:55 crc kubenswrapper[4837]: I0111 17:53:55.185700 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a01c38b2-b0ff-400f-a6af-3be08fad9373-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "a01c38b2-b0ff-400f-a6af-3be08fad9373" (UID: "a01c38b2-b0ff-400f-a6af-3be08fad9373"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 11 17:53:55 crc kubenswrapper[4837]: I0111 17:53:55.187961 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a01c38b2-b0ff-400f-a6af-3be08fad9373-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "a01c38b2-b0ff-400f-a6af-3be08fad9373" (UID: "a01c38b2-b0ff-400f-a6af-3be08fad9373"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 11 17:53:55 crc kubenswrapper[4837]: I0111 17:53:55.193773 4837 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-t482m"] Jan 11 17:53:55 crc kubenswrapper[4837]: I0111 17:53:55.232134 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/9c525acb-2a0a-4df5-866a-507d5ba03364-utilities\") pod \"redhat-operators-t482m\" (UID: \"9c525acb-2a0a-4df5-866a-507d5ba03364\") " pod="openshift-marketplace/redhat-operators-t482m" Jan 11 17:53:55 crc kubenswrapper[4837]: I0111 17:53:55.232177 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-v4srt\" (UniqueName: \"kubernetes.io/projected/9c525acb-2a0a-4df5-866a-507d5ba03364-kube-api-access-v4srt\") pod \"redhat-operators-t482m\" (UID: \"9c525acb-2a0a-4df5-866a-507d5ba03364\") " pod="openshift-marketplace/redhat-operators-t482m" Jan 11 17:53:55 crc kubenswrapper[4837]: I0111 17:53:55.232216 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/9c525acb-2a0a-4df5-866a-507d5ba03364-catalog-content\") pod \"redhat-operators-t482m\" (UID: \"9c525acb-2a0a-4df5-866a-507d5ba03364\") " pod="openshift-marketplace/redhat-operators-t482m" Jan 11 17:53:55 crc kubenswrapper[4837]: I0111 17:53:55.232308 4837 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a01c38b2-b0ff-400f-a6af-3be08fad9373-config\") on node \"crc\" DevicePath \"\"" Jan 11 17:53:55 crc kubenswrapper[4837]: I0111 17:53:55.232320 4837 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/a01c38b2-b0ff-400f-a6af-3be08fad9373-dns-svc\") on node \"crc\" DevicePath \"\"" Jan 11 17:53:55 crc kubenswrapper[4837]: I0111 17:53:55.232329 4837 reconciler_common.go:293] "Volume detached for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/a01c38b2-b0ff-400f-a6af-3be08fad9373-dns-swift-storage-0\") on node \"crc\" DevicePath \"\"" Jan 11 17:53:55 crc kubenswrapper[4837]: I0111 17:53:55.232339 4837 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/a01c38b2-b0ff-400f-a6af-3be08fad9373-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Jan 11 17:53:55 crc kubenswrapper[4837]: I0111 17:53:55.232347 4837 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/a01c38b2-b0ff-400f-a6af-3be08fad9373-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Jan 11 17:53:55 crc kubenswrapper[4837]: I0111 17:53:55.333845 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/9c525acb-2a0a-4df5-866a-507d5ba03364-utilities\") pod \"redhat-operators-t482m\" (UID: \"9c525acb-2a0a-4df5-866a-507d5ba03364\") " pod="openshift-marketplace/redhat-operators-t482m" Jan 11 17:53:55 crc kubenswrapper[4837]: I0111 17:53:55.333892 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-v4srt\" (UniqueName: \"kubernetes.io/projected/9c525acb-2a0a-4df5-866a-507d5ba03364-kube-api-access-v4srt\") pod \"redhat-operators-t482m\" (UID: \"9c525acb-2a0a-4df5-866a-507d5ba03364\") " pod="openshift-marketplace/redhat-operators-t482m" Jan 11 17:53:55 crc kubenswrapper[4837]: I0111 17:53:55.333933 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/9c525acb-2a0a-4df5-866a-507d5ba03364-catalog-content\") pod \"redhat-operators-t482m\" (UID: \"9c525acb-2a0a-4df5-866a-507d5ba03364\") " pod="openshift-marketplace/redhat-operators-t482m" Jan 11 17:53:55 crc kubenswrapper[4837]: I0111 17:53:55.334356 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/9c525acb-2a0a-4df5-866a-507d5ba03364-utilities\") pod \"redhat-operators-t482m\" (UID: \"9c525acb-2a0a-4df5-866a-507d5ba03364\") " pod="openshift-marketplace/redhat-operators-t482m" Jan 11 17:53:55 crc kubenswrapper[4837]: I0111 17:53:55.334424 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/9c525acb-2a0a-4df5-866a-507d5ba03364-catalog-content\") pod \"redhat-operators-t482m\" (UID: \"9c525acb-2a0a-4df5-866a-507d5ba03364\") " pod="openshift-marketplace/redhat-operators-t482m" Jan 11 17:53:55 crc kubenswrapper[4837]: I0111 17:53:55.349575 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-v4srt\" (UniqueName: \"kubernetes.io/projected/9c525acb-2a0a-4df5-866a-507d5ba03364-kube-api-access-v4srt\") pod \"redhat-operators-t482m\" (UID: \"9c525acb-2a0a-4df5-866a-507d5ba03364\") " pod="openshift-marketplace/redhat-operators-t482m" Jan 11 17:53:55 crc kubenswrapper[4837]: I0111 17:53:55.511623 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-t482m" Jan 11 17:53:55 crc kubenswrapper[4837]: I0111 17:53:55.879695 4837 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-845d6d6f59-8q68w" Jan 11 17:53:55 crc kubenswrapper[4837]: I0111 17:53:55.917005 4837 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-845d6d6f59-8q68w"] Jan 11 17:53:55 crc kubenswrapper[4837]: I0111 17:53:55.935484 4837 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-845d6d6f59-8q68w"] Jan 11 17:53:56 crc kubenswrapper[4837]: I0111 17:53:56.027170 4837 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-t482m"] Jan 11 17:53:56 crc kubenswrapper[4837]: W0111 17:53:56.031915 4837 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod9c525acb_2a0a_4df5_866a_507d5ba03364.slice/crio-ba499b510764f17e49dea8ad36fd1218fa1738031338f07f882ca12ec309b9ad WatchSource:0}: Error finding container ba499b510764f17e49dea8ad36fd1218fa1738031338f07f882ca12ec309b9ad: Status 404 returned error can't find the container with id ba499b510764f17e49dea8ad36fd1218fa1738031338f07f882ca12ec309b9ad Jan 11 17:53:56 crc kubenswrapper[4837]: I0111 17:53:56.373573 4837 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a01c38b2-b0ff-400f-a6af-3be08fad9373" path="/var/lib/kubelet/pods/a01c38b2-b0ff-400f-a6af-3be08fad9373/volumes" Jan 11 17:53:56 crc kubenswrapper[4837]: I0111 17:53:56.887653 4837 generic.go:334] "Generic (PLEG): container finished" podID="9c525acb-2a0a-4df5-866a-507d5ba03364" containerID="8a23e9219984e4dc111b09b363842c9957d4564340476ccdb54b9a93846e608b" exitCode=0 Jan 11 17:53:56 crc kubenswrapper[4837]: I0111 17:53:56.887831 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-t482m" event={"ID":"9c525acb-2a0a-4df5-866a-507d5ba03364","Type":"ContainerDied","Data":"8a23e9219984e4dc111b09b363842c9957d4564340476ccdb54b9a93846e608b"} Jan 11 17:53:56 crc kubenswrapper[4837]: I0111 17:53:56.887922 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-t482m" event={"ID":"9c525acb-2a0a-4df5-866a-507d5ba03364","Type":"ContainerStarted","Data":"ba499b510764f17e49dea8ad36fd1218fa1738031338f07f882ca12ec309b9ad"} Jan 11 17:53:57 crc kubenswrapper[4837]: I0111 17:53:57.898538 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-t482m" event={"ID":"9c525acb-2a0a-4df5-866a-507d5ba03364","Type":"ContainerStarted","Data":"2b3ca64ea875b2fa652fc6da261da8ea8f6ac568c89d18f7c8fe8abf6a2087eb"} Jan 11 17:53:58 crc kubenswrapper[4837]: I0111 17:53:58.908453 4837 generic.go:334] "Generic (PLEG): container finished" podID="293316f1-dfe8-42c8-82d3-365be90cdbd1" containerID="1fbb68e7f3679aad99ef6abda0b2c572c0228d6ef797d566597d8a067bfc7bbb" exitCode=0 Jan 11 17:53:58 crc kubenswrapper[4837]: I0111 17:53:58.908531 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-cell-mapping-mdgzf" event={"ID":"293316f1-dfe8-42c8-82d3-365be90cdbd1","Type":"ContainerDied","Data":"1fbb68e7f3679aad99ef6abda0b2c572c0228d6ef797d566597d8a067bfc7bbb"} Jan 11 17:53:59 crc kubenswrapper[4837]: E0111 17:53:59.490209 4837 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod9c525acb_2a0a_4df5_866a_507d5ba03364.slice/crio-conmon-2b3ca64ea875b2fa652fc6da261da8ea8f6ac568c89d18f7c8fe8abf6a2087eb.scope\": RecentStats: unable to find data in memory cache]" Jan 11 17:53:59 crc kubenswrapper[4837]: I0111 17:53:59.923864 4837 generic.go:334] "Generic (PLEG): container finished" podID="9c525acb-2a0a-4df5-866a-507d5ba03364" containerID="2b3ca64ea875b2fa652fc6da261da8ea8f6ac568c89d18f7c8fe8abf6a2087eb" exitCode=0 Jan 11 17:53:59 crc kubenswrapper[4837]: I0111 17:53:59.923951 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-t482m" event={"ID":"9c525acb-2a0a-4df5-866a-507d5ba03364","Type":"ContainerDied","Data":"2b3ca64ea875b2fa652fc6da261da8ea8f6ac568c89d18f7c8fe8abf6a2087eb"} Jan 11 17:54:00 crc kubenswrapper[4837]: I0111 17:54:00.397690 4837 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-cell-mapping-mdgzf" Jan 11 17:54:00 crc kubenswrapper[4837]: I0111 17:54:00.458184 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/293316f1-dfe8-42c8-82d3-365be90cdbd1-config-data\") pod \"293316f1-dfe8-42c8-82d3-365be90cdbd1\" (UID: \"293316f1-dfe8-42c8-82d3-365be90cdbd1\") " Jan 11 17:54:00 crc kubenswrapper[4837]: I0111 17:54:00.458288 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/293316f1-dfe8-42c8-82d3-365be90cdbd1-scripts\") pod \"293316f1-dfe8-42c8-82d3-365be90cdbd1\" (UID: \"293316f1-dfe8-42c8-82d3-365be90cdbd1\") " Jan 11 17:54:00 crc kubenswrapper[4837]: I0111 17:54:00.458440 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/293316f1-dfe8-42c8-82d3-365be90cdbd1-combined-ca-bundle\") pod \"293316f1-dfe8-42c8-82d3-365be90cdbd1\" (UID: \"293316f1-dfe8-42c8-82d3-365be90cdbd1\") " Jan 11 17:54:00 crc kubenswrapper[4837]: I0111 17:54:00.469695 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/293316f1-dfe8-42c8-82d3-365be90cdbd1-scripts" (OuterVolumeSpecName: "scripts") pod "293316f1-dfe8-42c8-82d3-365be90cdbd1" (UID: "293316f1-dfe8-42c8-82d3-365be90cdbd1"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 11 17:54:00 crc kubenswrapper[4837]: I0111 17:54:00.499246 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/293316f1-dfe8-42c8-82d3-365be90cdbd1-config-data" (OuterVolumeSpecName: "config-data") pod "293316f1-dfe8-42c8-82d3-365be90cdbd1" (UID: "293316f1-dfe8-42c8-82d3-365be90cdbd1"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 11 17:54:00 crc kubenswrapper[4837]: I0111 17:54:00.510587 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/293316f1-dfe8-42c8-82d3-365be90cdbd1-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "293316f1-dfe8-42c8-82d3-365be90cdbd1" (UID: "293316f1-dfe8-42c8-82d3-365be90cdbd1"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 11 17:54:00 crc kubenswrapper[4837]: I0111 17:54:00.559897 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-r5l9q\" (UniqueName: \"kubernetes.io/projected/293316f1-dfe8-42c8-82d3-365be90cdbd1-kube-api-access-r5l9q\") pod \"293316f1-dfe8-42c8-82d3-365be90cdbd1\" (UID: \"293316f1-dfe8-42c8-82d3-365be90cdbd1\") " Jan 11 17:54:00 crc kubenswrapper[4837]: I0111 17:54:00.561564 4837 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/293316f1-dfe8-42c8-82d3-365be90cdbd1-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 11 17:54:00 crc kubenswrapper[4837]: I0111 17:54:00.561968 4837 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/293316f1-dfe8-42c8-82d3-365be90cdbd1-config-data\") on node \"crc\" DevicePath \"\"" Jan 11 17:54:00 crc kubenswrapper[4837]: I0111 17:54:00.562026 4837 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/293316f1-dfe8-42c8-82d3-365be90cdbd1-scripts\") on node \"crc\" DevicePath \"\"" Jan 11 17:54:00 crc kubenswrapper[4837]: I0111 17:54:00.586903 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/293316f1-dfe8-42c8-82d3-365be90cdbd1-kube-api-access-r5l9q" (OuterVolumeSpecName: "kube-api-access-r5l9q") pod "293316f1-dfe8-42c8-82d3-365be90cdbd1" (UID: "293316f1-dfe8-42c8-82d3-365be90cdbd1"). InnerVolumeSpecName "kube-api-access-r5l9q". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 11 17:54:00 crc kubenswrapper[4837]: I0111 17:54:00.663606 4837 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-r5l9q\" (UniqueName: \"kubernetes.io/projected/293316f1-dfe8-42c8-82d3-365be90cdbd1-kube-api-access-r5l9q\") on node \"crc\" DevicePath \"\"" Jan 11 17:54:00 crc kubenswrapper[4837]: I0111 17:54:00.942169 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-cell-mapping-mdgzf" event={"ID":"293316f1-dfe8-42c8-82d3-365be90cdbd1","Type":"ContainerDied","Data":"df8fa80274ca24b2230bbbe9ff9caf29d79c60a40fdb2b0e8d722f229336506e"} Jan 11 17:54:00 crc kubenswrapper[4837]: I0111 17:54:00.942880 4837 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="df8fa80274ca24b2230bbbe9ff9caf29d79c60a40fdb2b0e8d722f229336506e" Jan 11 17:54:00 crc kubenswrapper[4837]: I0111 17:54:00.942260 4837 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-cell-mapping-mdgzf" Jan 11 17:54:01 crc kubenswrapper[4837]: I0111 17:54:01.131770 4837 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-scheduler-0"] Jan 11 17:54:01 crc kubenswrapper[4837]: I0111 17:54:01.132068 4837 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-scheduler-0" podUID="33ef5a38-f83c-40ea-8036-fe830d8a32a3" containerName="nova-scheduler-scheduler" containerID="cri-o://05e495b4e01872df82d2f91347f5de6102ead04db40726141e08985501daa0a0" gracePeriod=30 Jan 11 17:54:01 crc kubenswrapper[4837]: I0111 17:54:01.147002 4837 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-0"] Jan 11 17:54:01 crc kubenswrapper[4837]: I0111 17:54:01.147284 4837 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-api-0" podUID="c636cfd6-27dc-4171-942e-8bb2cb7046a1" containerName="nova-api-log" containerID="cri-o://9b5615d0bcebedfe1699bb1f876da5f76de922fea5f541fa04a8d2b17f60b26c" gracePeriod=30 Jan 11 17:54:01 crc kubenswrapper[4837]: I0111 17:54:01.147372 4837 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-api-0" podUID="c636cfd6-27dc-4171-942e-8bb2cb7046a1" containerName="nova-api-api" containerID="cri-o://39e71a19adbcfd783551df4c96c33819de6ed1dc22700fd2bbc0b191bfd573b2" gracePeriod=30 Jan 11 17:54:01 crc kubenswrapper[4837]: I0111 17:54:01.212830 4837 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-metadata-0"] Jan 11 17:54:01 crc kubenswrapper[4837]: I0111 17:54:01.213062 4837 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-metadata-0" podUID="67f85854-48e7-4151-9051-da416208058a" containerName="nova-metadata-log" containerID="cri-o://bba8c1f23997bc8651038ff0037f72ff8ba795753b6b5b63c048b341c0246ca3" gracePeriod=30 Jan 11 17:54:01 crc kubenswrapper[4837]: I0111 17:54:01.213342 4837 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-metadata-0" podUID="67f85854-48e7-4151-9051-da416208058a" containerName="nova-metadata-metadata" containerID="cri-o://b9d102816df988088ac57543649f2da552a2588403c9f36fdfb82e34b1efe477" gracePeriod=30 Jan 11 17:54:01 crc kubenswrapper[4837]: I0111 17:54:01.737735 4837 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Jan 11 17:54:01 crc kubenswrapper[4837]: E0111 17:54:01.779039 4837 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="05e495b4e01872df82d2f91347f5de6102ead04db40726141e08985501daa0a0" cmd=["/usr/bin/pgrep","-r","DRST","nova-scheduler"] Jan 11 17:54:01 crc kubenswrapper[4837]: E0111 17:54:01.783809 4837 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="05e495b4e01872df82d2f91347f5de6102ead04db40726141e08985501daa0a0" cmd=["/usr/bin/pgrep","-r","DRST","nova-scheduler"] Jan 11 17:54:01 crc kubenswrapper[4837]: E0111 17:54:01.793788 4837 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="05e495b4e01872df82d2f91347f5de6102ead04db40726141e08985501daa0a0" cmd=["/usr/bin/pgrep","-r","DRST","nova-scheduler"] Jan 11 17:54:01 crc kubenswrapper[4837]: E0111 17:54:01.793854 4837 prober.go:104] "Probe errored" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" probeType="Readiness" pod="openstack/nova-scheduler-0" podUID="33ef5a38-f83c-40ea-8036-fe830d8a32a3" containerName="nova-scheduler-scheduler" Jan 11 17:54:01 crc kubenswrapper[4837]: I0111 17:54:01.795888 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c636cfd6-27dc-4171-942e-8bb2cb7046a1-config-data\") pod \"c636cfd6-27dc-4171-942e-8bb2cb7046a1\" (UID: \"c636cfd6-27dc-4171-942e-8bb2cb7046a1\") " Jan 11 17:54:01 crc kubenswrapper[4837]: I0111 17:54:01.795944 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/c636cfd6-27dc-4171-942e-8bb2cb7046a1-internal-tls-certs\") pod \"c636cfd6-27dc-4171-942e-8bb2cb7046a1\" (UID: \"c636cfd6-27dc-4171-942e-8bb2cb7046a1\") " Jan 11 17:54:01 crc kubenswrapper[4837]: I0111 17:54:01.795968 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/c636cfd6-27dc-4171-942e-8bb2cb7046a1-public-tls-certs\") pod \"c636cfd6-27dc-4171-942e-8bb2cb7046a1\" (UID: \"c636cfd6-27dc-4171-942e-8bb2cb7046a1\") " Jan 11 17:54:01 crc kubenswrapper[4837]: I0111 17:54:01.796039 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/c636cfd6-27dc-4171-942e-8bb2cb7046a1-logs\") pod \"c636cfd6-27dc-4171-942e-8bb2cb7046a1\" (UID: \"c636cfd6-27dc-4171-942e-8bb2cb7046a1\") " Jan 11 17:54:01 crc kubenswrapper[4837]: I0111 17:54:01.796081 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c636cfd6-27dc-4171-942e-8bb2cb7046a1-combined-ca-bundle\") pod \"c636cfd6-27dc-4171-942e-8bb2cb7046a1\" (UID: \"c636cfd6-27dc-4171-942e-8bb2cb7046a1\") " Jan 11 17:54:01 crc kubenswrapper[4837]: I0111 17:54:01.796227 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-sx7v2\" (UniqueName: \"kubernetes.io/projected/c636cfd6-27dc-4171-942e-8bb2cb7046a1-kube-api-access-sx7v2\") pod \"c636cfd6-27dc-4171-942e-8bb2cb7046a1\" (UID: \"c636cfd6-27dc-4171-942e-8bb2cb7046a1\") " Jan 11 17:54:01 crc kubenswrapper[4837]: I0111 17:54:01.797633 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/c636cfd6-27dc-4171-942e-8bb2cb7046a1-logs" (OuterVolumeSpecName: "logs") pod "c636cfd6-27dc-4171-942e-8bb2cb7046a1" (UID: "c636cfd6-27dc-4171-942e-8bb2cb7046a1"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 11 17:54:01 crc kubenswrapper[4837]: I0111 17:54:01.824208 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c636cfd6-27dc-4171-942e-8bb2cb7046a1-kube-api-access-sx7v2" (OuterVolumeSpecName: "kube-api-access-sx7v2") pod "c636cfd6-27dc-4171-942e-8bb2cb7046a1" (UID: "c636cfd6-27dc-4171-942e-8bb2cb7046a1"). InnerVolumeSpecName "kube-api-access-sx7v2". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 11 17:54:01 crc kubenswrapper[4837]: I0111 17:54:01.875164 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c636cfd6-27dc-4171-942e-8bb2cb7046a1-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "c636cfd6-27dc-4171-942e-8bb2cb7046a1" (UID: "c636cfd6-27dc-4171-942e-8bb2cb7046a1"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 11 17:54:01 crc kubenswrapper[4837]: I0111 17:54:01.879696 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c636cfd6-27dc-4171-942e-8bb2cb7046a1-public-tls-certs" (OuterVolumeSpecName: "public-tls-certs") pod "c636cfd6-27dc-4171-942e-8bb2cb7046a1" (UID: "c636cfd6-27dc-4171-942e-8bb2cb7046a1"). InnerVolumeSpecName "public-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 11 17:54:01 crc kubenswrapper[4837]: I0111 17:54:01.891162 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c636cfd6-27dc-4171-942e-8bb2cb7046a1-config-data" (OuterVolumeSpecName: "config-data") pod "c636cfd6-27dc-4171-942e-8bb2cb7046a1" (UID: "c636cfd6-27dc-4171-942e-8bb2cb7046a1"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 11 17:54:01 crc kubenswrapper[4837]: I0111 17:54:01.897824 4837 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-sx7v2\" (UniqueName: \"kubernetes.io/projected/c636cfd6-27dc-4171-942e-8bb2cb7046a1-kube-api-access-sx7v2\") on node \"crc\" DevicePath \"\"" Jan 11 17:54:01 crc kubenswrapper[4837]: I0111 17:54:01.897850 4837 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c636cfd6-27dc-4171-942e-8bb2cb7046a1-config-data\") on node \"crc\" DevicePath \"\"" Jan 11 17:54:01 crc kubenswrapper[4837]: I0111 17:54:01.897859 4837 reconciler_common.go:293] "Volume detached for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/c636cfd6-27dc-4171-942e-8bb2cb7046a1-public-tls-certs\") on node \"crc\" DevicePath \"\"" Jan 11 17:54:01 crc kubenswrapper[4837]: I0111 17:54:01.897870 4837 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/c636cfd6-27dc-4171-942e-8bb2cb7046a1-logs\") on node \"crc\" DevicePath \"\"" Jan 11 17:54:01 crc kubenswrapper[4837]: I0111 17:54:01.897878 4837 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c636cfd6-27dc-4171-942e-8bb2cb7046a1-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 11 17:54:01 crc kubenswrapper[4837]: I0111 17:54:01.911318 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c636cfd6-27dc-4171-942e-8bb2cb7046a1-internal-tls-certs" (OuterVolumeSpecName: "internal-tls-certs") pod "c636cfd6-27dc-4171-942e-8bb2cb7046a1" (UID: "c636cfd6-27dc-4171-942e-8bb2cb7046a1"). InnerVolumeSpecName "internal-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 11 17:54:01 crc kubenswrapper[4837]: I0111 17:54:01.957152 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-t482m" event={"ID":"9c525acb-2a0a-4df5-866a-507d5ba03364","Type":"ContainerStarted","Data":"0f310d999dcf159194a0af6c03ba756a625a583a8b2c0e9a75d25993ce1e7904"} Jan 11 17:54:01 crc kubenswrapper[4837]: I0111 17:54:01.967753 4837 generic.go:334] "Generic (PLEG): container finished" podID="c636cfd6-27dc-4171-942e-8bb2cb7046a1" containerID="39e71a19adbcfd783551df4c96c33819de6ed1dc22700fd2bbc0b191bfd573b2" exitCode=0 Jan 11 17:54:01 crc kubenswrapper[4837]: I0111 17:54:01.967781 4837 generic.go:334] "Generic (PLEG): container finished" podID="c636cfd6-27dc-4171-942e-8bb2cb7046a1" containerID="9b5615d0bcebedfe1699bb1f876da5f76de922fea5f541fa04a8d2b17f60b26c" exitCode=143 Jan 11 17:54:01 crc kubenswrapper[4837]: I0111 17:54:01.967839 4837 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Jan 11 17:54:01 crc kubenswrapper[4837]: I0111 17:54:01.967845 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"c636cfd6-27dc-4171-942e-8bb2cb7046a1","Type":"ContainerDied","Data":"39e71a19adbcfd783551df4c96c33819de6ed1dc22700fd2bbc0b191bfd573b2"} Jan 11 17:54:01 crc kubenswrapper[4837]: I0111 17:54:01.967920 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"c636cfd6-27dc-4171-942e-8bb2cb7046a1","Type":"ContainerDied","Data":"9b5615d0bcebedfe1699bb1f876da5f76de922fea5f541fa04a8d2b17f60b26c"} Jan 11 17:54:01 crc kubenswrapper[4837]: I0111 17:54:01.967936 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"c636cfd6-27dc-4171-942e-8bb2cb7046a1","Type":"ContainerDied","Data":"748c577171a0956f51ab747f28a2dfa160cb78179cb4ffc463b4cd50401d2eca"} Jan 11 17:54:01 crc kubenswrapper[4837]: I0111 17:54:01.968037 4837 scope.go:117] "RemoveContainer" containerID="39e71a19adbcfd783551df4c96c33819de6ed1dc22700fd2bbc0b191bfd573b2" Jan 11 17:54:01 crc kubenswrapper[4837]: I0111 17:54:01.972327 4837 generic.go:334] "Generic (PLEG): container finished" podID="67f85854-48e7-4151-9051-da416208058a" containerID="bba8c1f23997bc8651038ff0037f72ff8ba795753b6b5b63c048b341c0246ca3" exitCode=143 Jan 11 17:54:01 crc kubenswrapper[4837]: I0111 17:54:01.972351 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"67f85854-48e7-4151-9051-da416208058a","Type":"ContainerDied","Data":"bba8c1f23997bc8651038ff0037f72ff8ba795753b6b5b63c048b341c0246ca3"} Jan 11 17:54:01 crc kubenswrapper[4837]: I0111 17:54:01.980546 4837 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-t482m" podStartSLOduration=3.026855694 podStartE2EDuration="6.980532349s" podCreationTimestamp="2026-01-11 17:53:55 +0000 UTC" firstStartedPulling="2026-01-11 17:53:56.890044559 +0000 UTC m=+1411.068237265" lastFinishedPulling="2026-01-11 17:54:00.843721214 +0000 UTC m=+1415.021913920" observedRunningTime="2026-01-11 17:54:01.975560175 +0000 UTC m=+1416.153752901" watchObservedRunningTime="2026-01-11 17:54:01.980532349 +0000 UTC m=+1416.158725055" Jan 11 17:54:01 crc kubenswrapper[4837]: I0111 17:54:01.998345 4837 scope.go:117] "RemoveContainer" containerID="9b5615d0bcebedfe1699bb1f876da5f76de922fea5f541fa04a8d2b17f60b26c" Jan 11 17:54:01 crc kubenswrapper[4837]: I0111 17:54:01.998916 4837 reconciler_common.go:293] "Volume detached for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/c636cfd6-27dc-4171-942e-8bb2cb7046a1-internal-tls-certs\") on node \"crc\" DevicePath \"\"" Jan 11 17:54:02 crc kubenswrapper[4837]: I0111 17:54:02.011196 4837 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-0"] Jan 11 17:54:02 crc kubenswrapper[4837]: I0111 17:54:02.030266 4837 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-api-0"] Jan 11 17:54:02 crc kubenswrapper[4837]: I0111 17:54:02.034453 4837 scope.go:117] "RemoveContainer" containerID="39e71a19adbcfd783551df4c96c33819de6ed1dc22700fd2bbc0b191bfd573b2" Jan 11 17:54:02 crc kubenswrapper[4837]: I0111 17:54:02.038137 4837 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-api-0"] Jan 11 17:54:02 crc kubenswrapper[4837]: E0111 17:54:02.038519 4837 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c636cfd6-27dc-4171-942e-8bb2cb7046a1" containerName="nova-api-api" Jan 11 17:54:02 crc kubenswrapper[4837]: I0111 17:54:02.038535 4837 state_mem.go:107] "Deleted CPUSet assignment" podUID="c636cfd6-27dc-4171-942e-8bb2cb7046a1" containerName="nova-api-api" Jan 11 17:54:02 crc kubenswrapper[4837]: E0111 17:54:02.038547 4837 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c636cfd6-27dc-4171-942e-8bb2cb7046a1" containerName="nova-api-log" Jan 11 17:54:02 crc kubenswrapper[4837]: I0111 17:54:02.038553 4837 state_mem.go:107] "Deleted CPUSet assignment" podUID="c636cfd6-27dc-4171-942e-8bb2cb7046a1" containerName="nova-api-log" Jan 11 17:54:02 crc kubenswrapper[4837]: E0111 17:54:02.038579 4837 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="293316f1-dfe8-42c8-82d3-365be90cdbd1" containerName="nova-manage" Jan 11 17:54:02 crc kubenswrapper[4837]: I0111 17:54:02.038586 4837 state_mem.go:107] "Deleted CPUSet assignment" podUID="293316f1-dfe8-42c8-82d3-365be90cdbd1" containerName="nova-manage" Jan 11 17:54:02 crc kubenswrapper[4837]: I0111 17:54:02.038765 4837 memory_manager.go:354] "RemoveStaleState removing state" podUID="293316f1-dfe8-42c8-82d3-365be90cdbd1" containerName="nova-manage" Jan 11 17:54:02 crc kubenswrapper[4837]: I0111 17:54:02.038793 4837 memory_manager.go:354] "RemoveStaleState removing state" podUID="c636cfd6-27dc-4171-942e-8bb2cb7046a1" containerName="nova-api-log" Jan 11 17:54:02 crc kubenswrapper[4837]: I0111 17:54:02.038805 4837 memory_manager.go:354] "RemoveStaleState removing state" podUID="c636cfd6-27dc-4171-942e-8bb2cb7046a1" containerName="nova-api-api" Jan 11 17:54:02 crc kubenswrapper[4837]: I0111 17:54:02.039835 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Jan 11 17:54:02 crc kubenswrapper[4837]: E0111 17:54:02.039902 4837 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"39e71a19adbcfd783551df4c96c33819de6ed1dc22700fd2bbc0b191bfd573b2\": container with ID starting with 39e71a19adbcfd783551df4c96c33819de6ed1dc22700fd2bbc0b191bfd573b2 not found: ID does not exist" containerID="39e71a19adbcfd783551df4c96c33819de6ed1dc22700fd2bbc0b191bfd573b2" Jan 11 17:54:02 crc kubenswrapper[4837]: I0111 17:54:02.039941 4837 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"39e71a19adbcfd783551df4c96c33819de6ed1dc22700fd2bbc0b191bfd573b2"} err="failed to get container status \"39e71a19adbcfd783551df4c96c33819de6ed1dc22700fd2bbc0b191bfd573b2\": rpc error: code = NotFound desc = could not find container \"39e71a19adbcfd783551df4c96c33819de6ed1dc22700fd2bbc0b191bfd573b2\": container with ID starting with 39e71a19adbcfd783551df4c96c33819de6ed1dc22700fd2bbc0b191bfd573b2 not found: ID does not exist" Jan 11 17:54:02 crc kubenswrapper[4837]: I0111 17:54:02.039965 4837 scope.go:117] "RemoveContainer" containerID="9b5615d0bcebedfe1699bb1f876da5f76de922fea5f541fa04a8d2b17f60b26c" Jan 11 17:54:02 crc kubenswrapper[4837]: E0111 17:54:02.041411 4837 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"9b5615d0bcebedfe1699bb1f876da5f76de922fea5f541fa04a8d2b17f60b26c\": container with ID starting with 9b5615d0bcebedfe1699bb1f876da5f76de922fea5f541fa04a8d2b17f60b26c not found: ID does not exist" containerID="9b5615d0bcebedfe1699bb1f876da5f76de922fea5f541fa04a8d2b17f60b26c" Jan 11 17:54:02 crc kubenswrapper[4837]: I0111 17:54:02.041451 4837 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9b5615d0bcebedfe1699bb1f876da5f76de922fea5f541fa04a8d2b17f60b26c"} err="failed to get container status \"9b5615d0bcebedfe1699bb1f876da5f76de922fea5f541fa04a8d2b17f60b26c\": rpc error: code = NotFound desc = could not find container \"9b5615d0bcebedfe1699bb1f876da5f76de922fea5f541fa04a8d2b17f60b26c\": container with ID starting with 9b5615d0bcebedfe1699bb1f876da5f76de922fea5f541fa04a8d2b17f60b26c not found: ID does not exist" Jan 11 17:54:02 crc kubenswrapper[4837]: I0111 17:54:02.041481 4837 scope.go:117] "RemoveContainer" containerID="39e71a19adbcfd783551df4c96c33819de6ed1dc22700fd2bbc0b191bfd573b2" Jan 11 17:54:02 crc kubenswrapper[4837]: I0111 17:54:02.050414 4837 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Jan 11 17:54:02 crc kubenswrapper[4837]: I0111 17:54:02.087515 4837 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"39e71a19adbcfd783551df4c96c33819de6ed1dc22700fd2bbc0b191bfd573b2"} err="failed to get container status \"39e71a19adbcfd783551df4c96c33819de6ed1dc22700fd2bbc0b191bfd573b2\": rpc error: code = NotFound desc = could not find container \"39e71a19adbcfd783551df4c96c33819de6ed1dc22700fd2bbc0b191bfd573b2\": container with ID starting with 39e71a19adbcfd783551df4c96c33819de6ed1dc22700fd2bbc0b191bfd573b2 not found: ID does not exist" Jan 11 17:54:02 crc kubenswrapper[4837]: I0111 17:54:02.087551 4837 scope.go:117] "RemoveContainer" containerID="9b5615d0bcebedfe1699bb1f876da5f76de922fea5f541fa04a8d2b17f60b26c" Jan 11 17:54:02 crc kubenswrapper[4837]: I0111 17:54:02.087634 4837 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-api-config-data" Jan 11 17:54:02 crc kubenswrapper[4837]: I0111 17:54:02.087666 4837 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-internal-svc" Jan 11 17:54:02 crc kubenswrapper[4837]: I0111 17:54:02.087700 4837 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-public-svc" Jan 11 17:54:02 crc kubenswrapper[4837]: I0111 17:54:02.088622 4837 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9b5615d0bcebedfe1699bb1f876da5f76de922fea5f541fa04a8d2b17f60b26c"} err="failed to get container status \"9b5615d0bcebedfe1699bb1f876da5f76de922fea5f541fa04a8d2b17f60b26c\": rpc error: code = NotFound desc = could not find container \"9b5615d0bcebedfe1699bb1f876da5f76de922fea5f541fa04a8d2b17f60b26c\": container with ID starting with 9b5615d0bcebedfe1699bb1f876da5f76de922fea5f541fa04a8d2b17f60b26c not found: ID does not exist" Jan 11 17:54:02 crc kubenswrapper[4837]: I0111 17:54:02.101043 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c6de8f9a-cf35-49c6-8b0c-d75ac48b3691-config-data\") pod \"nova-api-0\" (UID: \"c6de8f9a-cf35-49c6-8b0c-d75ac48b3691\") " pod="openstack/nova-api-0" Jan 11 17:54:02 crc kubenswrapper[4837]: I0111 17:54:02.101085 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/c6de8f9a-cf35-49c6-8b0c-d75ac48b3691-logs\") pod \"nova-api-0\" (UID: \"c6de8f9a-cf35-49c6-8b0c-d75ac48b3691\") " pod="openstack/nova-api-0" Jan 11 17:54:02 crc kubenswrapper[4837]: I0111 17:54:02.101131 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/c6de8f9a-cf35-49c6-8b0c-d75ac48b3691-public-tls-certs\") pod \"nova-api-0\" (UID: \"c6de8f9a-cf35-49c6-8b0c-d75ac48b3691\") " pod="openstack/nova-api-0" Jan 11 17:54:02 crc kubenswrapper[4837]: I0111 17:54:02.101189 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/c6de8f9a-cf35-49c6-8b0c-d75ac48b3691-internal-tls-certs\") pod \"nova-api-0\" (UID: \"c6de8f9a-cf35-49c6-8b0c-d75ac48b3691\") " pod="openstack/nova-api-0" Jan 11 17:54:02 crc kubenswrapper[4837]: I0111 17:54:02.101231 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-s8llp\" (UniqueName: \"kubernetes.io/projected/c6de8f9a-cf35-49c6-8b0c-d75ac48b3691-kube-api-access-s8llp\") pod \"nova-api-0\" (UID: \"c6de8f9a-cf35-49c6-8b0c-d75ac48b3691\") " pod="openstack/nova-api-0" Jan 11 17:54:02 crc kubenswrapper[4837]: I0111 17:54:02.101265 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c6de8f9a-cf35-49c6-8b0c-d75ac48b3691-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"c6de8f9a-cf35-49c6-8b0c-d75ac48b3691\") " pod="openstack/nova-api-0" Jan 11 17:54:02 crc kubenswrapper[4837]: I0111 17:54:02.203359 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/c6de8f9a-cf35-49c6-8b0c-d75ac48b3691-public-tls-certs\") pod \"nova-api-0\" (UID: \"c6de8f9a-cf35-49c6-8b0c-d75ac48b3691\") " pod="openstack/nova-api-0" Jan 11 17:54:02 crc kubenswrapper[4837]: I0111 17:54:02.203422 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/c6de8f9a-cf35-49c6-8b0c-d75ac48b3691-internal-tls-certs\") pod \"nova-api-0\" (UID: \"c6de8f9a-cf35-49c6-8b0c-d75ac48b3691\") " pod="openstack/nova-api-0" Jan 11 17:54:02 crc kubenswrapper[4837]: I0111 17:54:02.203472 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s8llp\" (UniqueName: \"kubernetes.io/projected/c6de8f9a-cf35-49c6-8b0c-d75ac48b3691-kube-api-access-s8llp\") pod \"nova-api-0\" (UID: \"c6de8f9a-cf35-49c6-8b0c-d75ac48b3691\") " pod="openstack/nova-api-0" Jan 11 17:54:02 crc kubenswrapper[4837]: I0111 17:54:02.203518 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c6de8f9a-cf35-49c6-8b0c-d75ac48b3691-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"c6de8f9a-cf35-49c6-8b0c-d75ac48b3691\") " pod="openstack/nova-api-0" Jan 11 17:54:02 crc kubenswrapper[4837]: I0111 17:54:02.203596 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c6de8f9a-cf35-49c6-8b0c-d75ac48b3691-config-data\") pod \"nova-api-0\" (UID: \"c6de8f9a-cf35-49c6-8b0c-d75ac48b3691\") " pod="openstack/nova-api-0" Jan 11 17:54:02 crc kubenswrapper[4837]: I0111 17:54:02.203623 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/c6de8f9a-cf35-49c6-8b0c-d75ac48b3691-logs\") pod \"nova-api-0\" (UID: \"c6de8f9a-cf35-49c6-8b0c-d75ac48b3691\") " pod="openstack/nova-api-0" Jan 11 17:54:02 crc kubenswrapper[4837]: I0111 17:54:02.204106 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/c6de8f9a-cf35-49c6-8b0c-d75ac48b3691-logs\") pod \"nova-api-0\" (UID: \"c6de8f9a-cf35-49c6-8b0c-d75ac48b3691\") " pod="openstack/nova-api-0" Jan 11 17:54:02 crc kubenswrapper[4837]: I0111 17:54:02.208321 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c6de8f9a-cf35-49c6-8b0c-d75ac48b3691-config-data\") pod \"nova-api-0\" (UID: \"c6de8f9a-cf35-49c6-8b0c-d75ac48b3691\") " pod="openstack/nova-api-0" Jan 11 17:54:02 crc kubenswrapper[4837]: I0111 17:54:02.208344 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c6de8f9a-cf35-49c6-8b0c-d75ac48b3691-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"c6de8f9a-cf35-49c6-8b0c-d75ac48b3691\") " pod="openstack/nova-api-0" Jan 11 17:54:02 crc kubenswrapper[4837]: I0111 17:54:02.208507 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/c6de8f9a-cf35-49c6-8b0c-d75ac48b3691-public-tls-certs\") pod \"nova-api-0\" (UID: \"c6de8f9a-cf35-49c6-8b0c-d75ac48b3691\") " pod="openstack/nova-api-0" Jan 11 17:54:02 crc kubenswrapper[4837]: I0111 17:54:02.211168 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/c6de8f9a-cf35-49c6-8b0c-d75ac48b3691-internal-tls-certs\") pod \"nova-api-0\" (UID: \"c6de8f9a-cf35-49c6-8b0c-d75ac48b3691\") " pod="openstack/nova-api-0" Jan 11 17:54:02 crc kubenswrapper[4837]: I0111 17:54:02.220080 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-s8llp\" (UniqueName: \"kubernetes.io/projected/c6de8f9a-cf35-49c6-8b0c-d75ac48b3691-kube-api-access-s8llp\") pod \"nova-api-0\" (UID: \"c6de8f9a-cf35-49c6-8b0c-d75ac48b3691\") " pod="openstack/nova-api-0" Jan 11 17:54:02 crc kubenswrapper[4837]: I0111 17:54:02.373388 4837 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c636cfd6-27dc-4171-942e-8bb2cb7046a1" path="/var/lib/kubelet/pods/c636cfd6-27dc-4171-942e-8bb2cb7046a1/volumes" Jan 11 17:54:02 crc kubenswrapper[4837]: I0111 17:54:02.401468 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Jan 11 17:54:02 crc kubenswrapper[4837]: W0111 17:54:02.886009 4837 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podc6de8f9a_cf35_49c6_8b0c_d75ac48b3691.slice/crio-dca292be4e0c8d5f0668f6273a9ea50e7734af143f65ade1ea3e3d5b4a3c0046 WatchSource:0}: Error finding container dca292be4e0c8d5f0668f6273a9ea50e7734af143f65ade1ea3e3d5b4a3c0046: Status 404 returned error can't find the container with id dca292be4e0c8d5f0668f6273a9ea50e7734af143f65ade1ea3e3d5b4a3c0046 Jan 11 17:54:02 crc kubenswrapper[4837]: I0111 17:54:02.891707 4837 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Jan 11 17:54:02 crc kubenswrapper[4837]: I0111 17:54:02.982240 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"c6de8f9a-cf35-49c6-8b0c-d75ac48b3691","Type":"ContainerStarted","Data":"dca292be4e0c8d5f0668f6273a9ea50e7734af143f65ade1ea3e3d5b4a3c0046"} Jan 11 17:54:03 crc kubenswrapper[4837]: I0111 17:54:03.997736 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"c6de8f9a-cf35-49c6-8b0c-d75ac48b3691","Type":"ContainerStarted","Data":"3cb08f2758b4b625315293339624c42df514e46b7bcc7d02b5536946cdfe0c50"} Jan 11 17:54:03 crc kubenswrapper[4837]: I0111 17:54:03.998037 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"c6de8f9a-cf35-49c6-8b0c-d75ac48b3691","Type":"ContainerStarted","Data":"81cb69c475dfe4e84ce581382bee6485b5b35268142d150177a28ef4e7b30c0f"} Jan 11 17:54:04 crc kubenswrapper[4837]: I0111 17:54:04.024980 4837 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-api-0" podStartSLOduration=2.024951756 podStartE2EDuration="2.024951756s" podCreationTimestamp="2026-01-11 17:54:02 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-11 17:54:04.019074818 +0000 UTC m=+1418.197267564" watchObservedRunningTime="2026-01-11 17:54:04.024951756 +0000 UTC m=+1418.203144502" Jan 11 17:54:04 crc kubenswrapper[4837]: I0111 17:54:04.363113 4837 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/nova-metadata-0" podUID="67f85854-48e7-4151-9051-da416208058a" containerName="nova-metadata-metadata" probeResult="failure" output="Get \"https://10.217.0.195:8775/\": read tcp 10.217.0.2:38048->10.217.0.195:8775: read: connection reset by peer" Jan 11 17:54:04 crc kubenswrapper[4837]: I0111 17:54:04.363148 4837 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/nova-metadata-0" podUID="67f85854-48e7-4151-9051-da416208058a" containerName="nova-metadata-log" probeResult="failure" output="Get \"https://10.217.0.195:8775/\": read tcp 10.217.0.2:38050->10.217.0.195:8775: read: connection reset by peer" Jan 11 17:54:05 crc kubenswrapper[4837]: I0111 17:54:05.511932 4837 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-t482m" Jan 11 17:54:05 crc kubenswrapper[4837]: I0111 17:54:05.511979 4837 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-t482m" Jan 11 17:54:06 crc kubenswrapper[4837]: I0111 17:54:06.561118 4837 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-t482m" podUID="9c525acb-2a0a-4df5-866a-507d5ba03364" containerName="registry-server" probeResult="failure" output=< Jan 11 17:54:06 crc kubenswrapper[4837]: timeout: failed to connect service ":50051" within 1s Jan 11 17:54:06 crc kubenswrapper[4837]: > Jan 11 17:54:06 crc kubenswrapper[4837]: E0111 17:54:06.780312 4837 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="05e495b4e01872df82d2f91347f5de6102ead04db40726141e08985501daa0a0" cmd=["/usr/bin/pgrep","-r","DRST","nova-scheduler"] Jan 11 17:54:06 crc kubenswrapper[4837]: E0111 17:54:06.785437 4837 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="05e495b4e01872df82d2f91347f5de6102ead04db40726141e08985501daa0a0" cmd=["/usr/bin/pgrep","-r","DRST","nova-scheduler"] Jan 11 17:54:06 crc kubenswrapper[4837]: E0111 17:54:06.789013 4837 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="05e495b4e01872df82d2f91347f5de6102ead04db40726141e08985501daa0a0" cmd=["/usr/bin/pgrep","-r","DRST","nova-scheduler"] Jan 11 17:54:06 crc kubenswrapper[4837]: E0111 17:54:06.789168 4837 prober.go:104] "Probe errored" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" probeType="Readiness" pod="openstack/nova-scheduler-0" podUID="33ef5a38-f83c-40ea-8036-fe830d8a32a3" containerName="nova-scheduler-scheduler" Jan 11 17:54:06 crc kubenswrapper[4837]: I0111 17:54:06.932598 4837 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Jan 11 17:54:06 crc kubenswrapper[4837]: I0111 17:54:06.999009 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/67f85854-48e7-4151-9051-da416208058a-combined-ca-bundle\") pod \"67f85854-48e7-4151-9051-da416208058a\" (UID: \"67f85854-48e7-4151-9051-da416208058a\") " Jan 11 17:54:06 crc kubenswrapper[4837]: I0111 17:54:06.999175 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/67f85854-48e7-4151-9051-da416208058a-nova-metadata-tls-certs\") pod \"67f85854-48e7-4151-9051-da416208058a\" (UID: \"67f85854-48e7-4151-9051-da416208058a\") " Jan 11 17:54:06 crc kubenswrapper[4837]: I0111 17:54:06.999241 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/67f85854-48e7-4151-9051-da416208058a-logs\") pod \"67f85854-48e7-4151-9051-da416208058a\" (UID: \"67f85854-48e7-4151-9051-da416208058a\") " Jan 11 17:54:06 crc kubenswrapper[4837]: I0111 17:54:06.999291 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-b7kqf\" (UniqueName: \"kubernetes.io/projected/67f85854-48e7-4151-9051-da416208058a-kube-api-access-b7kqf\") pod \"67f85854-48e7-4151-9051-da416208058a\" (UID: \"67f85854-48e7-4151-9051-da416208058a\") " Jan 11 17:54:06 crc kubenswrapper[4837]: I0111 17:54:06.999321 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/67f85854-48e7-4151-9051-da416208058a-config-data\") pod \"67f85854-48e7-4151-9051-da416208058a\" (UID: \"67f85854-48e7-4151-9051-da416208058a\") " Jan 11 17:54:07 crc kubenswrapper[4837]: I0111 17:54:06.999885 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/67f85854-48e7-4151-9051-da416208058a-logs" (OuterVolumeSpecName: "logs") pod "67f85854-48e7-4151-9051-da416208058a" (UID: "67f85854-48e7-4151-9051-da416208058a"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 11 17:54:07 crc kubenswrapper[4837]: I0111 17:54:07.008970 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/67f85854-48e7-4151-9051-da416208058a-kube-api-access-b7kqf" (OuterVolumeSpecName: "kube-api-access-b7kqf") pod "67f85854-48e7-4151-9051-da416208058a" (UID: "67f85854-48e7-4151-9051-da416208058a"). InnerVolumeSpecName "kube-api-access-b7kqf". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 11 17:54:07 crc kubenswrapper[4837]: I0111 17:54:07.033059 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/67f85854-48e7-4151-9051-da416208058a-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "67f85854-48e7-4151-9051-da416208058a" (UID: "67f85854-48e7-4151-9051-da416208058a"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 11 17:54:07 crc kubenswrapper[4837]: I0111 17:54:07.036532 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/67f85854-48e7-4151-9051-da416208058a-config-data" (OuterVolumeSpecName: "config-data") pod "67f85854-48e7-4151-9051-da416208058a" (UID: "67f85854-48e7-4151-9051-da416208058a"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 11 17:54:07 crc kubenswrapper[4837]: I0111 17:54:07.044331 4837 generic.go:334] "Generic (PLEG): container finished" podID="67f85854-48e7-4151-9051-da416208058a" containerID="b9d102816df988088ac57543649f2da552a2588403c9f36fdfb82e34b1efe477" exitCode=0 Jan 11 17:54:07 crc kubenswrapper[4837]: I0111 17:54:07.044370 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"67f85854-48e7-4151-9051-da416208058a","Type":"ContainerDied","Data":"b9d102816df988088ac57543649f2da552a2588403c9f36fdfb82e34b1efe477"} Jan 11 17:54:07 crc kubenswrapper[4837]: I0111 17:54:07.044396 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"67f85854-48e7-4151-9051-da416208058a","Type":"ContainerDied","Data":"f464159971be53f939f389d4f897e56cf937b01d5d65a5cb7d09b06703116abf"} Jan 11 17:54:07 crc kubenswrapper[4837]: I0111 17:54:07.044412 4837 scope.go:117] "RemoveContainer" containerID="b9d102816df988088ac57543649f2da552a2588403c9f36fdfb82e34b1efe477" Jan 11 17:54:07 crc kubenswrapper[4837]: I0111 17:54:07.044529 4837 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Jan 11 17:54:07 crc kubenswrapper[4837]: I0111 17:54:07.064688 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/67f85854-48e7-4151-9051-da416208058a-nova-metadata-tls-certs" (OuterVolumeSpecName: "nova-metadata-tls-certs") pod "67f85854-48e7-4151-9051-da416208058a" (UID: "67f85854-48e7-4151-9051-da416208058a"). InnerVolumeSpecName "nova-metadata-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 11 17:54:07 crc kubenswrapper[4837]: I0111 17:54:07.101191 4837 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/67f85854-48e7-4151-9051-da416208058a-logs\") on node \"crc\" DevicePath \"\"" Jan 11 17:54:07 crc kubenswrapper[4837]: I0111 17:54:07.101223 4837 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-b7kqf\" (UniqueName: \"kubernetes.io/projected/67f85854-48e7-4151-9051-da416208058a-kube-api-access-b7kqf\") on node \"crc\" DevicePath \"\"" Jan 11 17:54:07 crc kubenswrapper[4837]: I0111 17:54:07.101236 4837 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/67f85854-48e7-4151-9051-da416208058a-config-data\") on node \"crc\" DevicePath \"\"" Jan 11 17:54:07 crc kubenswrapper[4837]: I0111 17:54:07.101247 4837 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/67f85854-48e7-4151-9051-da416208058a-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 11 17:54:07 crc kubenswrapper[4837]: I0111 17:54:07.101257 4837 reconciler_common.go:293] "Volume detached for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/67f85854-48e7-4151-9051-da416208058a-nova-metadata-tls-certs\") on node \"crc\" DevicePath \"\"" Jan 11 17:54:07 crc kubenswrapper[4837]: I0111 17:54:07.171285 4837 scope.go:117] "RemoveContainer" containerID="bba8c1f23997bc8651038ff0037f72ff8ba795753b6b5b63c048b341c0246ca3" Jan 11 17:54:07 crc kubenswrapper[4837]: I0111 17:54:07.213952 4837 scope.go:117] "RemoveContainer" containerID="b9d102816df988088ac57543649f2da552a2588403c9f36fdfb82e34b1efe477" Jan 11 17:54:07 crc kubenswrapper[4837]: E0111 17:54:07.214894 4837 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b9d102816df988088ac57543649f2da552a2588403c9f36fdfb82e34b1efe477\": container with ID starting with b9d102816df988088ac57543649f2da552a2588403c9f36fdfb82e34b1efe477 not found: ID does not exist" containerID="b9d102816df988088ac57543649f2da552a2588403c9f36fdfb82e34b1efe477" Jan 11 17:54:07 crc kubenswrapper[4837]: I0111 17:54:07.214926 4837 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b9d102816df988088ac57543649f2da552a2588403c9f36fdfb82e34b1efe477"} err="failed to get container status \"b9d102816df988088ac57543649f2da552a2588403c9f36fdfb82e34b1efe477\": rpc error: code = NotFound desc = could not find container \"b9d102816df988088ac57543649f2da552a2588403c9f36fdfb82e34b1efe477\": container with ID starting with b9d102816df988088ac57543649f2da552a2588403c9f36fdfb82e34b1efe477 not found: ID does not exist" Jan 11 17:54:07 crc kubenswrapper[4837]: I0111 17:54:07.214948 4837 scope.go:117] "RemoveContainer" containerID="bba8c1f23997bc8651038ff0037f72ff8ba795753b6b5b63c048b341c0246ca3" Jan 11 17:54:07 crc kubenswrapper[4837]: E0111 17:54:07.215259 4837 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"bba8c1f23997bc8651038ff0037f72ff8ba795753b6b5b63c048b341c0246ca3\": container with ID starting with bba8c1f23997bc8651038ff0037f72ff8ba795753b6b5b63c048b341c0246ca3 not found: ID does not exist" containerID="bba8c1f23997bc8651038ff0037f72ff8ba795753b6b5b63c048b341c0246ca3" Jan 11 17:54:07 crc kubenswrapper[4837]: I0111 17:54:07.215306 4837 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"bba8c1f23997bc8651038ff0037f72ff8ba795753b6b5b63c048b341c0246ca3"} err="failed to get container status \"bba8c1f23997bc8651038ff0037f72ff8ba795753b6b5b63c048b341c0246ca3\": rpc error: code = NotFound desc = could not find container \"bba8c1f23997bc8651038ff0037f72ff8ba795753b6b5b63c048b341c0246ca3\": container with ID starting with bba8c1f23997bc8651038ff0037f72ff8ba795753b6b5b63c048b341c0246ca3 not found: ID does not exist" Jan 11 17:54:07 crc kubenswrapper[4837]: I0111 17:54:07.405402 4837 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-metadata-0"] Jan 11 17:54:07 crc kubenswrapper[4837]: I0111 17:54:07.417200 4837 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-metadata-0"] Jan 11 17:54:07 crc kubenswrapper[4837]: I0111 17:54:07.442293 4837 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Jan 11 17:54:07 crc kubenswrapper[4837]: I0111 17:54:07.459062 4837 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-metadata-0"] Jan 11 17:54:07 crc kubenswrapper[4837]: E0111 17:54:07.459900 4837 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="67f85854-48e7-4151-9051-da416208058a" containerName="nova-metadata-log" Jan 11 17:54:07 crc kubenswrapper[4837]: I0111 17:54:07.459916 4837 state_mem.go:107] "Deleted CPUSet assignment" podUID="67f85854-48e7-4151-9051-da416208058a" containerName="nova-metadata-log" Jan 11 17:54:07 crc kubenswrapper[4837]: E0111 17:54:07.459930 4837 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="33ef5a38-f83c-40ea-8036-fe830d8a32a3" containerName="nova-scheduler-scheduler" Jan 11 17:54:07 crc kubenswrapper[4837]: I0111 17:54:07.459936 4837 state_mem.go:107] "Deleted CPUSet assignment" podUID="33ef5a38-f83c-40ea-8036-fe830d8a32a3" containerName="nova-scheduler-scheduler" Jan 11 17:54:07 crc kubenswrapper[4837]: E0111 17:54:07.459976 4837 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="67f85854-48e7-4151-9051-da416208058a" containerName="nova-metadata-metadata" Jan 11 17:54:07 crc kubenswrapper[4837]: I0111 17:54:07.459982 4837 state_mem.go:107] "Deleted CPUSet assignment" podUID="67f85854-48e7-4151-9051-da416208058a" containerName="nova-metadata-metadata" Jan 11 17:54:07 crc kubenswrapper[4837]: I0111 17:54:07.462030 4837 memory_manager.go:354] "RemoveStaleState removing state" podUID="67f85854-48e7-4151-9051-da416208058a" containerName="nova-metadata-metadata" Jan 11 17:54:07 crc kubenswrapper[4837]: I0111 17:54:07.462061 4837 memory_manager.go:354] "RemoveStaleState removing state" podUID="67f85854-48e7-4151-9051-da416208058a" containerName="nova-metadata-log" Jan 11 17:54:07 crc kubenswrapper[4837]: I0111 17:54:07.462075 4837 memory_manager.go:354] "RemoveStaleState removing state" podUID="33ef5a38-f83c-40ea-8036-fe830d8a32a3" containerName="nova-scheduler-scheduler" Jan 11 17:54:07 crc kubenswrapper[4837]: I0111 17:54:07.463129 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Jan 11 17:54:07 crc kubenswrapper[4837]: I0111 17:54:07.465982 4837 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-metadata-internal-svc" Jan 11 17:54:07 crc kubenswrapper[4837]: I0111 17:54:07.469014 4837 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-metadata-config-data" Jan 11 17:54:07 crc kubenswrapper[4837]: I0111 17:54:07.470177 4837 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Jan 11 17:54:07 crc kubenswrapper[4837]: I0111 17:54:07.609459 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/33ef5a38-f83c-40ea-8036-fe830d8a32a3-config-data\") pod \"33ef5a38-f83c-40ea-8036-fe830d8a32a3\" (UID: \"33ef5a38-f83c-40ea-8036-fe830d8a32a3\") " Jan 11 17:54:07 crc kubenswrapper[4837]: I0111 17:54:07.609604 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-5nrwj\" (UniqueName: \"kubernetes.io/projected/33ef5a38-f83c-40ea-8036-fe830d8a32a3-kube-api-access-5nrwj\") pod \"33ef5a38-f83c-40ea-8036-fe830d8a32a3\" (UID: \"33ef5a38-f83c-40ea-8036-fe830d8a32a3\") " Jan 11 17:54:07 crc kubenswrapper[4837]: I0111 17:54:07.609764 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/33ef5a38-f83c-40ea-8036-fe830d8a32a3-combined-ca-bundle\") pod \"33ef5a38-f83c-40ea-8036-fe830d8a32a3\" (UID: \"33ef5a38-f83c-40ea-8036-fe830d8a32a3\") " Jan 11 17:54:07 crc kubenswrapper[4837]: I0111 17:54:07.610276 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/1847dbaa-536d-48f0-ac85-de5ad698e483-logs\") pod \"nova-metadata-0\" (UID: \"1847dbaa-536d-48f0-ac85-de5ad698e483\") " pod="openstack/nova-metadata-0" Jan 11 17:54:07 crc kubenswrapper[4837]: I0111 17:54:07.610708 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/1847dbaa-536d-48f0-ac85-de5ad698e483-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"1847dbaa-536d-48f0-ac85-de5ad698e483\") " pod="openstack/nova-metadata-0" Jan 11 17:54:07 crc kubenswrapper[4837]: I0111 17:54:07.610916 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1847dbaa-536d-48f0-ac85-de5ad698e483-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"1847dbaa-536d-48f0-ac85-de5ad698e483\") " pod="openstack/nova-metadata-0" Jan 11 17:54:07 crc kubenswrapper[4837]: I0111 17:54:07.610995 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dgv8q\" (UniqueName: \"kubernetes.io/projected/1847dbaa-536d-48f0-ac85-de5ad698e483-kube-api-access-dgv8q\") pod \"nova-metadata-0\" (UID: \"1847dbaa-536d-48f0-ac85-de5ad698e483\") " pod="openstack/nova-metadata-0" Jan 11 17:54:07 crc kubenswrapper[4837]: I0111 17:54:07.611952 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1847dbaa-536d-48f0-ac85-de5ad698e483-config-data\") pod \"nova-metadata-0\" (UID: \"1847dbaa-536d-48f0-ac85-de5ad698e483\") " pod="openstack/nova-metadata-0" Jan 11 17:54:07 crc kubenswrapper[4837]: I0111 17:54:07.612798 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/33ef5a38-f83c-40ea-8036-fe830d8a32a3-kube-api-access-5nrwj" (OuterVolumeSpecName: "kube-api-access-5nrwj") pod "33ef5a38-f83c-40ea-8036-fe830d8a32a3" (UID: "33ef5a38-f83c-40ea-8036-fe830d8a32a3"). InnerVolumeSpecName "kube-api-access-5nrwj". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 11 17:54:07 crc kubenswrapper[4837]: I0111 17:54:07.640953 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/33ef5a38-f83c-40ea-8036-fe830d8a32a3-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "33ef5a38-f83c-40ea-8036-fe830d8a32a3" (UID: "33ef5a38-f83c-40ea-8036-fe830d8a32a3"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 11 17:54:07 crc kubenswrapper[4837]: I0111 17:54:07.652749 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/33ef5a38-f83c-40ea-8036-fe830d8a32a3-config-data" (OuterVolumeSpecName: "config-data") pod "33ef5a38-f83c-40ea-8036-fe830d8a32a3" (UID: "33ef5a38-f83c-40ea-8036-fe830d8a32a3"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 11 17:54:07 crc kubenswrapper[4837]: I0111 17:54:07.714334 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1847dbaa-536d-48f0-ac85-de5ad698e483-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"1847dbaa-536d-48f0-ac85-de5ad698e483\") " pod="openstack/nova-metadata-0" Jan 11 17:54:07 crc kubenswrapper[4837]: I0111 17:54:07.714490 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dgv8q\" (UniqueName: \"kubernetes.io/projected/1847dbaa-536d-48f0-ac85-de5ad698e483-kube-api-access-dgv8q\") pod \"nova-metadata-0\" (UID: \"1847dbaa-536d-48f0-ac85-de5ad698e483\") " pod="openstack/nova-metadata-0" Jan 11 17:54:07 crc kubenswrapper[4837]: I0111 17:54:07.714554 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1847dbaa-536d-48f0-ac85-de5ad698e483-config-data\") pod \"nova-metadata-0\" (UID: \"1847dbaa-536d-48f0-ac85-de5ad698e483\") " pod="openstack/nova-metadata-0" Jan 11 17:54:07 crc kubenswrapper[4837]: I0111 17:54:07.714759 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/1847dbaa-536d-48f0-ac85-de5ad698e483-logs\") pod \"nova-metadata-0\" (UID: \"1847dbaa-536d-48f0-ac85-de5ad698e483\") " pod="openstack/nova-metadata-0" Jan 11 17:54:07 crc kubenswrapper[4837]: I0111 17:54:07.714872 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/1847dbaa-536d-48f0-ac85-de5ad698e483-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"1847dbaa-536d-48f0-ac85-de5ad698e483\") " pod="openstack/nova-metadata-0" Jan 11 17:54:07 crc kubenswrapper[4837]: I0111 17:54:07.715122 4837 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/33ef5a38-f83c-40ea-8036-fe830d8a32a3-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 11 17:54:07 crc kubenswrapper[4837]: I0111 17:54:07.715155 4837 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/33ef5a38-f83c-40ea-8036-fe830d8a32a3-config-data\") on node \"crc\" DevicePath \"\"" Jan 11 17:54:07 crc kubenswrapper[4837]: I0111 17:54:07.715182 4837 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-5nrwj\" (UniqueName: \"kubernetes.io/projected/33ef5a38-f83c-40ea-8036-fe830d8a32a3-kube-api-access-5nrwj\") on node \"crc\" DevicePath \"\"" Jan 11 17:54:07 crc kubenswrapper[4837]: I0111 17:54:07.715194 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/1847dbaa-536d-48f0-ac85-de5ad698e483-logs\") pod \"nova-metadata-0\" (UID: \"1847dbaa-536d-48f0-ac85-de5ad698e483\") " pod="openstack/nova-metadata-0" Jan 11 17:54:07 crc kubenswrapper[4837]: I0111 17:54:07.718638 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1847dbaa-536d-48f0-ac85-de5ad698e483-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"1847dbaa-536d-48f0-ac85-de5ad698e483\") " pod="openstack/nova-metadata-0" Jan 11 17:54:07 crc kubenswrapper[4837]: I0111 17:54:07.720053 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1847dbaa-536d-48f0-ac85-de5ad698e483-config-data\") pod \"nova-metadata-0\" (UID: \"1847dbaa-536d-48f0-ac85-de5ad698e483\") " pod="openstack/nova-metadata-0" Jan 11 17:54:07 crc kubenswrapper[4837]: I0111 17:54:07.721284 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/1847dbaa-536d-48f0-ac85-de5ad698e483-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"1847dbaa-536d-48f0-ac85-de5ad698e483\") " pod="openstack/nova-metadata-0" Jan 11 17:54:07 crc kubenswrapper[4837]: I0111 17:54:07.741912 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dgv8q\" (UniqueName: \"kubernetes.io/projected/1847dbaa-536d-48f0-ac85-de5ad698e483-kube-api-access-dgv8q\") pod \"nova-metadata-0\" (UID: \"1847dbaa-536d-48f0-ac85-de5ad698e483\") " pod="openstack/nova-metadata-0" Jan 11 17:54:07 crc kubenswrapper[4837]: I0111 17:54:07.793154 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Jan 11 17:54:08 crc kubenswrapper[4837]: I0111 17:54:08.053956 4837 generic.go:334] "Generic (PLEG): container finished" podID="33ef5a38-f83c-40ea-8036-fe830d8a32a3" containerID="05e495b4e01872df82d2f91347f5de6102ead04db40726141e08985501daa0a0" exitCode=0 Jan 11 17:54:08 crc kubenswrapper[4837]: I0111 17:54:08.054008 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"33ef5a38-f83c-40ea-8036-fe830d8a32a3","Type":"ContainerDied","Data":"05e495b4e01872df82d2f91347f5de6102ead04db40726141e08985501daa0a0"} Jan 11 17:54:08 crc kubenswrapper[4837]: I0111 17:54:08.054257 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"33ef5a38-f83c-40ea-8036-fe830d8a32a3","Type":"ContainerDied","Data":"f5117c2dfd5aed3f94406d50570fe2d997abb59c1e98589d655d4a64b1e3ddad"} Jan 11 17:54:08 crc kubenswrapper[4837]: I0111 17:54:08.054277 4837 scope.go:117] "RemoveContainer" containerID="05e495b4e01872df82d2f91347f5de6102ead04db40726141e08985501daa0a0" Jan 11 17:54:08 crc kubenswrapper[4837]: I0111 17:54:08.054021 4837 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Jan 11 17:54:08 crc kubenswrapper[4837]: I0111 17:54:08.083715 4837 scope.go:117] "RemoveContainer" containerID="05e495b4e01872df82d2f91347f5de6102ead04db40726141e08985501daa0a0" Jan 11 17:54:08 crc kubenswrapper[4837]: E0111 17:54:08.084391 4837 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"05e495b4e01872df82d2f91347f5de6102ead04db40726141e08985501daa0a0\": container with ID starting with 05e495b4e01872df82d2f91347f5de6102ead04db40726141e08985501daa0a0 not found: ID does not exist" containerID="05e495b4e01872df82d2f91347f5de6102ead04db40726141e08985501daa0a0" Jan 11 17:54:08 crc kubenswrapper[4837]: I0111 17:54:08.084453 4837 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"05e495b4e01872df82d2f91347f5de6102ead04db40726141e08985501daa0a0"} err="failed to get container status \"05e495b4e01872df82d2f91347f5de6102ead04db40726141e08985501daa0a0\": rpc error: code = NotFound desc = could not find container \"05e495b4e01872df82d2f91347f5de6102ead04db40726141e08985501daa0a0\": container with ID starting with 05e495b4e01872df82d2f91347f5de6102ead04db40726141e08985501daa0a0 not found: ID does not exist" Jan 11 17:54:08 crc kubenswrapper[4837]: I0111 17:54:08.089989 4837 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-scheduler-0"] Jan 11 17:54:08 crc kubenswrapper[4837]: I0111 17:54:08.101667 4837 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-scheduler-0"] Jan 11 17:54:08 crc kubenswrapper[4837]: I0111 17:54:08.124467 4837 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-scheduler-0"] Jan 11 17:54:08 crc kubenswrapper[4837]: I0111 17:54:08.126135 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Jan 11 17:54:08 crc kubenswrapper[4837]: I0111 17:54:08.129409 4837 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-scheduler-config-data" Jan 11 17:54:08 crc kubenswrapper[4837]: I0111 17:54:08.136427 4837 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-scheduler-0"] Jan 11 17:54:08 crc kubenswrapper[4837]: W0111 17:54:08.283922 4837 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod1847dbaa_536d_48f0_ac85_de5ad698e483.slice/crio-a18e56dc50a6ba3c2d072232402e8c46d1fb658cc15626302d9acc80a99d03ab WatchSource:0}: Error finding container a18e56dc50a6ba3c2d072232402e8c46d1fb658cc15626302d9acc80a99d03ab: Status 404 returned error can't find the container with id a18e56dc50a6ba3c2d072232402e8c46d1fb658cc15626302d9acc80a99d03ab Jan 11 17:54:08 crc kubenswrapper[4837]: I0111 17:54:08.294520 4837 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Jan 11 17:54:08 crc kubenswrapper[4837]: I0111 17:54:08.325753 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jtmck\" (UniqueName: \"kubernetes.io/projected/4180ef05-e41c-4e74-8e23-41fbda984554-kube-api-access-jtmck\") pod \"nova-scheduler-0\" (UID: \"4180ef05-e41c-4e74-8e23-41fbda984554\") " pod="openstack/nova-scheduler-0" Jan 11 17:54:08 crc kubenswrapper[4837]: I0111 17:54:08.325918 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4180ef05-e41c-4e74-8e23-41fbda984554-config-data\") pod \"nova-scheduler-0\" (UID: \"4180ef05-e41c-4e74-8e23-41fbda984554\") " pod="openstack/nova-scheduler-0" Jan 11 17:54:08 crc kubenswrapper[4837]: I0111 17:54:08.325972 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4180ef05-e41c-4e74-8e23-41fbda984554-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"4180ef05-e41c-4e74-8e23-41fbda984554\") " pod="openstack/nova-scheduler-0" Jan 11 17:54:08 crc kubenswrapper[4837]: I0111 17:54:08.381601 4837 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="33ef5a38-f83c-40ea-8036-fe830d8a32a3" path="/var/lib/kubelet/pods/33ef5a38-f83c-40ea-8036-fe830d8a32a3/volumes" Jan 11 17:54:08 crc kubenswrapper[4837]: I0111 17:54:08.382959 4837 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="67f85854-48e7-4151-9051-da416208058a" path="/var/lib/kubelet/pods/67f85854-48e7-4151-9051-da416208058a/volumes" Jan 11 17:54:08 crc kubenswrapper[4837]: I0111 17:54:08.428369 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jtmck\" (UniqueName: \"kubernetes.io/projected/4180ef05-e41c-4e74-8e23-41fbda984554-kube-api-access-jtmck\") pod \"nova-scheduler-0\" (UID: \"4180ef05-e41c-4e74-8e23-41fbda984554\") " pod="openstack/nova-scheduler-0" Jan 11 17:54:08 crc kubenswrapper[4837]: I0111 17:54:08.428511 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4180ef05-e41c-4e74-8e23-41fbda984554-config-data\") pod \"nova-scheduler-0\" (UID: \"4180ef05-e41c-4e74-8e23-41fbda984554\") " pod="openstack/nova-scheduler-0" Jan 11 17:54:08 crc kubenswrapper[4837]: I0111 17:54:08.428535 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4180ef05-e41c-4e74-8e23-41fbda984554-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"4180ef05-e41c-4e74-8e23-41fbda984554\") " pod="openstack/nova-scheduler-0" Jan 11 17:54:08 crc kubenswrapper[4837]: I0111 17:54:08.435144 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4180ef05-e41c-4e74-8e23-41fbda984554-config-data\") pod \"nova-scheduler-0\" (UID: \"4180ef05-e41c-4e74-8e23-41fbda984554\") " pod="openstack/nova-scheduler-0" Jan 11 17:54:08 crc kubenswrapper[4837]: I0111 17:54:08.442388 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4180ef05-e41c-4e74-8e23-41fbda984554-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"4180ef05-e41c-4e74-8e23-41fbda984554\") " pod="openstack/nova-scheduler-0" Jan 11 17:54:08 crc kubenswrapper[4837]: I0111 17:54:08.448223 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jtmck\" (UniqueName: \"kubernetes.io/projected/4180ef05-e41c-4e74-8e23-41fbda984554-kube-api-access-jtmck\") pod \"nova-scheduler-0\" (UID: \"4180ef05-e41c-4e74-8e23-41fbda984554\") " pod="openstack/nova-scheduler-0" Jan 11 17:54:08 crc kubenswrapper[4837]: I0111 17:54:08.746003 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Jan 11 17:54:09 crc kubenswrapper[4837]: I0111 17:54:09.065891 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"1847dbaa-536d-48f0-ac85-de5ad698e483","Type":"ContainerStarted","Data":"a47b1593f5c642bed50bf3982dcf124a6fc64b522f5905de5c6508004de22f5f"} Jan 11 17:54:09 crc kubenswrapper[4837]: I0111 17:54:09.066348 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"1847dbaa-536d-48f0-ac85-de5ad698e483","Type":"ContainerStarted","Data":"ffce5c5c43f916008c47974694022f01ca15715d08c5384d67ba87f06678e20a"} Jan 11 17:54:09 crc kubenswrapper[4837]: I0111 17:54:09.066361 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"1847dbaa-536d-48f0-ac85-de5ad698e483","Type":"ContainerStarted","Data":"a18e56dc50a6ba3c2d072232402e8c46d1fb658cc15626302d9acc80a99d03ab"} Jan 11 17:54:09 crc kubenswrapper[4837]: I0111 17:54:09.093380 4837 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-metadata-0" podStartSLOduration=2.093358663 podStartE2EDuration="2.093358663s" podCreationTimestamp="2026-01-11 17:54:07 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-11 17:54:09.083733915 +0000 UTC m=+1423.261926641" watchObservedRunningTime="2026-01-11 17:54:09.093358663 +0000 UTC m=+1423.271551369" Jan 11 17:54:09 crc kubenswrapper[4837]: I0111 17:54:09.177737 4837 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-scheduler-0"] Jan 11 17:54:09 crc kubenswrapper[4837]: W0111 17:54:09.181762 4837 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod4180ef05_e41c_4e74_8e23_41fbda984554.slice/crio-e29cc1e7f7c88ac2015e64d9245cfab389818eff8f08246bb94ca37f772d3df1 WatchSource:0}: Error finding container e29cc1e7f7c88ac2015e64d9245cfab389818eff8f08246bb94ca37f772d3df1: Status 404 returned error can't find the container with id e29cc1e7f7c88ac2015e64d9245cfab389818eff8f08246bb94ca37f772d3df1 Jan 11 17:54:09 crc kubenswrapper[4837]: I0111 17:54:09.443661 4837 patch_prober.go:28] interesting pod/machine-config-daemon-pqnst container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Jan 11 17:54:09 crc kubenswrapper[4837]: I0111 17:54:09.443962 4837 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-pqnst" podUID="1cf6fa66-290a-4e29-bafc-e60185a22fe2" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Jan 11 17:54:10 crc kubenswrapper[4837]: I0111 17:54:10.096483 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"4180ef05-e41c-4e74-8e23-41fbda984554","Type":"ContainerStarted","Data":"ec01cd116bec6689e84ab47b5b9ec5ee8e4e643e24411ea822033ea8f1e453e7"} Jan 11 17:54:10 crc kubenswrapper[4837]: I0111 17:54:10.096881 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"4180ef05-e41c-4e74-8e23-41fbda984554","Type":"ContainerStarted","Data":"e29cc1e7f7c88ac2015e64d9245cfab389818eff8f08246bb94ca37f772d3df1"} Jan 11 17:54:10 crc kubenswrapper[4837]: I0111 17:54:10.126666 4837 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-scheduler-0" podStartSLOduration=2.126644308 podStartE2EDuration="2.126644308s" podCreationTimestamp="2026-01-11 17:54:08 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-11 17:54:10.119536847 +0000 UTC m=+1424.297729563" watchObservedRunningTime="2026-01-11 17:54:10.126644308 +0000 UTC m=+1424.304837024" Jan 11 17:54:12 crc kubenswrapper[4837]: I0111 17:54:12.402329 4837 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-api-0" Jan 11 17:54:12 crc kubenswrapper[4837]: I0111 17:54:12.402412 4837 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-api-0" Jan 11 17:54:12 crc kubenswrapper[4837]: I0111 17:54:12.793391 4837 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-metadata-0" Jan 11 17:54:12 crc kubenswrapper[4837]: I0111 17:54:12.794156 4837 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-metadata-0" Jan 11 17:54:13 crc kubenswrapper[4837]: I0111 17:54:13.418925 4837 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-api-0" podUID="c6de8f9a-cf35-49c6-8b0c-d75ac48b3691" containerName="nova-api-api" probeResult="failure" output="Get \"https://10.217.0.207:8774/\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Jan 11 17:54:13 crc kubenswrapper[4837]: I0111 17:54:13.418892 4837 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-api-0" podUID="c6de8f9a-cf35-49c6-8b0c-d75ac48b3691" containerName="nova-api-log" probeResult="failure" output="Get \"https://10.217.0.207:8774/\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Jan 11 17:54:13 crc kubenswrapper[4837]: I0111 17:54:13.746557 4837 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-scheduler-0" Jan 11 17:54:15 crc kubenswrapper[4837]: I0111 17:54:15.566352 4837 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-t482m" Jan 11 17:54:15 crc kubenswrapper[4837]: I0111 17:54:15.628497 4837 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-t482m" Jan 11 17:54:15 crc kubenswrapper[4837]: I0111 17:54:15.805199 4837 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-t482m"] Jan 11 17:54:17 crc kubenswrapper[4837]: I0111 17:54:17.160948 4837 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-t482m" podUID="9c525acb-2a0a-4df5-866a-507d5ba03364" containerName="registry-server" containerID="cri-o://0f310d999dcf159194a0af6c03ba756a625a583a8b2c0e9a75d25993ce1e7904" gracePeriod=2 Jan 11 17:54:17 crc kubenswrapper[4837]: I0111 17:54:17.639730 4837 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-t482m" Jan 11 17:54:17 crc kubenswrapper[4837]: I0111 17:54:17.793759 4837 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-metadata-0" Jan 11 17:54:17 crc kubenswrapper[4837]: I0111 17:54:17.793795 4837 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-metadata-0" Jan 11 17:54:17 crc kubenswrapper[4837]: I0111 17:54:17.819637 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/9c525acb-2a0a-4df5-866a-507d5ba03364-catalog-content\") pod \"9c525acb-2a0a-4df5-866a-507d5ba03364\" (UID: \"9c525acb-2a0a-4df5-866a-507d5ba03364\") " Jan 11 17:54:17 crc kubenswrapper[4837]: I0111 17:54:17.819720 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-v4srt\" (UniqueName: \"kubernetes.io/projected/9c525acb-2a0a-4df5-866a-507d5ba03364-kube-api-access-v4srt\") pod \"9c525acb-2a0a-4df5-866a-507d5ba03364\" (UID: \"9c525acb-2a0a-4df5-866a-507d5ba03364\") " Jan 11 17:54:17 crc kubenswrapper[4837]: I0111 17:54:17.819885 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/9c525acb-2a0a-4df5-866a-507d5ba03364-utilities\") pod \"9c525acb-2a0a-4df5-866a-507d5ba03364\" (UID: \"9c525acb-2a0a-4df5-866a-507d5ba03364\") " Jan 11 17:54:17 crc kubenswrapper[4837]: I0111 17:54:17.820896 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/9c525acb-2a0a-4df5-866a-507d5ba03364-utilities" (OuterVolumeSpecName: "utilities") pod "9c525acb-2a0a-4df5-866a-507d5ba03364" (UID: "9c525acb-2a0a-4df5-866a-507d5ba03364"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 11 17:54:17 crc kubenswrapper[4837]: I0111 17:54:17.832136 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9c525acb-2a0a-4df5-866a-507d5ba03364-kube-api-access-v4srt" (OuterVolumeSpecName: "kube-api-access-v4srt") pod "9c525acb-2a0a-4df5-866a-507d5ba03364" (UID: "9c525acb-2a0a-4df5-866a-507d5ba03364"). InnerVolumeSpecName "kube-api-access-v4srt". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 11 17:54:17 crc kubenswrapper[4837]: I0111 17:54:17.921863 4837 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/9c525acb-2a0a-4df5-866a-507d5ba03364-utilities\") on node \"crc\" DevicePath \"\"" Jan 11 17:54:17 crc kubenswrapper[4837]: I0111 17:54:17.922384 4837 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-v4srt\" (UniqueName: \"kubernetes.io/projected/9c525acb-2a0a-4df5-866a-507d5ba03364-kube-api-access-v4srt\") on node \"crc\" DevicePath \"\"" Jan 11 17:54:17 crc kubenswrapper[4837]: I0111 17:54:17.936986 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/9c525acb-2a0a-4df5-866a-507d5ba03364-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "9c525acb-2a0a-4df5-866a-507d5ba03364" (UID: "9c525acb-2a0a-4df5-866a-507d5ba03364"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 11 17:54:18 crc kubenswrapper[4837]: I0111 17:54:18.024017 4837 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/9c525acb-2a0a-4df5-866a-507d5ba03364-catalog-content\") on node \"crc\" DevicePath \"\"" Jan 11 17:54:18 crc kubenswrapper[4837]: I0111 17:54:18.170386 4837 generic.go:334] "Generic (PLEG): container finished" podID="9c525acb-2a0a-4df5-866a-507d5ba03364" containerID="0f310d999dcf159194a0af6c03ba756a625a583a8b2c0e9a75d25993ce1e7904" exitCode=0 Jan 11 17:54:18 crc kubenswrapper[4837]: I0111 17:54:18.170416 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-t482m" event={"ID":"9c525acb-2a0a-4df5-866a-507d5ba03364","Type":"ContainerDied","Data":"0f310d999dcf159194a0af6c03ba756a625a583a8b2c0e9a75d25993ce1e7904"} Jan 11 17:54:18 crc kubenswrapper[4837]: I0111 17:54:18.171312 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-t482m" event={"ID":"9c525acb-2a0a-4df5-866a-507d5ba03364","Type":"ContainerDied","Data":"ba499b510764f17e49dea8ad36fd1218fa1738031338f07f882ca12ec309b9ad"} Jan 11 17:54:18 crc kubenswrapper[4837]: I0111 17:54:18.170465 4837 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-t482m" Jan 11 17:54:18 crc kubenswrapper[4837]: I0111 17:54:18.171398 4837 scope.go:117] "RemoveContainer" containerID="0f310d999dcf159194a0af6c03ba756a625a583a8b2c0e9a75d25993ce1e7904" Jan 11 17:54:18 crc kubenswrapper[4837]: I0111 17:54:18.191999 4837 scope.go:117] "RemoveContainer" containerID="2b3ca64ea875b2fa652fc6da261da8ea8f6ac568c89d18f7c8fe8abf6a2087eb" Jan 11 17:54:18 crc kubenswrapper[4837]: I0111 17:54:18.208938 4837 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-t482m"] Jan 11 17:54:18 crc kubenswrapper[4837]: I0111 17:54:18.230647 4837 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-t482m"] Jan 11 17:54:18 crc kubenswrapper[4837]: I0111 17:54:18.232799 4837 scope.go:117] "RemoveContainer" containerID="8a23e9219984e4dc111b09b363842c9957d4564340476ccdb54b9a93846e608b" Jan 11 17:54:18 crc kubenswrapper[4837]: I0111 17:54:18.273181 4837 scope.go:117] "RemoveContainer" containerID="0f310d999dcf159194a0af6c03ba756a625a583a8b2c0e9a75d25993ce1e7904" Jan 11 17:54:18 crc kubenswrapper[4837]: E0111 17:54:18.273662 4837 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"0f310d999dcf159194a0af6c03ba756a625a583a8b2c0e9a75d25993ce1e7904\": container with ID starting with 0f310d999dcf159194a0af6c03ba756a625a583a8b2c0e9a75d25993ce1e7904 not found: ID does not exist" containerID="0f310d999dcf159194a0af6c03ba756a625a583a8b2c0e9a75d25993ce1e7904" Jan 11 17:54:18 crc kubenswrapper[4837]: I0111 17:54:18.273722 4837 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0f310d999dcf159194a0af6c03ba756a625a583a8b2c0e9a75d25993ce1e7904"} err="failed to get container status \"0f310d999dcf159194a0af6c03ba756a625a583a8b2c0e9a75d25993ce1e7904\": rpc error: code = NotFound desc = could not find container \"0f310d999dcf159194a0af6c03ba756a625a583a8b2c0e9a75d25993ce1e7904\": container with ID starting with 0f310d999dcf159194a0af6c03ba756a625a583a8b2c0e9a75d25993ce1e7904 not found: ID does not exist" Jan 11 17:54:18 crc kubenswrapper[4837]: I0111 17:54:18.273756 4837 scope.go:117] "RemoveContainer" containerID="2b3ca64ea875b2fa652fc6da261da8ea8f6ac568c89d18f7c8fe8abf6a2087eb" Jan 11 17:54:18 crc kubenswrapper[4837]: E0111 17:54:18.274248 4837 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"2b3ca64ea875b2fa652fc6da261da8ea8f6ac568c89d18f7c8fe8abf6a2087eb\": container with ID starting with 2b3ca64ea875b2fa652fc6da261da8ea8f6ac568c89d18f7c8fe8abf6a2087eb not found: ID does not exist" containerID="2b3ca64ea875b2fa652fc6da261da8ea8f6ac568c89d18f7c8fe8abf6a2087eb" Jan 11 17:54:18 crc kubenswrapper[4837]: I0111 17:54:18.274295 4837 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2b3ca64ea875b2fa652fc6da261da8ea8f6ac568c89d18f7c8fe8abf6a2087eb"} err="failed to get container status \"2b3ca64ea875b2fa652fc6da261da8ea8f6ac568c89d18f7c8fe8abf6a2087eb\": rpc error: code = NotFound desc = could not find container \"2b3ca64ea875b2fa652fc6da261da8ea8f6ac568c89d18f7c8fe8abf6a2087eb\": container with ID starting with 2b3ca64ea875b2fa652fc6da261da8ea8f6ac568c89d18f7c8fe8abf6a2087eb not found: ID does not exist" Jan 11 17:54:18 crc kubenswrapper[4837]: I0111 17:54:18.274323 4837 scope.go:117] "RemoveContainer" containerID="8a23e9219984e4dc111b09b363842c9957d4564340476ccdb54b9a93846e608b" Jan 11 17:54:18 crc kubenswrapper[4837]: E0111 17:54:18.276120 4837 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"8a23e9219984e4dc111b09b363842c9957d4564340476ccdb54b9a93846e608b\": container with ID starting with 8a23e9219984e4dc111b09b363842c9957d4564340476ccdb54b9a93846e608b not found: ID does not exist" containerID="8a23e9219984e4dc111b09b363842c9957d4564340476ccdb54b9a93846e608b" Jan 11 17:54:18 crc kubenswrapper[4837]: I0111 17:54:18.276153 4837 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8a23e9219984e4dc111b09b363842c9957d4564340476ccdb54b9a93846e608b"} err="failed to get container status \"8a23e9219984e4dc111b09b363842c9957d4564340476ccdb54b9a93846e608b\": rpc error: code = NotFound desc = could not find container \"8a23e9219984e4dc111b09b363842c9957d4564340476ccdb54b9a93846e608b\": container with ID starting with 8a23e9219984e4dc111b09b363842c9957d4564340476ccdb54b9a93846e608b not found: ID does not exist" Jan 11 17:54:18 crc kubenswrapper[4837]: I0111 17:54:18.376019 4837 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="9c525acb-2a0a-4df5-866a-507d5ba03364" path="/var/lib/kubelet/pods/9c525acb-2a0a-4df5-866a-507d5ba03364/volumes" Jan 11 17:54:18 crc kubenswrapper[4837]: I0111 17:54:18.746927 4837 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-scheduler-0" Jan 11 17:54:18 crc kubenswrapper[4837]: I0111 17:54:18.782491 4837 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-scheduler-0" Jan 11 17:54:18 crc kubenswrapper[4837]: I0111 17:54:18.807918 4837 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-metadata-0" podUID="1847dbaa-536d-48f0-ac85-de5ad698e483" containerName="nova-metadata-log" probeResult="failure" output="Get \"https://10.217.0.208:8775/\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Jan 11 17:54:18 crc kubenswrapper[4837]: I0111 17:54:18.808047 4837 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-metadata-0" podUID="1847dbaa-536d-48f0-ac85-de5ad698e483" containerName="nova-metadata-metadata" probeResult="failure" output="Get \"https://10.217.0.208:8775/\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Jan 11 17:54:19 crc kubenswrapper[4837]: I0111 17:54:19.233477 4837 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-scheduler-0" Jan 11 17:54:19 crc kubenswrapper[4837]: I0111 17:54:19.545791 4837 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ceilometer-0" Jan 11 17:54:22 crc kubenswrapper[4837]: I0111 17:54:22.410763 4837 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-api-0" Jan 11 17:54:22 crc kubenswrapper[4837]: I0111 17:54:22.411439 4837 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-api-0" Jan 11 17:54:22 crc kubenswrapper[4837]: I0111 17:54:22.415596 4837 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-api-0" Jan 11 17:54:22 crc kubenswrapper[4837]: I0111 17:54:22.420438 4837 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-api-0" Jan 11 17:54:23 crc kubenswrapper[4837]: I0111 17:54:23.235616 4837 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-api-0" Jan 11 17:54:23 crc kubenswrapper[4837]: I0111 17:54:23.242629 4837 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-api-0" Jan 11 17:54:27 crc kubenswrapper[4837]: I0111 17:54:27.800354 4837 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-metadata-0" Jan 11 17:54:27 crc kubenswrapper[4837]: I0111 17:54:27.804048 4837 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-metadata-0" Jan 11 17:54:27 crc kubenswrapper[4837]: I0111 17:54:27.809433 4837 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-metadata-0" Jan 11 17:54:28 crc kubenswrapper[4837]: I0111 17:54:28.286981 4837 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-metadata-0" Jan 11 17:54:36 crc kubenswrapper[4837]: I0111 17:54:36.316308 4837 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/rabbitmq-server-0"] Jan 11 17:54:37 crc kubenswrapper[4837]: I0111 17:54:37.162385 4837 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Jan 11 17:54:39 crc kubenswrapper[4837]: I0111 17:54:39.444309 4837 patch_prober.go:28] interesting pod/machine-config-daemon-pqnst container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Jan 11 17:54:39 crc kubenswrapper[4837]: I0111 17:54:39.444779 4837 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-pqnst" podUID="1cf6fa66-290a-4e29-bafc-e60185a22fe2" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Jan 11 17:54:39 crc kubenswrapper[4837]: I0111 17:54:39.444821 4837 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-pqnst" Jan 11 17:54:39 crc kubenswrapper[4837]: I0111 17:54:39.445434 4837 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"e9cdcfc59acae9ad70b4c250fd104a15cd4c9020810ce5e6aa418583f1bc934e"} pod="openshift-machine-config-operator/machine-config-daemon-pqnst" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Jan 11 17:54:39 crc kubenswrapper[4837]: I0111 17:54:39.445474 4837 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-pqnst" podUID="1cf6fa66-290a-4e29-bafc-e60185a22fe2" containerName="machine-config-daemon" containerID="cri-o://e9cdcfc59acae9ad70b4c250fd104a15cd4c9020810ce5e6aa418583f1bc934e" gracePeriod=600 Jan 11 17:54:40 crc kubenswrapper[4837]: I0111 17:54:40.412262 4837 generic.go:334] "Generic (PLEG): container finished" podID="1cf6fa66-290a-4e29-bafc-e60185a22fe2" containerID="e9cdcfc59acae9ad70b4c250fd104a15cd4c9020810ce5e6aa418583f1bc934e" exitCode=0 Jan 11 17:54:40 crc kubenswrapper[4837]: I0111 17:54:40.412376 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-pqnst" event={"ID":"1cf6fa66-290a-4e29-bafc-e60185a22fe2","Type":"ContainerDied","Data":"e9cdcfc59acae9ad70b4c250fd104a15cd4c9020810ce5e6aa418583f1bc934e"} Jan 11 17:54:40 crc kubenswrapper[4837]: I0111 17:54:40.412816 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-pqnst" event={"ID":"1cf6fa66-290a-4e29-bafc-e60185a22fe2","Type":"ContainerStarted","Data":"275040296db5cef9700aa2e2244ea724fe1663bdc93466378b56f7e33809d0f9"} Jan 11 17:54:40 crc kubenswrapper[4837]: I0111 17:54:40.412838 4837 scope.go:117] "RemoveContainer" containerID="018458ba654491d89959a574a332d71c982ec5d0e53864759d97887f8cd68688" Jan 11 17:54:40 crc kubenswrapper[4837]: I0111 17:54:40.508488 4837 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/rabbitmq-server-0" podUID="89ca88bf-2462-4fce-8a85-8dc04655b21c" containerName="rabbitmq" containerID="cri-o://5e4f18128979344961b956f7b3cb1cf46c0c5cabba9f3d76bd3725aae7b33879" gracePeriod=604796 Jan 11 17:54:41 crc kubenswrapper[4837]: I0111 17:54:41.319800 4837 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/rabbitmq-cell1-server-0" podUID="f21b505a-45c3-4f7e-b323-204d384185b9" containerName="rabbitmq" containerID="cri-o://96fd6d715572e68e28afddc6708154913dd9d5e9e5f2f26efd5a37f3d50ac164" gracePeriod=604796 Jan 11 17:54:42 crc kubenswrapper[4837]: I0111 17:54:42.542851 4837 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/rabbitmq-server-0" podUID="89ca88bf-2462-4fce-8a85-8dc04655b21c" containerName="rabbitmq" probeResult="failure" output="dial tcp 10.217.0.99:5671: connect: connection refused" Jan 11 17:54:42 crc kubenswrapper[4837]: I0111 17:54:42.563956 4837 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/rabbitmq-cell1-server-0" podUID="f21b505a-45c3-4f7e-b323-204d384185b9" containerName="rabbitmq" probeResult="failure" output="dial tcp 10.217.0.100:5671: connect: connection refused" Jan 11 17:54:47 crc kubenswrapper[4837]: I0111 17:54:47.476690 4837 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-server-0" Jan 11 17:54:47 crc kubenswrapper[4837]: I0111 17:54:47.486686 4837 generic.go:334] "Generic (PLEG): container finished" podID="89ca88bf-2462-4fce-8a85-8dc04655b21c" containerID="5e4f18128979344961b956f7b3cb1cf46c0c5cabba9f3d76bd3725aae7b33879" exitCode=0 Jan 11 17:54:47 crc kubenswrapper[4837]: I0111 17:54:47.486731 4837 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-server-0" Jan 11 17:54:47 crc kubenswrapper[4837]: I0111 17:54:47.486755 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"89ca88bf-2462-4fce-8a85-8dc04655b21c","Type":"ContainerDied","Data":"5e4f18128979344961b956f7b3cb1cf46c0c5cabba9f3d76bd3725aae7b33879"} Jan 11 17:54:47 crc kubenswrapper[4837]: I0111 17:54:47.486791 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"89ca88bf-2462-4fce-8a85-8dc04655b21c","Type":"ContainerDied","Data":"b8506eccbe2ac34ea0c156f9f059de610ca78f68e43af4f6163f0631a5b79a96"} Jan 11 17:54:47 crc kubenswrapper[4837]: I0111 17:54:47.486810 4837 scope.go:117] "RemoveContainer" containerID="5e4f18128979344961b956f7b3cb1cf46c0c5cabba9f3d76bd3725aae7b33879" Jan 11 17:54:47 crc kubenswrapper[4837]: I0111 17:54:47.627377 4837 scope.go:117] "RemoveContainer" containerID="8e0c518885ba9fd613894e81f44f81cf63dac770e6d902da138829b469a23051" Jan 11 17:54:47 crc kubenswrapper[4837]: I0111 17:54:47.641423 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/89ca88bf-2462-4fce-8a85-8dc04655b21c-rabbitmq-erlang-cookie\") pod \"89ca88bf-2462-4fce-8a85-8dc04655b21c\" (UID: \"89ca88bf-2462-4fce-8a85-8dc04655b21c\") " Jan 11 17:54:47 crc kubenswrapper[4837]: I0111 17:54:47.641495 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/89ca88bf-2462-4fce-8a85-8dc04655b21c-rabbitmq-confd\") pod \"89ca88bf-2462-4fce-8a85-8dc04655b21c\" (UID: \"89ca88bf-2462-4fce-8a85-8dc04655b21c\") " Jan 11 17:54:47 crc kubenswrapper[4837]: I0111 17:54:47.641575 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/89ca88bf-2462-4fce-8a85-8dc04655b21c-pod-info\") pod \"89ca88bf-2462-4fce-8a85-8dc04655b21c\" (UID: \"89ca88bf-2462-4fce-8a85-8dc04655b21c\") " Jan 11 17:54:47 crc kubenswrapper[4837]: I0111 17:54:47.641596 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/89ca88bf-2462-4fce-8a85-8dc04655b21c-config-data\") pod \"89ca88bf-2462-4fce-8a85-8dc04655b21c\" (UID: \"89ca88bf-2462-4fce-8a85-8dc04655b21c\") " Jan 11 17:54:47 crc kubenswrapper[4837]: I0111 17:54:47.641626 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/89ca88bf-2462-4fce-8a85-8dc04655b21c-rabbitmq-plugins\") pod \"89ca88bf-2462-4fce-8a85-8dc04655b21c\" (UID: \"89ca88bf-2462-4fce-8a85-8dc04655b21c\") " Jan 11 17:54:47 crc kubenswrapper[4837]: I0111 17:54:47.641711 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"persistence\" (UniqueName: \"kubernetes.io/local-volume/local-storage07-crc\") pod \"89ca88bf-2462-4fce-8a85-8dc04655b21c\" (UID: \"89ca88bf-2462-4fce-8a85-8dc04655b21c\") " Jan 11 17:54:47 crc kubenswrapper[4837]: I0111 17:54:47.641764 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/89ca88bf-2462-4fce-8a85-8dc04655b21c-plugins-conf\") pod \"89ca88bf-2462-4fce-8a85-8dc04655b21c\" (UID: \"89ca88bf-2462-4fce-8a85-8dc04655b21c\") " Jan 11 17:54:47 crc kubenswrapper[4837]: I0111 17:54:47.641815 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/89ca88bf-2462-4fce-8a85-8dc04655b21c-rabbitmq-tls\") pod \"89ca88bf-2462-4fce-8a85-8dc04655b21c\" (UID: \"89ca88bf-2462-4fce-8a85-8dc04655b21c\") " Jan 11 17:54:47 crc kubenswrapper[4837]: I0111 17:54:47.641865 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-kz8zc\" (UniqueName: \"kubernetes.io/projected/89ca88bf-2462-4fce-8a85-8dc04655b21c-kube-api-access-kz8zc\") pod \"89ca88bf-2462-4fce-8a85-8dc04655b21c\" (UID: \"89ca88bf-2462-4fce-8a85-8dc04655b21c\") " Jan 11 17:54:47 crc kubenswrapper[4837]: I0111 17:54:47.641955 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/89ca88bf-2462-4fce-8a85-8dc04655b21c-server-conf\") pod \"89ca88bf-2462-4fce-8a85-8dc04655b21c\" (UID: \"89ca88bf-2462-4fce-8a85-8dc04655b21c\") " Jan 11 17:54:47 crc kubenswrapper[4837]: I0111 17:54:47.641996 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/89ca88bf-2462-4fce-8a85-8dc04655b21c-erlang-cookie-secret\") pod \"89ca88bf-2462-4fce-8a85-8dc04655b21c\" (UID: \"89ca88bf-2462-4fce-8a85-8dc04655b21c\") " Jan 11 17:54:47 crc kubenswrapper[4837]: I0111 17:54:47.643341 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/89ca88bf-2462-4fce-8a85-8dc04655b21c-rabbitmq-erlang-cookie" (OuterVolumeSpecName: "rabbitmq-erlang-cookie") pod "89ca88bf-2462-4fce-8a85-8dc04655b21c" (UID: "89ca88bf-2462-4fce-8a85-8dc04655b21c"). InnerVolumeSpecName "rabbitmq-erlang-cookie". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 11 17:54:47 crc kubenswrapper[4837]: I0111 17:54:47.647331 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/89ca88bf-2462-4fce-8a85-8dc04655b21c-rabbitmq-plugins" (OuterVolumeSpecName: "rabbitmq-plugins") pod "89ca88bf-2462-4fce-8a85-8dc04655b21c" (UID: "89ca88bf-2462-4fce-8a85-8dc04655b21c"). InnerVolumeSpecName "rabbitmq-plugins". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 11 17:54:47 crc kubenswrapper[4837]: I0111 17:54:47.648522 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/89ca88bf-2462-4fce-8a85-8dc04655b21c-plugins-conf" (OuterVolumeSpecName: "plugins-conf") pod "89ca88bf-2462-4fce-8a85-8dc04655b21c" (UID: "89ca88bf-2462-4fce-8a85-8dc04655b21c"). InnerVolumeSpecName "plugins-conf". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 11 17:54:47 crc kubenswrapper[4837]: I0111 17:54:47.651576 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage07-crc" (OuterVolumeSpecName: "persistence") pod "89ca88bf-2462-4fce-8a85-8dc04655b21c" (UID: "89ca88bf-2462-4fce-8a85-8dc04655b21c"). InnerVolumeSpecName "local-storage07-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Jan 11 17:54:47 crc kubenswrapper[4837]: I0111 17:54:47.651583 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/89ca88bf-2462-4fce-8a85-8dc04655b21c-rabbitmq-tls" (OuterVolumeSpecName: "rabbitmq-tls") pod "89ca88bf-2462-4fce-8a85-8dc04655b21c" (UID: "89ca88bf-2462-4fce-8a85-8dc04655b21c"). InnerVolumeSpecName "rabbitmq-tls". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 11 17:54:47 crc kubenswrapper[4837]: I0111 17:54:47.653725 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/89ca88bf-2462-4fce-8a85-8dc04655b21c-kube-api-access-kz8zc" (OuterVolumeSpecName: "kube-api-access-kz8zc") pod "89ca88bf-2462-4fce-8a85-8dc04655b21c" (UID: "89ca88bf-2462-4fce-8a85-8dc04655b21c"). InnerVolumeSpecName "kube-api-access-kz8zc". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 11 17:54:47 crc kubenswrapper[4837]: I0111 17:54:47.654350 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/downward-api/89ca88bf-2462-4fce-8a85-8dc04655b21c-pod-info" (OuterVolumeSpecName: "pod-info") pod "89ca88bf-2462-4fce-8a85-8dc04655b21c" (UID: "89ca88bf-2462-4fce-8a85-8dc04655b21c"). InnerVolumeSpecName "pod-info". PluginName "kubernetes.io/downward-api", VolumeGidValue "" Jan 11 17:54:47 crc kubenswrapper[4837]: I0111 17:54:47.663785 4837 scope.go:117] "RemoveContainer" containerID="5e4f18128979344961b956f7b3cb1cf46c0c5cabba9f3d76bd3725aae7b33879" Jan 11 17:54:47 crc kubenswrapper[4837]: E0111 17:54:47.664708 4837 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"5e4f18128979344961b956f7b3cb1cf46c0c5cabba9f3d76bd3725aae7b33879\": container with ID starting with 5e4f18128979344961b956f7b3cb1cf46c0c5cabba9f3d76bd3725aae7b33879 not found: ID does not exist" containerID="5e4f18128979344961b956f7b3cb1cf46c0c5cabba9f3d76bd3725aae7b33879" Jan 11 17:54:47 crc kubenswrapper[4837]: I0111 17:54:47.664829 4837 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5e4f18128979344961b956f7b3cb1cf46c0c5cabba9f3d76bd3725aae7b33879"} err="failed to get container status \"5e4f18128979344961b956f7b3cb1cf46c0c5cabba9f3d76bd3725aae7b33879\": rpc error: code = NotFound desc = could not find container \"5e4f18128979344961b956f7b3cb1cf46c0c5cabba9f3d76bd3725aae7b33879\": container with ID starting with 5e4f18128979344961b956f7b3cb1cf46c0c5cabba9f3d76bd3725aae7b33879 not found: ID does not exist" Jan 11 17:54:47 crc kubenswrapper[4837]: I0111 17:54:47.664908 4837 scope.go:117] "RemoveContainer" containerID="8e0c518885ba9fd613894e81f44f81cf63dac770e6d902da138829b469a23051" Jan 11 17:54:47 crc kubenswrapper[4837]: I0111 17:54:47.666918 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/89ca88bf-2462-4fce-8a85-8dc04655b21c-erlang-cookie-secret" (OuterVolumeSpecName: "erlang-cookie-secret") pod "89ca88bf-2462-4fce-8a85-8dc04655b21c" (UID: "89ca88bf-2462-4fce-8a85-8dc04655b21c"). InnerVolumeSpecName "erlang-cookie-secret". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 11 17:54:47 crc kubenswrapper[4837]: E0111 17:54:47.668564 4837 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"8e0c518885ba9fd613894e81f44f81cf63dac770e6d902da138829b469a23051\": container with ID starting with 8e0c518885ba9fd613894e81f44f81cf63dac770e6d902da138829b469a23051 not found: ID does not exist" containerID="8e0c518885ba9fd613894e81f44f81cf63dac770e6d902da138829b469a23051" Jan 11 17:54:47 crc kubenswrapper[4837]: I0111 17:54:47.668619 4837 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8e0c518885ba9fd613894e81f44f81cf63dac770e6d902da138829b469a23051"} err="failed to get container status \"8e0c518885ba9fd613894e81f44f81cf63dac770e6d902da138829b469a23051\": rpc error: code = NotFound desc = could not find container \"8e0c518885ba9fd613894e81f44f81cf63dac770e6d902da138829b469a23051\": container with ID starting with 8e0c518885ba9fd613894e81f44f81cf63dac770e6d902da138829b469a23051 not found: ID does not exist" Jan 11 17:54:47 crc kubenswrapper[4837]: I0111 17:54:47.707760 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/89ca88bf-2462-4fce-8a85-8dc04655b21c-config-data" (OuterVolumeSpecName: "config-data") pod "89ca88bf-2462-4fce-8a85-8dc04655b21c" (UID: "89ca88bf-2462-4fce-8a85-8dc04655b21c"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 11 17:54:47 crc kubenswrapper[4837]: I0111 17:54:47.729193 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/89ca88bf-2462-4fce-8a85-8dc04655b21c-server-conf" (OuterVolumeSpecName: "server-conf") pod "89ca88bf-2462-4fce-8a85-8dc04655b21c" (UID: "89ca88bf-2462-4fce-8a85-8dc04655b21c"). InnerVolumeSpecName "server-conf". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 11 17:54:47 crc kubenswrapper[4837]: I0111 17:54:47.746058 4837 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage07-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage07-crc\") on node \"crc\" " Jan 11 17:54:47 crc kubenswrapper[4837]: I0111 17:54:47.746091 4837 reconciler_common.go:293] "Volume detached for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/89ca88bf-2462-4fce-8a85-8dc04655b21c-plugins-conf\") on node \"crc\" DevicePath \"\"" Jan 11 17:54:47 crc kubenswrapper[4837]: I0111 17:54:47.746105 4837 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/89ca88bf-2462-4fce-8a85-8dc04655b21c-rabbitmq-tls\") on node \"crc\" DevicePath \"\"" Jan 11 17:54:47 crc kubenswrapper[4837]: I0111 17:54:47.746120 4837 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-kz8zc\" (UniqueName: \"kubernetes.io/projected/89ca88bf-2462-4fce-8a85-8dc04655b21c-kube-api-access-kz8zc\") on node \"crc\" DevicePath \"\"" Jan 11 17:54:47 crc kubenswrapper[4837]: I0111 17:54:47.746136 4837 reconciler_common.go:293] "Volume detached for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/89ca88bf-2462-4fce-8a85-8dc04655b21c-server-conf\") on node \"crc\" DevicePath \"\"" Jan 11 17:54:47 crc kubenswrapper[4837]: I0111 17:54:47.746149 4837 reconciler_common.go:293] "Volume detached for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/89ca88bf-2462-4fce-8a85-8dc04655b21c-erlang-cookie-secret\") on node \"crc\" DevicePath \"\"" Jan 11 17:54:47 crc kubenswrapper[4837]: I0111 17:54:47.746162 4837 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/89ca88bf-2462-4fce-8a85-8dc04655b21c-rabbitmq-erlang-cookie\") on node \"crc\" DevicePath \"\"" Jan 11 17:54:47 crc kubenswrapper[4837]: I0111 17:54:47.746175 4837 reconciler_common.go:293] "Volume detached for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/89ca88bf-2462-4fce-8a85-8dc04655b21c-pod-info\") on node \"crc\" DevicePath \"\"" Jan 11 17:54:47 crc kubenswrapper[4837]: I0111 17:54:47.746188 4837 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/89ca88bf-2462-4fce-8a85-8dc04655b21c-config-data\") on node \"crc\" DevicePath \"\"" Jan 11 17:54:47 crc kubenswrapper[4837]: I0111 17:54:47.746198 4837 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/89ca88bf-2462-4fce-8a85-8dc04655b21c-rabbitmq-plugins\") on node \"crc\" DevicePath \"\"" Jan 11 17:54:47 crc kubenswrapper[4837]: I0111 17:54:47.782273 4837 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage07-crc" (UniqueName: "kubernetes.io/local-volume/local-storage07-crc") on node "crc" Jan 11 17:54:47 crc kubenswrapper[4837]: I0111 17:54:47.827471 4837 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-67b789f86c-2dr5n"] Jan 11 17:54:47 crc kubenswrapper[4837]: E0111 17:54:47.827910 4837 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9c525acb-2a0a-4df5-866a-507d5ba03364" containerName="extract-content" Jan 11 17:54:47 crc kubenswrapper[4837]: I0111 17:54:47.827927 4837 state_mem.go:107] "Deleted CPUSet assignment" podUID="9c525acb-2a0a-4df5-866a-507d5ba03364" containerName="extract-content" Jan 11 17:54:47 crc kubenswrapper[4837]: E0111 17:54:47.827949 4837 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="89ca88bf-2462-4fce-8a85-8dc04655b21c" containerName="rabbitmq" Jan 11 17:54:47 crc kubenswrapper[4837]: I0111 17:54:47.827954 4837 state_mem.go:107] "Deleted CPUSet assignment" podUID="89ca88bf-2462-4fce-8a85-8dc04655b21c" containerName="rabbitmq" Jan 11 17:54:47 crc kubenswrapper[4837]: E0111 17:54:47.827968 4837 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9c525acb-2a0a-4df5-866a-507d5ba03364" containerName="extract-utilities" Jan 11 17:54:47 crc kubenswrapper[4837]: I0111 17:54:47.827975 4837 state_mem.go:107] "Deleted CPUSet assignment" podUID="9c525acb-2a0a-4df5-866a-507d5ba03364" containerName="extract-utilities" Jan 11 17:54:47 crc kubenswrapper[4837]: E0111 17:54:47.827987 4837 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="89ca88bf-2462-4fce-8a85-8dc04655b21c" containerName="setup-container" Jan 11 17:54:47 crc kubenswrapper[4837]: I0111 17:54:47.827993 4837 state_mem.go:107] "Deleted CPUSet assignment" podUID="89ca88bf-2462-4fce-8a85-8dc04655b21c" containerName="setup-container" Jan 11 17:54:47 crc kubenswrapper[4837]: E0111 17:54:47.828010 4837 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9c525acb-2a0a-4df5-866a-507d5ba03364" containerName="registry-server" Jan 11 17:54:47 crc kubenswrapper[4837]: I0111 17:54:47.828017 4837 state_mem.go:107] "Deleted CPUSet assignment" podUID="9c525acb-2a0a-4df5-866a-507d5ba03364" containerName="registry-server" Jan 11 17:54:47 crc kubenswrapper[4837]: I0111 17:54:47.828188 4837 memory_manager.go:354] "RemoveStaleState removing state" podUID="9c525acb-2a0a-4df5-866a-507d5ba03364" containerName="registry-server" Jan 11 17:54:47 crc kubenswrapper[4837]: I0111 17:54:47.828198 4837 memory_manager.go:354] "RemoveStaleState removing state" podUID="89ca88bf-2462-4fce-8a85-8dc04655b21c" containerName="rabbitmq" Jan 11 17:54:47 crc kubenswrapper[4837]: I0111 17:54:47.830934 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-67b789f86c-2dr5n" Jan 11 17:54:47 crc kubenswrapper[4837]: I0111 17:54:47.836982 4837 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-edpm-ipam" Jan 11 17:54:47 crc kubenswrapper[4837]: I0111 17:54:47.838000 4837 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-67b789f86c-2dr5n"] Jan 11 17:54:47 crc kubenswrapper[4837]: I0111 17:54:47.847695 4837 reconciler_common.go:293] "Volume detached for volume \"local-storage07-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage07-crc\") on node \"crc\" DevicePath \"\"" Jan 11 17:54:47 crc kubenswrapper[4837]: I0111 17:54:47.880694 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/89ca88bf-2462-4fce-8a85-8dc04655b21c-rabbitmq-confd" (OuterVolumeSpecName: "rabbitmq-confd") pod "89ca88bf-2462-4fce-8a85-8dc04655b21c" (UID: "89ca88bf-2462-4fce-8a85-8dc04655b21c"). InnerVolumeSpecName "rabbitmq-confd". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 11 17:54:47 crc kubenswrapper[4837]: I0111 17:54:47.894648 4837 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-cell1-server-0" Jan 11 17:54:47 crc kubenswrapper[4837]: I0111 17:54:47.950233 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/a3c82eb3-b0a8-4972-bc61-c990541be2eb-dns-svc\") pod \"dnsmasq-dns-67b789f86c-2dr5n\" (UID: \"a3c82eb3-b0a8-4972-bc61-c990541be2eb\") " pod="openstack/dnsmasq-dns-67b789f86c-2dr5n" Jan 11 17:54:47 crc kubenswrapper[4837]: I0111 17:54:47.950333 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/configmap/a3c82eb3-b0a8-4972-bc61-c990541be2eb-openstack-edpm-ipam\") pod \"dnsmasq-dns-67b789f86c-2dr5n\" (UID: \"a3c82eb3-b0a8-4972-bc61-c990541be2eb\") " pod="openstack/dnsmasq-dns-67b789f86c-2dr5n" Jan 11 17:54:47 crc kubenswrapper[4837]: I0111 17:54:47.950436 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/a3c82eb3-b0a8-4972-bc61-c990541be2eb-dns-swift-storage-0\") pod \"dnsmasq-dns-67b789f86c-2dr5n\" (UID: \"a3c82eb3-b0a8-4972-bc61-c990541be2eb\") " pod="openstack/dnsmasq-dns-67b789f86c-2dr5n" Jan 11 17:54:47 crc kubenswrapper[4837]: I0111 17:54:47.950464 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/a3c82eb3-b0a8-4972-bc61-c990541be2eb-ovsdbserver-sb\") pod \"dnsmasq-dns-67b789f86c-2dr5n\" (UID: \"a3c82eb3-b0a8-4972-bc61-c990541be2eb\") " pod="openstack/dnsmasq-dns-67b789f86c-2dr5n" Jan 11 17:54:47 crc kubenswrapper[4837]: I0111 17:54:47.950497 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4w696\" (UniqueName: \"kubernetes.io/projected/a3c82eb3-b0a8-4972-bc61-c990541be2eb-kube-api-access-4w696\") pod \"dnsmasq-dns-67b789f86c-2dr5n\" (UID: \"a3c82eb3-b0a8-4972-bc61-c990541be2eb\") " pod="openstack/dnsmasq-dns-67b789f86c-2dr5n" Jan 11 17:54:47 crc kubenswrapper[4837]: I0111 17:54:47.950527 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a3c82eb3-b0a8-4972-bc61-c990541be2eb-config\") pod \"dnsmasq-dns-67b789f86c-2dr5n\" (UID: \"a3c82eb3-b0a8-4972-bc61-c990541be2eb\") " pod="openstack/dnsmasq-dns-67b789f86c-2dr5n" Jan 11 17:54:47 crc kubenswrapper[4837]: I0111 17:54:47.950636 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/a3c82eb3-b0a8-4972-bc61-c990541be2eb-ovsdbserver-nb\") pod \"dnsmasq-dns-67b789f86c-2dr5n\" (UID: \"a3c82eb3-b0a8-4972-bc61-c990541be2eb\") " pod="openstack/dnsmasq-dns-67b789f86c-2dr5n" Jan 11 17:54:47 crc kubenswrapper[4837]: I0111 17:54:47.950722 4837 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/89ca88bf-2462-4fce-8a85-8dc04655b21c-rabbitmq-confd\") on node \"crc\" DevicePath \"\"" Jan 11 17:54:47 crc kubenswrapper[4837]: I0111 17:54:47.984699 4837 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-67b789f86c-2dr5n"] Jan 11 17:54:47 crc kubenswrapper[4837]: E0111 17:54:47.985372 4837 pod_workers.go:1301] "Error syncing pod, skipping" err="unmounted volumes=[config dns-svc dns-swift-storage-0 kube-api-access-4w696 openstack-edpm-ipam ovsdbserver-nb ovsdbserver-sb], unattached volumes=[], failed to process volumes=[]: context canceled" pod="openstack/dnsmasq-dns-67b789f86c-2dr5n" podUID="a3c82eb3-b0a8-4972-bc61-c990541be2eb" Jan 11 17:54:48 crc kubenswrapper[4837]: I0111 17:54:48.000381 4837 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-cb6ffcf87-ch9dq"] Jan 11 17:54:48 crc kubenswrapper[4837]: E0111 17:54:48.000863 4837 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f21b505a-45c3-4f7e-b323-204d384185b9" containerName="setup-container" Jan 11 17:54:48 crc kubenswrapper[4837]: I0111 17:54:48.000886 4837 state_mem.go:107] "Deleted CPUSet assignment" podUID="f21b505a-45c3-4f7e-b323-204d384185b9" containerName="setup-container" Jan 11 17:54:48 crc kubenswrapper[4837]: E0111 17:54:48.000922 4837 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f21b505a-45c3-4f7e-b323-204d384185b9" containerName="rabbitmq" Jan 11 17:54:48 crc kubenswrapper[4837]: I0111 17:54:48.000931 4837 state_mem.go:107] "Deleted CPUSet assignment" podUID="f21b505a-45c3-4f7e-b323-204d384185b9" containerName="rabbitmq" Jan 11 17:54:48 crc kubenswrapper[4837]: I0111 17:54:48.001165 4837 memory_manager.go:354] "RemoveStaleState removing state" podUID="f21b505a-45c3-4f7e-b323-204d384185b9" containerName="rabbitmq" Jan 11 17:54:48 crc kubenswrapper[4837]: I0111 17:54:48.003722 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-cb6ffcf87-ch9dq" Jan 11 17:54:48 crc kubenswrapper[4837]: I0111 17:54:48.014850 4837 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-cb6ffcf87-ch9dq"] Jan 11 17:54:48 crc kubenswrapper[4837]: I0111 17:54:48.052031 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/f21b505a-45c3-4f7e-b323-204d384185b9-rabbitmq-tls\") pod \"f21b505a-45c3-4f7e-b323-204d384185b9\" (UID: \"f21b505a-45c3-4f7e-b323-204d384185b9\") " Jan 11 17:54:48 crc kubenswrapper[4837]: I0111 17:54:48.052156 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/f21b505a-45c3-4f7e-b323-204d384185b9-plugins-conf\") pod \"f21b505a-45c3-4f7e-b323-204d384185b9\" (UID: \"f21b505a-45c3-4f7e-b323-204d384185b9\") " Jan 11 17:54:48 crc kubenswrapper[4837]: I0111 17:54:48.052187 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/f21b505a-45c3-4f7e-b323-204d384185b9-pod-info\") pod \"f21b505a-45c3-4f7e-b323-204d384185b9\" (UID: \"f21b505a-45c3-4f7e-b323-204d384185b9\") " Jan 11 17:54:48 crc kubenswrapper[4837]: I0111 17:54:48.052225 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/f21b505a-45c3-4f7e-b323-204d384185b9-erlang-cookie-secret\") pod \"f21b505a-45c3-4f7e-b323-204d384185b9\" (UID: \"f21b505a-45c3-4f7e-b323-204d384185b9\") " Jan 11 17:54:48 crc kubenswrapper[4837]: I0111 17:54:48.052245 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/f21b505a-45c3-4f7e-b323-204d384185b9-rabbitmq-plugins\") pod \"f21b505a-45c3-4f7e-b323-204d384185b9\" (UID: \"f21b505a-45c3-4f7e-b323-204d384185b9\") " Jan 11 17:54:48 crc kubenswrapper[4837]: I0111 17:54:48.052325 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/f21b505a-45c3-4f7e-b323-204d384185b9-rabbitmq-confd\") pod \"f21b505a-45c3-4f7e-b323-204d384185b9\" (UID: \"f21b505a-45c3-4f7e-b323-204d384185b9\") " Jan 11 17:54:48 crc kubenswrapper[4837]: I0111 17:54:48.052351 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/f21b505a-45c3-4f7e-b323-204d384185b9-rabbitmq-erlang-cookie\") pod \"f21b505a-45c3-4f7e-b323-204d384185b9\" (UID: \"f21b505a-45c3-4f7e-b323-204d384185b9\") " Jan 11 17:54:48 crc kubenswrapper[4837]: I0111 17:54:48.052407 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/f21b505a-45c3-4f7e-b323-204d384185b9-config-data\") pod \"f21b505a-45c3-4f7e-b323-204d384185b9\" (UID: \"f21b505a-45c3-4f7e-b323-204d384185b9\") " Jan 11 17:54:48 crc kubenswrapper[4837]: I0111 17:54:48.052450 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"persistence\" (UniqueName: \"kubernetes.io/local-volume/local-storage02-crc\") pod \"f21b505a-45c3-4f7e-b323-204d384185b9\" (UID: \"f21b505a-45c3-4f7e-b323-204d384185b9\") " Jan 11 17:54:48 crc kubenswrapper[4837]: I0111 17:54:48.052487 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-5lvv9\" (UniqueName: \"kubernetes.io/projected/f21b505a-45c3-4f7e-b323-204d384185b9-kube-api-access-5lvv9\") pod \"f21b505a-45c3-4f7e-b323-204d384185b9\" (UID: \"f21b505a-45c3-4f7e-b323-204d384185b9\") " Jan 11 17:54:48 crc kubenswrapper[4837]: I0111 17:54:48.052526 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/f21b505a-45c3-4f7e-b323-204d384185b9-server-conf\") pod \"f21b505a-45c3-4f7e-b323-204d384185b9\" (UID: \"f21b505a-45c3-4f7e-b323-204d384185b9\") " Jan 11 17:54:48 crc kubenswrapper[4837]: I0111 17:54:48.052802 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/a3c82eb3-b0a8-4972-bc61-c990541be2eb-ovsdbserver-nb\") pod \"dnsmasq-dns-67b789f86c-2dr5n\" (UID: \"a3c82eb3-b0a8-4972-bc61-c990541be2eb\") " pod="openstack/dnsmasq-dns-67b789f86c-2dr5n" Jan 11 17:54:48 crc kubenswrapper[4837]: I0111 17:54:48.052840 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/a3c82eb3-b0a8-4972-bc61-c990541be2eb-dns-svc\") pod \"dnsmasq-dns-67b789f86c-2dr5n\" (UID: \"a3c82eb3-b0a8-4972-bc61-c990541be2eb\") " pod="openstack/dnsmasq-dns-67b789f86c-2dr5n" Jan 11 17:54:48 crc kubenswrapper[4837]: I0111 17:54:48.052901 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/configmap/a3c82eb3-b0a8-4972-bc61-c990541be2eb-openstack-edpm-ipam\") pod \"dnsmasq-dns-67b789f86c-2dr5n\" (UID: \"a3c82eb3-b0a8-4972-bc61-c990541be2eb\") " pod="openstack/dnsmasq-dns-67b789f86c-2dr5n" Jan 11 17:54:48 crc kubenswrapper[4837]: I0111 17:54:48.052985 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/a3c82eb3-b0a8-4972-bc61-c990541be2eb-dns-swift-storage-0\") pod \"dnsmasq-dns-67b789f86c-2dr5n\" (UID: \"a3c82eb3-b0a8-4972-bc61-c990541be2eb\") " pod="openstack/dnsmasq-dns-67b789f86c-2dr5n" Jan 11 17:54:48 crc kubenswrapper[4837]: I0111 17:54:48.053018 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/a3c82eb3-b0a8-4972-bc61-c990541be2eb-ovsdbserver-sb\") pod \"dnsmasq-dns-67b789f86c-2dr5n\" (UID: \"a3c82eb3-b0a8-4972-bc61-c990541be2eb\") " pod="openstack/dnsmasq-dns-67b789f86c-2dr5n" Jan 11 17:54:48 crc kubenswrapper[4837]: I0111 17:54:48.053046 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4w696\" (UniqueName: \"kubernetes.io/projected/a3c82eb3-b0a8-4972-bc61-c990541be2eb-kube-api-access-4w696\") pod \"dnsmasq-dns-67b789f86c-2dr5n\" (UID: \"a3c82eb3-b0a8-4972-bc61-c990541be2eb\") " pod="openstack/dnsmasq-dns-67b789f86c-2dr5n" Jan 11 17:54:48 crc kubenswrapper[4837]: I0111 17:54:48.053075 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a3c82eb3-b0a8-4972-bc61-c990541be2eb-config\") pod \"dnsmasq-dns-67b789f86c-2dr5n\" (UID: \"a3c82eb3-b0a8-4972-bc61-c990541be2eb\") " pod="openstack/dnsmasq-dns-67b789f86c-2dr5n" Jan 11 17:54:48 crc kubenswrapper[4837]: I0111 17:54:48.054142 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/a3c82eb3-b0a8-4972-bc61-c990541be2eb-dns-svc\") pod \"dnsmasq-dns-67b789f86c-2dr5n\" (UID: \"a3c82eb3-b0a8-4972-bc61-c990541be2eb\") " pod="openstack/dnsmasq-dns-67b789f86c-2dr5n" Jan 11 17:54:48 crc kubenswrapper[4837]: I0111 17:54:48.054162 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a3c82eb3-b0a8-4972-bc61-c990541be2eb-config\") pod \"dnsmasq-dns-67b789f86c-2dr5n\" (UID: \"a3c82eb3-b0a8-4972-bc61-c990541be2eb\") " pod="openstack/dnsmasq-dns-67b789f86c-2dr5n" Jan 11 17:54:48 crc kubenswrapper[4837]: I0111 17:54:48.054263 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/configmap/a3c82eb3-b0a8-4972-bc61-c990541be2eb-openstack-edpm-ipam\") pod \"dnsmasq-dns-67b789f86c-2dr5n\" (UID: \"a3c82eb3-b0a8-4972-bc61-c990541be2eb\") " pod="openstack/dnsmasq-dns-67b789f86c-2dr5n" Jan 11 17:54:48 crc kubenswrapper[4837]: I0111 17:54:48.054846 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/a3c82eb3-b0a8-4972-bc61-c990541be2eb-ovsdbserver-sb\") pod \"dnsmasq-dns-67b789f86c-2dr5n\" (UID: \"a3c82eb3-b0a8-4972-bc61-c990541be2eb\") " pod="openstack/dnsmasq-dns-67b789f86c-2dr5n" Jan 11 17:54:48 crc kubenswrapper[4837]: I0111 17:54:48.055109 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/a3c82eb3-b0a8-4972-bc61-c990541be2eb-dns-swift-storage-0\") pod \"dnsmasq-dns-67b789f86c-2dr5n\" (UID: \"a3c82eb3-b0a8-4972-bc61-c990541be2eb\") " pod="openstack/dnsmasq-dns-67b789f86c-2dr5n" Jan 11 17:54:48 crc kubenswrapper[4837]: I0111 17:54:48.055542 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/f21b505a-45c3-4f7e-b323-204d384185b9-rabbitmq-plugins" (OuterVolumeSpecName: "rabbitmq-plugins") pod "f21b505a-45c3-4f7e-b323-204d384185b9" (UID: "f21b505a-45c3-4f7e-b323-204d384185b9"). InnerVolumeSpecName "rabbitmq-plugins". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 11 17:54:48 crc kubenswrapper[4837]: I0111 17:54:48.055768 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/a3c82eb3-b0a8-4972-bc61-c990541be2eb-ovsdbserver-nb\") pod \"dnsmasq-dns-67b789f86c-2dr5n\" (UID: \"a3c82eb3-b0a8-4972-bc61-c990541be2eb\") " pod="openstack/dnsmasq-dns-67b789f86c-2dr5n" Jan 11 17:54:48 crc kubenswrapper[4837]: I0111 17:54:48.056347 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/f21b505a-45c3-4f7e-b323-204d384185b9-rabbitmq-erlang-cookie" (OuterVolumeSpecName: "rabbitmq-erlang-cookie") pod "f21b505a-45c3-4f7e-b323-204d384185b9" (UID: "f21b505a-45c3-4f7e-b323-204d384185b9"). InnerVolumeSpecName "rabbitmq-erlang-cookie". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 11 17:54:48 crc kubenswrapper[4837]: I0111 17:54:48.057342 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/f21b505a-45c3-4f7e-b323-204d384185b9-plugins-conf" (OuterVolumeSpecName: "plugins-conf") pod "f21b505a-45c3-4f7e-b323-204d384185b9" (UID: "f21b505a-45c3-4f7e-b323-204d384185b9"). InnerVolumeSpecName "plugins-conf". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 11 17:54:48 crc kubenswrapper[4837]: I0111 17:54:48.063892 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/downward-api/f21b505a-45c3-4f7e-b323-204d384185b9-pod-info" (OuterVolumeSpecName: "pod-info") pod "f21b505a-45c3-4f7e-b323-204d384185b9" (UID: "f21b505a-45c3-4f7e-b323-204d384185b9"). InnerVolumeSpecName "pod-info". PluginName "kubernetes.io/downward-api", VolumeGidValue "" Jan 11 17:54:48 crc kubenswrapper[4837]: I0111 17:54:48.063904 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f21b505a-45c3-4f7e-b323-204d384185b9-erlang-cookie-secret" (OuterVolumeSpecName: "erlang-cookie-secret") pod "f21b505a-45c3-4f7e-b323-204d384185b9" (UID: "f21b505a-45c3-4f7e-b323-204d384185b9"). InnerVolumeSpecName "erlang-cookie-secret". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 11 17:54:48 crc kubenswrapper[4837]: I0111 17:54:48.063961 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f21b505a-45c3-4f7e-b323-204d384185b9-kube-api-access-5lvv9" (OuterVolumeSpecName: "kube-api-access-5lvv9") pod "f21b505a-45c3-4f7e-b323-204d384185b9" (UID: "f21b505a-45c3-4f7e-b323-204d384185b9"). InnerVolumeSpecName "kube-api-access-5lvv9". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 11 17:54:48 crc kubenswrapper[4837]: I0111 17:54:48.063974 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage02-crc" (OuterVolumeSpecName: "persistence") pod "f21b505a-45c3-4f7e-b323-204d384185b9" (UID: "f21b505a-45c3-4f7e-b323-204d384185b9"). InnerVolumeSpecName "local-storage02-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Jan 11 17:54:48 crc kubenswrapper[4837]: I0111 17:54:48.063992 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f21b505a-45c3-4f7e-b323-204d384185b9-rabbitmq-tls" (OuterVolumeSpecName: "rabbitmq-tls") pod "f21b505a-45c3-4f7e-b323-204d384185b9" (UID: "f21b505a-45c3-4f7e-b323-204d384185b9"). InnerVolumeSpecName "rabbitmq-tls". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 11 17:54:48 crc kubenswrapper[4837]: I0111 17:54:48.069606 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4w696\" (UniqueName: \"kubernetes.io/projected/a3c82eb3-b0a8-4972-bc61-c990541be2eb-kube-api-access-4w696\") pod \"dnsmasq-dns-67b789f86c-2dr5n\" (UID: \"a3c82eb3-b0a8-4972-bc61-c990541be2eb\") " pod="openstack/dnsmasq-dns-67b789f86c-2dr5n" Jan 11 17:54:48 crc kubenswrapper[4837]: I0111 17:54:48.078276 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/f21b505a-45c3-4f7e-b323-204d384185b9-config-data" (OuterVolumeSpecName: "config-data") pod "f21b505a-45c3-4f7e-b323-204d384185b9" (UID: "f21b505a-45c3-4f7e-b323-204d384185b9"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 11 17:54:48 crc kubenswrapper[4837]: I0111 17:54:48.133396 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/f21b505a-45c3-4f7e-b323-204d384185b9-server-conf" (OuterVolumeSpecName: "server-conf") pod "f21b505a-45c3-4f7e-b323-204d384185b9" (UID: "f21b505a-45c3-4f7e-b323-204d384185b9"). InnerVolumeSpecName "server-conf". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 11 17:54:48 crc kubenswrapper[4837]: I0111 17:54:48.155448 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/25f14f43-12d0-4c7d-b823-4c6d3eecd355-ovsdbserver-nb\") pod \"dnsmasq-dns-cb6ffcf87-ch9dq\" (UID: \"25f14f43-12d0-4c7d-b823-4c6d3eecd355\") " pod="openstack/dnsmasq-dns-cb6ffcf87-ch9dq" Jan 11 17:54:48 crc kubenswrapper[4837]: I0111 17:54:48.155531 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ckl49\" (UniqueName: \"kubernetes.io/projected/25f14f43-12d0-4c7d-b823-4c6d3eecd355-kube-api-access-ckl49\") pod \"dnsmasq-dns-cb6ffcf87-ch9dq\" (UID: \"25f14f43-12d0-4c7d-b823-4c6d3eecd355\") " pod="openstack/dnsmasq-dns-cb6ffcf87-ch9dq" Jan 11 17:54:48 crc kubenswrapper[4837]: I0111 17:54:48.155554 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/25f14f43-12d0-4c7d-b823-4c6d3eecd355-config\") pod \"dnsmasq-dns-cb6ffcf87-ch9dq\" (UID: \"25f14f43-12d0-4c7d-b823-4c6d3eecd355\") " pod="openstack/dnsmasq-dns-cb6ffcf87-ch9dq" Jan 11 17:54:48 crc kubenswrapper[4837]: I0111 17:54:48.155628 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/25f14f43-12d0-4c7d-b823-4c6d3eecd355-dns-svc\") pod \"dnsmasq-dns-cb6ffcf87-ch9dq\" (UID: \"25f14f43-12d0-4c7d-b823-4c6d3eecd355\") " pod="openstack/dnsmasq-dns-cb6ffcf87-ch9dq" Jan 11 17:54:48 crc kubenswrapper[4837]: I0111 17:54:48.155695 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/25f14f43-12d0-4c7d-b823-4c6d3eecd355-dns-swift-storage-0\") pod \"dnsmasq-dns-cb6ffcf87-ch9dq\" (UID: \"25f14f43-12d0-4c7d-b823-4c6d3eecd355\") " pod="openstack/dnsmasq-dns-cb6ffcf87-ch9dq" Jan 11 17:54:48 crc kubenswrapper[4837]: I0111 17:54:48.155717 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/25f14f43-12d0-4c7d-b823-4c6d3eecd355-ovsdbserver-sb\") pod \"dnsmasq-dns-cb6ffcf87-ch9dq\" (UID: \"25f14f43-12d0-4c7d-b823-4c6d3eecd355\") " pod="openstack/dnsmasq-dns-cb6ffcf87-ch9dq" Jan 11 17:54:48 crc kubenswrapper[4837]: I0111 17:54:48.155746 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/configmap/25f14f43-12d0-4c7d-b823-4c6d3eecd355-openstack-edpm-ipam\") pod \"dnsmasq-dns-cb6ffcf87-ch9dq\" (UID: \"25f14f43-12d0-4c7d-b823-4c6d3eecd355\") " pod="openstack/dnsmasq-dns-cb6ffcf87-ch9dq" Jan 11 17:54:48 crc kubenswrapper[4837]: I0111 17:54:48.155826 4837 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/f21b505a-45c3-4f7e-b323-204d384185b9-rabbitmq-tls\") on node \"crc\" DevicePath \"\"" Jan 11 17:54:48 crc kubenswrapper[4837]: I0111 17:54:48.155836 4837 reconciler_common.go:293] "Volume detached for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/f21b505a-45c3-4f7e-b323-204d384185b9-plugins-conf\") on node \"crc\" DevicePath \"\"" Jan 11 17:54:48 crc kubenswrapper[4837]: I0111 17:54:48.155845 4837 reconciler_common.go:293] "Volume detached for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/f21b505a-45c3-4f7e-b323-204d384185b9-pod-info\") on node \"crc\" DevicePath \"\"" Jan 11 17:54:48 crc kubenswrapper[4837]: I0111 17:54:48.155855 4837 reconciler_common.go:293] "Volume detached for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/f21b505a-45c3-4f7e-b323-204d384185b9-erlang-cookie-secret\") on node \"crc\" DevicePath \"\"" Jan 11 17:54:48 crc kubenswrapper[4837]: I0111 17:54:48.155864 4837 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/f21b505a-45c3-4f7e-b323-204d384185b9-rabbitmq-plugins\") on node \"crc\" DevicePath \"\"" Jan 11 17:54:48 crc kubenswrapper[4837]: I0111 17:54:48.155872 4837 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/f21b505a-45c3-4f7e-b323-204d384185b9-rabbitmq-erlang-cookie\") on node \"crc\" DevicePath \"\"" Jan 11 17:54:48 crc kubenswrapper[4837]: I0111 17:54:48.155883 4837 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/f21b505a-45c3-4f7e-b323-204d384185b9-config-data\") on node \"crc\" DevicePath \"\"" Jan 11 17:54:48 crc kubenswrapper[4837]: I0111 17:54:48.155904 4837 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage02-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage02-crc\") on node \"crc\" " Jan 11 17:54:48 crc kubenswrapper[4837]: I0111 17:54:48.155913 4837 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-5lvv9\" (UniqueName: \"kubernetes.io/projected/f21b505a-45c3-4f7e-b323-204d384185b9-kube-api-access-5lvv9\") on node \"crc\" DevicePath \"\"" Jan 11 17:54:48 crc kubenswrapper[4837]: I0111 17:54:48.155922 4837 reconciler_common.go:293] "Volume detached for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/f21b505a-45c3-4f7e-b323-204d384185b9-server-conf\") on node \"crc\" DevicePath \"\"" Jan 11 17:54:48 crc kubenswrapper[4837]: I0111 17:54:48.163958 4837 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/rabbitmq-server-0"] Jan 11 17:54:48 crc kubenswrapper[4837]: I0111 17:54:48.167524 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f21b505a-45c3-4f7e-b323-204d384185b9-rabbitmq-confd" (OuterVolumeSpecName: "rabbitmq-confd") pod "f21b505a-45c3-4f7e-b323-204d384185b9" (UID: "f21b505a-45c3-4f7e-b323-204d384185b9"). InnerVolumeSpecName "rabbitmq-confd". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 11 17:54:48 crc kubenswrapper[4837]: I0111 17:54:48.193106 4837 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/rabbitmq-server-0"] Jan 11 17:54:48 crc kubenswrapper[4837]: I0111 17:54:48.195067 4837 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage02-crc" (UniqueName: "kubernetes.io/local-volume/local-storage02-crc") on node "crc" Jan 11 17:54:48 crc kubenswrapper[4837]: I0111 17:54:48.211745 4837 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/rabbitmq-server-0"] Jan 11 17:54:48 crc kubenswrapper[4837]: I0111 17:54:48.213376 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-server-0" Jan 11 17:54:48 crc kubenswrapper[4837]: I0111 17:54:48.217628 4837 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-erlang-cookie" Jan 11 17:54:48 crc kubenswrapper[4837]: I0111 17:54:48.217820 4837 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-default-user" Jan 11 17:54:48 crc kubenswrapper[4837]: I0111 17:54:48.219104 4837 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-server-conf" Jan 11 17:54:48 crc kubenswrapper[4837]: I0111 17:54:48.219378 4837 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-rabbitmq-svc" Jan 11 17:54:48 crc kubenswrapper[4837]: I0111 17:54:48.219429 4837 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-plugins-conf" Jan 11 17:54:48 crc kubenswrapper[4837]: I0111 17:54:48.219388 4837 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-config-data" Jan 11 17:54:48 crc kubenswrapper[4837]: I0111 17:54:48.219608 4837 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-server-dockercfg-8xf8t" Jan 11 17:54:48 crc kubenswrapper[4837]: I0111 17:54:48.239443 4837 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-server-0"] Jan 11 17:54:48 crc kubenswrapper[4837]: I0111 17:54:48.262028 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ckl49\" (UniqueName: \"kubernetes.io/projected/25f14f43-12d0-4c7d-b823-4c6d3eecd355-kube-api-access-ckl49\") pod \"dnsmasq-dns-cb6ffcf87-ch9dq\" (UID: \"25f14f43-12d0-4c7d-b823-4c6d3eecd355\") " pod="openstack/dnsmasq-dns-cb6ffcf87-ch9dq" Jan 11 17:54:48 crc kubenswrapper[4837]: I0111 17:54:48.262072 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/25f14f43-12d0-4c7d-b823-4c6d3eecd355-config\") pod \"dnsmasq-dns-cb6ffcf87-ch9dq\" (UID: \"25f14f43-12d0-4c7d-b823-4c6d3eecd355\") " pod="openstack/dnsmasq-dns-cb6ffcf87-ch9dq" Jan 11 17:54:48 crc kubenswrapper[4837]: I0111 17:54:48.262116 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/25f14f43-12d0-4c7d-b823-4c6d3eecd355-dns-svc\") pod \"dnsmasq-dns-cb6ffcf87-ch9dq\" (UID: \"25f14f43-12d0-4c7d-b823-4c6d3eecd355\") " pod="openstack/dnsmasq-dns-cb6ffcf87-ch9dq" Jan 11 17:54:48 crc kubenswrapper[4837]: I0111 17:54:48.262173 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/25f14f43-12d0-4c7d-b823-4c6d3eecd355-dns-swift-storage-0\") pod \"dnsmasq-dns-cb6ffcf87-ch9dq\" (UID: \"25f14f43-12d0-4c7d-b823-4c6d3eecd355\") " pod="openstack/dnsmasq-dns-cb6ffcf87-ch9dq" Jan 11 17:54:48 crc kubenswrapper[4837]: I0111 17:54:48.262193 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/25f14f43-12d0-4c7d-b823-4c6d3eecd355-ovsdbserver-sb\") pod \"dnsmasq-dns-cb6ffcf87-ch9dq\" (UID: \"25f14f43-12d0-4c7d-b823-4c6d3eecd355\") " pod="openstack/dnsmasq-dns-cb6ffcf87-ch9dq" Jan 11 17:54:48 crc kubenswrapper[4837]: I0111 17:54:48.262221 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/configmap/25f14f43-12d0-4c7d-b823-4c6d3eecd355-openstack-edpm-ipam\") pod \"dnsmasq-dns-cb6ffcf87-ch9dq\" (UID: \"25f14f43-12d0-4c7d-b823-4c6d3eecd355\") " pod="openstack/dnsmasq-dns-cb6ffcf87-ch9dq" Jan 11 17:54:48 crc kubenswrapper[4837]: I0111 17:54:48.262284 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/25f14f43-12d0-4c7d-b823-4c6d3eecd355-ovsdbserver-nb\") pod \"dnsmasq-dns-cb6ffcf87-ch9dq\" (UID: \"25f14f43-12d0-4c7d-b823-4c6d3eecd355\") " pod="openstack/dnsmasq-dns-cb6ffcf87-ch9dq" Jan 11 17:54:48 crc kubenswrapper[4837]: I0111 17:54:48.262335 4837 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/f21b505a-45c3-4f7e-b323-204d384185b9-rabbitmq-confd\") on node \"crc\" DevicePath \"\"" Jan 11 17:54:48 crc kubenswrapper[4837]: I0111 17:54:48.262346 4837 reconciler_common.go:293] "Volume detached for volume \"local-storage02-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage02-crc\") on node \"crc\" DevicePath \"\"" Jan 11 17:54:48 crc kubenswrapper[4837]: I0111 17:54:48.263211 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/25f14f43-12d0-4c7d-b823-4c6d3eecd355-dns-svc\") pod \"dnsmasq-dns-cb6ffcf87-ch9dq\" (UID: \"25f14f43-12d0-4c7d-b823-4c6d3eecd355\") " pod="openstack/dnsmasq-dns-cb6ffcf87-ch9dq" Jan 11 17:54:48 crc kubenswrapper[4837]: I0111 17:54:48.263307 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/25f14f43-12d0-4c7d-b823-4c6d3eecd355-ovsdbserver-sb\") pod \"dnsmasq-dns-cb6ffcf87-ch9dq\" (UID: \"25f14f43-12d0-4c7d-b823-4c6d3eecd355\") " pod="openstack/dnsmasq-dns-cb6ffcf87-ch9dq" Jan 11 17:54:48 crc kubenswrapper[4837]: I0111 17:54:48.263839 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/configmap/25f14f43-12d0-4c7d-b823-4c6d3eecd355-openstack-edpm-ipam\") pod \"dnsmasq-dns-cb6ffcf87-ch9dq\" (UID: \"25f14f43-12d0-4c7d-b823-4c6d3eecd355\") " pod="openstack/dnsmasq-dns-cb6ffcf87-ch9dq" Jan 11 17:54:48 crc kubenswrapper[4837]: I0111 17:54:48.263939 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/25f14f43-12d0-4c7d-b823-4c6d3eecd355-dns-swift-storage-0\") pod \"dnsmasq-dns-cb6ffcf87-ch9dq\" (UID: \"25f14f43-12d0-4c7d-b823-4c6d3eecd355\") " pod="openstack/dnsmasq-dns-cb6ffcf87-ch9dq" Jan 11 17:54:48 crc kubenswrapper[4837]: I0111 17:54:48.264236 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/25f14f43-12d0-4c7d-b823-4c6d3eecd355-ovsdbserver-nb\") pod \"dnsmasq-dns-cb6ffcf87-ch9dq\" (UID: \"25f14f43-12d0-4c7d-b823-4c6d3eecd355\") " pod="openstack/dnsmasq-dns-cb6ffcf87-ch9dq" Jan 11 17:54:48 crc kubenswrapper[4837]: I0111 17:54:48.264549 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/25f14f43-12d0-4c7d-b823-4c6d3eecd355-config\") pod \"dnsmasq-dns-cb6ffcf87-ch9dq\" (UID: \"25f14f43-12d0-4c7d-b823-4c6d3eecd355\") " pod="openstack/dnsmasq-dns-cb6ffcf87-ch9dq" Jan 11 17:54:48 crc kubenswrapper[4837]: I0111 17:54:48.301187 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ckl49\" (UniqueName: \"kubernetes.io/projected/25f14f43-12d0-4c7d-b823-4c6d3eecd355-kube-api-access-ckl49\") pod \"dnsmasq-dns-cb6ffcf87-ch9dq\" (UID: \"25f14f43-12d0-4c7d-b823-4c6d3eecd355\") " pod="openstack/dnsmasq-dns-cb6ffcf87-ch9dq" Jan 11 17:54:48 crc kubenswrapper[4837]: I0111 17:54:48.320347 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-cb6ffcf87-ch9dq" Jan 11 17:54:48 crc kubenswrapper[4837]: I0111 17:54:48.364085 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/c8ee05fd-6122-4935-a3c0-4b9f71175434-pod-info\") pod \"rabbitmq-server-0\" (UID: \"c8ee05fd-6122-4935-a3c0-4b9f71175434\") " pod="openstack/rabbitmq-server-0" Jan 11 17:54:48 crc kubenswrapper[4837]: I0111 17:54:48.364146 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/c8ee05fd-6122-4935-a3c0-4b9f71175434-erlang-cookie-secret\") pod \"rabbitmq-server-0\" (UID: \"c8ee05fd-6122-4935-a3c0-4b9f71175434\") " pod="openstack/rabbitmq-server-0" Jan 11 17:54:48 crc kubenswrapper[4837]: I0111 17:54:48.364167 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/c8ee05fd-6122-4935-a3c0-4b9f71175434-rabbitmq-tls\") pod \"rabbitmq-server-0\" (UID: \"c8ee05fd-6122-4935-a3c0-4b9f71175434\") " pod="openstack/rabbitmq-server-0" Jan 11 17:54:48 crc kubenswrapper[4837]: I0111 17:54:48.364198 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage07-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage07-crc\") pod \"rabbitmq-server-0\" (UID: \"c8ee05fd-6122-4935-a3c0-4b9f71175434\") " pod="openstack/rabbitmq-server-0" Jan 11 17:54:48 crc kubenswrapper[4837]: I0111 17:54:48.364390 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wzmh7\" (UniqueName: \"kubernetes.io/projected/c8ee05fd-6122-4935-a3c0-4b9f71175434-kube-api-access-wzmh7\") pod \"rabbitmq-server-0\" (UID: \"c8ee05fd-6122-4935-a3c0-4b9f71175434\") " pod="openstack/rabbitmq-server-0" Jan 11 17:54:48 crc kubenswrapper[4837]: I0111 17:54:48.364485 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/c8ee05fd-6122-4935-a3c0-4b9f71175434-rabbitmq-confd\") pod \"rabbitmq-server-0\" (UID: \"c8ee05fd-6122-4935-a3c0-4b9f71175434\") " pod="openstack/rabbitmq-server-0" Jan 11 17:54:48 crc kubenswrapper[4837]: I0111 17:54:48.364557 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/c8ee05fd-6122-4935-a3c0-4b9f71175434-rabbitmq-plugins\") pod \"rabbitmq-server-0\" (UID: \"c8ee05fd-6122-4935-a3c0-4b9f71175434\") " pod="openstack/rabbitmq-server-0" Jan 11 17:54:48 crc kubenswrapper[4837]: I0111 17:54:48.364658 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/c8ee05fd-6122-4935-a3c0-4b9f71175434-plugins-conf\") pod \"rabbitmq-server-0\" (UID: \"c8ee05fd-6122-4935-a3c0-4b9f71175434\") " pod="openstack/rabbitmq-server-0" Jan 11 17:54:48 crc kubenswrapper[4837]: I0111 17:54:48.364839 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/c8ee05fd-6122-4935-a3c0-4b9f71175434-config-data\") pod \"rabbitmq-server-0\" (UID: \"c8ee05fd-6122-4935-a3c0-4b9f71175434\") " pod="openstack/rabbitmq-server-0" Jan 11 17:54:48 crc kubenswrapper[4837]: I0111 17:54:48.364887 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/c8ee05fd-6122-4935-a3c0-4b9f71175434-server-conf\") pod \"rabbitmq-server-0\" (UID: \"c8ee05fd-6122-4935-a3c0-4b9f71175434\") " pod="openstack/rabbitmq-server-0" Jan 11 17:54:48 crc kubenswrapper[4837]: I0111 17:54:48.364968 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/c8ee05fd-6122-4935-a3c0-4b9f71175434-rabbitmq-erlang-cookie\") pod \"rabbitmq-server-0\" (UID: \"c8ee05fd-6122-4935-a3c0-4b9f71175434\") " pod="openstack/rabbitmq-server-0" Jan 11 17:54:48 crc kubenswrapper[4837]: I0111 17:54:48.376854 4837 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="89ca88bf-2462-4fce-8a85-8dc04655b21c" path="/var/lib/kubelet/pods/89ca88bf-2462-4fce-8a85-8dc04655b21c/volumes" Jan 11 17:54:48 crc kubenswrapper[4837]: I0111 17:54:48.466077 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/c8ee05fd-6122-4935-a3c0-4b9f71175434-config-data\") pod \"rabbitmq-server-0\" (UID: \"c8ee05fd-6122-4935-a3c0-4b9f71175434\") " pod="openstack/rabbitmq-server-0" Jan 11 17:54:48 crc kubenswrapper[4837]: I0111 17:54:48.466131 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/c8ee05fd-6122-4935-a3c0-4b9f71175434-server-conf\") pod \"rabbitmq-server-0\" (UID: \"c8ee05fd-6122-4935-a3c0-4b9f71175434\") " pod="openstack/rabbitmq-server-0" Jan 11 17:54:48 crc kubenswrapper[4837]: I0111 17:54:48.466184 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/c8ee05fd-6122-4935-a3c0-4b9f71175434-rabbitmq-erlang-cookie\") pod \"rabbitmq-server-0\" (UID: \"c8ee05fd-6122-4935-a3c0-4b9f71175434\") " pod="openstack/rabbitmq-server-0" Jan 11 17:54:48 crc kubenswrapper[4837]: I0111 17:54:48.466624 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/c8ee05fd-6122-4935-a3c0-4b9f71175434-pod-info\") pod \"rabbitmq-server-0\" (UID: \"c8ee05fd-6122-4935-a3c0-4b9f71175434\") " pod="openstack/rabbitmq-server-0" Jan 11 17:54:48 crc kubenswrapper[4837]: I0111 17:54:48.466787 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/c8ee05fd-6122-4935-a3c0-4b9f71175434-rabbitmq-erlang-cookie\") pod \"rabbitmq-server-0\" (UID: \"c8ee05fd-6122-4935-a3c0-4b9f71175434\") " pod="openstack/rabbitmq-server-0" Jan 11 17:54:48 crc kubenswrapper[4837]: I0111 17:54:48.467116 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/c8ee05fd-6122-4935-a3c0-4b9f71175434-config-data\") pod \"rabbitmq-server-0\" (UID: \"c8ee05fd-6122-4935-a3c0-4b9f71175434\") " pod="openstack/rabbitmq-server-0" Jan 11 17:54:48 crc kubenswrapper[4837]: I0111 17:54:48.467601 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/c8ee05fd-6122-4935-a3c0-4b9f71175434-server-conf\") pod \"rabbitmq-server-0\" (UID: \"c8ee05fd-6122-4935-a3c0-4b9f71175434\") " pod="openstack/rabbitmq-server-0" Jan 11 17:54:48 crc kubenswrapper[4837]: I0111 17:54:48.467709 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/c8ee05fd-6122-4935-a3c0-4b9f71175434-erlang-cookie-secret\") pod \"rabbitmq-server-0\" (UID: \"c8ee05fd-6122-4935-a3c0-4b9f71175434\") " pod="openstack/rabbitmq-server-0" Jan 11 17:54:48 crc kubenswrapper[4837]: I0111 17:54:48.467732 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/c8ee05fd-6122-4935-a3c0-4b9f71175434-rabbitmq-tls\") pod \"rabbitmq-server-0\" (UID: \"c8ee05fd-6122-4935-a3c0-4b9f71175434\") " pod="openstack/rabbitmq-server-0" Jan 11 17:54:48 crc kubenswrapper[4837]: I0111 17:54:48.468130 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage07-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage07-crc\") pod \"rabbitmq-server-0\" (UID: \"c8ee05fd-6122-4935-a3c0-4b9f71175434\") " pod="openstack/rabbitmq-server-0" Jan 11 17:54:48 crc kubenswrapper[4837]: I0111 17:54:48.468177 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wzmh7\" (UniqueName: \"kubernetes.io/projected/c8ee05fd-6122-4935-a3c0-4b9f71175434-kube-api-access-wzmh7\") pod \"rabbitmq-server-0\" (UID: \"c8ee05fd-6122-4935-a3c0-4b9f71175434\") " pod="openstack/rabbitmq-server-0" Jan 11 17:54:48 crc kubenswrapper[4837]: I0111 17:54:48.468335 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/c8ee05fd-6122-4935-a3c0-4b9f71175434-rabbitmq-confd\") pod \"rabbitmq-server-0\" (UID: \"c8ee05fd-6122-4935-a3c0-4b9f71175434\") " pod="openstack/rabbitmq-server-0" Jan 11 17:54:48 crc kubenswrapper[4837]: I0111 17:54:48.468400 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/c8ee05fd-6122-4935-a3c0-4b9f71175434-rabbitmq-plugins\") pod \"rabbitmq-server-0\" (UID: \"c8ee05fd-6122-4935-a3c0-4b9f71175434\") " pod="openstack/rabbitmq-server-0" Jan 11 17:54:48 crc kubenswrapper[4837]: I0111 17:54:48.468451 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/c8ee05fd-6122-4935-a3c0-4b9f71175434-plugins-conf\") pod \"rabbitmq-server-0\" (UID: \"c8ee05fd-6122-4935-a3c0-4b9f71175434\") " pod="openstack/rabbitmq-server-0" Jan 11 17:54:48 crc kubenswrapper[4837]: I0111 17:54:48.468516 4837 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage07-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage07-crc\") pod \"rabbitmq-server-0\" (UID: \"c8ee05fd-6122-4935-a3c0-4b9f71175434\") device mount path \"/mnt/openstack/pv07\"" pod="openstack/rabbitmq-server-0" Jan 11 17:54:48 crc kubenswrapper[4837]: I0111 17:54:48.469093 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/c8ee05fd-6122-4935-a3c0-4b9f71175434-plugins-conf\") pod \"rabbitmq-server-0\" (UID: \"c8ee05fd-6122-4935-a3c0-4b9f71175434\") " pod="openstack/rabbitmq-server-0" Jan 11 17:54:48 crc kubenswrapper[4837]: I0111 17:54:48.469405 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/c8ee05fd-6122-4935-a3c0-4b9f71175434-rabbitmq-plugins\") pod \"rabbitmq-server-0\" (UID: \"c8ee05fd-6122-4935-a3c0-4b9f71175434\") " pod="openstack/rabbitmq-server-0" Jan 11 17:54:48 crc kubenswrapper[4837]: I0111 17:54:48.472571 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/c8ee05fd-6122-4935-a3c0-4b9f71175434-pod-info\") pod \"rabbitmq-server-0\" (UID: \"c8ee05fd-6122-4935-a3c0-4b9f71175434\") " pod="openstack/rabbitmq-server-0" Jan 11 17:54:48 crc kubenswrapper[4837]: I0111 17:54:48.475442 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/c8ee05fd-6122-4935-a3c0-4b9f71175434-rabbitmq-confd\") pod \"rabbitmq-server-0\" (UID: \"c8ee05fd-6122-4935-a3c0-4b9f71175434\") " pod="openstack/rabbitmq-server-0" Jan 11 17:54:48 crc kubenswrapper[4837]: I0111 17:54:48.483856 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/c8ee05fd-6122-4935-a3c0-4b9f71175434-rabbitmq-tls\") pod \"rabbitmq-server-0\" (UID: \"c8ee05fd-6122-4935-a3c0-4b9f71175434\") " pod="openstack/rabbitmq-server-0" Jan 11 17:54:48 crc kubenswrapper[4837]: I0111 17:54:48.487901 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wzmh7\" (UniqueName: \"kubernetes.io/projected/c8ee05fd-6122-4935-a3c0-4b9f71175434-kube-api-access-wzmh7\") pod \"rabbitmq-server-0\" (UID: \"c8ee05fd-6122-4935-a3c0-4b9f71175434\") " pod="openstack/rabbitmq-server-0" Jan 11 17:54:48 crc kubenswrapper[4837]: I0111 17:54:48.492303 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/c8ee05fd-6122-4935-a3c0-4b9f71175434-erlang-cookie-secret\") pod \"rabbitmq-server-0\" (UID: \"c8ee05fd-6122-4935-a3c0-4b9f71175434\") " pod="openstack/rabbitmq-server-0" Jan 11 17:54:48 crc kubenswrapper[4837]: I0111 17:54:48.514042 4837 generic.go:334] "Generic (PLEG): container finished" podID="f21b505a-45c3-4f7e-b323-204d384185b9" containerID="96fd6d715572e68e28afddc6708154913dd9d5e9e5f2f26efd5a37f3d50ac164" exitCode=0 Jan 11 17:54:48 crc kubenswrapper[4837]: I0111 17:54:48.514118 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"f21b505a-45c3-4f7e-b323-204d384185b9","Type":"ContainerDied","Data":"96fd6d715572e68e28afddc6708154913dd9d5e9e5f2f26efd5a37f3d50ac164"} Jan 11 17:54:48 crc kubenswrapper[4837]: I0111 17:54:48.514145 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"f21b505a-45c3-4f7e-b323-204d384185b9","Type":"ContainerDied","Data":"593ebf5705e544460c5f953044732d953c8ad2664adbfad585706c2209e16132"} Jan 11 17:54:48 crc kubenswrapper[4837]: I0111 17:54:48.514161 4837 scope.go:117] "RemoveContainer" containerID="96fd6d715572e68e28afddc6708154913dd9d5e9e5f2f26efd5a37f3d50ac164" Jan 11 17:54:48 crc kubenswrapper[4837]: I0111 17:54:48.514260 4837 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-cell1-server-0" Jan 11 17:54:48 crc kubenswrapper[4837]: I0111 17:54:48.518700 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-67b789f86c-2dr5n" Jan 11 17:54:48 crc kubenswrapper[4837]: I0111 17:54:48.540330 4837 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Jan 11 17:54:48 crc kubenswrapper[4837]: I0111 17:54:48.546986 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-67b789f86c-2dr5n" Jan 11 17:54:48 crc kubenswrapper[4837]: I0111 17:54:48.552946 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage07-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage07-crc\") pod \"rabbitmq-server-0\" (UID: \"c8ee05fd-6122-4935-a3c0-4b9f71175434\") " pod="openstack/rabbitmq-server-0" Jan 11 17:54:48 crc kubenswrapper[4837]: I0111 17:54:48.554633 4837 scope.go:117] "RemoveContainer" containerID="c8a7e39f9bce03573496d4a5fbc8f9cbf25f994a2506c623c9b99578288b1ccc" Jan 11 17:54:48 crc kubenswrapper[4837]: I0111 17:54:48.570642 4837 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Jan 11 17:54:48 crc kubenswrapper[4837]: I0111 17:54:48.607744 4837 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Jan 11 17:54:48 crc kubenswrapper[4837]: I0111 17:54:48.609747 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-cell1-server-0" Jan 11 17:54:48 crc kubenswrapper[4837]: I0111 17:54:48.614413 4837 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-cell1-erlang-cookie" Jan 11 17:54:48 crc kubenswrapper[4837]: I0111 17:54:48.614445 4837 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-cell1-plugins-conf" Jan 11 17:54:48 crc kubenswrapper[4837]: I0111 17:54:48.614550 4837 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-cell1-server-dockercfg-xhdt5" Jan 11 17:54:48 crc kubenswrapper[4837]: I0111 17:54:48.614628 4837 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-cell1-default-user" Jan 11 17:54:48 crc kubenswrapper[4837]: I0111 17:54:48.618770 4837 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-cell1-server-conf" Jan 11 17:54:48 crc kubenswrapper[4837]: I0111 17:54:48.619147 4837 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-cell1-config-data" Jan 11 17:54:48 crc kubenswrapper[4837]: I0111 17:54:48.619863 4837 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-rabbitmq-cell1-svc" Jan 11 17:54:48 crc kubenswrapper[4837]: I0111 17:54:48.628927 4837 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Jan 11 17:54:48 crc kubenswrapper[4837]: I0111 17:54:48.634391 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-server-0" Jan 11 17:54:48 crc kubenswrapper[4837]: I0111 17:54:48.637384 4837 scope.go:117] "RemoveContainer" containerID="96fd6d715572e68e28afddc6708154913dd9d5e9e5f2f26efd5a37f3d50ac164" Jan 11 17:54:48 crc kubenswrapper[4837]: E0111 17:54:48.643991 4837 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"96fd6d715572e68e28afddc6708154913dd9d5e9e5f2f26efd5a37f3d50ac164\": container with ID starting with 96fd6d715572e68e28afddc6708154913dd9d5e9e5f2f26efd5a37f3d50ac164 not found: ID does not exist" containerID="96fd6d715572e68e28afddc6708154913dd9d5e9e5f2f26efd5a37f3d50ac164" Jan 11 17:54:48 crc kubenswrapper[4837]: I0111 17:54:48.644037 4837 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"96fd6d715572e68e28afddc6708154913dd9d5e9e5f2f26efd5a37f3d50ac164"} err="failed to get container status \"96fd6d715572e68e28afddc6708154913dd9d5e9e5f2f26efd5a37f3d50ac164\": rpc error: code = NotFound desc = could not find container \"96fd6d715572e68e28afddc6708154913dd9d5e9e5f2f26efd5a37f3d50ac164\": container with ID starting with 96fd6d715572e68e28afddc6708154913dd9d5e9e5f2f26efd5a37f3d50ac164 not found: ID does not exist" Jan 11 17:54:48 crc kubenswrapper[4837]: I0111 17:54:48.644066 4837 scope.go:117] "RemoveContainer" containerID="c8a7e39f9bce03573496d4a5fbc8f9cbf25f994a2506c623c9b99578288b1ccc" Jan 11 17:54:48 crc kubenswrapper[4837]: E0111 17:54:48.654979 4837 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"c8a7e39f9bce03573496d4a5fbc8f9cbf25f994a2506c623c9b99578288b1ccc\": container with ID starting with c8a7e39f9bce03573496d4a5fbc8f9cbf25f994a2506c623c9b99578288b1ccc not found: ID does not exist" containerID="c8a7e39f9bce03573496d4a5fbc8f9cbf25f994a2506c623c9b99578288b1ccc" Jan 11 17:54:48 crc kubenswrapper[4837]: I0111 17:54:48.655036 4837 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c8a7e39f9bce03573496d4a5fbc8f9cbf25f994a2506c623c9b99578288b1ccc"} err="failed to get container status \"c8a7e39f9bce03573496d4a5fbc8f9cbf25f994a2506c623c9b99578288b1ccc\": rpc error: code = NotFound desc = could not find container \"c8a7e39f9bce03573496d4a5fbc8f9cbf25f994a2506c623c9b99578288b1ccc\": container with ID starting with c8a7e39f9bce03573496d4a5fbc8f9cbf25f994a2506c623c9b99578288b1ccc not found: ID does not exist" Jan 11 17:54:48 crc kubenswrapper[4837]: I0111 17:54:48.672986 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a3c82eb3-b0a8-4972-bc61-c990541be2eb-dns-swift-storage-0" (OuterVolumeSpecName: "dns-swift-storage-0") pod "a3c82eb3-b0a8-4972-bc61-c990541be2eb" (UID: "a3c82eb3-b0a8-4972-bc61-c990541be2eb"). InnerVolumeSpecName "dns-swift-storage-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 11 17:54:48 crc kubenswrapper[4837]: I0111 17:54:48.673039 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/a3c82eb3-b0a8-4972-bc61-c990541be2eb-dns-swift-storage-0\") pod \"a3c82eb3-b0a8-4972-bc61-c990541be2eb\" (UID: \"a3c82eb3-b0a8-4972-bc61-c990541be2eb\") " Jan 11 17:54:48 crc kubenswrapper[4837]: I0111 17:54:48.673073 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/a3c82eb3-b0a8-4972-bc61-c990541be2eb-ovsdbserver-sb\") pod \"a3c82eb3-b0a8-4972-bc61-c990541be2eb\" (UID: \"a3c82eb3-b0a8-4972-bc61-c990541be2eb\") " Jan 11 17:54:48 crc kubenswrapper[4837]: I0111 17:54:48.673141 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/a3c82eb3-b0a8-4972-bc61-c990541be2eb-ovsdbserver-nb\") pod \"a3c82eb3-b0a8-4972-bc61-c990541be2eb\" (UID: \"a3c82eb3-b0a8-4972-bc61-c990541be2eb\") " Jan 11 17:54:48 crc kubenswrapper[4837]: I0111 17:54:48.673213 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/configmap/a3c82eb3-b0a8-4972-bc61-c990541be2eb-openstack-edpm-ipam\") pod \"a3c82eb3-b0a8-4972-bc61-c990541be2eb\" (UID: \"a3c82eb3-b0a8-4972-bc61-c990541be2eb\") " Jan 11 17:54:48 crc kubenswrapper[4837]: I0111 17:54:48.673317 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4w696\" (UniqueName: \"kubernetes.io/projected/a3c82eb3-b0a8-4972-bc61-c990541be2eb-kube-api-access-4w696\") pod \"a3c82eb3-b0a8-4972-bc61-c990541be2eb\" (UID: \"a3c82eb3-b0a8-4972-bc61-c990541be2eb\") " Jan 11 17:54:48 crc kubenswrapper[4837]: I0111 17:54:48.673360 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a3c82eb3-b0a8-4972-bc61-c990541be2eb-config\") pod \"a3c82eb3-b0a8-4972-bc61-c990541be2eb\" (UID: \"a3c82eb3-b0a8-4972-bc61-c990541be2eb\") " Jan 11 17:54:48 crc kubenswrapper[4837]: I0111 17:54:48.673395 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/a3c82eb3-b0a8-4972-bc61-c990541be2eb-dns-svc\") pod \"a3c82eb3-b0a8-4972-bc61-c990541be2eb\" (UID: \"a3c82eb3-b0a8-4972-bc61-c990541be2eb\") " Jan 11 17:54:48 crc kubenswrapper[4837]: I0111 17:54:48.674157 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a3c82eb3-b0a8-4972-bc61-c990541be2eb-openstack-edpm-ipam" (OuterVolumeSpecName: "openstack-edpm-ipam") pod "a3c82eb3-b0a8-4972-bc61-c990541be2eb" (UID: "a3c82eb3-b0a8-4972-bc61-c990541be2eb"). InnerVolumeSpecName "openstack-edpm-ipam". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 11 17:54:48 crc kubenswrapper[4837]: I0111 17:54:48.674408 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a3c82eb3-b0a8-4972-bc61-c990541be2eb-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "a3c82eb3-b0a8-4972-bc61-c990541be2eb" (UID: "a3c82eb3-b0a8-4972-bc61-c990541be2eb"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 11 17:54:48 crc kubenswrapper[4837]: I0111 17:54:48.674450 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a3c82eb3-b0a8-4972-bc61-c990541be2eb-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "a3c82eb3-b0a8-4972-bc61-c990541be2eb" (UID: "a3c82eb3-b0a8-4972-bc61-c990541be2eb"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 11 17:54:48 crc kubenswrapper[4837]: I0111 17:54:48.674598 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a3c82eb3-b0a8-4972-bc61-c990541be2eb-config" (OuterVolumeSpecName: "config") pod "a3c82eb3-b0a8-4972-bc61-c990541be2eb" (UID: "a3c82eb3-b0a8-4972-bc61-c990541be2eb"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 11 17:54:48 crc kubenswrapper[4837]: I0111 17:54:48.674819 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a3c82eb3-b0a8-4972-bc61-c990541be2eb-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "a3c82eb3-b0a8-4972-bc61-c990541be2eb" (UID: "a3c82eb3-b0a8-4972-bc61-c990541be2eb"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 11 17:54:48 crc kubenswrapper[4837]: I0111 17:54:48.675524 4837 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/a3c82eb3-b0a8-4972-bc61-c990541be2eb-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Jan 11 17:54:48 crc kubenswrapper[4837]: I0111 17:54:48.675548 4837 reconciler_common.go:293] "Volume detached for volume \"openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/configmap/a3c82eb3-b0a8-4972-bc61-c990541be2eb-openstack-edpm-ipam\") on node \"crc\" DevicePath \"\"" Jan 11 17:54:48 crc kubenswrapper[4837]: I0111 17:54:48.675560 4837 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a3c82eb3-b0a8-4972-bc61-c990541be2eb-config\") on node \"crc\" DevicePath \"\"" Jan 11 17:54:48 crc kubenswrapper[4837]: I0111 17:54:48.675570 4837 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/a3c82eb3-b0a8-4972-bc61-c990541be2eb-dns-svc\") on node \"crc\" DevicePath \"\"" Jan 11 17:54:48 crc kubenswrapper[4837]: I0111 17:54:48.675578 4837 reconciler_common.go:293] "Volume detached for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/a3c82eb3-b0a8-4972-bc61-c990541be2eb-dns-swift-storage-0\") on node \"crc\" DevicePath \"\"" Jan 11 17:54:48 crc kubenswrapper[4837]: I0111 17:54:48.675586 4837 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/a3c82eb3-b0a8-4972-bc61-c990541be2eb-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Jan 11 17:54:48 crc kubenswrapper[4837]: I0111 17:54:48.677663 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a3c82eb3-b0a8-4972-bc61-c990541be2eb-kube-api-access-4w696" (OuterVolumeSpecName: "kube-api-access-4w696") pod "a3c82eb3-b0a8-4972-bc61-c990541be2eb" (UID: "a3c82eb3-b0a8-4972-bc61-c990541be2eb"). InnerVolumeSpecName "kube-api-access-4w696". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 11 17:54:48 crc kubenswrapper[4837]: I0111 17:54:48.768716 4837 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-cb6ffcf87-ch9dq"] Jan 11 17:54:48 crc kubenswrapper[4837]: I0111 17:54:48.776632 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage02-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage02-crc\") pod \"rabbitmq-cell1-server-0\" (UID: \"33de23cd-829c-449c-a816-d8a54f8ea68f\") " pod="openstack/rabbitmq-cell1-server-0" Jan 11 17:54:48 crc kubenswrapper[4837]: I0111 17:54:48.776698 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/33de23cd-829c-449c-a816-d8a54f8ea68f-rabbitmq-confd\") pod \"rabbitmq-cell1-server-0\" (UID: \"33de23cd-829c-449c-a816-d8a54f8ea68f\") " pod="openstack/rabbitmq-cell1-server-0" Jan 11 17:54:48 crc kubenswrapper[4837]: I0111 17:54:48.776795 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/33de23cd-829c-449c-a816-d8a54f8ea68f-rabbitmq-erlang-cookie\") pod \"rabbitmq-cell1-server-0\" (UID: \"33de23cd-829c-449c-a816-d8a54f8ea68f\") " pod="openstack/rabbitmq-cell1-server-0" Jan 11 17:54:48 crc kubenswrapper[4837]: I0111 17:54:48.776841 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/33de23cd-829c-449c-a816-d8a54f8ea68f-plugins-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"33de23cd-829c-449c-a816-d8a54f8ea68f\") " pod="openstack/rabbitmq-cell1-server-0" Jan 11 17:54:48 crc kubenswrapper[4837]: I0111 17:54:48.776857 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/33de23cd-829c-449c-a816-d8a54f8ea68f-erlang-cookie-secret\") pod \"rabbitmq-cell1-server-0\" (UID: \"33de23cd-829c-449c-a816-d8a54f8ea68f\") " pod="openstack/rabbitmq-cell1-server-0" Jan 11 17:54:48 crc kubenswrapper[4837]: I0111 17:54:48.776874 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/33de23cd-829c-449c-a816-d8a54f8ea68f-rabbitmq-plugins\") pod \"rabbitmq-cell1-server-0\" (UID: \"33de23cd-829c-449c-a816-d8a54f8ea68f\") " pod="openstack/rabbitmq-cell1-server-0" Jan 11 17:54:48 crc kubenswrapper[4837]: I0111 17:54:48.776899 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/33de23cd-829c-449c-a816-d8a54f8ea68f-pod-info\") pod \"rabbitmq-cell1-server-0\" (UID: \"33de23cd-829c-449c-a816-d8a54f8ea68f\") " pod="openstack/rabbitmq-cell1-server-0" Jan 11 17:54:48 crc kubenswrapper[4837]: I0111 17:54:48.776986 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/33de23cd-829c-449c-a816-d8a54f8ea68f-server-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"33de23cd-829c-449c-a816-d8a54f8ea68f\") " pod="openstack/rabbitmq-cell1-server-0" Jan 11 17:54:48 crc kubenswrapper[4837]: I0111 17:54:48.777039 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/33de23cd-829c-449c-a816-d8a54f8ea68f-rabbitmq-tls\") pod \"rabbitmq-cell1-server-0\" (UID: \"33de23cd-829c-449c-a816-d8a54f8ea68f\") " pod="openstack/rabbitmq-cell1-server-0" Jan 11 17:54:48 crc kubenswrapper[4837]: I0111 17:54:48.777066 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vhqr2\" (UniqueName: \"kubernetes.io/projected/33de23cd-829c-449c-a816-d8a54f8ea68f-kube-api-access-vhqr2\") pod \"rabbitmq-cell1-server-0\" (UID: \"33de23cd-829c-449c-a816-d8a54f8ea68f\") " pod="openstack/rabbitmq-cell1-server-0" Jan 11 17:54:48 crc kubenswrapper[4837]: I0111 17:54:48.777119 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/33de23cd-829c-449c-a816-d8a54f8ea68f-config-data\") pod \"rabbitmq-cell1-server-0\" (UID: \"33de23cd-829c-449c-a816-d8a54f8ea68f\") " pod="openstack/rabbitmq-cell1-server-0" Jan 11 17:54:48 crc kubenswrapper[4837]: I0111 17:54:48.777278 4837 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-4w696\" (UniqueName: \"kubernetes.io/projected/a3c82eb3-b0a8-4972-bc61-c990541be2eb-kube-api-access-4w696\") on node \"crc\" DevicePath \"\"" Jan 11 17:54:48 crc kubenswrapper[4837]: W0111 17:54:48.781777 4837 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod25f14f43_12d0_4c7d_b823_4c6d3eecd355.slice/crio-e65cdf1e3f0279fe374f5bec3200a83055f6993f9bdb627b2cf1b4a7b2f7dcbc WatchSource:0}: Error finding container e65cdf1e3f0279fe374f5bec3200a83055f6993f9bdb627b2cf1b4a7b2f7dcbc: Status 404 returned error can't find the container with id e65cdf1e3f0279fe374f5bec3200a83055f6993f9bdb627b2cf1b4a7b2f7dcbc Jan 11 17:54:48 crc kubenswrapper[4837]: I0111 17:54:48.879352 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/33de23cd-829c-449c-a816-d8a54f8ea68f-rabbitmq-erlang-cookie\") pod \"rabbitmq-cell1-server-0\" (UID: \"33de23cd-829c-449c-a816-d8a54f8ea68f\") " pod="openstack/rabbitmq-cell1-server-0" Jan 11 17:54:48 crc kubenswrapper[4837]: I0111 17:54:48.880123 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/33de23cd-829c-449c-a816-d8a54f8ea68f-plugins-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"33de23cd-829c-449c-a816-d8a54f8ea68f\") " pod="openstack/rabbitmq-cell1-server-0" Jan 11 17:54:48 crc kubenswrapper[4837]: I0111 17:54:48.879977 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/33de23cd-829c-449c-a816-d8a54f8ea68f-rabbitmq-erlang-cookie\") pod \"rabbitmq-cell1-server-0\" (UID: \"33de23cd-829c-449c-a816-d8a54f8ea68f\") " pod="openstack/rabbitmq-cell1-server-0" Jan 11 17:54:48 crc kubenswrapper[4837]: I0111 17:54:48.880211 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/33de23cd-829c-449c-a816-d8a54f8ea68f-erlang-cookie-secret\") pod \"rabbitmq-cell1-server-0\" (UID: \"33de23cd-829c-449c-a816-d8a54f8ea68f\") " pod="openstack/rabbitmq-cell1-server-0" Jan 11 17:54:48 crc kubenswrapper[4837]: I0111 17:54:48.883378 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/33de23cd-829c-449c-a816-d8a54f8ea68f-plugins-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"33de23cd-829c-449c-a816-d8a54f8ea68f\") " pod="openstack/rabbitmq-cell1-server-0" Jan 11 17:54:48 crc kubenswrapper[4837]: I0111 17:54:48.883661 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/33de23cd-829c-449c-a816-d8a54f8ea68f-rabbitmq-plugins\") pod \"rabbitmq-cell1-server-0\" (UID: \"33de23cd-829c-449c-a816-d8a54f8ea68f\") " pod="openstack/rabbitmq-cell1-server-0" Jan 11 17:54:48 crc kubenswrapper[4837]: I0111 17:54:48.883743 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/33de23cd-829c-449c-a816-d8a54f8ea68f-pod-info\") pod \"rabbitmq-cell1-server-0\" (UID: \"33de23cd-829c-449c-a816-d8a54f8ea68f\") " pod="openstack/rabbitmq-cell1-server-0" Jan 11 17:54:48 crc kubenswrapper[4837]: I0111 17:54:48.883780 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/33de23cd-829c-449c-a816-d8a54f8ea68f-server-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"33de23cd-829c-449c-a816-d8a54f8ea68f\") " pod="openstack/rabbitmq-cell1-server-0" Jan 11 17:54:48 crc kubenswrapper[4837]: I0111 17:54:48.883821 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/33de23cd-829c-449c-a816-d8a54f8ea68f-rabbitmq-tls\") pod \"rabbitmq-cell1-server-0\" (UID: \"33de23cd-829c-449c-a816-d8a54f8ea68f\") " pod="openstack/rabbitmq-cell1-server-0" Jan 11 17:54:48 crc kubenswrapper[4837]: I0111 17:54:48.883847 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vhqr2\" (UniqueName: \"kubernetes.io/projected/33de23cd-829c-449c-a816-d8a54f8ea68f-kube-api-access-vhqr2\") pod \"rabbitmq-cell1-server-0\" (UID: \"33de23cd-829c-449c-a816-d8a54f8ea68f\") " pod="openstack/rabbitmq-cell1-server-0" Jan 11 17:54:48 crc kubenswrapper[4837]: I0111 17:54:48.883891 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/33de23cd-829c-449c-a816-d8a54f8ea68f-config-data\") pod \"rabbitmq-cell1-server-0\" (UID: \"33de23cd-829c-449c-a816-d8a54f8ea68f\") " pod="openstack/rabbitmq-cell1-server-0" Jan 11 17:54:48 crc kubenswrapper[4837]: I0111 17:54:48.883924 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage02-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage02-crc\") pod \"rabbitmq-cell1-server-0\" (UID: \"33de23cd-829c-449c-a816-d8a54f8ea68f\") " pod="openstack/rabbitmq-cell1-server-0" Jan 11 17:54:48 crc kubenswrapper[4837]: I0111 17:54:48.883954 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/33de23cd-829c-449c-a816-d8a54f8ea68f-rabbitmq-confd\") pod \"rabbitmq-cell1-server-0\" (UID: \"33de23cd-829c-449c-a816-d8a54f8ea68f\") " pod="openstack/rabbitmq-cell1-server-0" Jan 11 17:54:48 crc kubenswrapper[4837]: I0111 17:54:48.884025 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/33de23cd-829c-449c-a816-d8a54f8ea68f-rabbitmq-plugins\") pod \"rabbitmq-cell1-server-0\" (UID: \"33de23cd-829c-449c-a816-d8a54f8ea68f\") " pod="openstack/rabbitmq-cell1-server-0" Jan 11 17:54:48 crc kubenswrapper[4837]: I0111 17:54:48.884262 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/33de23cd-829c-449c-a816-d8a54f8ea68f-erlang-cookie-secret\") pod \"rabbitmq-cell1-server-0\" (UID: \"33de23cd-829c-449c-a816-d8a54f8ea68f\") " pod="openstack/rabbitmq-cell1-server-0" Jan 11 17:54:48 crc kubenswrapper[4837]: I0111 17:54:48.884541 4837 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage02-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage02-crc\") pod \"rabbitmq-cell1-server-0\" (UID: \"33de23cd-829c-449c-a816-d8a54f8ea68f\") device mount path \"/mnt/openstack/pv02\"" pod="openstack/rabbitmq-cell1-server-0" Jan 11 17:54:48 crc kubenswrapper[4837]: I0111 17:54:48.885453 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/33de23cd-829c-449c-a816-d8a54f8ea68f-server-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"33de23cd-829c-449c-a816-d8a54f8ea68f\") " pod="openstack/rabbitmq-cell1-server-0" Jan 11 17:54:48 crc kubenswrapper[4837]: I0111 17:54:48.885883 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/33de23cd-829c-449c-a816-d8a54f8ea68f-config-data\") pod \"rabbitmq-cell1-server-0\" (UID: \"33de23cd-829c-449c-a816-d8a54f8ea68f\") " pod="openstack/rabbitmq-cell1-server-0" Jan 11 17:54:48 crc kubenswrapper[4837]: I0111 17:54:48.893190 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/33de23cd-829c-449c-a816-d8a54f8ea68f-rabbitmq-confd\") pod \"rabbitmq-cell1-server-0\" (UID: \"33de23cd-829c-449c-a816-d8a54f8ea68f\") " pod="openstack/rabbitmq-cell1-server-0" Jan 11 17:54:48 crc kubenswrapper[4837]: I0111 17:54:48.900251 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/33de23cd-829c-449c-a816-d8a54f8ea68f-pod-info\") pod \"rabbitmq-cell1-server-0\" (UID: \"33de23cd-829c-449c-a816-d8a54f8ea68f\") " pod="openstack/rabbitmq-cell1-server-0" Jan 11 17:54:48 crc kubenswrapper[4837]: I0111 17:54:48.900562 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/33de23cd-829c-449c-a816-d8a54f8ea68f-rabbitmq-tls\") pod \"rabbitmq-cell1-server-0\" (UID: \"33de23cd-829c-449c-a816-d8a54f8ea68f\") " pod="openstack/rabbitmq-cell1-server-0" Jan 11 17:54:48 crc kubenswrapper[4837]: I0111 17:54:48.903290 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vhqr2\" (UniqueName: \"kubernetes.io/projected/33de23cd-829c-449c-a816-d8a54f8ea68f-kube-api-access-vhqr2\") pod \"rabbitmq-cell1-server-0\" (UID: \"33de23cd-829c-449c-a816-d8a54f8ea68f\") " pod="openstack/rabbitmq-cell1-server-0" Jan 11 17:54:48 crc kubenswrapper[4837]: I0111 17:54:48.920614 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage02-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage02-crc\") pod \"rabbitmq-cell1-server-0\" (UID: \"33de23cd-829c-449c-a816-d8a54f8ea68f\") " pod="openstack/rabbitmq-cell1-server-0" Jan 11 17:54:48 crc kubenswrapper[4837]: I0111 17:54:48.942958 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-cell1-server-0" Jan 11 17:54:49 crc kubenswrapper[4837]: I0111 17:54:49.139824 4837 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-server-0"] Jan 11 17:54:49 crc kubenswrapper[4837]: W0111 17:54:49.185010 4837 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podc8ee05fd_6122_4935_a3c0_4b9f71175434.slice/crio-0b19802f098d926a722b27ebe463cc370536109f1ea997f20456ec07f9d97104 WatchSource:0}: Error finding container 0b19802f098d926a722b27ebe463cc370536109f1ea997f20456ec07f9d97104: Status 404 returned error can't find the container with id 0b19802f098d926a722b27ebe463cc370536109f1ea997f20456ec07f9d97104 Jan 11 17:54:49 crc kubenswrapper[4837]: I0111 17:54:49.251128 4837 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Jan 11 17:54:49 crc kubenswrapper[4837]: W0111 17:54:49.301085 4837 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod33de23cd_829c_449c_a816_d8a54f8ea68f.slice/crio-149170e241504f19eb6d42fc57c88cdb2003b52f585e8f38e9f64ec300781979 WatchSource:0}: Error finding container 149170e241504f19eb6d42fc57c88cdb2003b52f585e8f38e9f64ec300781979: Status 404 returned error can't find the container with id 149170e241504f19eb6d42fc57c88cdb2003b52f585e8f38e9f64ec300781979 Jan 11 17:54:49 crc kubenswrapper[4837]: I0111 17:54:49.536134 4837 generic.go:334] "Generic (PLEG): container finished" podID="25f14f43-12d0-4c7d-b823-4c6d3eecd355" containerID="34f8025ee8d182ab860e1572bc05c6c1180bbcc572102439e292cd2266d0c64e" exitCode=0 Jan 11 17:54:49 crc kubenswrapper[4837]: I0111 17:54:49.536232 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-cb6ffcf87-ch9dq" event={"ID":"25f14f43-12d0-4c7d-b823-4c6d3eecd355","Type":"ContainerDied","Data":"34f8025ee8d182ab860e1572bc05c6c1180bbcc572102439e292cd2266d0c64e"} Jan 11 17:54:49 crc kubenswrapper[4837]: I0111 17:54:49.536655 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-cb6ffcf87-ch9dq" event={"ID":"25f14f43-12d0-4c7d-b823-4c6d3eecd355","Type":"ContainerStarted","Data":"e65cdf1e3f0279fe374f5bec3200a83055f6993f9bdb627b2cf1b4a7b2f7dcbc"} Jan 11 17:54:49 crc kubenswrapper[4837]: I0111 17:54:49.548129 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"33de23cd-829c-449c-a816-d8a54f8ea68f","Type":"ContainerStarted","Data":"149170e241504f19eb6d42fc57c88cdb2003b52f585e8f38e9f64ec300781979"} Jan 11 17:54:49 crc kubenswrapper[4837]: I0111 17:54:49.552137 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-67b789f86c-2dr5n" Jan 11 17:54:49 crc kubenswrapper[4837]: I0111 17:54:49.552375 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"c8ee05fd-6122-4935-a3c0-4b9f71175434","Type":"ContainerStarted","Data":"0b19802f098d926a722b27ebe463cc370536109f1ea997f20456ec07f9d97104"} Jan 11 17:54:49 crc kubenswrapper[4837]: I0111 17:54:49.619614 4837 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-67b789f86c-2dr5n"] Jan 11 17:54:49 crc kubenswrapper[4837]: I0111 17:54:49.629813 4837 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-67b789f86c-2dr5n"] Jan 11 17:54:50 crc kubenswrapper[4837]: I0111 17:54:50.375150 4837 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a3c82eb3-b0a8-4972-bc61-c990541be2eb" path="/var/lib/kubelet/pods/a3c82eb3-b0a8-4972-bc61-c990541be2eb/volumes" Jan 11 17:54:50 crc kubenswrapper[4837]: I0111 17:54:50.376012 4837 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f21b505a-45c3-4f7e-b323-204d384185b9" path="/var/lib/kubelet/pods/f21b505a-45c3-4f7e-b323-204d384185b9/volumes" Jan 11 17:54:50 crc kubenswrapper[4837]: I0111 17:54:50.561648 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-cb6ffcf87-ch9dq" event={"ID":"25f14f43-12d0-4c7d-b823-4c6d3eecd355","Type":"ContainerStarted","Data":"c2c329c543f03c39e15f7b23f3acb01a91b4fbca0b38f5c8e117c9b18af9e748"} Jan 11 17:54:50 crc kubenswrapper[4837]: I0111 17:54:50.562769 4837 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-cb6ffcf87-ch9dq" Jan 11 17:54:50 crc kubenswrapper[4837]: I0111 17:54:50.591172 4837 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-cb6ffcf87-ch9dq" podStartSLOduration=3.591154483 podStartE2EDuration="3.591154483s" podCreationTimestamp="2026-01-11 17:54:47 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-11 17:54:50.579540771 +0000 UTC m=+1464.757733517" watchObservedRunningTime="2026-01-11 17:54:50.591154483 +0000 UTC m=+1464.769347199" Jan 11 17:54:51 crc kubenswrapper[4837]: I0111 17:54:51.571471 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"33de23cd-829c-449c-a816-d8a54f8ea68f","Type":"ContainerStarted","Data":"9c7a0826f58e007399a83e8d9ebce96df0afd37e2bad0da06340437ce859e783"} Jan 11 17:54:51 crc kubenswrapper[4837]: I0111 17:54:51.573519 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"c8ee05fd-6122-4935-a3c0-4b9f71175434","Type":"ContainerStarted","Data":"5137100e78bfbcc662cf3832cc31bb97e62da101be0681ea9e15af0c53d92502"} Jan 11 17:54:58 crc kubenswrapper[4837]: I0111 17:54:58.323150 4837 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-cb6ffcf87-ch9dq" Jan 11 17:54:58 crc kubenswrapper[4837]: I0111 17:54:58.407954 4837 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-59cf4bdb65-txfdc"] Jan 11 17:54:58 crc kubenswrapper[4837]: I0111 17:54:58.408303 4837 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-59cf4bdb65-txfdc" podUID="d146eef1-2f66-448c-a614-f5832ddbaaa6" containerName="dnsmasq-dns" containerID="cri-o://329a0e61e9944552996287a96c2c6a490d980fe654ce2e1cb9e5ead38b3a34b8" gracePeriod=10 Jan 11 17:54:58 crc kubenswrapper[4837]: I0111 17:54:58.646059 4837 generic.go:334] "Generic (PLEG): container finished" podID="d146eef1-2f66-448c-a614-f5832ddbaaa6" containerID="329a0e61e9944552996287a96c2c6a490d980fe654ce2e1cb9e5ead38b3a34b8" exitCode=0 Jan 11 17:54:58 crc kubenswrapper[4837]: I0111 17:54:58.646245 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-59cf4bdb65-txfdc" event={"ID":"d146eef1-2f66-448c-a614-f5832ddbaaa6","Type":"ContainerDied","Data":"329a0e61e9944552996287a96c2c6a490d980fe654ce2e1cb9e5ead38b3a34b8"} Jan 11 17:54:58 crc kubenswrapper[4837]: I0111 17:54:58.908628 4837 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-59cf4bdb65-txfdc" Jan 11 17:54:59 crc kubenswrapper[4837]: I0111 17:54:59.006258 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/d146eef1-2f66-448c-a614-f5832ddbaaa6-dns-swift-storage-0\") pod \"d146eef1-2f66-448c-a614-f5832ddbaaa6\" (UID: \"d146eef1-2f66-448c-a614-f5832ddbaaa6\") " Jan 11 17:54:59 crc kubenswrapper[4837]: I0111 17:54:59.006299 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-c9jgw\" (UniqueName: \"kubernetes.io/projected/d146eef1-2f66-448c-a614-f5832ddbaaa6-kube-api-access-c9jgw\") pod \"d146eef1-2f66-448c-a614-f5832ddbaaa6\" (UID: \"d146eef1-2f66-448c-a614-f5832ddbaaa6\") " Jan 11 17:54:59 crc kubenswrapper[4837]: I0111 17:54:59.006439 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/d146eef1-2f66-448c-a614-f5832ddbaaa6-ovsdbserver-nb\") pod \"d146eef1-2f66-448c-a614-f5832ddbaaa6\" (UID: \"d146eef1-2f66-448c-a614-f5832ddbaaa6\") " Jan 11 17:54:59 crc kubenswrapper[4837]: I0111 17:54:59.006510 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/d146eef1-2f66-448c-a614-f5832ddbaaa6-ovsdbserver-sb\") pod \"d146eef1-2f66-448c-a614-f5832ddbaaa6\" (UID: \"d146eef1-2f66-448c-a614-f5832ddbaaa6\") " Jan 11 17:54:59 crc kubenswrapper[4837]: I0111 17:54:59.006552 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d146eef1-2f66-448c-a614-f5832ddbaaa6-config\") pod \"d146eef1-2f66-448c-a614-f5832ddbaaa6\" (UID: \"d146eef1-2f66-448c-a614-f5832ddbaaa6\") " Jan 11 17:54:59 crc kubenswrapper[4837]: I0111 17:54:59.006599 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/d146eef1-2f66-448c-a614-f5832ddbaaa6-dns-svc\") pod \"d146eef1-2f66-448c-a614-f5832ddbaaa6\" (UID: \"d146eef1-2f66-448c-a614-f5832ddbaaa6\") " Jan 11 17:54:59 crc kubenswrapper[4837]: I0111 17:54:59.013874 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d146eef1-2f66-448c-a614-f5832ddbaaa6-kube-api-access-c9jgw" (OuterVolumeSpecName: "kube-api-access-c9jgw") pod "d146eef1-2f66-448c-a614-f5832ddbaaa6" (UID: "d146eef1-2f66-448c-a614-f5832ddbaaa6"). InnerVolumeSpecName "kube-api-access-c9jgw". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 11 17:54:59 crc kubenswrapper[4837]: I0111 17:54:59.058406 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d146eef1-2f66-448c-a614-f5832ddbaaa6-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "d146eef1-2f66-448c-a614-f5832ddbaaa6" (UID: "d146eef1-2f66-448c-a614-f5832ddbaaa6"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 11 17:54:59 crc kubenswrapper[4837]: I0111 17:54:59.062956 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d146eef1-2f66-448c-a614-f5832ddbaaa6-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "d146eef1-2f66-448c-a614-f5832ddbaaa6" (UID: "d146eef1-2f66-448c-a614-f5832ddbaaa6"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 11 17:54:59 crc kubenswrapper[4837]: I0111 17:54:59.081628 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d146eef1-2f66-448c-a614-f5832ddbaaa6-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "d146eef1-2f66-448c-a614-f5832ddbaaa6" (UID: "d146eef1-2f66-448c-a614-f5832ddbaaa6"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 11 17:54:59 crc kubenswrapper[4837]: I0111 17:54:59.088917 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d146eef1-2f66-448c-a614-f5832ddbaaa6-dns-swift-storage-0" (OuterVolumeSpecName: "dns-swift-storage-0") pod "d146eef1-2f66-448c-a614-f5832ddbaaa6" (UID: "d146eef1-2f66-448c-a614-f5832ddbaaa6"). InnerVolumeSpecName "dns-swift-storage-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 11 17:54:59 crc kubenswrapper[4837]: I0111 17:54:59.112237 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d146eef1-2f66-448c-a614-f5832ddbaaa6-config" (OuterVolumeSpecName: "config") pod "d146eef1-2f66-448c-a614-f5832ddbaaa6" (UID: "d146eef1-2f66-448c-a614-f5832ddbaaa6"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 11 17:54:59 crc kubenswrapper[4837]: I0111 17:54:59.118437 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d146eef1-2f66-448c-a614-f5832ddbaaa6-config\") pod \"d146eef1-2f66-448c-a614-f5832ddbaaa6\" (UID: \"d146eef1-2f66-448c-a614-f5832ddbaaa6\") " Jan 11 17:54:59 crc kubenswrapper[4837]: W0111 17:54:59.118960 4837 empty_dir.go:500] Warning: Unmount skipped because path does not exist: /var/lib/kubelet/pods/d146eef1-2f66-448c-a614-f5832ddbaaa6/volumes/kubernetes.io~configmap/config Jan 11 17:54:59 crc kubenswrapper[4837]: I0111 17:54:59.119000 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d146eef1-2f66-448c-a614-f5832ddbaaa6-config" (OuterVolumeSpecName: "config") pod "d146eef1-2f66-448c-a614-f5832ddbaaa6" (UID: "d146eef1-2f66-448c-a614-f5832ddbaaa6"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 11 17:54:59 crc kubenswrapper[4837]: I0111 17:54:59.119338 4837 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/d146eef1-2f66-448c-a614-f5832ddbaaa6-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Jan 11 17:54:59 crc kubenswrapper[4837]: I0111 17:54:59.119362 4837 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/d146eef1-2f66-448c-a614-f5832ddbaaa6-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Jan 11 17:54:59 crc kubenswrapper[4837]: I0111 17:54:59.119373 4837 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d146eef1-2f66-448c-a614-f5832ddbaaa6-config\") on node \"crc\" DevicePath \"\"" Jan 11 17:54:59 crc kubenswrapper[4837]: I0111 17:54:59.119383 4837 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/d146eef1-2f66-448c-a614-f5832ddbaaa6-dns-svc\") on node \"crc\" DevicePath \"\"" Jan 11 17:54:59 crc kubenswrapper[4837]: I0111 17:54:59.119393 4837 reconciler_common.go:293] "Volume detached for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/d146eef1-2f66-448c-a614-f5832ddbaaa6-dns-swift-storage-0\") on node \"crc\" DevicePath \"\"" Jan 11 17:54:59 crc kubenswrapper[4837]: I0111 17:54:59.119402 4837 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-c9jgw\" (UniqueName: \"kubernetes.io/projected/d146eef1-2f66-448c-a614-f5832ddbaaa6-kube-api-access-c9jgw\") on node \"crc\" DevicePath \"\"" Jan 11 17:54:59 crc kubenswrapper[4837]: I0111 17:54:59.660014 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-59cf4bdb65-txfdc" event={"ID":"d146eef1-2f66-448c-a614-f5832ddbaaa6","Type":"ContainerDied","Data":"d71e930dfad19d22283ebcbe145ebb94f2aec3e69fd6ff607d45e5ea71aaee3c"} Jan 11 17:54:59 crc kubenswrapper[4837]: I0111 17:54:59.660091 4837 scope.go:117] "RemoveContainer" containerID="329a0e61e9944552996287a96c2c6a490d980fe654ce2e1cb9e5ead38b3a34b8" Jan 11 17:54:59 crc kubenswrapper[4837]: I0111 17:54:59.660118 4837 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-59cf4bdb65-txfdc" Jan 11 17:54:59 crc kubenswrapper[4837]: I0111 17:54:59.699234 4837 scope.go:117] "RemoveContainer" containerID="67b619fb41b4abee9735c879b3c66957d0f9204509bddc0c9821b76af68c53a5" Jan 11 17:54:59 crc kubenswrapper[4837]: I0111 17:54:59.721908 4837 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-59cf4bdb65-txfdc"] Jan 11 17:54:59 crc kubenswrapper[4837]: I0111 17:54:59.736147 4837 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-59cf4bdb65-txfdc"] Jan 11 17:55:00 crc kubenswrapper[4837]: I0111 17:55:00.376620 4837 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d146eef1-2f66-448c-a614-f5832ddbaaa6" path="/var/lib/kubelet/pods/d146eef1-2f66-448c-a614-f5832ddbaaa6/volumes" Jan 11 17:55:06 crc kubenswrapper[4837]: I0111 17:55:06.719635 4837 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-vp9f5"] Jan 11 17:55:06 crc kubenswrapper[4837]: E0111 17:55:06.720965 4837 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d146eef1-2f66-448c-a614-f5832ddbaaa6" containerName="init" Jan 11 17:55:06 crc kubenswrapper[4837]: I0111 17:55:06.720988 4837 state_mem.go:107] "Deleted CPUSet assignment" podUID="d146eef1-2f66-448c-a614-f5832ddbaaa6" containerName="init" Jan 11 17:55:06 crc kubenswrapper[4837]: E0111 17:55:06.721005 4837 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d146eef1-2f66-448c-a614-f5832ddbaaa6" containerName="dnsmasq-dns" Jan 11 17:55:06 crc kubenswrapper[4837]: I0111 17:55:06.721017 4837 state_mem.go:107] "Deleted CPUSet assignment" podUID="d146eef1-2f66-448c-a614-f5832ddbaaa6" containerName="dnsmasq-dns" Jan 11 17:55:06 crc kubenswrapper[4837]: I0111 17:55:06.721272 4837 memory_manager.go:354] "RemoveStaleState removing state" podUID="d146eef1-2f66-448c-a614-f5832ddbaaa6" containerName="dnsmasq-dns" Jan 11 17:55:06 crc kubenswrapper[4837]: I0111 17:55:06.722228 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-vp9f5" Jan 11 17:55:06 crc kubenswrapper[4837]: I0111 17:55:06.728126 4837 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Jan 11 17:55:06 crc kubenswrapper[4837]: I0111 17:55:06.728581 4837 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Jan 11 17:55:06 crc kubenswrapper[4837]: I0111 17:55:06.729258 4837 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Jan 11 17:55:06 crc kubenswrapper[4837]: I0111 17:55:06.729355 4837 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-bzt89" Jan 11 17:55:06 crc kubenswrapper[4837]: I0111 17:55:06.753895 4837 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-vp9f5"] Jan 11 17:55:06 crc kubenswrapper[4837]: I0111 17:55:06.793917 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/4c81bfed-8b17-4ea6-90f8-794ea9dec0f6-ssh-key-openstack-edpm-ipam\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-vp9f5\" (UID: \"4c81bfed-8b17-4ea6-90f8-794ea9dec0f6\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-vp9f5" Jan 11 17:55:06 crc kubenswrapper[4837]: I0111 17:55:06.793991 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/4c81bfed-8b17-4ea6-90f8-794ea9dec0f6-inventory\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-vp9f5\" (UID: \"4c81bfed-8b17-4ea6-90f8-794ea9dec0f6\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-vp9f5" Jan 11 17:55:06 crc kubenswrapper[4837]: I0111 17:55:06.794180 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zmbzc\" (UniqueName: \"kubernetes.io/projected/4c81bfed-8b17-4ea6-90f8-794ea9dec0f6-kube-api-access-zmbzc\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-vp9f5\" (UID: \"4c81bfed-8b17-4ea6-90f8-794ea9dec0f6\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-vp9f5" Jan 11 17:55:06 crc kubenswrapper[4837]: I0111 17:55:06.794427 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"repo-setup-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4c81bfed-8b17-4ea6-90f8-794ea9dec0f6-repo-setup-combined-ca-bundle\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-vp9f5\" (UID: \"4c81bfed-8b17-4ea6-90f8-794ea9dec0f6\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-vp9f5" Jan 11 17:55:06 crc kubenswrapper[4837]: I0111 17:55:06.896937 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/4c81bfed-8b17-4ea6-90f8-794ea9dec0f6-ssh-key-openstack-edpm-ipam\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-vp9f5\" (UID: \"4c81bfed-8b17-4ea6-90f8-794ea9dec0f6\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-vp9f5" Jan 11 17:55:06 crc kubenswrapper[4837]: I0111 17:55:06.896992 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/4c81bfed-8b17-4ea6-90f8-794ea9dec0f6-inventory\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-vp9f5\" (UID: \"4c81bfed-8b17-4ea6-90f8-794ea9dec0f6\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-vp9f5" Jan 11 17:55:06 crc kubenswrapper[4837]: I0111 17:55:06.897050 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zmbzc\" (UniqueName: \"kubernetes.io/projected/4c81bfed-8b17-4ea6-90f8-794ea9dec0f6-kube-api-access-zmbzc\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-vp9f5\" (UID: \"4c81bfed-8b17-4ea6-90f8-794ea9dec0f6\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-vp9f5" Jan 11 17:55:06 crc kubenswrapper[4837]: I0111 17:55:06.897126 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"repo-setup-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4c81bfed-8b17-4ea6-90f8-794ea9dec0f6-repo-setup-combined-ca-bundle\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-vp9f5\" (UID: \"4c81bfed-8b17-4ea6-90f8-794ea9dec0f6\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-vp9f5" Jan 11 17:55:06 crc kubenswrapper[4837]: I0111 17:55:06.903339 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/4c81bfed-8b17-4ea6-90f8-794ea9dec0f6-ssh-key-openstack-edpm-ipam\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-vp9f5\" (UID: \"4c81bfed-8b17-4ea6-90f8-794ea9dec0f6\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-vp9f5" Jan 11 17:55:06 crc kubenswrapper[4837]: I0111 17:55:06.904203 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"repo-setup-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4c81bfed-8b17-4ea6-90f8-794ea9dec0f6-repo-setup-combined-ca-bundle\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-vp9f5\" (UID: \"4c81bfed-8b17-4ea6-90f8-794ea9dec0f6\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-vp9f5" Jan 11 17:55:06 crc kubenswrapper[4837]: I0111 17:55:06.906439 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/4c81bfed-8b17-4ea6-90f8-794ea9dec0f6-inventory\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-vp9f5\" (UID: \"4c81bfed-8b17-4ea6-90f8-794ea9dec0f6\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-vp9f5" Jan 11 17:55:06 crc kubenswrapper[4837]: I0111 17:55:06.927150 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zmbzc\" (UniqueName: \"kubernetes.io/projected/4c81bfed-8b17-4ea6-90f8-794ea9dec0f6-kube-api-access-zmbzc\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-vp9f5\" (UID: \"4c81bfed-8b17-4ea6-90f8-794ea9dec0f6\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-vp9f5" Jan 11 17:55:07 crc kubenswrapper[4837]: I0111 17:55:07.044074 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-vp9f5" Jan 11 17:55:07 crc kubenswrapper[4837]: I0111 17:55:07.622970 4837 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-vp9f5"] Jan 11 17:55:07 crc kubenswrapper[4837]: I0111 17:55:07.774791 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-vp9f5" event={"ID":"4c81bfed-8b17-4ea6-90f8-794ea9dec0f6","Type":"ContainerStarted","Data":"87ccf3a520bb90a18f15b60055a4b7d5df391c41a1b2175de4486a3d9d2224f1"} Jan 11 17:55:16 crc kubenswrapper[4837]: I0111 17:55:16.864506 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-vp9f5" event={"ID":"4c81bfed-8b17-4ea6-90f8-794ea9dec0f6","Type":"ContainerStarted","Data":"f2a72c1842510c052690006d13ca43d2e0cee53666f4eeb52bbd2bd9f54b55b1"} Jan 11 17:55:16 crc kubenswrapper[4837]: I0111 17:55:16.900902 4837 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-vp9f5" podStartSLOduration=2.531827903 podStartE2EDuration="10.900878936s" podCreationTimestamp="2026-01-11 17:55:06 +0000 UTC" firstStartedPulling="2026-01-11 17:55:07.628133225 +0000 UTC m=+1481.806325931" lastFinishedPulling="2026-01-11 17:55:15.997184218 +0000 UTC m=+1490.175376964" observedRunningTime="2026-01-11 17:55:16.889133981 +0000 UTC m=+1491.067326717" watchObservedRunningTime="2026-01-11 17:55:16.900878936 +0000 UTC m=+1491.079071662" Jan 11 17:55:23 crc kubenswrapper[4837]: I0111 17:55:23.945303 4837 generic.go:334] "Generic (PLEG): container finished" podID="33de23cd-829c-449c-a816-d8a54f8ea68f" containerID="9c7a0826f58e007399a83e8d9ebce96df0afd37e2bad0da06340437ce859e783" exitCode=0 Jan 11 17:55:23 crc kubenswrapper[4837]: I0111 17:55:23.945383 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"33de23cd-829c-449c-a816-d8a54f8ea68f","Type":"ContainerDied","Data":"9c7a0826f58e007399a83e8d9ebce96df0afd37e2bad0da06340437ce859e783"} Jan 11 17:55:23 crc kubenswrapper[4837]: I0111 17:55:23.948122 4837 generic.go:334] "Generic (PLEG): container finished" podID="c8ee05fd-6122-4935-a3c0-4b9f71175434" containerID="5137100e78bfbcc662cf3832cc31bb97e62da101be0681ea9e15af0c53d92502" exitCode=0 Jan 11 17:55:23 crc kubenswrapper[4837]: I0111 17:55:23.948195 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"c8ee05fd-6122-4935-a3c0-4b9f71175434","Type":"ContainerDied","Data":"5137100e78bfbcc662cf3832cc31bb97e62da101be0681ea9e15af0c53d92502"} Jan 11 17:55:24 crc kubenswrapper[4837]: I0111 17:55:24.956408 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"33de23cd-829c-449c-a816-d8a54f8ea68f","Type":"ContainerStarted","Data":"a69236b69d4dd0cc327d69fcf3510a339980ba27a2aa095982aff595fbaa8ead"} Jan 11 17:55:24 crc kubenswrapper[4837]: I0111 17:55:24.958022 4837 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/rabbitmq-cell1-server-0" Jan 11 17:55:24 crc kubenswrapper[4837]: I0111 17:55:24.960264 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"c8ee05fd-6122-4935-a3c0-4b9f71175434","Type":"ContainerStarted","Data":"0d70ae551e5bf2bff5b37e6432e25dbaa97d62aea69c44036a855c187d6ad817"} Jan 11 17:55:24 crc kubenswrapper[4837]: I0111 17:55:24.960591 4837 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/rabbitmq-server-0" Jan 11 17:55:24 crc kubenswrapper[4837]: I0111 17:55:24.984922 4837 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/rabbitmq-cell1-server-0" podStartSLOduration=36.984902138 podStartE2EDuration="36.984902138s" podCreationTimestamp="2026-01-11 17:54:48 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-11 17:55:24.982067392 +0000 UTC m=+1499.160260108" watchObservedRunningTime="2026-01-11 17:55:24.984902138 +0000 UTC m=+1499.163094844" Jan 11 17:55:25 crc kubenswrapper[4837]: I0111 17:55:25.005177 4837 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/rabbitmq-server-0" podStartSLOduration=37.005156072 podStartE2EDuration="37.005156072s" podCreationTimestamp="2026-01-11 17:54:48 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-11 17:55:25.001912535 +0000 UTC m=+1499.180105231" watchObservedRunningTime="2026-01-11 17:55:25.005156072 +0000 UTC m=+1499.183348778" Jan 11 17:55:29 crc kubenswrapper[4837]: I0111 17:55:29.001922 4837 generic.go:334] "Generic (PLEG): container finished" podID="4c81bfed-8b17-4ea6-90f8-794ea9dec0f6" containerID="f2a72c1842510c052690006d13ca43d2e0cee53666f4eeb52bbd2bd9f54b55b1" exitCode=0 Jan 11 17:55:29 crc kubenswrapper[4837]: I0111 17:55:29.002052 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-vp9f5" event={"ID":"4c81bfed-8b17-4ea6-90f8-794ea9dec0f6","Type":"ContainerDied","Data":"f2a72c1842510c052690006d13ca43d2e0cee53666f4eeb52bbd2bd9f54b55b1"} Jan 11 17:55:30 crc kubenswrapper[4837]: I0111 17:55:30.458053 4837 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-vp9f5" Jan 11 17:55:30 crc kubenswrapper[4837]: I0111 17:55:30.602712 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"repo-setup-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4c81bfed-8b17-4ea6-90f8-794ea9dec0f6-repo-setup-combined-ca-bundle\") pod \"4c81bfed-8b17-4ea6-90f8-794ea9dec0f6\" (UID: \"4c81bfed-8b17-4ea6-90f8-794ea9dec0f6\") " Jan 11 17:55:30 crc kubenswrapper[4837]: I0111 17:55:30.602876 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/4c81bfed-8b17-4ea6-90f8-794ea9dec0f6-ssh-key-openstack-edpm-ipam\") pod \"4c81bfed-8b17-4ea6-90f8-794ea9dec0f6\" (UID: \"4c81bfed-8b17-4ea6-90f8-794ea9dec0f6\") " Jan 11 17:55:30 crc kubenswrapper[4837]: I0111 17:55:30.602997 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/4c81bfed-8b17-4ea6-90f8-794ea9dec0f6-inventory\") pod \"4c81bfed-8b17-4ea6-90f8-794ea9dec0f6\" (UID: \"4c81bfed-8b17-4ea6-90f8-794ea9dec0f6\") " Jan 11 17:55:30 crc kubenswrapper[4837]: I0111 17:55:30.603035 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zmbzc\" (UniqueName: \"kubernetes.io/projected/4c81bfed-8b17-4ea6-90f8-794ea9dec0f6-kube-api-access-zmbzc\") pod \"4c81bfed-8b17-4ea6-90f8-794ea9dec0f6\" (UID: \"4c81bfed-8b17-4ea6-90f8-794ea9dec0f6\") " Jan 11 17:55:30 crc kubenswrapper[4837]: I0111 17:55:30.611971 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4c81bfed-8b17-4ea6-90f8-794ea9dec0f6-repo-setup-combined-ca-bundle" (OuterVolumeSpecName: "repo-setup-combined-ca-bundle") pod "4c81bfed-8b17-4ea6-90f8-794ea9dec0f6" (UID: "4c81bfed-8b17-4ea6-90f8-794ea9dec0f6"). InnerVolumeSpecName "repo-setup-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 11 17:55:30 crc kubenswrapper[4837]: I0111 17:55:30.612069 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4c81bfed-8b17-4ea6-90f8-794ea9dec0f6-kube-api-access-zmbzc" (OuterVolumeSpecName: "kube-api-access-zmbzc") pod "4c81bfed-8b17-4ea6-90f8-794ea9dec0f6" (UID: "4c81bfed-8b17-4ea6-90f8-794ea9dec0f6"). InnerVolumeSpecName "kube-api-access-zmbzc". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 11 17:55:30 crc kubenswrapper[4837]: I0111 17:55:30.634274 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4c81bfed-8b17-4ea6-90f8-794ea9dec0f6-inventory" (OuterVolumeSpecName: "inventory") pod "4c81bfed-8b17-4ea6-90f8-794ea9dec0f6" (UID: "4c81bfed-8b17-4ea6-90f8-794ea9dec0f6"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 11 17:55:30 crc kubenswrapper[4837]: I0111 17:55:30.645226 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4c81bfed-8b17-4ea6-90f8-794ea9dec0f6-ssh-key-openstack-edpm-ipam" (OuterVolumeSpecName: "ssh-key-openstack-edpm-ipam") pod "4c81bfed-8b17-4ea6-90f8-794ea9dec0f6" (UID: "4c81bfed-8b17-4ea6-90f8-794ea9dec0f6"). InnerVolumeSpecName "ssh-key-openstack-edpm-ipam". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 11 17:55:30 crc kubenswrapper[4837]: I0111 17:55:30.705249 4837 reconciler_common.go:293] "Volume detached for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/4c81bfed-8b17-4ea6-90f8-794ea9dec0f6-ssh-key-openstack-edpm-ipam\") on node \"crc\" DevicePath \"\"" Jan 11 17:55:30 crc kubenswrapper[4837]: I0111 17:55:30.705290 4837 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/4c81bfed-8b17-4ea6-90f8-794ea9dec0f6-inventory\") on node \"crc\" DevicePath \"\"" Jan 11 17:55:30 crc kubenswrapper[4837]: I0111 17:55:30.705306 4837 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-zmbzc\" (UniqueName: \"kubernetes.io/projected/4c81bfed-8b17-4ea6-90f8-794ea9dec0f6-kube-api-access-zmbzc\") on node \"crc\" DevicePath \"\"" Jan 11 17:55:30 crc kubenswrapper[4837]: I0111 17:55:30.705319 4837 reconciler_common.go:293] "Volume detached for volume \"repo-setup-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4c81bfed-8b17-4ea6-90f8-794ea9dec0f6-repo-setup-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 11 17:55:31 crc kubenswrapper[4837]: I0111 17:55:31.021076 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-vp9f5" event={"ID":"4c81bfed-8b17-4ea6-90f8-794ea9dec0f6","Type":"ContainerDied","Data":"87ccf3a520bb90a18f15b60055a4b7d5df391c41a1b2175de4486a3d9d2224f1"} Jan 11 17:55:31 crc kubenswrapper[4837]: I0111 17:55:31.021117 4837 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="87ccf3a520bb90a18f15b60055a4b7d5df391c41a1b2175de4486a3d9d2224f1" Jan 11 17:55:31 crc kubenswrapper[4837]: I0111 17:55:31.021164 4837 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-vp9f5" Jan 11 17:55:31 crc kubenswrapper[4837]: I0111 17:55:31.129961 4837 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/redhat-edpm-deployment-openstack-edpm-ipam-jwxdr"] Jan 11 17:55:31 crc kubenswrapper[4837]: E0111 17:55:31.130347 4837 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4c81bfed-8b17-4ea6-90f8-794ea9dec0f6" containerName="repo-setup-edpm-deployment-openstack-edpm-ipam" Jan 11 17:55:31 crc kubenswrapper[4837]: I0111 17:55:31.130365 4837 state_mem.go:107] "Deleted CPUSet assignment" podUID="4c81bfed-8b17-4ea6-90f8-794ea9dec0f6" containerName="repo-setup-edpm-deployment-openstack-edpm-ipam" Jan 11 17:55:31 crc kubenswrapper[4837]: I0111 17:55:31.130543 4837 memory_manager.go:354] "RemoveStaleState removing state" podUID="4c81bfed-8b17-4ea6-90f8-794ea9dec0f6" containerName="repo-setup-edpm-deployment-openstack-edpm-ipam" Jan 11 17:55:31 crc kubenswrapper[4837]: I0111 17:55:31.131210 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-jwxdr" Jan 11 17:55:31 crc kubenswrapper[4837]: I0111 17:55:31.133276 4837 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Jan 11 17:55:31 crc kubenswrapper[4837]: I0111 17:55:31.133606 4837 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Jan 11 17:55:31 crc kubenswrapper[4837]: I0111 17:55:31.133912 4837 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-bzt89" Jan 11 17:55:31 crc kubenswrapper[4837]: I0111 17:55:31.134369 4837 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Jan 11 17:55:31 crc kubenswrapper[4837]: I0111 17:55:31.140925 4837 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/redhat-edpm-deployment-openstack-edpm-ipam-jwxdr"] Jan 11 17:55:31 crc kubenswrapper[4837]: I0111 17:55:31.314349 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vw95z\" (UniqueName: \"kubernetes.io/projected/8c4987b1-e485-4665-9094-ed0f9cd0ed7d-kube-api-access-vw95z\") pod \"redhat-edpm-deployment-openstack-edpm-ipam-jwxdr\" (UID: \"8c4987b1-e485-4665-9094-ed0f9cd0ed7d\") " pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-jwxdr" Jan 11 17:55:31 crc kubenswrapper[4837]: I0111 17:55:31.314519 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/8c4987b1-e485-4665-9094-ed0f9cd0ed7d-ssh-key-openstack-edpm-ipam\") pod \"redhat-edpm-deployment-openstack-edpm-ipam-jwxdr\" (UID: \"8c4987b1-e485-4665-9094-ed0f9cd0ed7d\") " pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-jwxdr" Jan 11 17:55:31 crc kubenswrapper[4837]: I0111 17:55:31.314569 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/8c4987b1-e485-4665-9094-ed0f9cd0ed7d-inventory\") pod \"redhat-edpm-deployment-openstack-edpm-ipam-jwxdr\" (UID: \"8c4987b1-e485-4665-9094-ed0f9cd0ed7d\") " pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-jwxdr" Jan 11 17:55:31 crc kubenswrapper[4837]: I0111 17:55:31.415915 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/8c4987b1-e485-4665-9094-ed0f9cd0ed7d-ssh-key-openstack-edpm-ipam\") pod \"redhat-edpm-deployment-openstack-edpm-ipam-jwxdr\" (UID: \"8c4987b1-e485-4665-9094-ed0f9cd0ed7d\") " pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-jwxdr" Jan 11 17:55:31 crc kubenswrapper[4837]: I0111 17:55:31.415977 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/8c4987b1-e485-4665-9094-ed0f9cd0ed7d-inventory\") pod \"redhat-edpm-deployment-openstack-edpm-ipam-jwxdr\" (UID: \"8c4987b1-e485-4665-9094-ed0f9cd0ed7d\") " pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-jwxdr" Jan 11 17:55:31 crc kubenswrapper[4837]: I0111 17:55:31.416015 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vw95z\" (UniqueName: \"kubernetes.io/projected/8c4987b1-e485-4665-9094-ed0f9cd0ed7d-kube-api-access-vw95z\") pod \"redhat-edpm-deployment-openstack-edpm-ipam-jwxdr\" (UID: \"8c4987b1-e485-4665-9094-ed0f9cd0ed7d\") " pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-jwxdr" Jan 11 17:55:31 crc kubenswrapper[4837]: I0111 17:55:31.419922 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/8c4987b1-e485-4665-9094-ed0f9cd0ed7d-inventory\") pod \"redhat-edpm-deployment-openstack-edpm-ipam-jwxdr\" (UID: \"8c4987b1-e485-4665-9094-ed0f9cd0ed7d\") " pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-jwxdr" Jan 11 17:55:31 crc kubenswrapper[4837]: I0111 17:55:31.419955 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/8c4987b1-e485-4665-9094-ed0f9cd0ed7d-ssh-key-openstack-edpm-ipam\") pod \"redhat-edpm-deployment-openstack-edpm-ipam-jwxdr\" (UID: \"8c4987b1-e485-4665-9094-ed0f9cd0ed7d\") " pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-jwxdr" Jan 11 17:55:31 crc kubenswrapper[4837]: I0111 17:55:31.436605 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vw95z\" (UniqueName: \"kubernetes.io/projected/8c4987b1-e485-4665-9094-ed0f9cd0ed7d-kube-api-access-vw95z\") pod \"redhat-edpm-deployment-openstack-edpm-ipam-jwxdr\" (UID: \"8c4987b1-e485-4665-9094-ed0f9cd0ed7d\") " pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-jwxdr" Jan 11 17:55:31 crc kubenswrapper[4837]: I0111 17:55:31.463912 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-jwxdr" Jan 11 17:55:31 crc kubenswrapper[4837]: I0111 17:55:31.976567 4837 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/redhat-edpm-deployment-openstack-edpm-ipam-jwxdr"] Jan 11 17:55:32 crc kubenswrapper[4837]: I0111 17:55:32.044156 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-jwxdr" event={"ID":"8c4987b1-e485-4665-9094-ed0f9cd0ed7d","Type":"ContainerStarted","Data":"e01976fc08939a9042df95b2adb118ae5e89e10a8ee892649dc86ccc0c3ad23f"} Jan 11 17:55:33 crc kubenswrapper[4837]: I0111 17:55:33.054410 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-jwxdr" event={"ID":"8c4987b1-e485-4665-9094-ed0f9cd0ed7d","Type":"ContainerStarted","Data":"f7a47d67ea8ae82c8e6487c950945675426204ba60c856a2f407a70c0effd9bf"} Jan 11 17:55:33 crc kubenswrapper[4837]: I0111 17:55:33.076878 4837 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-jwxdr" podStartSLOduration=1.237095212 podStartE2EDuration="2.076854413s" podCreationTimestamp="2026-01-11 17:55:31 +0000 UTC" firstStartedPulling="2026-01-11 17:55:31.989996359 +0000 UTC m=+1506.168189065" lastFinishedPulling="2026-01-11 17:55:32.82975556 +0000 UTC m=+1507.007948266" observedRunningTime="2026-01-11 17:55:33.067898463 +0000 UTC m=+1507.246091179" watchObservedRunningTime="2026-01-11 17:55:33.076854413 +0000 UTC m=+1507.255047119" Jan 11 17:55:34 crc kubenswrapper[4837]: I0111 17:55:34.462704 4837 scope.go:117] "RemoveContainer" containerID="892c1071f039c7f21325f41a2bcca20b48d3925ea457a5d2c914d76d7af02ee7" Jan 11 17:55:34 crc kubenswrapper[4837]: I0111 17:55:34.492365 4837 scope.go:117] "RemoveContainer" containerID="1ea4cac02a3e38e4405e89d50c0227c8e1c5d36ab412a14c0dbcb3a3c5750e3f" Jan 11 17:55:34 crc kubenswrapper[4837]: I0111 17:55:34.550699 4837 scope.go:117] "RemoveContainer" containerID="090c16b6579339fa8a190dd68dae626cf24eced302842fd629cf501463501539" Jan 11 17:55:34 crc kubenswrapper[4837]: I0111 17:55:34.570794 4837 scope.go:117] "RemoveContainer" containerID="0ffdf69a1ed83907515afc15b5586ce4041929133d7be7461fb025aa80cdeb93" Jan 11 17:55:36 crc kubenswrapper[4837]: I0111 17:55:36.092372 4837 generic.go:334] "Generic (PLEG): container finished" podID="8c4987b1-e485-4665-9094-ed0f9cd0ed7d" containerID="f7a47d67ea8ae82c8e6487c950945675426204ba60c856a2f407a70c0effd9bf" exitCode=0 Jan 11 17:55:36 crc kubenswrapper[4837]: I0111 17:55:36.092462 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-jwxdr" event={"ID":"8c4987b1-e485-4665-9094-ed0f9cd0ed7d","Type":"ContainerDied","Data":"f7a47d67ea8ae82c8e6487c950945675426204ba60c856a2f407a70c0effd9bf"} Jan 11 17:55:37 crc kubenswrapper[4837]: I0111 17:55:37.559200 4837 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-jwxdr" Jan 11 17:55:37 crc kubenswrapper[4837]: I0111 17:55:37.699190 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/8c4987b1-e485-4665-9094-ed0f9cd0ed7d-inventory\") pod \"8c4987b1-e485-4665-9094-ed0f9cd0ed7d\" (UID: \"8c4987b1-e485-4665-9094-ed0f9cd0ed7d\") " Jan 11 17:55:37 crc kubenswrapper[4837]: I0111 17:55:37.699357 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-vw95z\" (UniqueName: \"kubernetes.io/projected/8c4987b1-e485-4665-9094-ed0f9cd0ed7d-kube-api-access-vw95z\") pod \"8c4987b1-e485-4665-9094-ed0f9cd0ed7d\" (UID: \"8c4987b1-e485-4665-9094-ed0f9cd0ed7d\") " Jan 11 17:55:37 crc kubenswrapper[4837]: I0111 17:55:37.699479 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/8c4987b1-e485-4665-9094-ed0f9cd0ed7d-ssh-key-openstack-edpm-ipam\") pod \"8c4987b1-e485-4665-9094-ed0f9cd0ed7d\" (UID: \"8c4987b1-e485-4665-9094-ed0f9cd0ed7d\") " Jan 11 17:55:37 crc kubenswrapper[4837]: I0111 17:55:37.709013 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8c4987b1-e485-4665-9094-ed0f9cd0ed7d-kube-api-access-vw95z" (OuterVolumeSpecName: "kube-api-access-vw95z") pod "8c4987b1-e485-4665-9094-ed0f9cd0ed7d" (UID: "8c4987b1-e485-4665-9094-ed0f9cd0ed7d"). InnerVolumeSpecName "kube-api-access-vw95z". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 11 17:55:37 crc kubenswrapper[4837]: I0111 17:55:37.727059 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8c4987b1-e485-4665-9094-ed0f9cd0ed7d-inventory" (OuterVolumeSpecName: "inventory") pod "8c4987b1-e485-4665-9094-ed0f9cd0ed7d" (UID: "8c4987b1-e485-4665-9094-ed0f9cd0ed7d"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 11 17:55:37 crc kubenswrapper[4837]: I0111 17:55:37.743478 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8c4987b1-e485-4665-9094-ed0f9cd0ed7d-ssh-key-openstack-edpm-ipam" (OuterVolumeSpecName: "ssh-key-openstack-edpm-ipam") pod "8c4987b1-e485-4665-9094-ed0f9cd0ed7d" (UID: "8c4987b1-e485-4665-9094-ed0f9cd0ed7d"). InnerVolumeSpecName "ssh-key-openstack-edpm-ipam". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 11 17:55:37 crc kubenswrapper[4837]: I0111 17:55:37.802959 4837 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/8c4987b1-e485-4665-9094-ed0f9cd0ed7d-inventory\") on node \"crc\" DevicePath \"\"" Jan 11 17:55:37 crc kubenswrapper[4837]: I0111 17:55:37.803009 4837 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-vw95z\" (UniqueName: \"kubernetes.io/projected/8c4987b1-e485-4665-9094-ed0f9cd0ed7d-kube-api-access-vw95z\") on node \"crc\" DevicePath \"\"" Jan 11 17:55:37 crc kubenswrapper[4837]: I0111 17:55:37.803032 4837 reconciler_common.go:293] "Volume detached for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/8c4987b1-e485-4665-9094-ed0f9cd0ed7d-ssh-key-openstack-edpm-ipam\") on node \"crc\" DevicePath \"\"" Jan 11 17:55:38 crc kubenswrapper[4837]: I0111 17:55:38.124235 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-jwxdr" event={"ID":"8c4987b1-e485-4665-9094-ed0f9cd0ed7d","Type":"ContainerDied","Data":"e01976fc08939a9042df95b2adb118ae5e89e10a8ee892649dc86ccc0c3ad23f"} Jan 11 17:55:38 crc kubenswrapper[4837]: I0111 17:55:38.124288 4837 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="e01976fc08939a9042df95b2adb118ae5e89e10a8ee892649dc86ccc0c3ad23f" Jan 11 17:55:38 crc kubenswrapper[4837]: I0111 17:55:38.124350 4837 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-jwxdr" Jan 11 17:55:38 crc kubenswrapper[4837]: I0111 17:55:38.193935 4837 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-kfwxw"] Jan 11 17:55:38 crc kubenswrapper[4837]: E0111 17:55:38.194526 4837 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8c4987b1-e485-4665-9094-ed0f9cd0ed7d" containerName="redhat-edpm-deployment-openstack-edpm-ipam" Jan 11 17:55:38 crc kubenswrapper[4837]: I0111 17:55:38.194561 4837 state_mem.go:107] "Deleted CPUSet assignment" podUID="8c4987b1-e485-4665-9094-ed0f9cd0ed7d" containerName="redhat-edpm-deployment-openstack-edpm-ipam" Jan 11 17:55:38 crc kubenswrapper[4837]: I0111 17:55:38.194951 4837 memory_manager.go:354] "RemoveStaleState removing state" podUID="8c4987b1-e485-4665-9094-ed0f9cd0ed7d" containerName="redhat-edpm-deployment-openstack-edpm-ipam" Jan 11 17:55:38 crc kubenswrapper[4837]: I0111 17:55:38.196040 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-kfwxw" Jan 11 17:55:38 crc kubenswrapper[4837]: I0111 17:55:38.199230 4837 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-bzt89" Jan 11 17:55:38 crc kubenswrapper[4837]: I0111 17:55:38.200098 4837 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Jan 11 17:55:38 crc kubenswrapper[4837]: I0111 17:55:38.200331 4837 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Jan 11 17:55:38 crc kubenswrapper[4837]: I0111 17:55:38.200852 4837 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Jan 11 17:55:38 crc kubenswrapper[4837]: I0111 17:55:38.209947 4837 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-kfwxw"] Jan 11 17:55:38 crc kubenswrapper[4837]: I0111 17:55:38.312255 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-b7stf\" (UniqueName: \"kubernetes.io/projected/bb11040a-b0ba-46f9-b2bf-65c1b0c8cf51-kube-api-access-b7stf\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-kfwxw\" (UID: \"bb11040a-b0ba-46f9-b2bf-65c1b0c8cf51\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-kfwxw" Jan 11 17:55:38 crc kubenswrapper[4837]: I0111 17:55:38.312326 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/bb11040a-b0ba-46f9-b2bf-65c1b0c8cf51-bootstrap-combined-ca-bundle\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-kfwxw\" (UID: \"bb11040a-b0ba-46f9-b2bf-65c1b0c8cf51\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-kfwxw" Jan 11 17:55:38 crc kubenswrapper[4837]: I0111 17:55:38.312352 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/bb11040a-b0ba-46f9-b2bf-65c1b0c8cf51-inventory\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-kfwxw\" (UID: \"bb11040a-b0ba-46f9-b2bf-65c1b0c8cf51\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-kfwxw" Jan 11 17:55:38 crc kubenswrapper[4837]: I0111 17:55:38.313567 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/bb11040a-b0ba-46f9-b2bf-65c1b0c8cf51-ssh-key-openstack-edpm-ipam\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-kfwxw\" (UID: \"bb11040a-b0ba-46f9-b2bf-65c1b0c8cf51\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-kfwxw" Jan 11 17:55:38 crc kubenswrapper[4837]: I0111 17:55:38.415621 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/bb11040a-b0ba-46f9-b2bf-65c1b0c8cf51-ssh-key-openstack-edpm-ipam\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-kfwxw\" (UID: \"bb11040a-b0ba-46f9-b2bf-65c1b0c8cf51\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-kfwxw" Jan 11 17:55:38 crc kubenswrapper[4837]: I0111 17:55:38.415721 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-b7stf\" (UniqueName: \"kubernetes.io/projected/bb11040a-b0ba-46f9-b2bf-65c1b0c8cf51-kube-api-access-b7stf\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-kfwxw\" (UID: \"bb11040a-b0ba-46f9-b2bf-65c1b0c8cf51\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-kfwxw" Jan 11 17:55:38 crc kubenswrapper[4837]: I0111 17:55:38.415754 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/bb11040a-b0ba-46f9-b2bf-65c1b0c8cf51-bootstrap-combined-ca-bundle\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-kfwxw\" (UID: \"bb11040a-b0ba-46f9-b2bf-65c1b0c8cf51\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-kfwxw" Jan 11 17:55:38 crc kubenswrapper[4837]: I0111 17:55:38.415772 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/bb11040a-b0ba-46f9-b2bf-65c1b0c8cf51-inventory\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-kfwxw\" (UID: \"bb11040a-b0ba-46f9-b2bf-65c1b0c8cf51\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-kfwxw" Jan 11 17:55:38 crc kubenswrapper[4837]: I0111 17:55:38.419542 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/bb11040a-b0ba-46f9-b2bf-65c1b0c8cf51-inventory\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-kfwxw\" (UID: \"bb11040a-b0ba-46f9-b2bf-65c1b0c8cf51\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-kfwxw" Jan 11 17:55:38 crc kubenswrapper[4837]: I0111 17:55:38.419889 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/bb11040a-b0ba-46f9-b2bf-65c1b0c8cf51-bootstrap-combined-ca-bundle\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-kfwxw\" (UID: \"bb11040a-b0ba-46f9-b2bf-65c1b0c8cf51\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-kfwxw" Jan 11 17:55:38 crc kubenswrapper[4837]: I0111 17:55:38.421172 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/bb11040a-b0ba-46f9-b2bf-65c1b0c8cf51-ssh-key-openstack-edpm-ipam\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-kfwxw\" (UID: \"bb11040a-b0ba-46f9-b2bf-65c1b0c8cf51\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-kfwxw" Jan 11 17:55:38 crc kubenswrapper[4837]: I0111 17:55:38.438319 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-b7stf\" (UniqueName: \"kubernetes.io/projected/bb11040a-b0ba-46f9-b2bf-65c1b0c8cf51-kube-api-access-b7stf\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-kfwxw\" (UID: \"bb11040a-b0ba-46f9-b2bf-65c1b0c8cf51\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-kfwxw" Jan 11 17:55:38 crc kubenswrapper[4837]: I0111 17:55:38.516450 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-kfwxw" Jan 11 17:55:38 crc kubenswrapper[4837]: I0111 17:55:38.650041 4837 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/rabbitmq-server-0" Jan 11 17:55:38 crc kubenswrapper[4837]: I0111 17:55:38.949381 4837 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/rabbitmq-cell1-server-0" Jan 11 17:55:39 crc kubenswrapper[4837]: I0111 17:55:39.190377 4837 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-kfwxw"] Jan 11 17:55:40 crc kubenswrapper[4837]: I0111 17:55:40.140229 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-kfwxw" event={"ID":"bb11040a-b0ba-46f9-b2bf-65c1b0c8cf51","Type":"ContainerStarted","Data":"9444df6b973b75583c5e5e2d54206ebe138fa701edb4b74d18ff5e5c0ed31375"} Jan 11 17:55:40 crc kubenswrapper[4837]: I0111 17:55:40.140287 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-kfwxw" event={"ID":"bb11040a-b0ba-46f9-b2bf-65c1b0c8cf51","Type":"ContainerStarted","Data":"10c1be50375bc51ece18ca35e1f1ca38e31da043eba2f42583fbb54abdeff561"} Jan 11 17:55:40 crc kubenswrapper[4837]: I0111 17:55:40.165285 4837 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-kfwxw" podStartSLOduration=1.491999919 podStartE2EDuration="2.165261991s" podCreationTimestamp="2026-01-11 17:55:38 +0000 UTC" firstStartedPulling="2026-01-11 17:55:39.198958573 +0000 UTC m=+1513.377151279" lastFinishedPulling="2026-01-11 17:55:39.872220645 +0000 UTC m=+1514.050413351" observedRunningTime="2026-01-11 17:55:40.154881882 +0000 UTC m=+1514.333074588" watchObservedRunningTime="2026-01-11 17:55:40.165261991 +0000 UTC m=+1514.343454697" Jan 11 17:56:15 crc kubenswrapper[4837]: I0111 17:56:15.599240 4837 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-v65l5"] Jan 11 17:56:15 crc kubenswrapper[4837]: I0111 17:56:15.604497 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-v65l5" Jan 11 17:56:15 crc kubenswrapper[4837]: I0111 17:56:15.637691 4837 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-v65l5"] Jan 11 17:56:15 crc kubenswrapper[4837]: I0111 17:56:15.728086 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/17d0097b-a94a-4a98-8ea3-1c79c144b42b-utilities\") pod \"certified-operators-v65l5\" (UID: \"17d0097b-a94a-4a98-8ea3-1c79c144b42b\") " pod="openshift-marketplace/certified-operators-v65l5" Jan 11 17:56:15 crc kubenswrapper[4837]: I0111 17:56:15.728169 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/17d0097b-a94a-4a98-8ea3-1c79c144b42b-catalog-content\") pod \"certified-operators-v65l5\" (UID: \"17d0097b-a94a-4a98-8ea3-1c79c144b42b\") " pod="openshift-marketplace/certified-operators-v65l5" Jan 11 17:56:15 crc kubenswrapper[4837]: I0111 17:56:15.728201 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4jttz\" (UniqueName: \"kubernetes.io/projected/17d0097b-a94a-4a98-8ea3-1c79c144b42b-kube-api-access-4jttz\") pod \"certified-operators-v65l5\" (UID: \"17d0097b-a94a-4a98-8ea3-1c79c144b42b\") " pod="openshift-marketplace/certified-operators-v65l5" Jan 11 17:56:15 crc kubenswrapper[4837]: I0111 17:56:15.830614 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/17d0097b-a94a-4a98-8ea3-1c79c144b42b-utilities\") pod \"certified-operators-v65l5\" (UID: \"17d0097b-a94a-4a98-8ea3-1c79c144b42b\") " pod="openshift-marketplace/certified-operators-v65l5" Jan 11 17:56:15 crc kubenswrapper[4837]: I0111 17:56:15.830688 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/17d0097b-a94a-4a98-8ea3-1c79c144b42b-catalog-content\") pod \"certified-operators-v65l5\" (UID: \"17d0097b-a94a-4a98-8ea3-1c79c144b42b\") " pod="openshift-marketplace/certified-operators-v65l5" Jan 11 17:56:15 crc kubenswrapper[4837]: I0111 17:56:15.830734 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4jttz\" (UniqueName: \"kubernetes.io/projected/17d0097b-a94a-4a98-8ea3-1c79c144b42b-kube-api-access-4jttz\") pod \"certified-operators-v65l5\" (UID: \"17d0097b-a94a-4a98-8ea3-1c79c144b42b\") " pod="openshift-marketplace/certified-operators-v65l5" Jan 11 17:56:15 crc kubenswrapper[4837]: I0111 17:56:15.831500 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/17d0097b-a94a-4a98-8ea3-1c79c144b42b-catalog-content\") pod \"certified-operators-v65l5\" (UID: \"17d0097b-a94a-4a98-8ea3-1c79c144b42b\") " pod="openshift-marketplace/certified-operators-v65l5" Jan 11 17:56:15 crc kubenswrapper[4837]: I0111 17:56:15.831997 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/17d0097b-a94a-4a98-8ea3-1c79c144b42b-utilities\") pod \"certified-operators-v65l5\" (UID: \"17d0097b-a94a-4a98-8ea3-1c79c144b42b\") " pod="openshift-marketplace/certified-operators-v65l5" Jan 11 17:56:15 crc kubenswrapper[4837]: I0111 17:56:15.857968 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4jttz\" (UniqueName: \"kubernetes.io/projected/17d0097b-a94a-4a98-8ea3-1c79c144b42b-kube-api-access-4jttz\") pod \"certified-operators-v65l5\" (UID: \"17d0097b-a94a-4a98-8ea3-1c79c144b42b\") " pod="openshift-marketplace/certified-operators-v65l5" Jan 11 17:56:15 crc kubenswrapper[4837]: I0111 17:56:15.939818 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-v65l5" Jan 11 17:56:16 crc kubenswrapper[4837]: I0111 17:56:16.529821 4837 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-v65l5"] Jan 11 17:56:17 crc kubenswrapper[4837]: I0111 17:56:17.548034 4837 generic.go:334] "Generic (PLEG): container finished" podID="17d0097b-a94a-4a98-8ea3-1c79c144b42b" containerID="0ce42166e5b006bacf9ac9489f5acd9d44456074914d62109a5ea2e30ec36cfe" exitCode=0 Jan 11 17:56:17 crc kubenswrapper[4837]: I0111 17:56:17.548097 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-v65l5" event={"ID":"17d0097b-a94a-4a98-8ea3-1c79c144b42b","Type":"ContainerDied","Data":"0ce42166e5b006bacf9ac9489f5acd9d44456074914d62109a5ea2e30ec36cfe"} Jan 11 17:56:17 crc kubenswrapper[4837]: I0111 17:56:17.548370 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-v65l5" event={"ID":"17d0097b-a94a-4a98-8ea3-1c79c144b42b","Type":"ContainerStarted","Data":"3d918ac6978b3590e35d189cbc247d6fc36b8929361937b3f9e9764e87976211"} Jan 11 17:56:19 crc kubenswrapper[4837]: I0111 17:56:19.573859 4837 generic.go:334] "Generic (PLEG): container finished" podID="17d0097b-a94a-4a98-8ea3-1c79c144b42b" containerID="e6536992081c7b85a03cdc73ebd14d912f118e2477016442dff6e307aee14145" exitCode=0 Jan 11 17:56:19 crc kubenswrapper[4837]: I0111 17:56:19.573917 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-v65l5" event={"ID":"17d0097b-a94a-4a98-8ea3-1c79c144b42b","Type":"ContainerDied","Data":"e6536992081c7b85a03cdc73ebd14d912f118e2477016442dff6e307aee14145"} Jan 11 17:56:22 crc kubenswrapper[4837]: I0111 17:56:22.616887 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-v65l5" event={"ID":"17d0097b-a94a-4a98-8ea3-1c79c144b42b","Type":"ContainerStarted","Data":"386959a2766370441912269a1948f998c50115473f331eea883846845cf75913"} Jan 11 17:56:22 crc kubenswrapper[4837]: I0111 17:56:22.647930 4837 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-v65l5" podStartSLOduration=3.866336452 podStartE2EDuration="7.647910567s" podCreationTimestamp="2026-01-11 17:56:15 +0000 UTC" firstStartedPulling="2026-01-11 17:56:17.552519466 +0000 UTC m=+1551.730712172" lastFinishedPulling="2026-01-11 17:56:21.334093571 +0000 UTC m=+1555.512286287" observedRunningTime="2026-01-11 17:56:22.643653633 +0000 UTC m=+1556.821846349" watchObservedRunningTime="2026-01-11 17:56:22.647910567 +0000 UTC m=+1556.826103273" Jan 11 17:56:25 crc kubenswrapper[4837]: I0111 17:56:25.940858 4837 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-v65l5" Jan 11 17:56:25 crc kubenswrapper[4837]: I0111 17:56:25.941317 4837 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-v65l5" Jan 11 17:56:25 crc kubenswrapper[4837]: I0111 17:56:25.992039 4837 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-v65l5" Jan 11 17:56:26 crc kubenswrapper[4837]: I0111 17:56:26.727983 4837 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-v65l5" Jan 11 17:56:26 crc kubenswrapper[4837]: I0111 17:56:26.802713 4837 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-v65l5"] Jan 11 17:56:28 crc kubenswrapper[4837]: I0111 17:56:28.670760 4837 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-v65l5" podUID="17d0097b-a94a-4a98-8ea3-1c79c144b42b" containerName="registry-server" containerID="cri-o://386959a2766370441912269a1948f998c50115473f331eea883846845cf75913" gracePeriod=2 Jan 11 17:56:30 crc kubenswrapper[4837]: I0111 17:56:30.694815 4837 generic.go:334] "Generic (PLEG): container finished" podID="17d0097b-a94a-4a98-8ea3-1c79c144b42b" containerID="386959a2766370441912269a1948f998c50115473f331eea883846845cf75913" exitCode=0 Jan 11 17:56:30 crc kubenswrapper[4837]: I0111 17:56:30.695045 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-v65l5" event={"ID":"17d0097b-a94a-4a98-8ea3-1c79c144b42b","Type":"ContainerDied","Data":"386959a2766370441912269a1948f998c50115473f331eea883846845cf75913"} Jan 11 17:56:31 crc kubenswrapper[4837]: I0111 17:56:31.105954 4837 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-v65l5" Jan 11 17:56:31 crc kubenswrapper[4837]: I0111 17:56:31.169969 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/17d0097b-a94a-4a98-8ea3-1c79c144b42b-utilities\") pod \"17d0097b-a94a-4a98-8ea3-1c79c144b42b\" (UID: \"17d0097b-a94a-4a98-8ea3-1c79c144b42b\") " Jan 11 17:56:31 crc kubenswrapper[4837]: I0111 17:56:31.170102 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/17d0097b-a94a-4a98-8ea3-1c79c144b42b-catalog-content\") pod \"17d0097b-a94a-4a98-8ea3-1c79c144b42b\" (UID: \"17d0097b-a94a-4a98-8ea3-1c79c144b42b\") " Jan 11 17:56:31 crc kubenswrapper[4837]: I0111 17:56:31.170277 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4jttz\" (UniqueName: \"kubernetes.io/projected/17d0097b-a94a-4a98-8ea3-1c79c144b42b-kube-api-access-4jttz\") pod \"17d0097b-a94a-4a98-8ea3-1c79c144b42b\" (UID: \"17d0097b-a94a-4a98-8ea3-1c79c144b42b\") " Jan 11 17:56:31 crc kubenswrapper[4837]: I0111 17:56:31.170932 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/17d0097b-a94a-4a98-8ea3-1c79c144b42b-utilities" (OuterVolumeSpecName: "utilities") pod "17d0097b-a94a-4a98-8ea3-1c79c144b42b" (UID: "17d0097b-a94a-4a98-8ea3-1c79c144b42b"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 11 17:56:31 crc kubenswrapper[4837]: I0111 17:56:31.178209 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/17d0097b-a94a-4a98-8ea3-1c79c144b42b-kube-api-access-4jttz" (OuterVolumeSpecName: "kube-api-access-4jttz") pod "17d0097b-a94a-4a98-8ea3-1c79c144b42b" (UID: "17d0097b-a94a-4a98-8ea3-1c79c144b42b"). InnerVolumeSpecName "kube-api-access-4jttz". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 11 17:56:31 crc kubenswrapper[4837]: I0111 17:56:31.210167 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/17d0097b-a94a-4a98-8ea3-1c79c144b42b-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "17d0097b-a94a-4a98-8ea3-1c79c144b42b" (UID: "17d0097b-a94a-4a98-8ea3-1c79c144b42b"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 11 17:56:31 crc kubenswrapper[4837]: I0111 17:56:31.271870 4837 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/17d0097b-a94a-4a98-8ea3-1c79c144b42b-utilities\") on node \"crc\" DevicePath \"\"" Jan 11 17:56:31 crc kubenswrapper[4837]: I0111 17:56:31.271901 4837 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/17d0097b-a94a-4a98-8ea3-1c79c144b42b-catalog-content\") on node \"crc\" DevicePath \"\"" Jan 11 17:56:31 crc kubenswrapper[4837]: I0111 17:56:31.271913 4837 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-4jttz\" (UniqueName: \"kubernetes.io/projected/17d0097b-a94a-4a98-8ea3-1c79c144b42b-kube-api-access-4jttz\") on node \"crc\" DevicePath \"\"" Jan 11 17:56:31 crc kubenswrapper[4837]: I0111 17:56:31.709785 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-v65l5" event={"ID":"17d0097b-a94a-4a98-8ea3-1c79c144b42b","Type":"ContainerDied","Data":"3d918ac6978b3590e35d189cbc247d6fc36b8929361937b3f9e9764e87976211"} Jan 11 17:56:31 crc kubenswrapper[4837]: I0111 17:56:31.709868 4837 scope.go:117] "RemoveContainer" containerID="386959a2766370441912269a1948f998c50115473f331eea883846845cf75913" Jan 11 17:56:31 crc kubenswrapper[4837]: I0111 17:56:31.709890 4837 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-v65l5" Jan 11 17:56:31 crc kubenswrapper[4837]: I0111 17:56:31.741784 4837 scope.go:117] "RemoveContainer" containerID="e6536992081c7b85a03cdc73ebd14d912f118e2477016442dff6e307aee14145" Jan 11 17:56:31 crc kubenswrapper[4837]: I0111 17:56:31.770783 4837 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-v65l5"] Jan 11 17:56:31 crc kubenswrapper[4837]: I0111 17:56:31.777603 4837 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-v65l5"] Jan 11 17:56:31 crc kubenswrapper[4837]: I0111 17:56:31.795539 4837 scope.go:117] "RemoveContainer" containerID="0ce42166e5b006bacf9ac9489f5acd9d44456074914d62109a5ea2e30ec36cfe" Jan 11 17:56:32 crc kubenswrapper[4837]: I0111 17:56:32.375275 4837 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="17d0097b-a94a-4a98-8ea3-1c79c144b42b" path="/var/lib/kubelet/pods/17d0097b-a94a-4a98-8ea3-1c79c144b42b/volumes" Jan 11 17:56:34 crc kubenswrapper[4837]: I0111 17:56:34.733901 4837 scope.go:117] "RemoveContainer" containerID="1289c566c9651fd8ce1f4b7af45c7b5db7bc43fe1f2044142a2eea0e3c3c33cb" Jan 11 17:56:34 crc kubenswrapper[4837]: I0111 17:56:34.788788 4837 scope.go:117] "RemoveContainer" containerID="f5e5a6d7b3e8107fad32fcdd4674d216875cbcfa938175bd760e22e94ee75e3e" Jan 11 17:56:34 crc kubenswrapper[4837]: I0111 17:56:34.850209 4837 scope.go:117] "RemoveContainer" containerID="221ed1d81791c870665aa671ba57922eeda80d386e82af5182a48ba03b593cf1" Jan 11 17:56:34 crc kubenswrapper[4837]: I0111 17:56:34.884272 4837 scope.go:117] "RemoveContainer" containerID="53fb0ce9a8cb569af17e8a40682754a192e1d5c1f46b4b28c0b252a77c3822f0" Jan 11 17:56:39 crc kubenswrapper[4837]: I0111 17:56:39.443775 4837 patch_prober.go:28] interesting pod/machine-config-daemon-pqnst container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Jan 11 17:56:39 crc kubenswrapper[4837]: I0111 17:56:39.444552 4837 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-pqnst" podUID="1cf6fa66-290a-4e29-bafc-e60185a22fe2" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Jan 11 17:56:44 crc kubenswrapper[4837]: I0111 17:56:44.521958 4837 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-58smc"] Jan 11 17:56:44 crc kubenswrapper[4837]: E0111 17:56:44.523108 4837 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="17d0097b-a94a-4a98-8ea3-1c79c144b42b" containerName="extract-content" Jan 11 17:56:44 crc kubenswrapper[4837]: I0111 17:56:44.523126 4837 state_mem.go:107] "Deleted CPUSet assignment" podUID="17d0097b-a94a-4a98-8ea3-1c79c144b42b" containerName="extract-content" Jan 11 17:56:44 crc kubenswrapper[4837]: E0111 17:56:44.523149 4837 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="17d0097b-a94a-4a98-8ea3-1c79c144b42b" containerName="extract-utilities" Jan 11 17:56:44 crc kubenswrapper[4837]: I0111 17:56:44.523158 4837 state_mem.go:107] "Deleted CPUSet assignment" podUID="17d0097b-a94a-4a98-8ea3-1c79c144b42b" containerName="extract-utilities" Jan 11 17:56:44 crc kubenswrapper[4837]: E0111 17:56:44.523195 4837 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="17d0097b-a94a-4a98-8ea3-1c79c144b42b" containerName="registry-server" Jan 11 17:56:44 crc kubenswrapper[4837]: I0111 17:56:44.523207 4837 state_mem.go:107] "Deleted CPUSet assignment" podUID="17d0097b-a94a-4a98-8ea3-1c79c144b42b" containerName="registry-server" Jan 11 17:56:44 crc kubenswrapper[4837]: I0111 17:56:44.523517 4837 memory_manager.go:354] "RemoveStaleState removing state" podUID="17d0097b-a94a-4a98-8ea3-1c79c144b42b" containerName="registry-server" Jan 11 17:56:44 crc kubenswrapper[4837]: I0111 17:56:44.526254 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-58smc" Jan 11 17:56:44 crc kubenswrapper[4837]: I0111 17:56:44.530258 4837 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-58smc"] Jan 11 17:56:44 crc kubenswrapper[4837]: I0111 17:56:44.679092 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qdfgh\" (UniqueName: \"kubernetes.io/projected/cfb98456-deac-4bcd-8b35-5fcb3b88b848-kube-api-access-qdfgh\") pod \"community-operators-58smc\" (UID: \"cfb98456-deac-4bcd-8b35-5fcb3b88b848\") " pod="openshift-marketplace/community-operators-58smc" Jan 11 17:56:44 crc kubenswrapper[4837]: I0111 17:56:44.679160 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/cfb98456-deac-4bcd-8b35-5fcb3b88b848-catalog-content\") pod \"community-operators-58smc\" (UID: \"cfb98456-deac-4bcd-8b35-5fcb3b88b848\") " pod="openshift-marketplace/community-operators-58smc" Jan 11 17:56:44 crc kubenswrapper[4837]: I0111 17:56:44.679191 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/cfb98456-deac-4bcd-8b35-5fcb3b88b848-utilities\") pod \"community-operators-58smc\" (UID: \"cfb98456-deac-4bcd-8b35-5fcb3b88b848\") " pod="openshift-marketplace/community-operators-58smc" Jan 11 17:56:44 crc kubenswrapper[4837]: I0111 17:56:44.780601 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qdfgh\" (UniqueName: \"kubernetes.io/projected/cfb98456-deac-4bcd-8b35-5fcb3b88b848-kube-api-access-qdfgh\") pod \"community-operators-58smc\" (UID: \"cfb98456-deac-4bcd-8b35-5fcb3b88b848\") " pod="openshift-marketplace/community-operators-58smc" Jan 11 17:56:44 crc kubenswrapper[4837]: I0111 17:56:44.780924 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/cfb98456-deac-4bcd-8b35-5fcb3b88b848-catalog-content\") pod \"community-operators-58smc\" (UID: \"cfb98456-deac-4bcd-8b35-5fcb3b88b848\") " pod="openshift-marketplace/community-operators-58smc" Jan 11 17:56:44 crc kubenswrapper[4837]: I0111 17:56:44.781003 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/cfb98456-deac-4bcd-8b35-5fcb3b88b848-utilities\") pod \"community-operators-58smc\" (UID: \"cfb98456-deac-4bcd-8b35-5fcb3b88b848\") " pod="openshift-marketplace/community-operators-58smc" Jan 11 17:56:44 crc kubenswrapper[4837]: I0111 17:56:44.781406 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/cfb98456-deac-4bcd-8b35-5fcb3b88b848-catalog-content\") pod \"community-operators-58smc\" (UID: \"cfb98456-deac-4bcd-8b35-5fcb3b88b848\") " pod="openshift-marketplace/community-operators-58smc" Jan 11 17:56:44 crc kubenswrapper[4837]: I0111 17:56:44.781448 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/cfb98456-deac-4bcd-8b35-5fcb3b88b848-utilities\") pod \"community-operators-58smc\" (UID: \"cfb98456-deac-4bcd-8b35-5fcb3b88b848\") " pod="openshift-marketplace/community-operators-58smc" Jan 11 17:56:44 crc kubenswrapper[4837]: I0111 17:56:44.799556 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qdfgh\" (UniqueName: \"kubernetes.io/projected/cfb98456-deac-4bcd-8b35-5fcb3b88b848-kube-api-access-qdfgh\") pod \"community-operators-58smc\" (UID: \"cfb98456-deac-4bcd-8b35-5fcb3b88b848\") " pod="openshift-marketplace/community-operators-58smc" Jan 11 17:56:44 crc kubenswrapper[4837]: I0111 17:56:44.864576 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-58smc" Jan 11 17:56:45 crc kubenswrapper[4837]: I0111 17:56:45.478515 4837 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-58smc"] Jan 11 17:56:45 crc kubenswrapper[4837]: I0111 17:56:45.883190 4837 generic.go:334] "Generic (PLEG): container finished" podID="cfb98456-deac-4bcd-8b35-5fcb3b88b848" containerID="c089ad258c5fba9b09e46b58e60587c51a2659a302328f4f5672447c4dcede1a" exitCode=0 Jan 11 17:56:45 crc kubenswrapper[4837]: I0111 17:56:45.883496 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-58smc" event={"ID":"cfb98456-deac-4bcd-8b35-5fcb3b88b848","Type":"ContainerDied","Data":"c089ad258c5fba9b09e46b58e60587c51a2659a302328f4f5672447c4dcede1a"} Jan 11 17:56:45 crc kubenswrapper[4837]: I0111 17:56:45.884806 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-58smc" event={"ID":"cfb98456-deac-4bcd-8b35-5fcb3b88b848","Type":"ContainerStarted","Data":"c3f70cc170d6f93d69091bf6b4a7b31cb2125d073f1a53ae77485d4fec25d1cf"} Jan 11 17:56:47 crc kubenswrapper[4837]: I0111 17:56:47.914413 4837 generic.go:334] "Generic (PLEG): container finished" podID="cfb98456-deac-4bcd-8b35-5fcb3b88b848" containerID="b72b1d88c048f96faca6f0d93c5fdce269bb4efca3ac2dd4420de3687a30ea84" exitCode=0 Jan 11 17:56:47 crc kubenswrapper[4837]: I0111 17:56:47.914984 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-58smc" event={"ID":"cfb98456-deac-4bcd-8b35-5fcb3b88b848","Type":"ContainerDied","Data":"b72b1d88c048f96faca6f0d93c5fdce269bb4efca3ac2dd4420de3687a30ea84"} Jan 11 17:56:48 crc kubenswrapper[4837]: I0111 17:56:48.928439 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-58smc" event={"ID":"cfb98456-deac-4bcd-8b35-5fcb3b88b848","Type":"ContainerStarted","Data":"e63ac5e74f25a0eb7699d12edce031adb716f80109fff3d46a556d40c3d39e59"} Jan 11 17:56:48 crc kubenswrapper[4837]: I0111 17:56:48.956228 4837 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-58smc" podStartSLOduration=2.494226875 podStartE2EDuration="4.956206689s" podCreationTimestamp="2026-01-11 17:56:44 +0000 UTC" firstStartedPulling="2026-01-11 17:56:45.885627387 +0000 UTC m=+1580.063820133" lastFinishedPulling="2026-01-11 17:56:48.347607211 +0000 UTC m=+1582.525799947" observedRunningTime="2026-01-11 17:56:48.950101784 +0000 UTC m=+1583.128294540" watchObservedRunningTime="2026-01-11 17:56:48.956206689 +0000 UTC m=+1583.134399405" Jan 11 17:56:54 crc kubenswrapper[4837]: I0111 17:56:54.865535 4837 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-58smc" Jan 11 17:56:54 crc kubenswrapper[4837]: I0111 17:56:54.866285 4837 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-58smc" Jan 11 17:56:54 crc kubenswrapper[4837]: I0111 17:56:54.925424 4837 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-58smc" Jan 11 17:56:55 crc kubenswrapper[4837]: I0111 17:56:55.054271 4837 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-58smc" Jan 11 17:56:55 crc kubenswrapper[4837]: I0111 17:56:55.165535 4837 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-58smc"] Jan 11 17:56:57 crc kubenswrapper[4837]: I0111 17:56:57.023025 4837 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-58smc" podUID="cfb98456-deac-4bcd-8b35-5fcb3b88b848" containerName="registry-server" containerID="cri-o://e63ac5e74f25a0eb7699d12edce031adb716f80109fff3d46a556d40c3d39e59" gracePeriod=2 Jan 11 17:56:57 crc kubenswrapper[4837]: I0111 17:56:57.516606 4837 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-58smc" Jan 11 17:56:57 crc kubenswrapper[4837]: I0111 17:56:57.566435 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/cfb98456-deac-4bcd-8b35-5fcb3b88b848-utilities\") pod \"cfb98456-deac-4bcd-8b35-5fcb3b88b848\" (UID: \"cfb98456-deac-4bcd-8b35-5fcb3b88b848\") " Jan 11 17:56:57 crc kubenswrapper[4837]: I0111 17:56:57.566481 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/cfb98456-deac-4bcd-8b35-5fcb3b88b848-catalog-content\") pod \"cfb98456-deac-4bcd-8b35-5fcb3b88b848\" (UID: \"cfb98456-deac-4bcd-8b35-5fcb3b88b848\") " Jan 11 17:56:57 crc kubenswrapper[4837]: I0111 17:56:57.566615 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qdfgh\" (UniqueName: \"kubernetes.io/projected/cfb98456-deac-4bcd-8b35-5fcb3b88b848-kube-api-access-qdfgh\") pod \"cfb98456-deac-4bcd-8b35-5fcb3b88b848\" (UID: \"cfb98456-deac-4bcd-8b35-5fcb3b88b848\") " Jan 11 17:56:57 crc kubenswrapper[4837]: I0111 17:56:57.569928 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/cfb98456-deac-4bcd-8b35-5fcb3b88b848-utilities" (OuterVolumeSpecName: "utilities") pod "cfb98456-deac-4bcd-8b35-5fcb3b88b848" (UID: "cfb98456-deac-4bcd-8b35-5fcb3b88b848"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 11 17:56:57 crc kubenswrapper[4837]: I0111 17:56:57.575525 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/cfb98456-deac-4bcd-8b35-5fcb3b88b848-kube-api-access-qdfgh" (OuterVolumeSpecName: "kube-api-access-qdfgh") pod "cfb98456-deac-4bcd-8b35-5fcb3b88b848" (UID: "cfb98456-deac-4bcd-8b35-5fcb3b88b848"). InnerVolumeSpecName "kube-api-access-qdfgh". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 11 17:56:57 crc kubenswrapper[4837]: I0111 17:56:57.621721 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/cfb98456-deac-4bcd-8b35-5fcb3b88b848-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "cfb98456-deac-4bcd-8b35-5fcb3b88b848" (UID: "cfb98456-deac-4bcd-8b35-5fcb3b88b848"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 11 17:56:57 crc kubenswrapper[4837]: I0111 17:56:57.668575 4837 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qdfgh\" (UniqueName: \"kubernetes.io/projected/cfb98456-deac-4bcd-8b35-5fcb3b88b848-kube-api-access-qdfgh\") on node \"crc\" DevicePath \"\"" Jan 11 17:56:57 crc kubenswrapper[4837]: I0111 17:56:57.668614 4837 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/cfb98456-deac-4bcd-8b35-5fcb3b88b848-utilities\") on node \"crc\" DevicePath \"\"" Jan 11 17:56:57 crc kubenswrapper[4837]: I0111 17:56:57.668625 4837 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/cfb98456-deac-4bcd-8b35-5fcb3b88b848-catalog-content\") on node \"crc\" DevicePath \"\"" Jan 11 17:56:58 crc kubenswrapper[4837]: I0111 17:56:58.042213 4837 generic.go:334] "Generic (PLEG): container finished" podID="cfb98456-deac-4bcd-8b35-5fcb3b88b848" containerID="e63ac5e74f25a0eb7699d12edce031adb716f80109fff3d46a556d40c3d39e59" exitCode=0 Jan 11 17:56:58 crc kubenswrapper[4837]: I0111 17:56:58.042268 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-58smc" event={"ID":"cfb98456-deac-4bcd-8b35-5fcb3b88b848","Type":"ContainerDied","Data":"e63ac5e74f25a0eb7699d12edce031adb716f80109fff3d46a556d40c3d39e59"} Jan 11 17:56:58 crc kubenswrapper[4837]: I0111 17:56:58.042304 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-58smc" event={"ID":"cfb98456-deac-4bcd-8b35-5fcb3b88b848","Type":"ContainerDied","Data":"c3f70cc170d6f93d69091bf6b4a7b31cb2125d073f1a53ae77485d4fec25d1cf"} Jan 11 17:56:58 crc kubenswrapper[4837]: I0111 17:56:58.042327 4837 scope.go:117] "RemoveContainer" containerID="e63ac5e74f25a0eb7699d12edce031adb716f80109fff3d46a556d40c3d39e59" Jan 11 17:56:58 crc kubenswrapper[4837]: I0111 17:56:58.042334 4837 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-58smc" Jan 11 17:56:58 crc kubenswrapper[4837]: I0111 17:56:58.081074 4837 scope.go:117] "RemoveContainer" containerID="b72b1d88c048f96faca6f0d93c5fdce269bb4efca3ac2dd4420de3687a30ea84" Jan 11 17:56:58 crc kubenswrapper[4837]: I0111 17:56:58.125899 4837 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-58smc"] Jan 11 17:56:58 crc kubenswrapper[4837]: I0111 17:56:58.129572 4837 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-58smc"] Jan 11 17:56:58 crc kubenswrapper[4837]: I0111 17:56:58.158012 4837 scope.go:117] "RemoveContainer" containerID="c089ad258c5fba9b09e46b58e60587c51a2659a302328f4f5672447c4dcede1a" Jan 11 17:56:58 crc kubenswrapper[4837]: I0111 17:56:58.179668 4837 scope.go:117] "RemoveContainer" containerID="e63ac5e74f25a0eb7699d12edce031adb716f80109fff3d46a556d40c3d39e59" Jan 11 17:56:58 crc kubenswrapper[4837]: E0111 17:56:58.180434 4837 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e63ac5e74f25a0eb7699d12edce031adb716f80109fff3d46a556d40c3d39e59\": container with ID starting with e63ac5e74f25a0eb7699d12edce031adb716f80109fff3d46a556d40c3d39e59 not found: ID does not exist" containerID="e63ac5e74f25a0eb7699d12edce031adb716f80109fff3d46a556d40c3d39e59" Jan 11 17:56:58 crc kubenswrapper[4837]: I0111 17:56:58.180515 4837 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e63ac5e74f25a0eb7699d12edce031adb716f80109fff3d46a556d40c3d39e59"} err="failed to get container status \"e63ac5e74f25a0eb7699d12edce031adb716f80109fff3d46a556d40c3d39e59\": rpc error: code = NotFound desc = could not find container \"e63ac5e74f25a0eb7699d12edce031adb716f80109fff3d46a556d40c3d39e59\": container with ID starting with e63ac5e74f25a0eb7699d12edce031adb716f80109fff3d46a556d40c3d39e59 not found: ID does not exist" Jan 11 17:56:58 crc kubenswrapper[4837]: I0111 17:56:58.180559 4837 scope.go:117] "RemoveContainer" containerID="b72b1d88c048f96faca6f0d93c5fdce269bb4efca3ac2dd4420de3687a30ea84" Jan 11 17:56:58 crc kubenswrapper[4837]: E0111 17:56:58.181591 4837 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b72b1d88c048f96faca6f0d93c5fdce269bb4efca3ac2dd4420de3687a30ea84\": container with ID starting with b72b1d88c048f96faca6f0d93c5fdce269bb4efca3ac2dd4420de3687a30ea84 not found: ID does not exist" containerID="b72b1d88c048f96faca6f0d93c5fdce269bb4efca3ac2dd4420de3687a30ea84" Jan 11 17:56:58 crc kubenswrapper[4837]: I0111 17:56:58.181638 4837 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b72b1d88c048f96faca6f0d93c5fdce269bb4efca3ac2dd4420de3687a30ea84"} err="failed to get container status \"b72b1d88c048f96faca6f0d93c5fdce269bb4efca3ac2dd4420de3687a30ea84\": rpc error: code = NotFound desc = could not find container \"b72b1d88c048f96faca6f0d93c5fdce269bb4efca3ac2dd4420de3687a30ea84\": container with ID starting with b72b1d88c048f96faca6f0d93c5fdce269bb4efca3ac2dd4420de3687a30ea84 not found: ID does not exist" Jan 11 17:56:58 crc kubenswrapper[4837]: I0111 17:56:58.181668 4837 scope.go:117] "RemoveContainer" containerID="c089ad258c5fba9b09e46b58e60587c51a2659a302328f4f5672447c4dcede1a" Jan 11 17:56:58 crc kubenswrapper[4837]: E0111 17:56:58.182223 4837 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"c089ad258c5fba9b09e46b58e60587c51a2659a302328f4f5672447c4dcede1a\": container with ID starting with c089ad258c5fba9b09e46b58e60587c51a2659a302328f4f5672447c4dcede1a not found: ID does not exist" containerID="c089ad258c5fba9b09e46b58e60587c51a2659a302328f4f5672447c4dcede1a" Jan 11 17:56:58 crc kubenswrapper[4837]: I0111 17:56:58.182271 4837 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c089ad258c5fba9b09e46b58e60587c51a2659a302328f4f5672447c4dcede1a"} err="failed to get container status \"c089ad258c5fba9b09e46b58e60587c51a2659a302328f4f5672447c4dcede1a\": rpc error: code = NotFound desc = could not find container \"c089ad258c5fba9b09e46b58e60587c51a2659a302328f4f5672447c4dcede1a\": container with ID starting with c089ad258c5fba9b09e46b58e60587c51a2659a302328f4f5672447c4dcede1a not found: ID does not exist" Jan 11 17:56:58 crc kubenswrapper[4837]: I0111 17:56:58.376360 4837 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="cfb98456-deac-4bcd-8b35-5fcb3b88b848" path="/var/lib/kubelet/pods/cfb98456-deac-4bcd-8b35-5fcb3b88b848/volumes" Jan 11 17:57:00 crc kubenswrapper[4837]: I0111 17:57:00.588786 4837 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-hcrsk"] Jan 11 17:57:00 crc kubenswrapper[4837]: E0111 17:57:00.592548 4837 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="cfb98456-deac-4bcd-8b35-5fcb3b88b848" containerName="registry-server" Jan 11 17:57:00 crc kubenswrapper[4837]: I0111 17:57:00.595931 4837 state_mem.go:107] "Deleted CPUSet assignment" podUID="cfb98456-deac-4bcd-8b35-5fcb3b88b848" containerName="registry-server" Jan 11 17:57:00 crc kubenswrapper[4837]: E0111 17:57:00.596207 4837 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="cfb98456-deac-4bcd-8b35-5fcb3b88b848" containerName="extract-content" Jan 11 17:57:00 crc kubenswrapper[4837]: I0111 17:57:00.596355 4837 state_mem.go:107] "Deleted CPUSet assignment" podUID="cfb98456-deac-4bcd-8b35-5fcb3b88b848" containerName="extract-content" Jan 11 17:57:00 crc kubenswrapper[4837]: E0111 17:57:00.596599 4837 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="cfb98456-deac-4bcd-8b35-5fcb3b88b848" containerName="extract-utilities" Jan 11 17:57:00 crc kubenswrapper[4837]: I0111 17:57:00.596809 4837 state_mem.go:107] "Deleted CPUSet assignment" podUID="cfb98456-deac-4bcd-8b35-5fcb3b88b848" containerName="extract-utilities" Jan 11 17:57:00 crc kubenswrapper[4837]: I0111 17:57:00.597563 4837 memory_manager.go:354] "RemoveStaleState removing state" podUID="cfb98456-deac-4bcd-8b35-5fcb3b88b848" containerName="registry-server" Jan 11 17:57:00 crc kubenswrapper[4837]: I0111 17:57:00.606140 4837 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-hcrsk"] Jan 11 17:57:00 crc kubenswrapper[4837]: I0111 17:57:00.606321 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-hcrsk" Jan 11 17:57:00 crc kubenswrapper[4837]: I0111 17:57:00.734090 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/6a03b007-f7ad-473a-b43b-1acfe4ec7fce-utilities\") pod \"redhat-marketplace-hcrsk\" (UID: \"6a03b007-f7ad-473a-b43b-1acfe4ec7fce\") " pod="openshift-marketplace/redhat-marketplace-hcrsk" Jan 11 17:57:00 crc kubenswrapper[4837]: I0111 17:57:00.734859 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-k2p59\" (UniqueName: \"kubernetes.io/projected/6a03b007-f7ad-473a-b43b-1acfe4ec7fce-kube-api-access-k2p59\") pod \"redhat-marketplace-hcrsk\" (UID: \"6a03b007-f7ad-473a-b43b-1acfe4ec7fce\") " pod="openshift-marketplace/redhat-marketplace-hcrsk" Jan 11 17:57:00 crc kubenswrapper[4837]: I0111 17:57:00.734897 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/6a03b007-f7ad-473a-b43b-1acfe4ec7fce-catalog-content\") pod \"redhat-marketplace-hcrsk\" (UID: \"6a03b007-f7ad-473a-b43b-1acfe4ec7fce\") " pod="openshift-marketplace/redhat-marketplace-hcrsk" Jan 11 17:57:00 crc kubenswrapper[4837]: I0111 17:57:00.837152 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/6a03b007-f7ad-473a-b43b-1acfe4ec7fce-utilities\") pod \"redhat-marketplace-hcrsk\" (UID: \"6a03b007-f7ad-473a-b43b-1acfe4ec7fce\") " pod="openshift-marketplace/redhat-marketplace-hcrsk" Jan 11 17:57:00 crc kubenswrapper[4837]: I0111 17:57:00.837301 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-k2p59\" (UniqueName: \"kubernetes.io/projected/6a03b007-f7ad-473a-b43b-1acfe4ec7fce-kube-api-access-k2p59\") pod \"redhat-marketplace-hcrsk\" (UID: \"6a03b007-f7ad-473a-b43b-1acfe4ec7fce\") " pod="openshift-marketplace/redhat-marketplace-hcrsk" Jan 11 17:57:00 crc kubenswrapper[4837]: I0111 17:57:00.837333 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/6a03b007-f7ad-473a-b43b-1acfe4ec7fce-catalog-content\") pod \"redhat-marketplace-hcrsk\" (UID: \"6a03b007-f7ad-473a-b43b-1acfe4ec7fce\") " pod="openshift-marketplace/redhat-marketplace-hcrsk" Jan 11 17:57:00 crc kubenswrapper[4837]: I0111 17:57:00.837789 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/6a03b007-f7ad-473a-b43b-1acfe4ec7fce-utilities\") pod \"redhat-marketplace-hcrsk\" (UID: \"6a03b007-f7ad-473a-b43b-1acfe4ec7fce\") " pod="openshift-marketplace/redhat-marketplace-hcrsk" Jan 11 17:57:00 crc kubenswrapper[4837]: I0111 17:57:00.837870 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/6a03b007-f7ad-473a-b43b-1acfe4ec7fce-catalog-content\") pod \"redhat-marketplace-hcrsk\" (UID: \"6a03b007-f7ad-473a-b43b-1acfe4ec7fce\") " pod="openshift-marketplace/redhat-marketplace-hcrsk" Jan 11 17:57:00 crc kubenswrapper[4837]: I0111 17:57:00.857703 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-k2p59\" (UniqueName: \"kubernetes.io/projected/6a03b007-f7ad-473a-b43b-1acfe4ec7fce-kube-api-access-k2p59\") pod \"redhat-marketplace-hcrsk\" (UID: \"6a03b007-f7ad-473a-b43b-1acfe4ec7fce\") " pod="openshift-marketplace/redhat-marketplace-hcrsk" Jan 11 17:57:00 crc kubenswrapper[4837]: I0111 17:57:00.933957 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-hcrsk" Jan 11 17:57:01 crc kubenswrapper[4837]: I0111 17:57:01.416157 4837 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-hcrsk"] Jan 11 17:57:01 crc kubenswrapper[4837]: W0111 17:57:01.428964 4837 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod6a03b007_f7ad_473a_b43b_1acfe4ec7fce.slice/crio-e585791312b2e90dfe846b7efd0bcf9ed3fd8b95fc5b866fd272a9af35db046e WatchSource:0}: Error finding container e585791312b2e90dfe846b7efd0bcf9ed3fd8b95fc5b866fd272a9af35db046e: Status 404 returned error can't find the container with id e585791312b2e90dfe846b7efd0bcf9ed3fd8b95fc5b866fd272a9af35db046e Jan 11 17:57:02 crc kubenswrapper[4837]: I0111 17:57:02.093236 4837 generic.go:334] "Generic (PLEG): container finished" podID="6a03b007-f7ad-473a-b43b-1acfe4ec7fce" containerID="8ffe739b85d0fb6403569e9722459768dbff2e5f20445647babee98a5d15d51d" exitCode=0 Jan 11 17:57:02 crc kubenswrapper[4837]: I0111 17:57:02.093361 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-hcrsk" event={"ID":"6a03b007-f7ad-473a-b43b-1acfe4ec7fce","Type":"ContainerDied","Data":"8ffe739b85d0fb6403569e9722459768dbff2e5f20445647babee98a5d15d51d"} Jan 11 17:57:02 crc kubenswrapper[4837]: I0111 17:57:02.093644 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-hcrsk" event={"ID":"6a03b007-f7ad-473a-b43b-1acfe4ec7fce","Type":"ContainerStarted","Data":"e585791312b2e90dfe846b7efd0bcf9ed3fd8b95fc5b866fd272a9af35db046e"} Jan 11 17:57:04 crc kubenswrapper[4837]: I0111 17:57:04.115963 4837 generic.go:334] "Generic (PLEG): container finished" podID="6a03b007-f7ad-473a-b43b-1acfe4ec7fce" containerID="e99a37888f186532b333af29ac25cccfe46f0e759232e50dbae877667f0d28c7" exitCode=0 Jan 11 17:57:04 crc kubenswrapper[4837]: I0111 17:57:04.116020 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-hcrsk" event={"ID":"6a03b007-f7ad-473a-b43b-1acfe4ec7fce","Type":"ContainerDied","Data":"e99a37888f186532b333af29ac25cccfe46f0e759232e50dbae877667f0d28c7"} Jan 11 17:57:05 crc kubenswrapper[4837]: I0111 17:57:05.128456 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-hcrsk" event={"ID":"6a03b007-f7ad-473a-b43b-1acfe4ec7fce","Type":"ContainerStarted","Data":"4961d2ed7bfdb93004316cc57c89c133c6699384b034ce2127177418b940163b"} Jan 11 17:57:05 crc kubenswrapper[4837]: I0111 17:57:05.159946 4837 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-hcrsk" podStartSLOduration=2.599836704 podStartE2EDuration="5.159922122s" podCreationTimestamp="2026-01-11 17:57:00 +0000 UTC" firstStartedPulling="2026-01-11 17:57:02.095955828 +0000 UTC m=+1596.274148574" lastFinishedPulling="2026-01-11 17:57:04.656041266 +0000 UTC m=+1598.834233992" observedRunningTime="2026-01-11 17:57:05.148762042 +0000 UTC m=+1599.326954758" watchObservedRunningTime="2026-01-11 17:57:05.159922122 +0000 UTC m=+1599.338114838" Jan 11 17:57:09 crc kubenswrapper[4837]: I0111 17:57:09.444067 4837 patch_prober.go:28] interesting pod/machine-config-daemon-pqnst container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Jan 11 17:57:09 crc kubenswrapper[4837]: I0111 17:57:09.444601 4837 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-pqnst" podUID="1cf6fa66-290a-4e29-bafc-e60185a22fe2" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Jan 11 17:57:10 crc kubenswrapper[4837]: I0111 17:57:10.935865 4837 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-hcrsk" Jan 11 17:57:10 crc kubenswrapper[4837]: I0111 17:57:10.937625 4837 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-hcrsk" Jan 11 17:57:11 crc kubenswrapper[4837]: I0111 17:57:11.025873 4837 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-hcrsk" Jan 11 17:57:11 crc kubenswrapper[4837]: I0111 17:57:11.256901 4837 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-hcrsk" Jan 11 17:57:11 crc kubenswrapper[4837]: I0111 17:57:11.322407 4837 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-hcrsk"] Jan 11 17:57:13 crc kubenswrapper[4837]: I0111 17:57:13.226067 4837 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-hcrsk" podUID="6a03b007-f7ad-473a-b43b-1acfe4ec7fce" containerName="registry-server" containerID="cri-o://4961d2ed7bfdb93004316cc57c89c133c6699384b034ce2127177418b940163b" gracePeriod=2 Jan 11 17:57:13 crc kubenswrapper[4837]: I0111 17:57:13.786236 4837 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-hcrsk" Jan 11 17:57:13 crc kubenswrapper[4837]: I0111 17:57:13.941300 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/6a03b007-f7ad-473a-b43b-1acfe4ec7fce-utilities\") pod \"6a03b007-f7ad-473a-b43b-1acfe4ec7fce\" (UID: \"6a03b007-f7ad-473a-b43b-1acfe4ec7fce\") " Jan 11 17:57:13 crc kubenswrapper[4837]: I0111 17:57:13.941517 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/6a03b007-f7ad-473a-b43b-1acfe4ec7fce-catalog-content\") pod \"6a03b007-f7ad-473a-b43b-1acfe4ec7fce\" (UID: \"6a03b007-f7ad-473a-b43b-1acfe4ec7fce\") " Jan 11 17:57:13 crc kubenswrapper[4837]: I0111 17:57:13.941605 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-k2p59\" (UniqueName: \"kubernetes.io/projected/6a03b007-f7ad-473a-b43b-1acfe4ec7fce-kube-api-access-k2p59\") pod \"6a03b007-f7ad-473a-b43b-1acfe4ec7fce\" (UID: \"6a03b007-f7ad-473a-b43b-1acfe4ec7fce\") " Jan 11 17:57:13 crc kubenswrapper[4837]: I0111 17:57:13.942180 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/6a03b007-f7ad-473a-b43b-1acfe4ec7fce-utilities" (OuterVolumeSpecName: "utilities") pod "6a03b007-f7ad-473a-b43b-1acfe4ec7fce" (UID: "6a03b007-f7ad-473a-b43b-1acfe4ec7fce"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 11 17:57:13 crc kubenswrapper[4837]: I0111 17:57:13.942496 4837 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/6a03b007-f7ad-473a-b43b-1acfe4ec7fce-utilities\") on node \"crc\" DevicePath \"\"" Jan 11 17:57:13 crc kubenswrapper[4837]: I0111 17:57:13.957121 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6a03b007-f7ad-473a-b43b-1acfe4ec7fce-kube-api-access-k2p59" (OuterVolumeSpecName: "kube-api-access-k2p59") pod "6a03b007-f7ad-473a-b43b-1acfe4ec7fce" (UID: "6a03b007-f7ad-473a-b43b-1acfe4ec7fce"). InnerVolumeSpecName "kube-api-access-k2p59". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 11 17:57:13 crc kubenswrapper[4837]: I0111 17:57:13.971111 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/6a03b007-f7ad-473a-b43b-1acfe4ec7fce-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "6a03b007-f7ad-473a-b43b-1acfe4ec7fce" (UID: "6a03b007-f7ad-473a-b43b-1acfe4ec7fce"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 11 17:57:14 crc kubenswrapper[4837]: I0111 17:57:14.044766 4837 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/6a03b007-f7ad-473a-b43b-1acfe4ec7fce-catalog-content\") on node \"crc\" DevicePath \"\"" Jan 11 17:57:14 crc kubenswrapper[4837]: I0111 17:57:14.044814 4837 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-k2p59\" (UniqueName: \"kubernetes.io/projected/6a03b007-f7ad-473a-b43b-1acfe4ec7fce-kube-api-access-k2p59\") on node \"crc\" DevicePath \"\"" Jan 11 17:57:14 crc kubenswrapper[4837]: I0111 17:57:14.245329 4837 generic.go:334] "Generic (PLEG): container finished" podID="6a03b007-f7ad-473a-b43b-1acfe4ec7fce" containerID="4961d2ed7bfdb93004316cc57c89c133c6699384b034ce2127177418b940163b" exitCode=0 Jan 11 17:57:14 crc kubenswrapper[4837]: I0111 17:57:14.245374 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-hcrsk" event={"ID":"6a03b007-f7ad-473a-b43b-1acfe4ec7fce","Type":"ContainerDied","Data":"4961d2ed7bfdb93004316cc57c89c133c6699384b034ce2127177418b940163b"} Jan 11 17:57:14 crc kubenswrapper[4837]: I0111 17:57:14.245422 4837 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-hcrsk" Jan 11 17:57:14 crc kubenswrapper[4837]: I0111 17:57:14.245516 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-hcrsk" event={"ID":"6a03b007-f7ad-473a-b43b-1acfe4ec7fce","Type":"ContainerDied","Data":"e585791312b2e90dfe846b7efd0bcf9ed3fd8b95fc5b866fd272a9af35db046e"} Jan 11 17:57:14 crc kubenswrapper[4837]: I0111 17:57:14.245602 4837 scope.go:117] "RemoveContainer" containerID="4961d2ed7bfdb93004316cc57c89c133c6699384b034ce2127177418b940163b" Jan 11 17:57:14 crc kubenswrapper[4837]: I0111 17:57:14.279664 4837 scope.go:117] "RemoveContainer" containerID="e99a37888f186532b333af29ac25cccfe46f0e759232e50dbae877667f0d28c7" Jan 11 17:57:14 crc kubenswrapper[4837]: I0111 17:57:14.306708 4837 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-hcrsk"] Jan 11 17:57:14 crc kubenswrapper[4837]: I0111 17:57:14.331309 4837 scope.go:117] "RemoveContainer" containerID="8ffe739b85d0fb6403569e9722459768dbff2e5f20445647babee98a5d15d51d" Jan 11 17:57:14 crc kubenswrapper[4837]: I0111 17:57:14.336594 4837 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-hcrsk"] Jan 11 17:57:14 crc kubenswrapper[4837]: I0111 17:57:14.367301 4837 scope.go:117] "RemoveContainer" containerID="4961d2ed7bfdb93004316cc57c89c133c6699384b034ce2127177418b940163b" Jan 11 17:57:14 crc kubenswrapper[4837]: E0111 17:57:14.367857 4837 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"4961d2ed7bfdb93004316cc57c89c133c6699384b034ce2127177418b940163b\": container with ID starting with 4961d2ed7bfdb93004316cc57c89c133c6699384b034ce2127177418b940163b not found: ID does not exist" containerID="4961d2ed7bfdb93004316cc57c89c133c6699384b034ce2127177418b940163b" Jan 11 17:57:14 crc kubenswrapper[4837]: I0111 17:57:14.367897 4837 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4961d2ed7bfdb93004316cc57c89c133c6699384b034ce2127177418b940163b"} err="failed to get container status \"4961d2ed7bfdb93004316cc57c89c133c6699384b034ce2127177418b940163b\": rpc error: code = NotFound desc = could not find container \"4961d2ed7bfdb93004316cc57c89c133c6699384b034ce2127177418b940163b\": container with ID starting with 4961d2ed7bfdb93004316cc57c89c133c6699384b034ce2127177418b940163b not found: ID does not exist" Jan 11 17:57:14 crc kubenswrapper[4837]: I0111 17:57:14.367924 4837 scope.go:117] "RemoveContainer" containerID="e99a37888f186532b333af29ac25cccfe46f0e759232e50dbae877667f0d28c7" Jan 11 17:57:14 crc kubenswrapper[4837]: E0111 17:57:14.368305 4837 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e99a37888f186532b333af29ac25cccfe46f0e759232e50dbae877667f0d28c7\": container with ID starting with e99a37888f186532b333af29ac25cccfe46f0e759232e50dbae877667f0d28c7 not found: ID does not exist" containerID="e99a37888f186532b333af29ac25cccfe46f0e759232e50dbae877667f0d28c7" Jan 11 17:57:14 crc kubenswrapper[4837]: I0111 17:57:14.368338 4837 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e99a37888f186532b333af29ac25cccfe46f0e759232e50dbae877667f0d28c7"} err="failed to get container status \"e99a37888f186532b333af29ac25cccfe46f0e759232e50dbae877667f0d28c7\": rpc error: code = NotFound desc = could not find container \"e99a37888f186532b333af29ac25cccfe46f0e759232e50dbae877667f0d28c7\": container with ID starting with e99a37888f186532b333af29ac25cccfe46f0e759232e50dbae877667f0d28c7 not found: ID does not exist" Jan 11 17:57:14 crc kubenswrapper[4837]: I0111 17:57:14.368372 4837 scope.go:117] "RemoveContainer" containerID="8ffe739b85d0fb6403569e9722459768dbff2e5f20445647babee98a5d15d51d" Jan 11 17:57:14 crc kubenswrapper[4837]: E0111 17:57:14.368629 4837 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"8ffe739b85d0fb6403569e9722459768dbff2e5f20445647babee98a5d15d51d\": container with ID starting with 8ffe739b85d0fb6403569e9722459768dbff2e5f20445647babee98a5d15d51d not found: ID does not exist" containerID="8ffe739b85d0fb6403569e9722459768dbff2e5f20445647babee98a5d15d51d" Jan 11 17:57:14 crc kubenswrapper[4837]: I0111 17:57:14.368665 4837 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8ffe739b85d0fb6403569e9722459768dbff2e5f20445647babee98a5d15d51d"} err="failed to get container status \"8ffe739b85d0fb6403569e9722459768dbff2e5f20445647babee98a5d15d51d\": rpc error: code = NotFound desc = could not find container \"8ffe739b85d0fb6403569e9722459768dbff2e5f20445647babee98a5d15d51d\": container with ID starting with 8ffe739b85d0fb6403569e9722459768dbff2e5f20445647babee98a5d15d51d not found: ID does not exist" Jan 11 17:57:14 crc kubenswrapper[4837]: I0111 17:57:14.376659 4837 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6a03b007-f7ad-473a-b43b-1acfe4ec7fce" path="/var/lib/kubelet/pods/6a03b007-f7ad-473a-b43b-1acfe4ec7fce/volumes" Jan 11 17:57:35 crc kubenswrapper[4837]: I0111 17:57:35.018872 4837 scope.go:117] "RemoveContainer" containerID="b35778a0da294966f950416862c1a2d9bc895a95c830ed367d3ba78319fdda82" Jan 11 17:57:35 crc kubenswrapper[4837]: I0111 17:57:35.051202 4837 scope.go:117] "RemoveContainer" containerID="71ae436473ecaeccb052dd25701507bd1adfa22c1b2c4afb6588a78d857f1798" Jan 11 17:57:39 crc kubenswrapper[4837]: I0111 17:57:39.443733 4837 patch_prober.go:28] interesting pod/machine-config-daemon-pqnst container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Jan 11 17:57:39 crc kubenswrapper[4837]: I0111 17:57:39.444235 4837 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-pqnst" podUID="1cf6fa66-290a-4e29-bafc-e60185a22fe2" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Jan 11 17:57:39 crc kubenswrapper[4837]: I0111 17:57:39.444282 4837 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-pqnst" Jan 11 17:57:39 crc kubenswrapper[4837]: I0111 17:57:39.444929 4837 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"275040296db5cef9700aa2e2244ea724fe1663bdc93466378b56f7e33809d0f9"} pod="openshift-machine-config-operator/machine-config-daemon-pqnst" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Jan 11 17:57:39 crc kubenswrapper[4837]: I0111 17:57:39.444981 4837 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-pqnst" podUID="1cf6fa66-290a-4e29-bafc-e60185a22fe2" containerName="machine-config-daemon" containerID="cri-o://275040296db5cef9700aa2e2244ea724fe1663bdc93466378b56f7e33809d0f9" gracePeriod=600 Jan 11 17:57:39 crc kubenswrapper[4837]: E0111 17:57:39.772976 4837 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-pqnst_openshift-machine-config-operator(1cf6fa66-290a-4e29-bafc-e60185a22fe2)\"" pod="openshift-machine-config-operator/machine-config-daemon-pqnst" podUID="1cf6fa66-290a-4e29-bafc-e60185a22fe2" Jan 11 17:57:40 crc kubenswrapper[4837]: I0111 17:57:40.542186 4837 generic.go:334] "Generic (PLEG): container finished" podID="1cf6fa66-290a-4e29-bafc-e60185a22fe2" containerID="275040296db5cef9700aa2e2244ea724fe1663bdc93466378b56f7e33809d0f9" exitCode=0 Jan 11 17:57:40 crc kubenswrapper[4837]: I0111 17:57:40.542259 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-pqnst" event={"ID":"1cf6fa66-290a-4e29-bafc-e60185a22fe2","Type":"ContainerDied","Data":"275040296db5cef9700aa2e2244ea724fe1663bdc93466378b56f7e33809d0f9"} Jan 11 17:57:40 crc kubenswrapper[4837]: I0111 17:57:40.542547 4837 scope.go:117] "RemoveContainer" containerID="e9cdcfc59acae9ad70b4c250fd104a15cd4c9020810ce5e6aa418583f1bc934e" Jan 11 17:57:40 crc kubenswrapper[4837]: I0111 17:57:40.543235 4837 scope.go:117] "RemoveContainer" containerID="275040296db5cef9700aa2e2244ea724fe1663bdc93466378b56f7e33809d0f9" Jan 11 17:57:40 crc kubenswrapper[4837]: E0111 17:57:40.543615 4837 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-pqnst_openshift-machine-config-operator(1cf6fa66-290a-4e29-bafc-e60185a22fe2)\"" pod="openshift-machine-config-operator/machine-config-daemon-pqnst" podUID="1cf6fa66-290a-4e29-bafc-e60185a22fe2" Jan 11 17:57:51 crc kubenswrapper[4837]: I0111 17:57:51.364191 4837 scope.go:117] "RemoveContainer" containerID="275040296db5cef9700aa2e2244ea724fe1663bdc93466378b56f7e33809d0f9" Jan 11 17:57:51 crc kubenswrapper[4837]: E0111 17:57:51.364987 4837 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-pqnst_openshift-machine-config-operator(1cf6fa66-290a-4e29-bafc-e60185a22fe2)\"" pod="openshift-machine-config-operator/machine-config-daemon-pqnst" podUID="1cf6fa66-290a-4e29-bafc-e60185a22fe2" Jan 11 17:58:03 crc kubenswrapper[4837]: I0111 17:58:03.366693 4837 scope.go:117] "RemoveContainer" containerID="275040296db5cef9700aa2e2244ea724fe1663bdc93466378b56f7e33809d0f9" Jan 11 17:58:03 crc kubenswrapper[4837]: E0111 17:58:03.367912 4837 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-pqnst_openshift-machine-config-operator(1cf6fa66-290a-4e29-bafc-e60185a22fe2)\"" pod="openshift-machine-config-operator/machine-config-daemon-pqnst" podUID="1cf6fa66-290a-4e29-bafc-e60185a22fe2" Jan 11 17:58:18 crc kubenswrapper[4837]: I0111 17:58:18.363520 4837 scope.go:117] "RemoveContainer" containerID="275040296db5cef9700aa2e2244ea724fe1663bdc93466378b56f7e33809d0f9" Jan 11 17:58:18 crc kubenswrapper[4837]: E0111 17:58:18.364345 4837 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-pqnst_openshift-machine-config-operator(1cf6fa66-290a-4e29-bafc-e60185a22fe2)\"" pod="openshift-machine-config-operator/machine-config-daemon-pqnst" podUID="1cf6fa66-290a-4e29-bafc-e60185a22fe2" Jan 11 17:58:31 crc kubenswrapper[4837]: I0111 17:58:31.364652 4837 scope.go:117] "RemoveContainer" containerID="275040296db5cef9700aa2e2244ea724fe1663bdc93466378b56f7e33809d0f9" Jan 11 17:58:31 crc kubenswrapper[4837]: E0111 17:58:31.365594 4837 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-pqnst_openshift-machine-config-operator(1cf6fa66-290a-4e29-bafc-e60185a22fe2)\"" pod="openshift-machine-config-operator/machine-config-daemon-pqnst" podUID="1cf6fa66-290a-4e29-bafc-e60185a22fe2" Jan 11 17:58:46 crc kubenswrapper[4837]: I0111 17:58:46.378947 4837 scope.go:117] "RemoveContainer" containerID="275040296db5cef9700aa2e2244ea724fe1663bdc93466378b56f7e33809d0f9" Jan 11 17:58:46 crc kubenswrapper[4837]: E0111 17:58:46.380115 4837 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-pqnst_openshift-machine-config-operator(1cf6fa66-290a-4e29-bafc-e60185a22fe2)\"" pod="openshift-machine-config-operator/machine-config-daemon-pqnst" podUID="1cf6fa66-290a-4e29-bafc-e60185a22fe2" Jan 11 17:58:57 crc kubenswrapper[4837]: I0111 17:58:57.364524 4837 scope.go:117] "RemoveContainer" containerID="275040296db5cef9700aa2e2244ea724fe1663bdc93466378b56f7e33809d0f9" Jan 11 17:58:57 crc kubenswrapper[4837]: E0111 17:58:57.365497 4837 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-pqnst_openshift-machine-config-operator(1cf6fa66-290a-4e29-bafc-e60185a22fe2)\"" pod="openshift-machine-config-operator/machine-config-daemon-pqnst" podUID="1cf6fa66-290a-4e29-bafc-e60185a22fe2" Jan 11 17:59:09 crc kubenswrapper[4837]: I0111 17:59:09.364906 4837 scope.go:117] "RemoveContainer" containerID="275040296db5cef9700aa2e2244ea724fe1663bdc93466378b56f7e33809d0f9" Jan 11 17:59:09 crc kubenswrapper[4837]: E0111 17:59:09.365746 4837 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-pqnst_openshift-machine-config-operator(1cf6fa66-290a-4e29-bafc-e60185a22fe2)\"" pod="openshift-machine-config-operator/machine-config-daemon-pqnst" podUID="1cf6fa66-290a-4e29-bafc-e60185a22fe2" Jan 11 17:59:11 crc kubenswrapper[4837]: I0111 17:59:11.271932 4837 generic.go:334] "Generic (PLEG): container finished" podID="bb11040a-b0ba-46f9-b2bf-65c1b0c8cf51" containerID="9444df6b973b75583c5e5e2d54206ebe138fa701edb4b74d18ff5e5c0ed31375" exitCode=0 Jan 11 17:59:11 crc kubenswrapper[4837]: I0111 17:59:11.272058 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-kfwxw" event={"ID":"bb11040a-b0ba-46f9-b2bf-65c1b0c8cf51","Type":"ContainerDied","Data":"9444df6b973b75583c5e5e2d54206ebe138fa701edb4b74d18ff5e5c0ed31375"} Jan 11 17:59:12 crc kubenswrapper[4837]: I0111 17:59:12.741018 4837 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-kfwxw" Jan 11 17:59:12 crc kubenswrapper[4837]: I0111 17:59:12.829201 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/bb11040a-b0ba-46f9-b2bf-65c1b0c8cf51-inventory\") pod \"bb11040a-b0ba-46f9-b2bf-65c1b0c8cf51\" (UID: \"bb11040a-b0ba-46f9-b2bf-65c1b0c8cf51\") " Jan 11 17:59:12 crc kubenswrapper[4837]: I0111 17:59:12.829333 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-b7stf\" (UniqueName: \"kubernetes.io/projected/bb11040a-b0ba-46f9-b2bf-65c1b0c8cf51-kube-api-access-b7stf\") pod \"bb11040a-b0ba-46f9-b2bf-65c1b0c8cf51\" (UID: \"bb11040a-b0ba-46f9-b2bf-65c1b0c8cf51\") " Jan 11 17:59:12 crc kubenswrapper[4837]: I0111 17:59:12.829403 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/bb11040a-b0ba-46f9-b2bf-65c1b0c8cf51-bootstrap-combined-ca-bundle\") pod \"bb11040a-b0ba-46f9-b2bf-65c1b0c8cf51\" (UID: \"bb11040a-b0ba-46f9-b2bf-65c1b0c8cf51\") " Jan 11 17:59:12 crc kubenswrapper[4837]: I0111 17:59:12.829451 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/bb11040a-b0ba-46f9-b2bf-65c1b0c8cf51-ssh-key-openstack-edpm-ipam\") pod \"bb11040a-b0ba-46f9-b2bf-65c1b0c8cf51\" (UID: \"bb11040a-b0ba-46f9-b2bf-65c1b0c8cf51\") " Jan 11 17:59:12 crc kubenswrapper[4837]: I0111 17:59:12.836234 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bb11040a-b0ba-46f9-b2bf-65c1b0c8cf51-bootstrap-combined-ca-bundle" (OuterVolumeSpecName: "bootstrap-combined-ca-bundle") pod "bb11040a-b0ba-46f9-b2bf-65c1b0c8cf51" (UID: "bb11040a-b0ba-46f9-b2bf-65c1b0c8cf51"). InnerVolumeSpecName "bootstrap-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 11 17:59:12 crc kubenswrapper[4837]: I0111 17:59:12.838958 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bb11040a-b0ba-46f9-b2bf-65c1b0c8cf51-kube-api-access-b7stf" (OuterVolumeSpecName: "kube-api-access-b7stf") pod "bb11040a-b0ba-46f9-b2bf-65c1b0c8cf51" (UID: "bb11040a-b0ba-46f9-b2bf-65c1b0c8cf51"). InnerVolumeSpecName "kube-api-access-b7stf". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 11 17:59:12 crc kubenswrapper[4837]: I0111 17:59:12.865941 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bb11040a-b0ba-46f9-b2bf-65c1b0c8cf51-ssh-key-openstack-edpm-ipam" (OuterVolumeSpecName: "ssh-key-openstack-edpm-ipam") pod "bb11040a-b0ba-46f9-b2bf-65c1b0c8cf51" (UID: "bb11040a-b0ba-46f9-b2bf-65c1b0c8cf51"). InnerVolumeSpecName "ssh-key-openstack-edpm-ipam". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 11 17:59:12 crc kubenswrapper[4837]: I0111 17:59:12.879138 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bb11040a-b0ba-46f9-b2bf-65c1b0c8cf51-inventory" (OuterVolumeSpecName: "inventory") pod "bb11040a-b0ba-46f9-b2bf-65c1b0c8cf51" (UID: "bb11040a-b0ba-46f9-b2bf-65c1b0c8cf51"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 11 17:59:12 crc kubenswrapper[4837]: I0111 17:59:12.931823 4837 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-b7stf\" (UniqueName: \"kubernetes.io/projected/bb11040a-b0ba-46f9-b2bf-65c1b0c8cf51-kube-api-access-b7stf\") on node \"crc\" DevicePath \"\"" Jan 11 17:59:12 crc kubenswrapper[4837]: I0111 17:59:12.931863 4837 reconciler_common.go:293] "Volume detached for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/bb11040a-b0ba-46f9-b2bf-65c1b0c8cf51-bootstrap-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 11 17:59:12 crc kubenswrapper[4837]: I0111 17:59:12.931877 4837 reconciler_common.go:293] "Volume detached for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/bb11040a-b0ba-46f9-b2bf-65c1b0c8cf51-ssh-key-openstack-edpm-ipam\") on node \"crc\" DevicePath \"\"" Jan 11 17:59:12 crc kubenswrapper[4837]: I0111 17:59:12.931892 4837 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/bb11040a-b0ba-46f9-b2bf-65c1b0c8cf51-inventory\") on node \"crc\" DevicePath \"\"" Jan 11 17:59:13 crc kubenswrapper[4837]: I0111 17:59:13.291648 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-kfwxw" event={"ID":"bb11040a-b0ba-46f9-b2bf-65c1b0c8cf51","Type":"ContainerDied","Data":"10c1be50375bc51ece18ca35e1f1ca38e31da043eba2f42583fbb54abdeff561"} Jan 11 17:59:13 crc kubenswrapper[4837]: I0111 17:59:13.292157 4837 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="10c1be50375bc51ece18ca35e1f1ca38e31da043eba2f42583fbb54abdeff561" Jan 11 17:59:13 crc kubenswrapper[4837]: I0111 17:59:13.291717 4837 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-kfwxw" Jan 11 17:59:13 crc kubenswrapper[4837]: I0111 17:59:13.450979 4837 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/download-cache-edpm-deployment-openstack-edpm-ipam-4d8rm"] Jan 11 17:59:13 crc kubenswrapper[4837]: E0111 17:59:13.451407 4837 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6a03b007-f7ad-473a-b43b-1acfe4ec7fce" containerName="extract-utilities" Jan 11 17:59:13 crc kubenswrapper[4837]: I0111 17:59:13.451439 4837 state_mem.go:107] "Deleted CPUSet assignment" podUID="6a03b007-f7ad-473a-b43b-1acfe4ec7fce" containerName="extract-utilities" Jan 11 17:59:13 crc kubenswrapper[4837]: E0111 17:59:13.451458 4837 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6a03b007-f7ad-473a-b43b-1acfe4ec7fce" containerName="extract-content" Jan 11 17:59:13 crc kubenswrapper[4837]: I0111 17:59:13.451468 4837 state_mem.go:107] "Deleted CPUSet assignment" podUID="6a03b007-f7ad-473a-b43b-1acfe4ec7fce" containerName="extract-content" Jan 11 17:59:13 crc kubenswrapper[4837]: E0111 17:59:13.451489 4837 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6a03b007-f7ad-473a-b43b-1acfe4ec7fce" containerName="registry-server" Jan 11 17:59:13 crc kubenswrapper[4837]: I0111 17:59:13.451497 4837 state_mem.go:107] "Deleted CPUSet assignment" podUID="6a03b007-f7ad-473a-b43b-1acfe4ec7fce" containerName="registry-server" Jan 11 17:59:13 crc kubenswrapper[4837]: E0111 17:59:13.451516 4837 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="bb11040a-b0ba-46f9-b2bf-65c1b0c8cf51" containerName="bootstrap-edpm-deployment-openstack-edpm-ipam" Jan 11 17:59:13 crc kubenswrapper[4837]: I0111 17:59:13.451525 4837 state_mem.go:107] "Deleted CPUSet assignment" podUID="bb11040a-b0ba-46f9-b2bf-65c1b0c8cf51" containerName="bootstrap-edpm-deployment-openstack-edpm-ipam" Jan 11 17:59:13 crc kubenswrapper[4837]: I0111 17:59:13.451855 4837 memory_manager.go:354] "RemoveStaleState removing state" podUID="6a03b007-f7ad-473a-b43b-1acfe4ec7fce" containerName="registry-server" Jan 11 17:59:13 crc kubenswrapper[4837]: I0111 17:59:13.451886 4837 memory_manager.go:354] "RemoveStaleState removing state" podUID="bb11040a-b0ba-46f9-b2bf-65c1b0c8cf51" containerName="bootstrap-edpm-deployment-openstack-edpm-ipam" Jan 11 17:59:13 crc kubenswrapper[4837]: I0111 17:59:13.452650 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-4d8rm" Jan 11 17:59:13 crc kubenswrapper[4837]: I0111 17:59:13.455480 4837 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Jan 11 17:59:13 crc kubenswrapper[4837]: I0111 17:59:13.457388 4837 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Jan 11 17:59:13 crc kubenswrapper[4837]: I0111 17:59:13.457545 4837 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Jan 11 17:59:13 crc kubenswrapper[4837]: I0111 17:59:13.459302 4837 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-bzt89" Jan 11 17:59:13 crc kubenswrapper[4837]: I0111 17:59:13.470320 4837 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/download-cache-edpm-deployment-openstack-edpm-ipam-4d8rm"] Jan 11 17:59:13 crc kubenswrapper[4837]: I0111 17:59:13.544298 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/e6773d83-814c-42bc-8578-5746bb984988-inventory\") pod \"download-cache-edpm-deployment-openstack-edpm-ipam-4d8rm\" (UID: \"e6773d83-814c-42bc-8578-5746bb984988\") " pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-4d8rm" Jan 11 17:59:13 crc kubenswrapper[4837]: I0111 17:59:13.544602 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hfcfj\" (UniqueName: \"kubernetes.io/projected/e6773d83-814c-42bc-8578-5746bb984988-kube-api-access-hfcfj\") pod \"download-cache-edpm-deployment-openstack-edpm-ipam-4d8rm\" (UID: \"e6773d83-814c-42bc-8578-5746bb984988\") " pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-4d8rm" Jan 11 17:59:13 crc kubenswrapper[4837]: I0111 17:59:13.544887 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/e6773d83-814c-42bc-8578-5746bb984988-ssh-key-openstack-edpm-ipam\") pod \"download-cache-edpm-deployment-openstack-edpm-ipam-4d8rm\" (UID: \"e6773d83-814c-42bc-8578-5746bb984988\") " pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-4d8rm" Jan 11 17:59:13 crc kubenswrapper[4837]: I0111 17:59:13.647400 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/e6773d83-814c-42bc-8578-5746bb984988-inventory\") pod \"download-cache-edpm-deployment-openstack-edpm-ipam-4d8rm\" (UID: \"e6773d83-814c-42bc-8578-5746bb984988\") " pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-4d8rm" Jan 11 17:59:13 crc kubenswrapper[4837]: I0111 17:59:13.647601 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hfcfj\" (UniqueName: \"kubernetes.io/projected/e6773d83-814c-42bc-8578-5746bb984988-kube-api-access-hfcfj\") pod \"download-cache-edpm-deployment-openstack-edpm-ipam-4d8rm\" (UID: \"e6773d83-814c-42bc-8578-5746bb984988\") " pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-4d8rm" Jan 11 17:59:13 crc kubenswrapper[4837]: I0111 17:59:13.647761 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/e6773d83-814c-42bc-8578-5746bb984988-ssh-key-openstack-edpm-ipam\") pod \"download-cache-edpm-deployment-openstack-edpm-ipam-4d8rm\" (UID: \"e6773d83-814c-42bc-8578-5746bb984988\") " pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-4d8rm" Jan 11 17:59:13 crc kubenswrapper[4837]: I0111 17:59:13.653371 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/e6773d83-814c-42bc-8578-5746bb984988-ssh-key-openstack-edpm-ipam\") pod \"download-cache-edpm-deployment-openstack-edpm-ipam-4d8rm\" (UID: \"e6773d83-814c-42bc-8578-5746bb984988\") " pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-4d8rm" Jan 11 17:59:13 crc kubenswrapper[4837]: I0111 17:59:13.659373 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/e6773d83-814c-42bc-8578-5746bb984988-inventory\") pod \"download-cache-edpm-deployment-openstack-edpm-ipam-4d8rm\" (UID: \"e6773d83-814c-42bc-8578-5746bb984988\") " pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-4d8rm" Jan 11 17:59:13 crc kubenswrapper[4837]: I0111 17:59:13.668940 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hfcfj\" (UniqueName: \"kubernetes.io/projected/e6773d83-814c-42bc-8578-5746bb984988-kube-api-access-hfcfj\") pod \"download-cache-edpm-deployment-openstack-edpm-ipam-4d8rm\" (UID: \"e6773d83-814c-42bc-8578-5746bb984988\") " pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-4d8rm" Jan 11 17:59:13 crc kubenswrapper[4837]: I0111 17:59:13.766824 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-4d8rm" Jan 11 17:59:14 crc kubenswrapper[4837]: I0111 17:59:14.322081 4837 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/download-cache-edpm-deployment-openstack-edpm-ipam-4d8rm"] Jan 11 17:59:14 crc kubenswrapper[4837]: W0111 17:59:14.325932 4837 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pode6773d83_814c_42bc_8578_5746bb984988.slice/crio-856e3e260e56f9fa1ac11189c41b3285bdf84e991da2c43c07dc8c8383c29cd6 WatchSource:0}: Error finding container 856e3e260e56f9fa1ac11189c41b3285bdf84e991da2c43c07dc8c8383c29cd6: Status 404 returned error can't find the container with id 856e3e260e56f9fa1ac11189c41b3285bdf84e991da2c43c07dc8c8383c29cd6 Jan 11 17:59:14 crc kubenswrapper[4837]: I0111 17:59:14.329340 4837 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Jan 11 17:59:15 crc kubenswrapper[4837]: I0111 17:59:15.315039 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-4d8rm" event={"ID":"e6773d83-814c-42bc-8578-5746bb984988","Type":"ContainerStarted","Data":"856e3e260e56f9fa1ac11189c41b3285bdf84e991da2c43c07dc8c8383c29cd6"} Jan 11 17:59:16 crc kubenswrapper[4837]: I0111 17:59:16.327784 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-4d8rm" event={"ID":"e6773d83-814c-42bc-8578-5746bb984988","Type":"ContainerStarted","Data":"eb5676a8c0e1efd16bcd461e5c0711f312eef67e273304eb40e741a4a263754b"} Jan 11 17:59:16 crc kubenswrapper[4837]: I0111 17:59:16.355787 4837 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-4d8rm" podStartSLOduration=2.55869545 podStartE2EDuration="3.355761991s" podCreationTimestamp="2026-01-11 17:59:13 +0000 UTC" firstStartedPulling="2026-01-11 17:59:14.329127591 +0000 UTC m=+1728.507320297" lastFinishedPulling="2026-01-11 17:59:15.126194122 +0000 UTC m=+1729.304386838" observedRunningTime="2026-01-11 17:59:16.350651084 +0000 UTC m=+1730.528843790" watchObservedRunningTime="2026-01-11 17:59:16.355761991 +0000 UTC m=+1730.533954697" Jan 11 17:59:21 crc kubenswrapper[4837]: I0111 17:59:21.365609 4837 scope.go:117] "RemoveContainer" containerID="275040296db5cef9700aa2e2244ea724fe1663bdc93466378b56f7e33809d0f9" Jan 11 17:59:21 crc kubenswrapper[4837]: E0111 17:59:21.366271 4837 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-pqnst_openshift-machine-config-operator(1cf6fa66-290a-4e29-bafc-e60185a22fe2)\"" pod="openshift-machine-config-operator/machine-config-daemon-pqnst" podUID="1cf6fa66-290a-4e29-bafc-e60185a22fe2" Jan 11 17:59:27 crc kubenswrapper[4837]: I0111 17:59:27.040337 4837 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/keystone-db-create-m8sjp"] Jan 11 17:59:27 crc kubenswrapper[4837]: I0111 17:59:27.048325 4837 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/keystone-db-create-m8sjp"] Jan 11 17:59:28 crc kubenswrapper[4837]: I0111 17:59:28.374652 4837 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="65ae93f3-f5ad-4225-86a9-c7bf0748d84f" path="/var/lib/kubelet/pods/65ae93f3-f5ad-4225-86a9-c7bf0748d84f/volumes" Jan 11 17:59:29 crc kubenswrapper[4837]: I0111 17:59:29.029292 4837 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/placement-db-create-dv96d"] Jan 11 17:59:29 crc kubenswrapper[4837]: I0111 17:59:29.050759 4837 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/placement-db-create-dv96d"] Jan 11 17:59:30 crc kubenswrapper[4837]: I0111 17:59:30.049596 4837 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/keystone-ce95-account-create-update-mv8vt"] Jan 11 17:59:30 crc kubenswrapper[4837]: I0111 17:59:30.064335 4837 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-db-create-b9fmt"] Jan 11 17:59:30 crc kubenswrapper[4837]: I0111 17:59:30.074171 4837 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-8247-account-create-update-chtt6"] Jan 11 17:59:30 crc kubenswrapper[4837]: I0111 17:59:30.084251 4837 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/placement-047d-account-create-update-9296c"] Jan 11 17:59:30 crc kubenswrapper[4837]: I0111 17:59:30.096271 4837 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/keystone-ce95-account-create-update-mv8vt"] Jan 11 17:59:30 crc kubenswrapper[4837]: I0111 17:59:30.109768 4837 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glance-8247-account-create-update-chtt6"] Jan 11 17:59:30 crc kubenswrapper[4837]: I0111 17:59:30.120364 4837 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glance-db-create-b9fmt"] Jan 11 17:59:30 crc kubenswrapper[4837]: I0111 17:59:30.130032 4837 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/placement-047d-account-create-update-9296c"] Jan 11 17:59:30 crc kubenswrapper[4837]: I0111 17:59:30.381439 4837 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="129b7610-1c04-47c2-ba4d-4c20195c2071" path="/var/lib/kubelet/pods/129b7610-1c04-47c2-ba4d-4c20195c2071/volumes" Jan 11 17:59:30 crc kubenswrapper[4837]: I0111 17:59:30.382881 4837 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3a0934ac-db06-4719-9ee0-edbefddcd983" path="/var/lib/kubelet/pods/3a0934ac-db06-4719-9ee0-edbefddcd983/volumes" Jan 11 17:59:30 crc kubenswrapper[4837]: I0111 17:59:30.384442 4837 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="76495fd7-f795-4340-85fa-9f1469bbd1aa" path="/var/lib/kubelet/pods/76495fd7-f795-4340-85fa-9f1469bbd1aa/volumes" Jan 11 17:59:30 crc kubenswrapper[4837]: I0111 17:59:30.385843 4837 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7c933d5f-4b3e-43c8-a87d-67b274355687" path="/var/lib/kubelet/pods/7c933d5f-4b3e-43c8-a87d-67b274355687/volumes" Jan 11 17:59:30 crc kubenswrapper[4837]: I0111 17:59:30.387980 4837 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a5853482-7ef2-4be7-83ac-5212d4db1696" path="/var/lib/kubelet/pods/a5853482-7ef2-4be7-83ac-5212d4db1696/volumes" Jan 11 17:59:32 crc kubenswrapper[4837]: I0111 17:59:32.364519 4837 scope.go:117] "RemoveContainer" containerID="275040296db5cef9700aa2e2244ea724fe1663bdc93466378b56f7e33809d0f9" Jan 11 17:59:32 crc kubenswrapper[4837]: E0111 17:59:32.365117 4837 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-pqnst_openshift-machine-config-operator(1cf6fa66-290a-4e29-bafc-e60185a22fe2)\"" pod="openshift-machine-config-operator/machine-config-daemon-pqnst" podUID="1cf6fa66-290a-4e29-bafc-e60185a22fe2" Jan 11 17:59:35 crc kubenswrapper[4837]: I0111 17:59:35.176267 4837 scope.go:117] "RemoveContainer" containerID="274cf8df922185f1b531431260cbe8bc072410bcd7e3df68013113f1b78a39c4" Jan 11 17:59:35 crc kubenswrapper[4837]: I0111 17:59:35.201276 4837 scope.go:117] "RemoveContainer" containerID="0a8f3ee29272f42b879ea7bd68b49ee864cdc1330c6740bc020afd8b722065b3" Jan 11 17:59:35 crc kubenswrapper[4837]: I0111 17:59:35.263259 4837 scope.go:117] "RemoveContainer" containerID="4946608abe9f38b0ce7bd682101e55184bf0348f285b29d2181c1d450d82e7be" Jan 11 17:59:35 crc kubenswrapper[4837]: I0111 17:59:35.300176 4837 scope.go:117] "RemoveContainer" containerID="20a389c545715b8834414800356628d63aacbef1a7d9800a452baf08c3f21efe" Jan 11 17:59:35 crc kubenswrapper[4837]: I0111 17:59:35.343262 4837 scope.go:117] "RemoveContainer" containerID="bc3597a6896d5097d8928cdc96d9475de068e7fcb43bbbc7e760fe9c2a0d2a35" Jan 11 17:59:35 crc kubenswrapper[4837]: I0111 17:59:35.366308 4837 scope.go:117] "RemoveContainer" containerID="387185be2dafebb0af31a3a4622b4f11ac5d86bde2c0f5c750fae35bed33aefb" Jan 11 17:59:35 crc kubenswrapper[4837]: I0111 17:59:35.404623 4837 scope.go:117] "RemoveContainer" containerID="ab94431e6e856e8762b5638e35264dba0d0cf5be791233e8088e845bcdf0924c" Jan 11 17:59:35 crc kubenswrapper[4837]: I0111 17:59:35.443013 4837 scope.go:117] "RemoveContainer" containerID="f1a691cbefb207e0912f108bb1859fef3a8ca9a89fb8eee662e8d939b31df464" Jan 11 17:59:44 crc kubenswrapper[4837]: I0111 17:59:44.364276 4837 scope.go:117] "RemoveContainer" containerID="275040296db5cef9700aa2e2244ea724fe1663bdc93466378b56f7e33809d0f9" Jan 11 17:59:44 crc kubenswrapper[4837]: E0111 17:59:44.364958 4837 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-pqnst_openshift-machine-config-operator(1cf6fa66-290a-4e29-bafc-e60185a22fe2)\"" pod="openshift-machine-config-operator/machine-config-daemon-pqnst" podUID="1cf6fa66-290a-4e29-bafc-e60185a22fe2" Jan 11 17:59:55 crc kubenswrapper[4837]: I0111 17:59:55.364631 4837 scope.go:117] "RemoveContainer" containerID="275040296db5cef9700aa2e2244ea724fe1663bdc93466378b56f7e33809d0f9" Jan 11 17:59:55 crc kubenswrapper[4837]: E0111 17:59:55.365371 4837 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-pqnst_openshift-machine-config-operator(1cf6fa66-290a-4e29-bafc-e60185a22fe2)\"" pod="openshift-machine-config-operator/machine-config-daemon-pqnst" podUID="1cf6fa66-290a-4e29-bafc-e60185a22fe2" Jan 11 18:00:00 crc kubenswrapper[4837]: I0111 18:00:00.054978 4837 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/root-account-create-update-khqcj"] Jan 11 18:00:00 crc kubenswrapper[4837]: I0111 18:00:00.065753 4837 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/root-account-create-update-khqcj"] Jan 11 18:00:00 crc kubenswrapper[4837]: I0111 18:00:00.145438 4837 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29469240-kx6l5"] Jan 11 18:00:00 crc kubenswrapper[4837]: I0111 18:00:00.146772 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29469240-kx6l5" Jan 11 18:00:00 crc kubenswrapper[4837]: I0111 18:00:00.149159 4837 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"collect-profiles-dockercfg-kzf4t" Jan 11 18:00:00 crc kubenswrapper[4837]: I0111 18:00:00.151412 4837 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"collect-profiles-config" Jan 11 18:00:00 crc kubenswrapper[4837]: I0111 18:00:00.168793 4837 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29469240-kx6l5"] Jan 11 18:00:00 crc kubenswrapper[4837]: I0111 18:00:00.311343 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-s85tl\" (UniqueName: \"kubernetes.io/projected/118fb501-a455-4398-9221-bc3c8922d5ff-kube-api-access-s85tl\") pod \"collect-profiles-29469240-kx6l5\" (UID: \"118fb501-a455-4398-9221-bc3c8922d5ff\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29469240-kx6l5" Jan 11 18:00:00 crc kubenswrapper[4837]: I0111 18:00:00.311438 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/118fb501-a455-4398-9221-bc3c8922d5ff-config-volume\") pod \"collect-profiles-29469240-kx6l5\" (UID: \"118fb501-a455-4398-9221-bc3c8922d5ff\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29469240-kx6l5" Jan 11 18:00:00 crc kubenswrapper[4837]: I0111 18:00:00.311918 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/118fb501-a455-4398-9221-bc3c8922d5ff-secret-volume\") pod \"collect-profiles-29469240-kx6l5\" (UID: \"118fb501-a455-4398-9221-bc3c8922d5ff\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29469240-kx6l5" Jan 11 18:00:00 crc kubenswrapper[4837]: I0111 18:00:00.381898 4837 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d861b390-4351-4ffb-8bb5-19201f06e3db" path="/var/lib/kubelet/pods/d861b390-4351-4ffb-8bb5-19201f06e3db/volumes" Jan 11 18:00:00 crc kubenswrapper[4837]: I0111 18:00:00.414006 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s85tl\" (UniqueName: \"kubernetes.io/projected/118fb501-a455-4398-9221-bc3c8922d5ff-kube-api-access-s85tl\") pod \"collect-profiles-29469240-kx6l5\" (UID: \"118fb501-a455-4398-9221-bc3c8922d5ff\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29469240-kx6l5" Jan 11 18:00:00 crc kubenswrapper[4837]: I0111 18:00:00.414122 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/118fb501-a455-4398-9221-bc3c8922d5ff-config-volume\") pod \"collect-profiles-29469240-kx6l5\" (UID: \"118fb501-a455-4398-9221-bc3c8922d5ff\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29469240-kx6l5" Jan 11 18:00:00 crc kubenswrapper[4837]: I0111 18:00:00.414382 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/118fb501-a455-4398-9221-bc3c8922d5ff-secret-volume\") pod \"collect-profiles-29469240-kx6l5\" (UID: \"118fb501-a455-4398-9221-bc3c8922d5ff\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29469240-kx6l5" Jan 11 18:00:00 crc kubenswrapper[4837]: I0111 18:00:00.415065 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/118fb501-a455-4398-9221-bc3c8922d5ff-config-volume\") pod \"collect-profiles-29469240-kx6l5\" (UID: \"118fb501-a455-4398-9221-bc3c8922d5ff\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29469240-kx6l5" Jan 11 18:00:00 crc kubenswrapper[4837]: I0111 18:00:00.434722 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/118fb501-a455-4398-9221-bc3c8922d5ff-secret-volume\") pod \"collect-profiles-29469240-kx6l5\" (UID: \"118fb501-a455-4398-9221-bc3c8922d5ff\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29469240-kx6l5" Jan 11 18:00:00 crc kubenswrapper[4837]: I0111 18:00:00.435928 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-s85tl\" (UniqueName: \"kubernetes.io/projected/118fb501-a455-4398-9221-bc3c8922d5ff-kube-api-access-s85tl\") pod \"collect-profiles-29469240-kx6l5\" (UID: \"118fb501-a455-4398-9221-bc3c8922d5ff\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29469240-kx6l5" Jan 11 18:00:00 crc kubenswrapper[4837]: I0111 18:00:00.519955 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29469240-kx6l5" Jan 11 18:00:00 crc kubenswrapper[4837]: I0111 18:00:00.992211 4837 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29469240-kx6l5"] Jan 11 18:00:01 crc kubenswrapper[4837]: I0111 18:00:01.826486 4837 generic.go:334] "Generic (PLEG): container finished" podID="118fb501-a455-4398-9221-bc3c8922d5ff" containerID="2a893a0965754f760e681e36c14e768a4c22cec4b37f6eef68d16056931d793d" exitCode=0 Jan 11 18:00:01 crc kubenswrapper[4837]: I0111 18:00:01.826568 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29469240-kx6l5" event={"ID":"118fb501-a455-4398-9221-bc3c8922d5ff","Type":"ContainerDied","Data":"2a893a0965754f760e681e36c14e768a4c22cec4b37f6eef68d16056931d793d"} Jan 11 18:00:01 crc kubenswrapper[4837]: I0111 18:00:01.826615 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29469240-kx6l5" event={"ID":"118fb501-a455-4398-9221-bc3c8922d5ff","Type":"ContainerStarted","Data":"5d2506e9ba91cc21223849e51656e1b97573dec50d30ad54b14debac5f048216"} Jan 11 18:00:03 crc kubenswrapper[4837]: I0111 18:00:03.257812 4837 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29469240-kx6l5" Jan 11 18:00:03 crc kubenswrapper[4837]: I0111 18:00:03.382899 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/118fb501-a455-4398-9221-bc3c8922d5ff-config-volume\") pod \"118fb501-a455-4398-9221-bc3c8922d5ff\" (UID: \"118fb501-a455-4398-9221-bc3c8922d5ff\") " Jan 11 18:00:03 crc kubenswrapper[4837]: I0111 18:00:03.383160 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/118fb501-a455-4398-9221-bc3c8922d5ff-secret-volume\") pod \"118fb501-a455-4398-9221-bc3c8922d5ff\" (UID: \"118fb501-a455-4398-9221-bc3c8922d5ff\") " Jan 11 18:00:03 crc kubenswrapper[4837]: I0111 18:00:03.383208 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-s85tl\" (UniqueName: \"kubernetes.io/projected/118fb501-a455-4398-9221-bc3c8922d5ff-kube-api-access-s85tl\") pod \"118fb501-a455-4398-9221-bc3c8922d5ff\" (UID: \"118fb501-a455-4398-9221-bc3c8922d5ff\") " Jan 11 18:00:03 crc kubenswrapper[4837]: I0111 18:00:03.383820 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/118fb501-a455-4398-9221-bc3c8922d5ff-config-volume" (OuterVolumeSpecName: "config-volume") pod "118fb501-a455-4398-9221-bc3c8922d5ff" (UID: "118fb501-a455-4398-9221-bc3c8922d5ff"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 11 18:00:03 crc kubenswrapper[4837]: I0111 18:00:03.383949 4837 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/118fb501-a455-4398-9221-bc3c8922d5ff-config-volume\") on node \"crc\" DevicePath \"\"" Jan 11 18:00:03 crc kubenswrapper[4837]: I0111 18:00:03.392769 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/118fb501-a455-4398-9221-bc3c8922d5ff-kube-api-access-s85tl" (OuterVolumeSpecName: "kube-api-access-s85tl") pod "118fb501-a455-4398-9221-bc3c8922d5ff" (UID: "118fb501-a455-4398-9221-bc3c8922d5ff"). InnerVolumeSpecName "kube-api-access-s85tl". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 11 18:00:03 crc kubenswrapper[4837]: I0111 18:00:03.394813 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/118fb501-a455-4398-9221-bc3c8922d5ff-secret-volume" (OuterVolumeSpecName: "secret-volume") pod "118fb501-a455-4398-9221-bc3c8922d5ff" (UID: "118fb501-a455-4398-9221-bc3c8922d5ff"). InnerVolumeSpecName "secret-volume". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 11 18:00:03 crc kubenswrapper[4837]: I0111 18:00:03.485653 4837 reconciler_common.go:293] "Volume detached for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/118fb501-a455-4398-9221-bc3c8922d5ff-secret-volume\") on node \"crc\" DevicePath \"\"" Jan 11 18:00:03 crc kubenswrapper[4837]: I0111 18:00:03.485707 4837 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-s85tl\" (UniqueName: \"kubernetes.io/projected/118fb501-a455-4398-9221-bc3c8922d5ff-kube-api-access-s85tl\") on node \"crc\" DevicePath \"\"" Jan 11 18:00:03 crc kubenswrapper[4837]: I0111 18:00:03.851562 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29469240-kx6l5" event={"ID":"118fb501-a455-4398-9221-bc3c8922d5ff","Type":"ContainerDied","Data":"5d2506e9ba91cc21223849e51656e1b97573dec50d30ad54b14debac5f048216"} Jan 11 18:00:03 crc kubenswrapper[4837]: I0111 18:00:03.851601 4837 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="5d2506e9ba91cc21223849e51656e1b97573dec50d30ad54b14debac5f048216" Jan 11 18:00:03 crc kubenswrapper[4837]: I0111 18:00:03.851641 4837 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29469240-kx6l5" Jan 11 18:00:08 crc kubenswrapper[4837]: I0111 18:00:08.364792 4837 scope.go:117] "RemoveContainer" containerID="275040296db5cef9700aa2e2244ea724fe1663bdc93466378b56f7e33809d0f9" Jan 11 18:00:08 crc kubenswrapper[4837]: E0111 18:00:08.365933 4837 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-pqnst_openshift-machine-config-operator(1cf6fa66-290a-4e29-bafc-e60185a22fe2)\"" pod="openshift-machine-config-operator/machine-config-daemon-pqnst" podUID="1cf6fa66-290a-4e29-bafc-e60185a22fe2" Jan 11 18:00:17 crc kubenswrapper[4837]: I0111 18:00:17.055997 4837 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-db-create-rvfdr"] Jan 11 18:00:17 crc kubenswrapper[4837]: I0111 18:00:17.068608 4837 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/cinder-db-create-rvfdr"] Jan 11 18:00:18 crc kubenswrapper[4837]: I0111 18:00:18.378748 4837 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="29bd365d-3562-452a-a585-041d3f538ebe" path="/var/lib/kubelet/pods/29bd365d-3562-452a-a585-041d3f538ebe/volumes" Jan 11 18:00:19 crc kubenswrapper[4837]: I0111 18:00:19.364572 4837 scope.go:117] "RemoveContainer" containerID="275040296db5cef9700aa2e2244ea724fe1663bdc93466378b56f7e33809d0f9" Jan 11 18:00:19 crc kubenswrapper[4837]: E0111 18:00:19.365503 4837 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-pqnst_openshift-machine-config-operator(1cf6fa66-290a-4e29-bafc-e60185a22fe2)\"" pod="openshift-machine-config-operator/machine-config-daemon-pqnst" podUID="1cf6fa66-290a-4e29-bafc-e60185a22fe2" Jan 11 18:00:21 crc kubenswrapper[4837]: I0111 18:00:21.050490 4837 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/barbican-136e-account-create-update-chnvk"] Jan 11 18:00:21 crc kubenswrapper[4837]: I0111 18:00:21.068392 4837 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/neutron-f46b-account-create-update-ctrv8"] Jan 11 18:00:21 crc kubenswrapper[4837]: I0111 18:00:21.080183 4837 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-9090-account-create-update-674ql"] Jan 11 18:00:21 crc kubenswrapper[4837]: I0111 18:00:21.090975 4837 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/barbican-db-create-8nczp"] Jan 11 18:00:21 crc kubenswrapper[4837]: I0111 18:00:21.099156 4837 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/neutron-db-create-5lj8p"] Jan 11 18:00:21 crc kubenswrapper[4837]: I0111 18:00:21.106517 4837 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/neutron-f46b-account-create-update-ctrv8"] Jan 11 18:00:21 crc kubenswrapper[4837]: I0111 18:00:21.113121 4837 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/barbican-136e-account-create-update-chnvk"] Jan 11 18:00:21 crc kubenswrapper[4837]: I0111 18:00:21.120890 4837 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/cinder-9090-account-create-update-674ql"] Jan 11 18:00:21 crc kubenswrapper[4837]: I0111 18:00:21.127468 4837 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/barbican-db-create-8nczp"] Jan 11 18:00:21 crc kubenswrapper[4837]: I0111 18:00:21.137872 4837 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/neutron-db-create-5lj8p"] Jan 11 18:00:22 crc kubenswrapper[4837]: I0111 18:00:22.377770 4837 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3865f01a-21c1-4214-b979-14e72a764eb8" path="/var/lib/kubelet/pods/3865f01a-21c1-4214-b979-14e72a764eb8/volumes" Jan 11 18:00:22 crc kubenswrapper[4837]: I0111 18:00:22.379097 4837 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7c8fa4d5-171c-4961-b1e2-203c2d2a128f" path="/var/lib/kubelet/pods/7c8fa4d5-171c-4961-b1e2-203c2d2a128f/volumes" Jan 11 18:00:22 crc kubenswrapper[4837]: I0111 18:00:22.380284 4837 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f02a3348-4ccd-4793-8f46-e735fa0fc49d" path="/var/lib/kubelet/pods/f02a3348-4ccd-4793-8f46-e735fa0fc49d/volumes" Jan 11 18:00:22 crc kubenswrapper[4837]: I0111 18:00:22.381614 4837 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f1d3da28-9c32-4294-b0b9-35b383dafb36" path="/var/lib/kubelet/pods/f1d3da28-9c32-4294-b0b9-35b383dafb36/volumes" Jan 11 18:00:22 crc kubenswrapper[4837]: I0111 18:00:22.383866 4837 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ff652e64-59f9-4670-8952-a33ee996c7e5" path="/var/lib/kubelet/pods/ff652e64-59f9-4670-8952-a33ee996c7e5/volumes" Jan 11 18:00:25 crc kubenswrapper[4837]: I0111 18:00:25.037647 4837 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/keystone-db-sync-d8v5z"] Jan 11 18:00:25 crc kubenswrapper[4837]: I0111 18:00:25.057316 4837 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/keystone-db-sync-d8v5z"] Jan 11 18:00:26 crc kubenswrapper[4837]: I0111 18:00:26.386578 4837 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3ca257e2-1e1f-401c-9a8e-776746d6bfe2" path="/var/lib/kubelet/pods/3ca257e2-1e1f-401c-9a8e-776746d6bfe2/volumes" Jan 11 18:00:28 crc kubenswrapper[4837]: I0111 18:00:28.048365 4837 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-db-sync-r8vzn"] Jan 11 18:00:28 crc kubenswrapper[4837]: I0111 18:00:28.058262 4837 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glance-db-sync-r8vzn"] Jan 11 18:00:28 crc kubenswrapper[4837]: I0111 18:00:28.379548 4837 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ed1cfeaf-6da5-40ed-b605-077e5c95900c" path="/var/lib/kubelet/pods/ed1cfeaf-6da5-40ed-b605-077e5c95900c/volumes" Jan 11 18:00:33 crc kubenswrapper[4837]: I0111 18:00:33.364160 4837 scope.go:117] "RemoveContainer" containerID="275040296db5cef9700aa2e2244ea724fe1663bdc93466378b56f7e33809d0f9" Jan 11 18:00:33 crc kubenswrapper[4837]: E0111 18:00:33.364929 4837 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-pqnst_openshift-machine-config-operator(1cf6fa66-290a-4e29-bafc-e60185a22fe2)\"" pod="openshift-machine-config-operator/machine-config-daemon-pqnst" podUID="1cf6fa66-290a-4e29-bafc-e60185a22fe2" Jan 11 18:00:35 crc kubenswrapper[4837]: I0111 18:00:35.607124 4837 scope.go:117] "RemoveContainer" containerID="93c7e0d022ac13d05d186210059cd3587449e76f24ba046842ec3e34a7e70f73" Jan 11 18:00:35 crc kubenswrapper[4837]: I0111 18:00:35.663580 4837 scope.go:117] "RemoveContainer" containerID="1d2a6a0a70eca994b42dc0d4f40e9703d7324ffca4d61e9113e69c400cbb1446" Jan 11 18:00:35 crc kubenswrapper[4837]: I0111 18:00:35.698129 4837 scope.go:117] "RemoveContainer" containerID="56b953bf2da6d3a8a84c18f4249c0a5f2c9bf0cdb0724a20a9cff19e12b1f277" Jan 11 18:00:35 crc kubenswrapper[4837]: I0111 18:00:35.737718 4837 scope.go:117] "RemoveContainer" containerID="820a117f8c24d68f1bd3a716a9093ff50bfc7224e3c22ac46e6eac2b9cf22ef9" Jan 11 18:00:35 crc kubenswrapper[4837]: I0111 18:00:35.765930 4837 scope.go:117] "RemoveContainer" containerID="b7fe0e99090474c599792ca7a532bd6276dd675cebeb22e673233cb41e557686" Jan 11 18:00:35 crc kubenswrapper[4837]: I0111 18:00:35.807720 4837 scope.go:117] "RemoveContainer" containerID="a36a59c0876ae98d222aab78869c8bcdf4810ea828fc526059b03a3d70f1d37f" Jan 11 18:00:35 crc kubenswrapper[4837]: I0111 18:00:35.868629 4837 scope.go:117] "RemoveContainer" containerID="bc7fb78df1712c24841197035ddd115fb2ce6e77521f6ba61641113c90a27917" Jan 11 18:00:35 crc kubenswrapper[4837]: I0111 18:00:35.889470 4837 scope.go:117] "RemoveContainer" containerID="e4b31cfde4e25b9262febddecd8a97fddf82fcde426e94897e7395faf084ffc9" Jan 11 18:00:35 crc kubenswrapper[4837]: I0111 18:00:35.908272 4837 scope.go:117] "RemoveContainer" containerID="ee5ecc1a9cfec995b82e04d12deb2fb465641ad952c38d225e5ec7da48803bb5" Jan 11 18:00:46 crc kubenswrapper[4837]: I0111 18:00:46.373802 4837 scope.go:117] "RemoveContainer" containerID="275040296db5cef9700aa2e2244ea724fe1663bdc93466378b56f7e33809d0f9" Jan 11 18:00:46 crc kubenswrapper[4837]: E0111 18:00:46.375897 4837 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-pqnst_openshift-machine-config-operator(1cf6fa66-290a-4e29-bafc-e60185a22fe2)\"" pod="openshift-machine-config-operator/machine-config-daemon-pqnst" podUID="1cf6fa66-290a-4e29-bafc-e60185a22fe2" Jan 11 18:00:58 crc kubenswrapper[4837]: I0111 18:00:58.363860 4837 scope.go:117] "RemoveContainer" containerID="275040296db5cef9700aa2e2244ea724fe1663bdc93466378b56f7e33809d0f9" Jan 11 18:00:58 crc kubenswrapper[4837]: E0111 18:00:58.364704 4837 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-pqnst_openshift-machine-config-operator(1cf6fa66-290a-4e29-bafc-e60185a22fe2)\"" pod="openshift-machine-config-operator/machine-config-daemon-pqnst" podUID="1cf6fa66-290a-4e29-bafc-e60185a22fe2" Jan 11 18:00:58 crc kubenswrapper[4837]: I0111 18:00:58.426256 4837 generic.go:334] "Generic (PLEG): container finished" podID="e6773d83-814c-42bc-8578-5746bb984988" containerID="eb5676a8c0e1efd16bcd461e5c0711f312eef67e273304eb40e741a4a263754b" exitCode=0 Jan 11 18:00:58 crc kubenswrapper[4837]: I0111 18:00:58.426311 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-4d8rm" event={"ID":"e6773d83-814c-42bc-8578-5746bb984988","Type":"ContainerDied","Data":"eb5676a8c0e1efd16bcd461e5c0711f312eef67e273304eb40e741a4a263754b"} Jan 11 18:00:59 crc kubenswrapper[4837]: I0111 18:00:59.828035 4837 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-4d8rm" Jan 11 18:00:59 crc kubenswrapper[4837]: I0111 18:00:59.909358 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-hfcfj\" (UniqueName: \"kubernetes.io/projected/e6773d83-814c-42bc-8578-5746bb984988-kube-api-access-hfcfj\") pod \"e6773d83-814c-42bc-8578-5746bb984988\" (UID: \"e6773d83-814c-42bc-8578-5746bb984988\") " Jan 11 18:00:59 crc kubenswrapper[4837]: I0111 18:00:59.910124 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/e6773d83-814c-42bc-8578-5746bb984988-inventory\") pod \"e6773d83-814c-42bc-8578-5746bb984988\" (UID: \"e6773d83-814c-42bc-8578-5746bb984988\") " Jan 11 18:00:59 crc kubenswrapper[4837]: I0111 18:00:59.910364 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/e6773d83-814c-42bc-8578-5746bb984988-ssh-key-openstack-edpm-ipam\") pod \"e6773d83-814c-42bc-8578-5746bb984988\" (UID: \"e6773d83-814c-42bc-8578-5746bb984988\") " Jan 11 18:00:59 crc kubenswrapper[4837]: I0111 18:00:59.917420 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e6773d83-814c-42bc-8578-5746bb984988-kube-api-access-hfcfj" (OuterVolumeSpecName: "kube-api-access-hfcfj") pod "e6773d83-814c-42bc-8578-5746bb984988" (UID: "e6773d83-814c-42bc-8578-5746bb984988"). InnerVolumeSpecName "kube-api-access-hfcfj". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 11 18:00:59 crc kubenswrapper[4837]: I0111 18:00:59.938906 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e6773d83-814c-42bc-8578-5746bb984988-ssh-key-openstack-edpm-ipam" (OuterVolumeSpecName: "ssh-key-openstack-edpm-ipam") pod "e6773d83-814c-42bc-8578-5746bb984988" (UID: "e6773d83-814c-42bc-8578-5746bb984988"). InnerVolumeSpecName "ssh-key-openstack-edpm-ipam". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 11 18:00:59 crc kubenswrapper[4837]: I0111 18:00:59.941204 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e6773d83-814c-42bc-8578-5746bb984988-inventory" (OuterVolumeSpecName: "inventory") pod "e6773d83-814c-42bc-8578-5746bb984988" (UID: "e6773d83-814c-42bc-8578-5746bb984988"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 11 18:01:00 crc kubenswrapper[4837]: I0111 18:01:00.012697 4837 reconciler_common.go:293] "Volume detached for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/e6773d83-814c-42bc-8578-5746bb984988-ssh-key-openstack-edpm-ipam\") on node \"crc\" DevicePath \"\"" Jan 11 18:01:00 crc kubenswrapper[4837]: I0111 18:01:00.012908 4837 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-hfcfj\" (UniqueName: \"kubernetes.io/projected/e6773d83-814c-42bc-8578-5746bb984988-kube-api-access-hfcfj\") on node \"crc\" DevicePath \"\"" Jan 11 18:01:00 crc kubenswrapper[4837]: I0111 18:01:00.012965 4837 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/e6773d83-814c-42bc-8578-5746bb984988-inventory\") on node \"crc\" DevicePath \"\"" Jan 11 18:01:00 crc kubenswrapper[4837]: I0111 18:01:00.155443 4837 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/keystone-cron-29469241-hc276"] Jan 11 18:01:00 crc kubenswrapper[4837]: E0111 18:01:00.156091 4837 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e6773d83-814c-42bc-8578-5746bb984988" containerName="download-cache-edpm-deployment-openstack-edpm-ipam" Jan 11 18:01:00 crc kubenswrapper[4837]: I0111 18:01:00.156108 4837 state_mem.go:107] "Deleted CPUSet assignment" podUID="e6773d83-814c-42bc-8578-5746bb984988" containerName="download-cache-edpm-deployment-openstack-edpm-ipam" Jan 11 18:01:00 crc kubenswrapper[4837]: E0111 18:01:00.156146 4837 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="118fb501-a455-4398-9221-bc3c8922d5ff" containerName="collect-profiles" Jan 11 18:01:00 crc kubenswrapper[4837]: I0111 18:01:00.156152 4837 state_mem.go:107] "Deleted CPUSet assignment" podUID="118fb501-a455-4398-9221-bc3c8922d5ff" containerName="collect-profiles" Jan 11 18:01:00 crc kubenswrapper[4837]: I0111 18:01:00.156337 4837 memory_manager.go:354] "RemoveStaleState removing state" podUID="118fb501-a455-4398-9221-bc3c8922d5ff" containerName="collect-profiles" Jan 11 18:01:00 crc kubenswrapper[4837]: I0111 18:01:00.156352 4837 memory_manager.go:354] "RemoveStaleState removing state" podUID="e6773d83-814c-42bc-8578-5746bb984988" containerName="download-cache-edpm-deployment-openstack-edpm-ipam" Jan 11 18:01:00 crc kubenswrapper[4837]: I0111 18:01:00.156970 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-cron-29469241-hc276" Jan 11 18:01:00 crc kubenswrapper[4837]: I0111 18:01:00.178024 4837 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-cron-29469241-hc276"] Jan 11 18:01:00 crc kubenswrapper[4837]: I0111 18:01:00.215924 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-p8ntb\" (UniqueName: \"kubernetes.io/projected/d2c8127b-3998-456f-bd2c-01f945d7f0b9-kube-api-access-p8ntb\") pod \"keystone-cron-29469241-hc276\" (UID: \"d2c8127b-3998-456f-bd2c-01f945d7f0b9\") " pod="openstack/keystone-cron-29469241-hc276" Jan 11 18:01:00 crc kubenswrapper[4837]: I0111 18:01:00.215984 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/d2c8127b-3998-456f-bd2c-01f945d7f0b9-fernet-keys\") pod \"keystone-cron-29469241-hc276\" (UID: \"d2c8127b-3998-456f-bd2c-01f945d7f0b9\") " pod="openstack/keystone-cron-29469241-hc276" Jan 11 18:01:00 crc kubenswrapper[4837]: I0111 18:01:00.216025 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d2c8127b-3998-456f-bd2c-01f945d7f0b9-config-data\") pod \"keystone-cron-29469241-hc276\" (UID: \"d2c8127b-3998-456f-bd2c-01f945d7f0b9\") " pod="openstack/keystone-cron-29469241-hc276" Jan 11 18:01:00 crc kubenswrapper[4837]: I0111 18:01:00.216063 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d2c8127b-3998-456f-bd2c-01f945d7f0b9-combined-ca-bundle\") pod \"keystone-cron-29469241-hc276\" (UID: \"d2c8127b-3998-456f-bd2c-01f945d7f0b9\") " pod="openstack/keystone-cron-29469241-hc276" Jan 11 18:01:00 crc kubenswrapper[4837]: I0111 18:01:00.317774 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-p8ntb\" (UniqueName: \"kubernetes.io/projected/d2c8127b-3998-456f-bd2c-01f945d7f0b9-kube-api-access-p8ntb\") pod \"keystone-cron-29469241-hc276\" (UID: \"d2c8127b-3998-456f-bd2c-01f945d7f0b9\") " pod="openstack/keystone-cron-29469241-hc276" Jan 11 18:01:00 crc kubenswrapper[4837]: I0111 18:01:00.317856 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/d2c8127b-3998-456f-bd2c-01f945d7f0b9-fernet-keys\") pod \"keystone-cron-29469241-hc276\" (UID: \"d2c8127b-3998-456f-bd2c-01f945d7f0b9\") " pod="openstack/keystone-cron-29469241-hc276" Jan 11 18:01:00 crc kubenswrapper[4837]: I0111 18:01:00.317927 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d2c8127b-3998-456f-bd2c-01f945d7f0b9-config-data\") pod \"keystone-cron-29469241-hc276\" (UID: \"d2c8127b-3998-456f-bd2c-01f945d7f0b9\") " pod="openstack/keystone-cron-29469241-hc276" Jan 11 18:01:00 crc kubenswrapper[4837]: I0111 18:01:00.317990 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d2c8127b-3998-456f-bd2c-01f945d7f0b9-combined-ca-bundle\") pod \"keystone-cron-29469241-hc276\" (UID: \"d2c8127b-3998-456f-bd2c-01f945d7f0b9\") " pod="openstack/keystone-cron-29469241-hc276" Jan 11 18:01:00 crc kubenswrapper[4837]: I0111 18:01:00.322976 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d2c8127b-3998-456f-bd2c-01f945d7f0b9-config-data\") pod \"keystone-cron-29469241-hc276\" (UID: \"d2c8127b-3998-456f-bd2c-01f945d7f0b9\") " pod="openstack/keystone-cron-29469241-hc276" Jan 11 18:01:00 crc kubenswrapper[4837]: I0111 18:01:00.324801 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/d2c8127b-3998-456f-bd2c-01f945d7f0b9-fernet-keys\") pod \"keystone-cron-29469241-hc276\" (UID: \"d2c8127b-3998-456f-bd2c-01f945d7f0b9\") " pod="openstack/keystone-cron-29469241-hc276" Jan 11 18:01:00 crc kubenswrapper[4837]: I0111 18:01:00.336527 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-p8ntb\" (UniqueName: \"kubernetes.io/projected/d2c8127b-3998-456f-bd2c-01f945d7f0b9-kube-api-access-p8ntb\") pod \"keystone-cron-29469241-hc276\" (UID: \"d2c8127b-3998-456f-bd2c-01f945d7f0b9\") " pod="openstack/keystone-cron-29469241-hc276" Jan 11 18:01:00 crc kubenswrapper[4837]: I0111 18:01:00.336760 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d2c8127b-3998-456f-bd2c-01f945d7f0b9-combined-ca-bundle\") pod \"keystone-cron-29469241-hc276\" (UID: \"d2c8127b-3998-456f-bd2c-01f945d7f0b9\") " pod="openstack/keystone-cron-29469241-hc276" Jan 11 18:01:00 crc kubenswrapper[4837]: I0111 18:01:00.445290 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-4d8rm" event={"ID":"e6773d83-814c-42bc-8578-5746bb984988","Type":"ContainerDied","Data":"856e3e260e56f9fa1ac11189c41b3285bdf84e991da2c43c07dc8c8383c29cd6"} Jan 11 18:01:00 crc kubenswrapper[4837]: I0111 18:01:00.445337 4837 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="856e3e260e56f9fa1ac11189c41b3285bdf84e991da2c43c07dc8c8383c29cd6" Jan 11 18:01:00 crc kubenswrapper[4837]: I0111 18:01:00.445369 4837 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-4d8rm" Jan 11 18:01:00 crc kubenswrapper[4837]: I0111 18:01:00.483164 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-cron-29469241-hc276" Jan 11 18:01:00 crc kubenswrapper[4837]: I0111 18:01:00.532615 4837 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/configure-network-edpm-deployment-openstack-edpm-ipam-xmwj9"] Jan 11 18:01:00 crc kubenswrapper[4837]: I0111 18:01:00.534115 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-xmwj9" Jan 11 18:01:00 crc kubenswrapper[4837]: I0111 18:01:00.538096 4837 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-bzt89" Jan 11 18:01:00 crc kubenswrapper[4837]: I0111 18:01:00.538596 4837 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Jan 11 18:01:00 crc kubenswrapper[4837]: I0111 18:01:00.539028 4837 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Jan 11 18:01:00 crc kubenswrapper[4837]: I0111 18:01:00.539093 4837 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Jan 11 18:01:00 crc kubenswrapper[4837]: I0111 18:01:00.553505 4837 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/configure-network-edpm-deployment-openstack-edpm-ipam-xmwj9"] Jan 11 18:01:00 crc kubenswrapper[4837]: I0111 18:01:00.626259 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dwjs5\" (UniqueName: \"kubernetes.io/projected/fd04d490-42de-47b9-aa6f-bc09ba8dd539-kube-api-access-dwjs5\") pod \"configure-network-edpm-deployment-openstack-edpm-ipam-xmwj9\" (UID: \"fd04d490-42de-47b9-aa6f-bc09ba8dd539\") " pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-xmwj9" Jan 11 18:01:00 crc kubenswrapper[4837]: I0111 18:01:00.626634 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/fd04d490-42de-47b9-aa6f-bc09ba8dd539-inventory\") pod \"configure-network-edpm-deployment-openstack-edpm-ipam-xmwj9\" (UID: \"fd04d490-42de-47b9-aa6f-bc09ba8dd539\") " pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-xmwj9" Jan 11 18:01:00 crc kubenswrapper[4837]: I0111 18:01:00.626786 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/fd04d490-42de-47b9-aa6f-bc09ba8dd539-ssh-key-openstack-edpm-ipam\") pod \"configure-network-edpm-deployment-openstack-edpm-ipam-xmwj9\" (UID: \"fd04d490-42de-47b9-aa6f-bc09ba8dd539\") " pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-xmwj9" Jan 11 18:01:00 crc kubenswrapper[4837]: I0111 18:01:00.728470 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/fd04d490-42de-47b9-aa6f-bc09ba8dd539-ssh-key-openstack-edpm-ipam\") pod \"configure-network-edpm-deployment-openstack-edpm-ipam-xmwj9\" (UID: \"fd04d490-42de-47b9-aa6f-bc09ba8dd539\") " pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-xmwj9" Jan 11 18:01:00 crc kubenswrapper[4837]: I0111 18:01:00.728539 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dwjs5\" (UniqueName: \"kubernetes.io/projected/fd04d490-42de-47b9-aa6f-bc09ba8dd539-kube-api-access-dwjs5\") pod \"configure-network-edpm-deployment-openstack-edpm-ipam-xmwj9\" (UID: \"fd04d490-42de-47b9-aa6f-bc09ba8dd539\") " pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-xmwj9" Jan 11 18:01:00 crc kubenswrapper[4837]: I0111 18:01:00.728563 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/fd04d490-42de-47b9-aa6f-bc09ba8dd539-inventory\") pod \"configure-network-edpm-deployment-openstack-edpm-ipam-xmwj9\" (UID: \"fd04d490-42de-47b9-aa6f-bc09ba8dd539\") " pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-xmwj9" Jan 11 18:01:00 crc kubenswrapper[4837]: I0111 18:01:00.735313 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/fd04d490-42de-47b9-aa6f-bc09ba8dd539-inventory\") pod \"configure-network-edpm-deployment-openstack-edpm-ipam-xmwj9\" (UID: \"fd04d490-42de-47b9-aa6f-bc09ba8dd539\") " pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-xmwj9" Jan 11 18:01:00 crc kubenswrapper[4837]: I0111 18:01:00.735351 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/fd04d490-42de-47b9-aa6f-bc09ba8dd539-ssh-key-openstack-edpm-ipam\") pod \"configure-network-edpm-deployment-openstack-edpm-ipam-xmwj9\" (UID: \"fd04d490-42de-47b9-aa6f-bc09ba8dd539\") " pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-xmwj9" Jan 11 18:01:00 crc kubenswrapper[4837]: I0111 18:01:00.754793 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dwjs5\" (UniqueName: \"kubernetes.io/projected/fd04d490-42de-47b9-aa6f-bc09ba8dd539-kube-api-access-dwjs5\") pod \"configure-network-edpm-deployment-openstack-edpm-ipam-xmwj9\" (UID: \"fd04d490-42de-47b9-aa6f-bc09ba8dd539\") " pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-xmwj9" Jan 11 18:01:00 crc kubenswrapper[4837]: I0111 18:01:00.856498 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-xmwj9" Jan 11 18:01:01 crc kubenswrapper[4837]: I0111 18:01:01.015839 4837 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-cron-29469241-hc276"] Jan 11 18:01:01 crc kubenswrapper[4837]: I0111 18:01:01.217714 4837 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/configure-network-edpm-deployment-openstack-edpm-ipam-xmwj9"] Jan 11 18:01:01 crc kubenswrapper[4837]: W0111 18:01:01.219451 4837 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podfd04d490_42de_47b9_aa6f_bc09ba8dd539.slice/crio-65bfa12bc30f677d5252298076e1a1c64a5a632d3050f071568575c5b103f6e1 WatchSource:0}: Error finding container 65bfa12bc30f677d5252298076e1a1c64a5a632d3050f071568575c5b103f6e1: Status 404 returned error can't find the container with id 65bfa12bc30f677d5252298076e1a1c64a5a632d3050f071568575c5b103f6e1 Jan 11 18:01:01 crc kubenswrapper[4837]: I0111 18:01:01.455088 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-xmwj9" event={"ID":"fd04d490-42de-47b9-aa6f-bc09ba8dd539","Type":"ContainerStarted","Data":"65bfa12bc30f677d5252298076e1a1c64a5a632d3050f071568575c5b103f6e1"} Jan 11 18:01:01 crc kubenswrapper[4837]: I0111 18:01:01.456765 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-cron-29469241-hc276" event={"ID":"d2c8127b-3998-456f-bd2c-01f945d7f0b9","Type":"ContainerStarted","Data":"1414e300a117e5ca94cd3fbc7f5c8c2a91cd31159d9cb81b61a159219052d132"} Jan 11 18:01:01 crc kubenswrapper[4837]: I0111 18:01:01.456806 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-cron-29469241-hc276" event={"ID":"d2c8127b-3998-456f-bd2c-01f945d7f0b9","Type":"ContainerStarted","Data":"c7e437a147196c4500a26fe886f8c67f3fb087aafeb1764580f7cd36a84b4baa"} Jan 11 18:01:01 crc kubenswrapper[4837]: I0111 18:01:01.477826 4837 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/keystone-cron-29469241-hc276" podStartSLOduration=1.477806562 podStartE2EDuration="1.477806562s" podCreationTimestamp="2026-01-11 18:01:00 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-11 18:01:01.469128769 +0000 UTC m=+1835.647321525" watchObservedRunningTime="2026-01-11 18:01:01.477806562 +0000 UTC m=+1835.655999288" Jan 11 18:01:02 crc kubenswrapper[4837]: I0111 18:01:02.470411 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-xmwj9" event={"ID":"fd04d490-42de-47b9-aa6f-bc09ba8dd539","Type":"ContainerStarted","Data":"952acf19913da7a957722747aa6d9fa411e5c7fb475243faa939acb9c7921647"} Jan 11 18:01:02 crc kubenswrapper[4837]: I0111 18:01:02.495140 4837 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-xmwj9" podStartSLOduration=2.078277063 podStartE2EDuration="2.495116349s" podCreationTimestamp="2026-01-11 18:01:00 +0000 UTC" firstStartedPulling="2026-01-11 18:01:01.220849913 +0000 UTC m=+1835.399042619" lastFinishedPulling="2026-01-11 18:01:01.637689199 +0000 UTC m=+1835.815881905" observedRunningTime="2026-01-11 18:01:02.489283653 +0000 UTC m=+1836.667476369" watchObservedRunningTime="2026-01-11 18:01:02.495116349 +0000 UTC m=+1836.673309085" Jan 11 18:01:03 crc kubenswrapper[4837]: I0111 18:01:03.483959 4837 generic.go:334] "Generic (PLEG): container finished" podID="d2c8127b-3998-456f-bd2c-01f945d7f0b9" containerID="1414e300a117e5ca94cd3fbc7f5c8c2a91cd31159d9cb81b61a159219052d132" exitCode=0 Jan 11 18:01:03 crc kubenswrapper[4837]: I0111 18:01:03.484315 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-cron-29469241-hc276" event={"ID":"d2c8127b-3998-456f-bd2c-01f945d7f0b9","Type":"ContainerDied","Data":"1414e300a117e5ca94cd3fbc7f5c8c2a91cd31159d9cb81b61a159219052d132"} Jan 11 18:01:04 crc kubenswrapper[4837]: I0111 18:01:04.847263 4837 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-cron-29469241-hc276" Jan 11 18:01:05 crc kubenswrapper[4837]: I0111 18:01:05.016741 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d2c8127b-3998-456f-bd2c-01f945d7f0b9-config-data\") pod \"d2c8127b-3998-456f-bd2c-01f945d7f0b9\" (UID: \"d2c8127b-3998-456f-bd2c-01f945d7f0b9\") " Jan 11 18:01:05 crc kubenswrapper[4837]: I0111 18:01:05.017322 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d2c8127b-3998-456f-bd2c-01f945d7f0b9-combined-ca-bundle\") pod \"d2c8127b-3998-456f-bd2c-01f945d7f0b9\" (UID: \"d2c8127b-3998-456f-bd2c-01f945d7f0b9\") " Jan 11 18:01:05 crc kubenswrapper[4837]: I0111 18:01:05.017406 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-p8ntb\" (UniqueName: \"kubernetes.io/projected/d2c8127b-3998-456f-bd2c-01f945d7f0b9-kube-api-access-p8ntb\") pod \"d2c8127b-3998-456f-bd2c-01f945d7f0b9\" (UID: \"d2c8127b-3998-456f-bd2c-01f945d7f0b9\") " Jan 11 18:01:05 crc kubenswrapper[4837]: I0111 18:01:05.017534 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/d2c8127b-3998-456f-bd2c-01f945d7f0b9-fernet-keys\") pod \"d2c8127b-3998-456f-bd2c-01f945d7f0b9\" (UID: \"d2c8127b-3998-456f-bd2c-01f945d7f0b9\") " Jan 11 18:01:05 crc kubenswrapper[4837]: I0111 18:01:05.023216 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d2c8127b-3998-456f-bd2c-01f945d7f0b9-kube-api-access-p8ntb" (OuterVolumeSpecName: "kube-api-access-p8ntb") pod "d2c8127b-3998-456f-bd2c-01f945d7f0b9" (UID: "d2c8127b-3998-456f-bd2c-01f945d7f0b9"). InnerVolumeSpecName "kube-api-access-p8ntb". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 11 18:01:05 crc kubenswrapper[4837]: I0111 18:01:05.031479 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d2c8127b-3998-456f-bd2c-01f945d7f0b9-fernet-keys" (OuterVolumeSpecName: "fernet-keys") pod "d2c8127b-3998-456f-bd2c-01f945d7f0b9" (UID: "d2c8127b-3998-456f-bd2c-01f945d7f0b9"). InnerVolumeSpecName "fernet-keys". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 11 18:01:05 crc kubenswrapper[4837]: I0111 18:01:05.055411 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d2c8127b-3998-456f-bd2c-01f945d7f0b9-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "d2c8127b-3998-456f-bd2c-01f945d7f0b9" (UID: "d2c8127b-3998-456f-bd2c-01f945d7f0b9"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 11 18:01:05 crc kubenswrapper[4837]: I0111 18:01:05.079630 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d2c8127b-3998-456f-bd2c-01f945d7f0b9-config-data" (OuterVolumeSpecName: "config-data") pod "d2c8127b-3998-456f-bd2c-01f945d7f0b9" (UID: "d2c8127b-3998-456f-bd2c-01f945d7f0b9"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 11 18:01:05 crc kubenswrapper[4837]: I0111 18:01:05.120959 4837 reconciler_common.go:293] "Volume detached for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/d2c8127b-3998-456f-bd2c-01f945d7f0b9-fernet-keys\") on node \"crc\" DevicePath \"\"" Jan 11 18:01:05 crc kubenswrapper[4837]: I0111 18:01:05.121001 4837 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d2c8127b-3998-456f-bd2c-01f945d7f0b9-config-data\") on node \"crc\" DevicePath \"\"" Jan 11 18:01:05 crc kubenswrapper[4837]: I0111 18:01:05.121011 4837 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d2c8127b-3998-456f-bd2c-01f945d7f0b9-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 11 18:01:05 crc kubenswrapper[4837]: I0111 18:01:05.121024 4837 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-p8ntb\" (UniqueName: \"kubernetes.io/projected/d2c8127b-3998-456f-bd2c-01f945d7f0b9-kube-api-access-p8ntb\") on node \"crc\" DevicePath \"\"" Jan 11 18:01:05 crc kubenswrapper[4837]: I0111 18:01:05.508555 4837 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-cron-29469241-hc276" Jan 11 18:01:05 crc kubenswrapper[4837]: I0111 18:01:05.508618 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-cron-29469241-hc276" event={"ID":"d2c8127b-3998-456f-bd2c-01f945d7f0b9","Type":"ContainerDied","Data":"c7e437a147196c4500a26fe886f8c67f3fb087aafeb1764580f7cd36a84b4baa"} Jan 11 18:01:05 crc kubenswrapper[4837]: I0111 18:01:05.509117 4837 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="c7e437a147196c4500a26fe886f8c67f3fb087aafeb1764580f7cd36a84b4baa" Jan 11 18:01:07 crc kubenswrapper[4837]: I0111 18:01:07.057901 4837 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/keystone-bootstrap-7pd2m"] Jan 11 18:01:07 crc kubenswrapper[4837]: I0111 18:01:07.070513 4837 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/keystone-bootstrap-7pd2m"] Jan 11 18:01:08 crc kubenswrapper[4837]: I0111 18:01:08.034939 4837 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/placement-db-sync-k7p7p"] Jan 11 18:01:08 crc kubenswrapper[4837]: I0111 18:01:08.042547 4837 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/placement-db-sync-k7p7p"] Jan 11 18:01:08 crc kubenswrapper[4837]: I0111 18:01:08.632773 4837 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c5803c46-a48f-4120-9010-51375caff2a5" path="/var/lib/kubelet/pods/c5803c46-a48f-4120-9010-51375caff2a5/volumes" Jan 11 18:01:08 crc kubenswrapper[4837]: I0111 18:01:08.634199 4837 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ca96b8a6-63fc-49fc-8898-6794c54e1676" path="/var/lib/kubelet/pods/ca96b8a6-63fc-49fc-8898-6794c54e1676/volumes" Jan 11 18:01:12 crc kubenswrapper[4837]: I0111 18:01:12.059516 4837 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/neutron-db-sync-fv6l8"] Jan 11 18:01:12 crc kubenswrapper[4837]: I0111 18:01:12.069754 4837 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/neutron-db-sync-fv6l8"] Jan 11 18:01:12 crc kubenswrapper[4837]: I0111 18:01:12.364705 4837 scope.go:117] "RemoveContainer" containerID="275040296db5cef9700aa2e2244ea724fe1663bdc93466378b56f7e33809d0f9" Jan 11 18:01:12 crc kubenswrapper[4837]: E0111 18:01:12.365103 4837 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-pqnst_openshift-machine-config-operator(1cf6fa66-290a-4e29-bafc-e60185a22fe2)\"" pod="openshift-machine-config-operator/machine-config-daemon-pqnst" podUID="1cf6fa66-290a-4e29-bafc-e60185a22fe2" Jan 11 18:01:12 crc kubenswrapper[4837]: I0111 18:01:12.379341 4837 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1b22cc91-a2b0-4468-96bc-73cb4aab66bb" path="/var/lib/kubelet/pods/1b22cc91-a2b0-4468-96bc-73cb4aab66bb/volumes" Jan 11 18:01:23 crc kubenswrapper[4837]: I0111 18:01:23.037598 4837 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/barbican-db-sync-6mdwl"] Jan 11 18:01:23 crc kubenswrapper[4837]: I0111 18:01:23.047303 4837 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/barbican-db-sync-6mdwl"] Jan 11 18:01:24 crc kubenswrapper[4837]: I0111 18:01:24.395077 4837 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5930b460-1c65-4c06-a3bc-f6d6f0518110" path="/var/lib/kubelet/pods/5930b460-1c65-4c06-a3bc-f6d6f0518110/volumes" Jan 11 18:01:25 crc kubenswrapper[4837]: I0111 18:01:25.364755 4837 scope.go:117] "RemoveContainer" containerID="275040296db5cef9700aa2e2244ea724fe1663bdc93466378b56f7e33809d0f9" Jan 11 18:01:25 crc kubenswrapper[4837]: E0111 18:01:25.365226 4837 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-pqnst_openshift-machine-config-operator(1cf6fa66-290a-4e29-bafc-e60185a22fe2)\"" pod="openshift-machine-config-operator/machine-config-daemon-pqnst" podUID="1cf6fa66-290a-4e29-bafc-e60185a22fe2" Jan 11 18:01:29 crc kubenswrapper[4837]: I0111 18:01:29.043578 4837 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-db-sync-l58f7"] Jan 11 18:01:29 crc kubenswrapper[4837]: I0111 18:01:29.058449 4837 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/cinder-db-sync-l58f7"] Jan 11 18:01:30 crc kubenswrapper[4837]: I0111 18:01:30.377485 4837 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ecc4b7a3-4585-48ee-9cf3-caa58c9f5895" path="/var/lib/kubelet/pods/ecc4b7a3-4585-48ee-9cf3-caa58c9f5895/volumes" Jan 11 18:01:36 crc kubenswrapper[4837]: I0111 18:01:36.092991 4837 scope.go:117] "RemoveContainer" containerID="48dffaa36d493d8933fc0c6d2c32e0493364340b235a0c14855cc42fb8164cb3" Jan 11 18:01:36 crc kubenswrapper[4837]: I0111 18:01:36.134608 4837 scope.go:117] "RemoveContainer" containerID="ec89d103ce72e03c31f614723ff515f9f50db651282eb16e23134970bbd6d516" Jan 11 18:01:36 crc kubenswrapper[4837]: I0111 18:01:36.195106 4837 scope.go:117] "RemoveContainer" containerID="2bb5637ead90d8db0e9cc88d248e8a187c19bf6d89b936cc130f704c59b2d2eb" Jan 11 18:01:36 crc kubenswrapper[4837]: I0111 18:01:36.252192 4837 scope.go:117] "RemoveContainer" containerID="aac301345f13d97aaec56ecccb3cd8c148a3f0d7fdf0d6970717bd45d50ca86d" Jan 11 18:01:36 crc kubenswrapper[4837]: I0111 18:01:36.289962 4837 scope.go:117] "RemoveContainer" containerID="05e01069189ca9bcb009b7a4e0f5bbac836d0fe3c08bea3479a309d144351b91" Jan 11 18:01:38 crc kubenswrapper[4837]: I0111 18:01:38.364190 4837 scope.go:117] "RemoveContainer" containerID="275040296db5cef9700aa2e2244ea724fe1663bdc93466378b56f7e33809d0f9" Jan 11 18:01:38 crc kubenswrapper[4837]: E0111 18:01:38.365094 4837 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-pqnst_openshift-machine-config-operator(1cf6fa66-290a-4e29-bafc-e60185a22fe2)\"" pod="openshift-machine-config-operator/machine-config-daemon-pqnst" podUID="1cf6fa66-290a-4e29-bafc-e60185a22fe2" Jan 11 18:01:51 crc kubenswrapper[4837]: I0111 18:01:51.364588 4837 scope.go:117] "RemoveContainer" containerID="275040296db5cef9700aa2e2244ea724fe1663bdc93466378b56f7e33809d0f9" Jan 11 18:01:51 crc kubenswrapper[4837]: E0111 18:01:51.365401 4837 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-pqnst_openshift-machine-config-operator(1cf6fa66-290a-4e29-bafc-e60185a22fe2)\"" pod="openshift-machine-config-operator/machine-config-daemon-pqnst" podUID="1cf6fa66-290a-4e29-bafc-e60185a22fe2" Jan 11 18:02:06 crc kubenswrapper[4837]: I0111 18:02:06.387956 4837 scope.go:117] "RemoveContainer" containerID="275040296db5cef9700aa2e2244ea724fe1663bdc93466378b56f7e33809d0f9" Jan 11 18:02:06 crc kubenswrapper[4837]: E0111 18:02:06.390431 4837 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-pqnst_openshift-machine-config-operator(1cf6fa66-290a-4e29-bafc-e60185a22fe2)\"" pod="openshift-machine-config-operator/machine-config-daemon-pqnst" podUID="1cf6fa66-290a-4e29-bafc-e60185a22fe2" Jan 11 18:02:16 crc kubenswrapper[4837]: I0111 18:02:16.069237 4837 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-db-create-h9f9z"] Jan 11 18:02:16 crc kubenswrapper[4837]: I0111 18:02:16.083974 4837 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-api-db-create-h9f9z"] Jan 11 18:02:16 crc kubenswrapper[4837]: I0111 18:02:16.381024 4837 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="54d9b221-84b0-4c01-9870-30500cafdaf5" path="/var/lib/kubelet/pods/54d9b221-84b0-4c01-9870-30500cafdaf5/volumes" Jan 11 18:02:17 crc kubenswrapper[4837]: I0111 18:02:17.038402 4837 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell1-db-create-v96ck"] Jan 11 18:02:17 crc kubenswrapper[4837]: I0111 18:02:17.044624 4837 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell1-e41d-account-create-update-l96tm"] Jan 11 18:02:17 crc kubenswrapper[4837]: I0111 18:02:17.052399 4837 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell0-db-create-hldbw"] Jan 11 18:02:17 crc kubenswrapper[4837]: I0111 18:02:17.059567 4837 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell0-f0ad-account-create-update-6nvvv"] Jan 11 18:02:17 crc kubenswrapper[4837]: I0111 18:02:17.066216 4837 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell1-e41d-account-create-update-l96tm"] Jan 11 18:02:17 crc kubenswrapper[4837]: I0111 18:02:17.072560 4837 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell1-db-create-v96ck"] Jan 11 18:02:17 crc kubenswrapper[4837]: I0111 18:02:17.078580 4837 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell0-f0ad-account-create-update-6nvvv"] Jan 11 18:02:17 crc kubenswrapper[4837]: I0111 18:02:17.084277 4837 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell0-db-create-hldbw"] Jan 11 18:02:17 crc kubenswrapper[4837]: I0111 18:02:17.364443 4837 scope.go:117] "RemoveContainer" containerID="275040296db5cef9700aa2e2244ea724fe1663bdc93466378b56f7e33809d0f9" Jan 11 18:02:17 crc kubenswrapper[4837]: E0111 18:02:17.364665 4837 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-pqnst_openshift-machine-config-operator(1cf6fa66-290a-4e29-bafc-e60185a22fe2)\"" pod="openshift-machine-config-operator/machine-config-daemon-pqnst" podUID="1cf6fa66-290a-4e29-bafc-e60185a22fe2" Jan 11 18:02:18 crc kubenswrapper[4837]: I0111 18:02:18.052224 4837 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-7633-account-create-update-2l7l4"] Jan 11 18:02:18 crc kubenswrapper[4837]: I0111 18:02:18.067356 4837 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-api-7633-account-create-update-2l7l4"] Jan 11 18:02:18 crc kubenswrapper[4837]: I0111 18:02:18.376190 4837 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="85847325-510d-4024-86ce-271201c83e9a" path="/var/lib/kubelet/pods/85847325-510d-4024-86ce-271201c83e9a/volumes" Jan 11 18:02:18 crc kubenswrapper[4837]: I0111 18:02:18.377476 4837 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="89212816-4e77-490e-ae62-d3aef36b0570" path="/var/lib/kubelet/pods/89212816-4e77-490e-ae62-d3aef36b0570/volumes" Jan 11 18:02:18 crc kubenswrapper[4837]: I0111 18:02:18.378663 4837 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a509deff-a6eb-435a-8739-bc5b0489d32b" path="/var/lib/kubelet/pods/a509deff-a6eb-435a-8739-bc5b0489d32b/volumes" Jan 11 18:02:18 crc kubenswrapper[4837]: I0111 18:02:18.379840 4837 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b1f55033-05b1-49a2-8848-17633aeea8ca" path="/var/lib/kubelet/pods/b1f55033-05b1-49a2-8848-17633aeea8ca/volumes" Jan 11 18:02:18 crc kubenswrapper[4837]: I0111 18:02:18.381524 4837 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="dc121be2-7012-40c1-8dbe-722d0e838685" path="/var/lib/kubelet/pods/dc121be2-7012-40c1-8dbe-722d0e838685/volumes" Jan 11 18:02:19 crc kubenswrapper[4837]: I0111 18:02:19.424910 4837 generic.go:334] "Generic (PLEG): container finished" podID="fd04d490-42de-47b9-aa6f-bc09ba8dd539" containerID="952acf19913da7a957722747aa6d9fa411e5c7fb475243faa939acb9c7921647" exitCode=0 Jan 11 18:02:19 crc kubenswrapper[4837]: I0111 18:02:19.424975 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-xmwj9" event={"ID":"fd04d490-42de-47b9-aa6f-bc09ba8dd539","Type":"ContainerDied","Data":"952acf19913da7a957722747aa6d9fa411e5c7fb475243faa939acb9c7921647"} Jan 11 18:02:20 crc kubenswrapper[4837]: I0111 18:02:20.891559 4837 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-xmwj9" Jan 11 18:02:20 crc kubenswrapper[4837]: I0111 18:02:20.928549 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/fd04d490-42de-47b9-aa6f-bc09ba8dd539-inventory\") pod \"fd04d490-42de-47b9-aa6f-bc09ba8dd539\" (UID: \"fd04d490-42de-47b9-aa6f-bc09ba8dd539\") " Jan 11 18:02:20 crc kubenswrapper[4837]: I0111 18:02:20.928718 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-dwjs5\" (UniqueName: \"kubernetes.io/projected/fd04d490-42de-47b9-aa6f-bc09ba8dd539-kube-api-access-dwjs5\") pod \"fd04d490-42de-47b9-aa6f-bc09ba8dd539\" (UID: \"fd04d490-42de-47b9-aa6f-bc09ba8dd539\") " Jan 11 18:02:20 crc kubenswrapper[4837]: I0111 18:02:20.928769 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/fd04d490-42de-47b9-aa6f-bc09ba8dd539-ssh-key-openstack-edpm-ipam\") pod \"fd04d490-42de-47b9-aa6f-bc09ba8dd539\" (UID: \"fd04d490-42de-47b9-aa6f-bc09ba8dd539\") " Jan 11 18:02:20 crc kubenswrapper[4837]: I0111 18:02:20.934368 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/fd04d490-42de-47b9-aa6f-bc09ba8dd539-kube-api-access-dwjs5" (OuterVolumeSpecName: "kube-api-access-dwjs5") pod "fd04d490-42de-47b9-aa6f-bc09ba8dd539" (UID: "fd04d490-42de-47b9-aa6f-bc09ba8dd539"). InnerVolumeSpecName "kube-api-access-dwjs5". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 11 18:02:20 crc kubenswrapper[4837]: I0111 18:02:20.957370 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/fd04d490-42de-47b9-aa6f-bc09ba8dd539-inventory" (OuterVolumeSpecName: "inventory") pod "fd04d490-42de-47b9-aa6f-bc09ba8dd539" (UID: "fd04d490-42de-47b9-aa6f-bc09ba8dd539"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 11 18:02:20 crc kubenswrapper[4837]: I0111 18:02:20.958454 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/fd04d490-42de-47b9-aa6f-bc09ba8dd539-ssh-key-openstack-edpm-ipam" (OuterVolumeSpecName: "ssh-key-openstack-edpm-ipam") pod "fd04d490-42de-47b9-aa6f-bc09ba8dd539" (UID: "fd04d490-42de-47b9-aa6f-bc09ba8dd539"). InnerVolumeSpecName "ssh-key-openstack-edpm-ipam". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 11 18:02:21 crc kubenswrapper[4837]: I0111 18:02:21.030585 4837 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/fd04d490-42de-47b9-aa6f-bc09ba8dd539-inventory\") on node \"crc\" DevicePath \"\"" Jan 11 18:02:21 crc kubenswrapper[4837]: I0111 18:02:21.030617 4837 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-dwjs5\" (UniqueName: \"kubernetes.io/projected/fd04d490-42de-47b9-aa6f-bc09ba8dd539-kube-api-access-dwjs5\") on node \"crc\" DevicePath \"\"" Jan 11 18:02:21 crc kubenswrapper[4837]: I0111 18:02:21.030631 4837 reconciler_common.go:293] "Volume detached for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/fd04d490-42de-47b9-aa6f-bc09ba8dd539-ssh-key-openstack-edpm-ipam\") on node \"crc\" DevicePath \"\"" Jan 11 18:02:21 crc kubenswrapper[4837]: I0111 18:02:21.446838 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-xmwj9" event={"ID":"fd04d490-42de-47b9-aa6f-bc09ba8dd539","Type":"ContainerDied","Data":"65bfa12bc30f677d5252298076e1a1c64a5a632d3050f071568575c5b103f6e1"} Jan 11 18:02:21 crc kubenswrapper[4837]: I0111 18:02:21.446892 4837 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-xmwj9" Jan 11 18:02:21 crc kubenswrapper[4837]: I0111 18:02:21.446895 4837 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="65bfa12bc30f677d5252298076e1a1c64a5a632d3050f071568575c5b103f6e1" Jan 11 18:02:21 crc kubenswrapper[4837]: I0111 18:02:21.555076 4837 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/validate-network-edpm-deployment-openstack-edpm-ipam-9nvm9"] Jan 11 18:02:21 crc kubenswrapper[4837]: E0111 18:02:21.555714 4837 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d2c8127b-3998-456f-bd2c-01f945d7f0b9" containerName="keystone-cron" Jan 11 18:02:21 crc kubenswrapper[4837]: I0111 18:02:21.555744 4837 state_mem.go:107] "Deleted CPUSet assignment" podUID="d2c8127b-3998-456f-bd2c-01f945d7f0b9" containerName="keystone-cron" Jan 11 18:02:21 crc kubenswrapper[4837]: E0111 18:02:21.555774 4837 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="fd04d490-42de-47b9-aa6f-bc09ba8dd539" containerName="configure-network-edpm-deployment-openstack-edpm-ipam" Jan 11 18:02:21 crc kubenswrapper[4837]: I0111 18:02:21.555790 4837 state_mem.go:107] "Deleted CPUSet assignment" podUID="fd04d490-42de-47b9-aa6f-bc09ba8dd539" containerName="configure-network-edpm-deployment-openstack-edpm-ipam" Jan 11 18:02:21 crc kubenswrapper[4837]: I0111 18:02:21.556103 4837 memory_manager.go:354] "RemoveStaleState removing state" podUID="d2c8127b-3998-456f-bd2c-01f945d7f0b9" containerName="keystone-cron" Jan 11 18:02:21 crc kubenswrapper[4837]: I0111 18:02:21.556142 4837 memory_manager.go:354] "RemoveStaleState removing state" podUID="fd04d490-42de-47b9-aa6f-bc09ba8dd539" containerName="configure-network-edpm-deployment-openstack-edpm-ipam" Jan 11 18:02:21 crc kubenswrapper[4837]: I0111 18:02:21.557144 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-9nvm9" Jan 11 18:02:21 crc kubenswrapper[4837]: I0111 18:02:21.561574 4837 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Jan 11 18:02:21 crc kubenswrapper[4837]: I0111 18:02:21.561601 4837 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Jan 11 18:02:21 crc kubenswrapper[4837]: I0111 18:02:21.561921 4837 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-bzt89" Jan 11 18:02:21 crc kubenswrapper[4837]: I0111 18:02:21.561943 4837 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Jan 11 18:02:21 crc kubenswrapper[4837]: I0111 18:02:21.571280 4837 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/validate-network-edpm-deployment-openstack-edpm-ipam-9nvm9"] Jan 11 18:02:21 crc kubenswrapper[4837]: I0111 18:02:21.641934 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/be22134a-b58f-4a66-bcb2-0545a067b33b-ssh-key-openstack-edpm-ipam\") pod \"validate-network-edpm-deployment-openstack-edpm-ipam-9nvm9\" (UID: \"be22134a-b58f-4a66-bcb2-0545a067b33b\") " pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-9nvm9" Jan 11 18:02:21 crc kubenswrapper[4837]: I0111 18:02:21.642050 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-24g5q\" (UniqueName: \"kubernetes.io/projected/be22134a-b58f-4a66-bcb2-0545a067b33b-kube-api-access-24g5q\") pod \"validate-network-edpm-deployment-openstack-edpm-ipam-9nvm9\" (UID: \"be22134a-b58f-4a66-bcb2-0545a067b33b\") " pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-9nvm9" Jan 11 18:02:21 crc kubenswrapper[4837]: I0111 18:02:21.642155 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/be22134a-b58f-4a66-bcb2-0545a067b33b-inventory\") pod \"validate-network-edpm-deployment-openstack-edpm-ipam-9nvm9\" (UID: \"be22134a-b58f-4a66-bcb2-0545a067b33b\") " pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-9nvm9" Jan 11 18:02:21 crc kubenswrapper[4837]: I0111 18:02:21.743768 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/be22134a-b58f-4a66-bcb2-0545a067b33b-inventory\") pod \"validate-network-edpm-deployment-openstack-edpm-ipam-9nvm9\" (UID: \"be22134a-b58f-4a66-bcb2-0545a067b33b\") " pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-9nvm9" Jan 11 18:02:21 crc kubenswrapper[4837]: I0111 18:02:21.743975 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/be22134a-b58f-4a66-bcb2-0545a067b33b-ssh-key-openstack-edpm-ipam\") pod \"validate-network-edpm-deployment-openstack-edpm-ipam-9nvm9\" (UID: \"be22134a-b58f-4a66-bcb2-0545a067b33b\") " pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-9nvm9" Jan 11 18:02:21 crc kubenswrapper[4837]: I0111 18:02:21.744829 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-24g5q\" (UniqueName: \"kubernetes.io/projected/be22134a-b58f-4a66-bcb2-0545a067b33b-kube-api-access-24g5q\") pod \"validate-network-edpm-deployment-openstack-edpm-ipam-9nvm9\" (UID: \"be22134a-b58f-4a66-bcb2-0545a067b33b\") " pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-9nvm9" Jan 11 18:02:21 crc kubenswrapper[4837]: I0111 18:02:21.753383 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/be22134a-b58f-4a66-bcb2-0545a067b33b-inventory\") pod \"validate-network-edpm-deployment-openstack-edpm-ipam-9nvm9\" (UID: \"be22134a-b58f-4a66-bcb2-0545a067b33b\") " pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-9nvm9" Jan 11 18:02:21 crc kubenswrapper[4837]: I0111 18:02:21.754888 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/be22134a-b58f-4a66-bcb2-0545a067b33b-ssh-key-openstack-edpm-ipam\") pod \"validate-network-edpm-deployment-openstack-edpm-ipam-9nvm9\" (UID: \"be22134a-b58f-4a66-bcb2-0545a067b33b\") " pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-9nvm9" Jan 11 18:02:21 crc kubenswrapper[4837]: I0111 18:02:21.765813 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-24g5q\" (UniqueName: \"kubernetes.io/projected/be22134a-b58f-4a66-bcb2-0545a067b33b-kube-api-access-24g5q\") pod \"validate-network-edpm-deployment-openstack-edpm-ipam-9nvm9\" (UID: \"be22134a-b58f-4a66-bcb2-0545a067b33b\") " pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-9nvm9" Jan 11 18:02:21 crc kubenswrapper[4837]: I0111 18:02:21.883130 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-9nvm9" Jan 11 18:02:22 crc kubenswrapper[4837]: I0111 18:02:22.489437 4837 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/validate-network-edpm-deployment-openstack-edpm-ipam-9nvm9"] Jan 11 18:02:23 crc kubenswrapper[4837]: I0111 18:02:23.470023 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-9nvm9" event={"ID":"be22134a-b58f-4a66-bcb2-0545a067b33b","Type":"ContainerStarted","Data":"dbf581cbc2b8211ab97339456ce52eeaf11cd676eeb4865bec5a74fce60ce615"} Jan 11 18:02:23 crc kubenswrapper[4837]: I0111 18:02:23.470759 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-9nvm9" event={"ID":"be22134a-b58f-4a66-bcb2-0545a067b33b","Type":"ContainerStarted","Data":"266a3fe2cd73937638a379db56095d312e9b792bf66b4f7b53286ae54d2465ee"} Jan 11 18:02:23 crc kubenswrapper[4837]: I0111 18:02:23.490252 4837 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-9nvm9" podStartSLOduration=2.0873352 podStartE2EDuration="2.490231341s" podCreationTimestamp="2026-01-11 18:02:21 +0000 UTC" firstStartedPulling="2026-01-11 18:02:22.496013238 +0000 UTC m=+1916.674205944" lastFinishedPulling="2026-01-11 18:02:22.898909339 +0000 UTC m=+1917.077102085" observedRunningTime="2026-01-11 18:02:23.488520274 +0000 UTC m=+1917.666713030" watchObservedRunningTime="2026-01-11 18:02:23.490231341 +0000 UTC m=+1917.668424067" Jan 11 18:02:28 crc kubenswrapper[4837]: I0111 18:02:28.525226 4837 generic.go:334] "Generic (PLEG): container finished" podID="be22134a-b58f-4a66-bcb2-0545a067b33b" containerID="dbf581cbc2b8211ab97339456ce52eeaf11cd676eeb4865bec5a74fce60ce615" exitCode=0 Jan 11 18:02:28 crc kubenswrapper[4837]: I0111 18:02:28.525371 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-9nvm9" event={"ID":"be22134a-b58f-4a66-bcb2-0545a067b33b","Type":"ContainerDied","Data":"dbf581cbc2b8211ab97339456ce52eeaf11cd676eeb4865bec5a74fce60ce615"} Jan 11 18:02:30 crc kubenswrapper[4837]: I0111 18:02:30.078396 4837 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-9nvm9" Jan 11 18:02:30 crc kubenswrapper[4837]: I0111 18:02:30.227315 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/be22134a-b58f-4a66-bcb2-0545a067b33b-inventory\") pod \"be22134a-b58f-4a66-bcb2-0545a067b33b\" (UID: \"be22134a-b58f-4a66-bcb2-0545a067b33b\") " Jan 11 18:02:30 crc kubenswrapper[4837]: I0111 18:02:30.227764 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-24g5q\" (UniqueName: \"kubernetes.io/projected/be22134a-b58f-4a66-bcb2-0545a067b33b-kube-api-access-24g5q\") pod \"be22134a-b58f-4a66-bcb2-0545a067b33b\" (UID: \"be22134a-b58f-4a66-bcb2-0545a067b33b\") " Jan 11 18:02:30 crc kubenswrapper[4837]: I0111 18:02:30.227839 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/be22134a-b58f-4a66-bcb2-0545a067b33b-ssh-key-openstack-edpm-ipam\") pod \"be22134a-b58f-4a66-bcb2-0545a067b33b\" (UID: \"be22134a-b58f-4a66-bcb2-0545a067b33b\") " Jan 11 18:02:30 crc kubenswrapper[4837]: I0111 18:02:30.236112 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/be22134a-b58f-4a66-bcb2-0545a067b33b-kube-api-access-24g5q" (OuterVolumeSpecName: "kube-api-access-24g5q") pod "be22134a-b58f-4a66-bcb2-0545a067b33b" (UID: "be22134a-b58f-4a66-bcb2-0545a067b33b"). InnerVolumeSpecName "kube-api-access-24g5q". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 11 18:02:30 crc kubenswrapper[4837]: I0111 18:02:30.275855 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/be22134a-b58f-4a66-bcb2-0545a067b33b-inventory" (OuterVolumeSpecName: "inventory") pod "be22134a-b58f-4a66-bcb2-0545a067b33b" (UID: "be22134a-b58f-4a66-bcb2-0545a067b33b"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 11 18:02:30 crc kubenswrapper[4837]: I0111 18:02:30.279453 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/be22134a-b58f-4a66-bcb2-0545a067b33b-ssh-key-openstack-edpm-ipam" (OuterVolumeSpecName: "ssh-key-openstack-edpm-ipam") pod "be22134a-b58f-4a66-bcb2-0545a067b33b" (UID: "be22134a-b58f-4a66-bcb2-0545a067b33b"). InnerVolumeSpecName "ssh-key-openstack-edpm-ipam". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 11 18:02:30 crc kubenswrapper[4837]: I0111 18:02:30.330918 4837 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-24g5q\" (UniqueName: \"kubernetes.io/projected/be22134a-b58f-4a66-bcb2-0545a067b33b-kube-api-access-24g5q\") on node \"crc\" DevicePath \"\"" Jan 11 18:02:30 crc kubenswrapper[4837]: I0111 18:02:30.330985 4837 reconciler_common.go:293] "Volume detached for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/be22134a-b58f-4a66-bcb2-0545a067b33b-ssh-key-openstack-edpm-ipam\") on node \"crc\" DevicePath \"\"" Jan 11 18:02:30 crc kubenswrapper[4837]: I0111 18:02:30.331014 4837 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/be22134a-b58f-4a66-bcb2-0545a067b33b-inventory\") on node \"crc\" DevicePath \"\"" Jan 11 18:02:30 crc kubenswrapper[4837]: I0111 18:02:30.562539 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-9nvm9" event={"ID":"be22134a-b58f-4a66-bcb2-0545a067b33b","Type":"ContainerDied","Data":"266a3fe2cd73937638a379db56095d312e9b792bf66b4f7b53286ae54d2465ee"} Jan 11 18:02:30 crc kubenswrapper[4837]: I0111 18:02:30.562591 4837 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="266a3fe2cd73937638a379db56095d312e9b792bf66b4f7b53286ae54d2465ee" Jan 11 18:02:30 crc kubenswrapper[4837]: I0111 18:02:30.562649 4837 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-9nvm9" Jan 11 18:02:30 crc kubenswrapper[4837]: I0111 18:02:30.663353 4837 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/install-os-edpm-deployment-openstack-edpm-ipam-j8p5n"] Jan 11 18:02:30 crc kubenswrapper[4837]: E0111 18:02:30.663977 4837 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="be22134a-b58f-4a66-bcb2-0545a067b33b" containerName="validate-network-edpm-deployment-openstack-edpm-ipam" Jan 11 18:02:30 crc kubenswrapper[4837]: I0111 18:02:30.664008 4837 state_mem.go:107] "Deleted CPUSet assignment" podUID="be22134a-b58f-4a66-bcb2-0545a067b33b" containerName="validate-network-edpm-deployment-openstack-edpm-ipam" Jan 11 18:02:30 crc kubenswrapper[4837]: I0111 18:02:30.664375 4837 memory_manager.go:354] "RemoveStaleState removing state" podUID="be22134a-b58f-4a66-bcb2-0545a067b33b" containerName="validate-network-edpm-deployment-openstack-edpm-ipam" Jan 11 18:02:30 crc kubenswrapper[4837]: I0111 18:02:30.665399 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-j8p5n" Jan 11 18:02:30 crc kubenswrapper[4837]: I0111 18:02:30.668865 4837 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Jan 11 18:02:30 crc kubenswrapper[4837]: I0111 18:02:30.669121 4837 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Jan 11 18:02:30 crc kubenswrapper[4837]: I0111 18:02:30.669440 4837 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Jan 11 18:02:30 crc kubenswrapper[4837]: I0111 18:02:30.671332 4837 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-bzt89" Jan 11 18:02:30 crc kubenswrapper[4837]: I0111 18:02:30.689670 4837 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/install-os-edpm-deployment-openstack-edpm-ipam-j8p5n"] Jan 11 18:02:30 crc kubenswrapper[4837]: I0111 18:02:30.840784 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/24037829-f96f-4b2e-93b1-968e19a0edb8-inventory\") pod \"install-os-edpm-deployment-openstack-edpm-ipam-j8p5n\" (UID: \"24037829-f96f-4b2e-93b1-968e19a0edb8\") " pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-j8p5n" Jan 11 18:02:30 crc kubenswrapper[4837]: I0111 18:02:30.841007 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-69d9j\" (UniqueName: \"kubernetes.io/projected/24037829-f96f-4b2e-93b1-968e19a0edb8-kube-api-access-69d9j\") pod \"install-os-edpm-deployment-openstack-edpm-ipam-j8p5n\" (UID: \"24037829-f96f-4b2e-93b1-968e19a0edb8\") " pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-j8p5n" Jan 11 18:02:30 crc kubenswrapper[4837]: I0111 18:02:30.841221 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/24037829-f96f-4b2e-93b1-968e19a0edb8-ssh-key-openstack-edpm-ipam\") pod \"install-os-edpm-deployment-openstack-edpm-ipam-j8p5n\" (UID: \"24037829-f96f-4b2e-93b1-968e19a0edb8\") " pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-j8p5n" Jan 11 18:02:30 crc kubenswrapper[4837]: I0111 18:02:30.942956 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/24037829-f96f-4b2e-93b1-968e19a0edb8-inventory\") pod \"install-os-edpm-deployment-openstack-edpm-ipam-j8p5n\" (UID: \"24037829-f96f-4b2e-93b1-968e19a0edb8\") " pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-j8p5n" Jan 11 18:02:30 crc kubenswrapper[4837]: I0111 18:02:30.943041 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-69d9j\" (UniqueName: \"kubernetes.io/projected/24037829-f96f-4b2e-93b1-968e19a0edb8-kube-api-access-69d9j\") pod \"install-os-edpm-deployment-openstack-edpm-ipam-j8p5n\" (UID: \"24037829-f96f-4b2e-93b1-968e19a0edb8\") " pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-j8p5n" Jan 11 18:02:30 crc kubenswrapper[4837]: I0111 18:02:30.943155 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/24037829-f96f-4b2e-93b1-968e19a0edb8-ssh-key-openstack-edpm-ipam\") pod \"install-os-edpm-deployment-openstack-edpm-ipam-j8p5n\" (UID: \"24037829-f96f-4b2e-93b1-968e19a0edb8\") " pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-j8p5n" Jan 11 18:02:30 crc kubenswrapper[4837]: I0111 18:02:30.949009 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/24037829-f96f-4b2e-93b1-968e19a0edb8-ssh-key-openstack-edpm-ipam\") pod \"install-os-edpm-deployment-openstack-edpm-ipam-j8p5n\" (UID: \"24037829-f96f-4b2e-93b1-968e19a0edb8\") " pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-j8p5n" Jan 11 18:02:30 crc kubenswrapper[4837]: I0111 18:02:30.953638 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/24037829-f96f-4b2e-93b1-968e19a0edb8-inventory\") pod \"install-os-edpm-deployment-openstack-edpm-ipam-j8p5n\" (UID: \"24037829-f96f-4b2e-93b1-968e19a0edb8\") " pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-j8p5n" Jan 11 18:02:30 crc kubenswrapper[4837]: I0111 18:02:30.973356 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-69d9j\" (UniqueName: \"kubernetes.io/projected/24037829-f96f-4b2e-93b1-968e19a0edb8-kube-api-access-69d9j\") pod \"install-os-edpm-deployment-openstack-edpm-ipam-j8p5n\" (UID: \"24037829-f96f-4b2e-93b1-968e19a0edb8\") " pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-j8p5n" Jan 11 18:02:30 crc kubenswrapper[4837]: I0111 18:02:30.985049 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-j8p5n" Jan 11 18:02:31 crc kubenswrapper[4837]: I0111 18:02:31.349863 4837 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/install-os-edpm-deployment-openstack-edpm-ipam-j8p5n"] Jan 11 18:02:31 crc kubenswrapper[4837]: I0111 18:02:31.572140 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-j8p5n" event={"ID":"24037829-f96f-4b2e-93b1-968e19a0edb8","Type":"ContainerStarted","Data":"a6a549925cdf222dda39599dc6a1dd4c4a866d990074c12718649130a9cfcf08"} Jan 11 18:02:32 crc kubenswrapper[4837]: I0111 18:02:32.364625 4837 scope.go:117] "RemoveContainer" containerID="275040296db5cef9700aa2e2244ea724fe1663bdc93466378b56f7e33809d0f9" Jan 11 18:02:32 crc kubenswrapper[4837]: E0111 18:02:32.365801 4837 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-pqnst_openshift-machine-config-operator(1cf6fa66-290a-4e29-bafc-e60185a22fe2)\"" pod="openshift-machine-config-operator/machine-config-daemon-pqnst" podUID="1cf6fa66-290a-4e29-bafc-e60185a22fe2" Jan 11 18:02:32 crc kubenswrapper[4837]: I0111 18:02:32.589141 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-j8p5n" event={"ID":"24037829-f96f-4b2e-93b1-968e19a0edb8","Type":"ContainerStarted","Data":"cde2534fd95d08ef0a6a36e10a55663cec89b9da7be3c40eab6bb45c6cc1640f"} Jan 11 18:02:32 crc kubenswrapper[4837]: I0111 18:02:32.640076 4837 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-j8p5n" podStartSLOduration=2.190435251 podStartE2EDuration="2.640045023s" podCreationTimestamp="2026-01-11 18:02:30 +0000 UTC" firstStartedPulling="2026-01-11 18:02:31.355794916 +0000 UTC m=+1925.533987632" lastFinishedPulling="2026-01-11 18:02:31.805404658 +0000 UTC m=+1925.983597404" observedRunningTime="2026-01-11 18:02:32.609825922 +0000 UTC m=+1926.788018658" watchObservedRunningTime="2026-01-11 18:02:32.640045023 +0000 UTC m=+1926.818237769" Jan 11 18:02:36 crc kubenswrapper[4837]: I0111 18:02:36.446201 4837 scope.go:117] "RemoveContainer" containerID="2484e2f3d4f9e02a4e3e8f7c00cb627e0fa0c74308de8ec37513e26776d877a3" Jan 11 18:02:36 crc kubenswrapper[4837]: I0111 18:02:36.497818 4837 scope.go:117] "RemoveContainer" containerID="8cab87f746855c422ef01a1606595bfd84e949e4b33bdb593d09d8efdaf27647" Jan 11 18:02:36 crc kubenswrapper[4837]: I0111 18:02:36.547437 4837 scope.go:117] "RemoveContainer" containerID="bd8f463131730357cfa62a1f3faa0774e2d9f9588be3e824c3284c77ce387198" Jan 11 18:02:36 crc kubenswrapper[4837]: I0111 18:02:36.597629 4837 scope.go:117] "RemoveContainer" containerID="437e503187a681729589f86207c5c5c71561dae2047ed8281de6b7e9f0d1498f" Jan 11 18:02:36 crc kubenswrapper[4837]: I0111 18:02:36.641503 4837 scope.go:117] "RemoveContainer" containerID="6baa8ef7bb546d3bdf87c4bba378b15f00d28eefc5d10b4bb95bf7755e989703" Jan 11 18:02:36 crc kubenswrapper[4837]: I0111 18:02:36.676240 4837 scope.go:117] "RemoveContainer" containerID="7fa53c8e94596391adea74ec48e1196c2e3b48844f54bb196268fa54002d16c6" Jan 11 18:02:45 crc kubenswrapper[4837]: I0111 18:02:45.365315 4837 scope.go:117] "RemoveContainer" containerID="275040296db5cef9700aa2e2244ea724fe1663bdc93466378b56f7e33809d0f9" Jan 11 18:02:45 crc kubenswrapper[4837]: I0111 18:02:45.730641 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-pqnst" event={"ID":"1cf6fa66-290a-4e29-bafc-e60185a22fe2","Type":"ContainerStarted","Data":"6a353f8b2825371ba8ab38b9b1d705da9972513542365a5827d613ce87b61f00"} Jan 11 18:02:53 crc kubenswrapper[4837]: I0111 18:02:53.073095 4837 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell0-conductor-db-sync-q5qqm"] Jan 11 18:02:53 crc kubenswrapper[4837]: I0111 18:02:53.086631 4837 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell0-conductor-db-sync-q5qqm"] Jan 11 18:02:54 crc kubenswrapper[4837]: I0111 18:02:54.374548 4837 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0550c6b1-8707-42d4-a3ac-f2cd7f8ac80f" path="/var/lib/kubelet/pods/0550c6b1-8707-42d4-a3ac-f2cd7f8ac80f/volumes" Jan 11 18:03:13 crc kubenswrapper[4837]: I0111 18:03:13.044175 4837 generic.go:334] "Generic (PLEG): container finished" podID="24037829-f96f-4b2e-93b1-968e19a0edb8" containerID="cde2534fd95d08ef0a6a36e10a55663cec89b9da7be3c40eab6bb45c6cc1640f" exitCode=0 Jan 11 18:03:13 crc kubenswrapper[4837]: I0111 18:03:13.044223 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-j8p5n" event={"ID":"24037829-f96f-4b2e-93b1-968e19a0edb8","Type":"ContainerDied","Data":"cde2534fd95d08ef0a6a36e10a55663cec89b9da7be3c40eab6bb45c6cc1640f"} Jan 11 18:03:14 crc kubenswrapper[4837]: I0111 18:03:14.508070 4837 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-j8p5n" Jan 11 18:03:14 crc kubenswrapper[4837]: I0111 18:03:14.591551 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/24037829-f96f-4b2e-93b1-968e19a0edb8-inventory\") pod \"24037829-f96f-4b2e-93b1-968e19a0edb8\" (UID: \"24037829-f96f-4b2e-93b1-968e19a0edb8\") " Jan 11 18:03:14 crc kubenswrapper[4837]: I0111 18:03:14.591622 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/24037829-f96f-4b2e-93b1-968e19a0edb8-ssh-key-openstack-edpm-ipam\") pod \"24037829-f96f-4b2e-93b1-968e19a0edb8\" (UID: \"24037829-f96f-4b2e-93b1-968e19a0edb8\") " Jan 11 18:03:14 crc kubenswrapper[4837]: I0111 18:03:14.591895 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-69d9j\" (UniqueName: \"kubernetes.io/projected/24037829-f96f-4b2e-93b1-968e19a0edb8-kube-api-access-69d9j\") pod \"24037829-f96f-4b2e-93b1-968e19a0edb8\" (UID: \"24037829-f96f-4b2e-93b1-968e19a0edb8\") " Jan 11 18:03:14 crc kubenswrapper[4837]: I0111 18:03:14.614224 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/24037829-f96f-4b2e-93b1-968e19a0edb8-kube-api-access-69d9j" (OuterVolumeSpecName: "kube-api-access-69d9j") pod "24037829-f96f-4b2e-93b1-968e19a0edb8" (UID: "24037829-f96f-4b2e-93b1-968e19a0edb8"). InnerVolumeSpecName "kube-api-access-69d9j". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 11 18:03:14 crc kubenswrapper[4837]: I0111 18:03:14.631639 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/24037829-f96f-4b2e-93b1-968e19a0edb8-inventory" (OuterVolumeSpecName: "inventory") pod "24037829-f96f-4b2e-93b1-968e19a0edb8" (UID: "24037829-f96f-4b2e-93b1-968e19a0edb8"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 11 18:03:14 crc kubenswrapper[4837]: I0111 18:03:14.644422 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/24037829-f96f-4b2e-93b1-968e19a0edb8-ssh-key-openstack-edpm-ipam" (OuterVolumeSpecName: "ssh-key-openstack-edpm-ipam") pod "24037829-f96f-4b2e-93b1-968e19a0edb8" (UID: "24037829-f96f-4b2e-93b1-968e19a0edb8"). InnerVolumeSpecName "ssh-key-openstack-edpm-ipam". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 11 18:03:14 crc kubenswrapper[4837]: I0111 18:03:14.694883 4837 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-69d9j\" (UniqueName: \"kubernetes.io/projected/24037829-f96f-4b2e-93b1-968e19a0edb8-kube-api-access-69d9j\") on node \"crc\" DevicePath \"\"" Jan 11 18:03:14 crc kubenswrapper[4837]: I0111 18:03:14.694929 4837 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/24037829-f96f-4b2e-93b1-968e19a0edb8-inventory\") on node \"crc\" DevicePath \"\"" Jan 11 18:03:14 crc kubenswrapper[4837]: I0111 18:03:14.694946 4837 reconciler_common.go:293] "Volume detached for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/24037829-f96f-4b2e-93b1-968e19a0edb8-ssh-key-openstack-edpm-ipam\") on node \"crc\" DevicePath \"\"" Jan 11 18:03:15 crc kubenswrapper[4837]: I0111 18:03:15.069969 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-j8p5n" event={"ID":"24037829-f96f-4b2e-93b1-968e19a0edb8","Type":"ContainerDied","Data":"a6a549925cdf222dda39599dc6a1dd4c4a866d990074c12718649130a9cfcf08"} Jan 11 18:03:15 crc kubenswrapper[4837]: I0111 18:03:15.070027 4837 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="a6a549925cdf222dda39599dc6a1dd4c4a866d990074c12718649130a9cfcf08" Jan 11 18:03:15 crc kubenswrapper[4837]: I0111 18:03:15.070075 4837 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-j8p5n" Jan 11 18:03:15 crc kubenswrapper[4837]: I0111 18:03:15.233847 4837 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/configure-os-edpm-deployment-openstack-edpm-ipam-nffxn"] Jan 11 18:03:15 crc kubenswrapper[4837]: E0111 18:03:15.234395 4837 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="24037829-f96f-4b2e-93b1-968e19a0edb8" containerName="install-os-edpm-deployment-openstack-edpm-ipam" Jan 11 18:03:15 crc kubenswrapper[4837]: I0111 18:03:15.234420 4837 state_mem.go:107] "Deleted CPUSet assignment" podUID="24037829-f96f-4b2e-93b1-968e19a0edb8" containerName="install-os-edpm-deployment-openstack-edpm-ipam" Jan 11 18:03:15 crc kubenswrapper[4837]: I0111 18:03:15.234648 4837 memory_manager.go:354] "RemoveStaleState removing state" podUID="24037829-f96f-4b2e-93b1-968e19a0edb8" containerName="install-os-edpm-deployment-openstack-edpm-ipam" Jan 11 18:03:15 crc kubenswrapper[4837]: I0111 18:03:15.241273 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-nffxn" Jan 11 18:03:15 crc kubenswrapper[4837]: I0111 18:03:15.245888 4837 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-bzt89" Jan 11 18:03:15 crc kubenswrapper[4837]: I0111 18:03:15.250127 4837 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Jan 11 18:03:15 crc kubenswrapper[4837]: I0111 18:03:15.259855 4837 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Jan 11 18:03:15 crc kubenswrapper[4837]: I0111 18:03:15.262435 4837 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Jan 11 18:03:15 crc kubenswrapper[4837]: I0111 18:03:15.276099 4837 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/configure-os-edpm-deployment-openstack-edpm-ipam-nffxn"] Jan 11 18:03:15 crc kubenswrapper[4837]: I0111 18:03:15.436830 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-52mr4\" (UniqueName: \"kubernetes.io/projected/355acc57-d5c4-46fa-8881-61cae424d004-kube-api-access-52mr4\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-nffxn\" (UID: \"355acc57-d5c4-46fa-8881-61cae424d004\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-nffxn" Jan 11 18:03:15 crc kubenswrapper[4837]: I0111 18:03:15.436942 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/355acc57-d5c4-46fa-8881-61cae424d004-inventory\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-nffxn\" (UID: \"355acc57-d5c4-46fa-8881-61cae424d004\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-nffxn" Jan 11 18:03:15 crc kubenswrapper[4837]: I0111 18:03:15.437135 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/355acc57-d5c4-46fa-8881-61cae424d004-ssh-key-openstack-edpm-ipam\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-nffxn\" (UID: \"355acc57-d5c4-46fa-8881-61cae424d004\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-nffxn" Jan 11 18:03:15 crc kubenswrapper[4837]: I0111 18:03:15.539636 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-52mr4\" (UniqueName: \"kubernetes.io/projected/355acc57-d5c4-46fa-8881-61cae424d004-kube-api-access-52mr4\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-nffxn\" (UID: \"355acc57-d5c4-46fa-8881-61cae424d004\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-nffxn" Jan 11 18:03:15 crc kubenswrapper[4837]: I0111 18:03:15.539823 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/355acc57-d5c4-46fa-8881-61cae424d004-inventory\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-nffxn\" (UID: \"355acc57-d5c4-46fa-8881-61cae424d004\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-nffxn" Jan 11 18:03:15 crc kubenswrapper[4837]: I0111 18:03:15.539876 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/355acc57-d5c4-46fa-8881-61cae424d004-ssh-key-openstack-edpm-ipam\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-nffxn\" (UID: \"355acc57-d5c4-46fa-8881-61cae424d004\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-nffxn" Jan 11 18:03:15 crc kubenswrapper[4837]: I0111 18:03:15.545498 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/355acc57-d5c4-46fa-8881-61cae424d004-ssh-key-openstack-edpm-ipam\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-nffxn\" (UID: \"355acc57-d5c4-46fa-8881-61cae424d004\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-nffxn" Jan 11 18:03:15 crc kubenswrapper[4837]: I0111 18:03:15.546406 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/355acc57-d5c4-46fa-8881-61cae424d004-inventory\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-nffxn\" (UID: \"355acc57-d5c4-46fa-8881-61cae424d004\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-nffxn" Jan 11 18:03:15 crc kubenswrapper[4837]: I0111 18:03:15.559275 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-52mr4\" (UniqueName: \"kubernetes.io/projected/355acc57-d5c4-46fa-8881-61cae424d004-kube-api-access-52mr4\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-nffxn\" (UID: \"355acc57-d5c4-46fa-8881-61cae424d004\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-nffxn" Jan 11 18:03:15 crc kubenswrapper[4837]: I0111 18:03:15.577139 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-nffxn" Jan 11 18:03:16 crc kubenswrapper[4837]: I0111 18:03:16.060583 4837 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell0-cell-mapping-5wzxr"] Jan 11 18:03:16 crc kubenswrapper[4837]: I0111 18:03:16.071839 4837 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell0-cell-mapping-5wzxr"] Jan 11 18:03:16 crc kubenswrapper[4837]: I0111 18:03:16.179782 4837 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/configure-os-edpm-deployment-openstack-edpm-ipam-nffxn"] Jan 11 18:03:16 crc kubenswrapper[4837]: I0111 18:03:16.377384 4837 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="bb261701-19a1-4f8f-a84b-e8748c2eb561" path="/var/lib/kubelet/pods/bb261701-19a1-4f8f-a84b-e8748c2eb561/volumes" Jan 11 18:03:17 crc kubenswrapper[4837]: I0111 18:03:17.090146 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-nffxn" event={"ID":"355acc57-d5c4-46fa-8881-61cae424d004","Type":"ContainerStarted","Data":"c3e024b4c4a249bea77d920751f2efb89c5a5a327584439afdf9a792f9c98de1"} Jan 11 18:03:17 crc kubenswrapper[4837]: I0111 18:03:17.090552 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-nffxn" event={"ID":"355acc57-d5c4-46fa-8881-61cae424d004","Type":"ContainerStarted","Data":"db573fa73ac4d40bead94cf7d70e36c6e83b2839dfc8849d125d8272d2fc9588"} Jan 11 18:03:17 crc kubenswrapper[4837]: I0111 18:03:17.113910 4837 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-nffxn" podStartSLOduration=1.664636987 podStartE2EDuration="2.11389355s" podCreationTimestamp="2026-01-11 18:03:15 +0000 UTC" firstStartedPulling="2026-01-11 18:03:16.185707998 +0000 UTC m=+1970.363900714" lastFinishedPulling="2026-01-11 18:03:16.634964521 +0000 UTC m=+1970.813157277" observedRunningTime="2026-01-11 18:03:17.107439827 +0000 UTC m=+1971.285632523" watchObservedRunningTime="2026-01-11 18:03:17.11389355 +0000 UTC m=+1971.292086256" Jan 11 18:03:18 crc kubenswrapper[4837]: I0111 18:03:18.061797 4837 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell1-conductor-db-sync-78z78"] Jan 11 18:03:18 crc kubenswrapper[4837]: I0111 18:03:18.074591 4837 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell1-conductor-db-sync-78z78"] Jan 11 18:03:18 crc kubenswrapper[4837]: I0111 18:03:18.381175 4837 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6da0d3e9-aabb-41f6-aaad-7cb1d1f203c8" path="/var/lib/kubelet/pods/6da0d3e9-aabb-41f6-aaad-7cb1d1f203c8/volumes" Jan 11 18:03:36 crc kubenswrapper[4837]: I0111 18:03:36.829762 4837 scope.go:117] "RemoveContainer" containerID="36d7dcea07af7a2b3cdbeff10a258a5334de2d4d97b15aeca8b1d43fdd0bdb14" Jan 11 18:03:36 crc kubenswrapper[4837]: I0111 18:03:36.913543 4837 scope.go:117] "RemoveContainer" containerID="f633d6111f7d5bceba726b3ef0fefa136f025a6f85e8dc8f5ebde40e5830b39f" Jan 11 18:03:36 crc kubenswrapper[4837]: I0111 18:03:36.964069 4837 scope.go:117] "RemoveContainer" containerID="9b9a7b0972d386315e54f02a158bf16ccf012c9b8757d163aac3a554e6f3dbef" Jan 11 18:04:00 crc kubenswrapper[4837]: I0111 18:04:00.040631 4837 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell1-cell-mapping-mdgzf"] Jan 11 18:04:00 crc kubenswrapper[4837]: I0111 18:04:00.052924 4837 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell1-cell-mapping-mdgzf"] Jan 11 18:04:00 crc kubenswrapper[4837]: I0111 18:04:00.377592 4837 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="293316f1-dfe8-42c8-82d3-365be90cdbd1" path="/var/lib/kubelet/pods/293316f1-dfe8-42c8-82d3-365be90cdbd1/volumes" Jan 11 18:04:16 crc kubenswrapper[4837]: I0111 18:04:16.719790 4837 generic.go:334] "Generic (PLEG): container finished" podID="355acc57-d5c4-46fa-8881-61cae424d004" containerID="c3e024b4c4a249bea77d920751f2efb89c5a5a327584439afdf9a792f9c98de1" exitCode=0 Jan 11 18:04:16 crc kubenswrapper[4837]: I0111 18:04:16.719860 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-nffxn" event={"ID":"355acc57-d5c4-46fa-8881-61cae424d004","Type":"ContainerDied","Data":"c3e024b4c4a249bea77d920751f2efb89c5a5a327584439afdf9a792f9c98de1"} Jan 11 18:04:18 crc kubenswrapper[4837]: I0111 18:04:18.158878 4837 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-nffxn" Jan 11 18:04:18 crc kubenswrapper[4837]: I0111 18:04:18.257397 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/355acc57-d5c4-46fa-8881-61cae424d004-ssh-key-openstack-edpm-ipam\") pod \"355acc57-d5c4-46fa-8881-61cae424d004\" (UID: \"355acc57-d5c4-46fa-8881-61cae424d004\") " Jan 11 18:04:18 crc kubenswrapper[4837]: I0111 18:04:18.257845 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-52mr4\" (UniqueName: \"kubernetes.io/projected/355acc57-d5c4-46fa-8881-61cae424d004-kube-api-access-52mr4\") pod \"355acc57-d5c4-46fa-8881-61cae424d004\" (UID: \"355acc57-d5c4-46fa-8881-61cae424d004\") " Jan 11 18:04:18 crc kubenswrapper[4837]: I0111 18:04:18.257883 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/355acc57-d5c4-46fa-8881-61cae424d004-inventory\") pod \"355acc57-d5c4-46fa-8881-61cae424d004\" (UID: \"355acc57-d5c4-46fa-8881-61cae424d004\") " Jan 11 18:04:18 crc kubenswrapper[4837]: I0111 18:04:18.263540 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/355acc57-d5c4-46fa-8881-61cae424d004-kube-api-access-52mr4" (OuterVolumeSpecName: "kube-api-access-52mr4") pod "355acc57-d5c4-46fa-8881-61cae424d004" (UID: "355acc57-d5c4-46fa-8881-61cae424d004"). InnerVolumeSpecName "kube-api-access-52mr4". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 11 18:04:18 crc kubenswrapper[4837]: I0111 18:04:18.287267 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/355acc57-d5c4-46fa-8881-61cae424d004-inventory" (OuterVolumeSpecName: "inventory") pod "355acc57-d5c4-46fa-8881-61cae424d004" (UID: "355acc57-d5c4-46fa-8881-61cae424d004"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 11 18:04:18 crc kubenswrapper[4837]: I0111 18:04:18.291768 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/355acc57-d5c4-46fa-8881-61cae424d004-ssh-key-openstack-edpm-ipam" (OuterVolumeSpecName: "ssh-key-openstack-edpm-ipam") pod "355acc57-d5c4-46fa-8881-61cae424d004" (UID: "355acc57-d5c4-46fa-8881-61cae424d004"). InnerVolumeSpecName "ssh-key-openstack-edpm-ipam". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 11 18:04:18 crc kubenswrapper[4837]: I0111 18:04:18.360744 4837 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-52mr4\" (UniqueName: \"kubernetes.io/projected/355acc57-d5c4-46fa-8881-61cae424d004-kube-api-access-52mr4\") on node \"crc\" DevicePath \"\"" Jan 11 18:04:18 crc kubenswrapper[4837]: I0111 18:04:18.361052 4837 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/355acc57-d5c4-46fa-8881-61cae424d004-inventory\") on node \"crc\" DevicePath \"\"" Jan 11 18:04:18 crc kubenswrapper[4837]: I0111 18:04:18.361263 4837 reconciler_common.go:293] "Volume detached for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/355acc57-d5c4-46fa-8881-61cae424d004-ssh-key-openstack-edpm-ipam\") on node \"crc\" DevicePath \"\"" Jan 11 18:04:18 crc kubenswrapper[4837]: I0111 18:04:18.744812 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-nffxn" event={"ID":"355acc57-d5c4-46fa-8881-61cae424d004","Type":"ContainerDied","Data":"db573fa73ac4d40bead94cf7d70e36c6e83b2839dfc8849d125d8272d2fc9588"} Jan 11 18:04:18 crc kubenswrapper[4837]: I0111 18:04:18.744868 4837 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="db573fa73ac4d40bead94cf7d70e36c6e83b2839dfc8849d125d8272d2fc9588" Jan 11 18:04:18 crc kubenswrapper[4837]: I0111 18:04:18.744899 4837 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-nffxn" Jan 11 18:04:18 crc kubenswrapper[4837]: I0111 18:04:18.839229 4837 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ssh-known-hosts-edpm-deployment-jcmv2"] Jan 11 18:04:18 crc kubenswrapper[4837]: E0111 18:04:18.844398 4837 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="355acc57-d5c4-46fa-8881-61cae424d004" containerName="configure-os-edpm-deployment-openstack-edpm-ipam" Jan 11 18:04:18 crc kubenswrapper[4837]: I0111 18:04:18.844601 4837 state_mem.go:107] "Deleted CPUSet assignment" podUID="355acc57-d5c4-46fa-8881-61cae424d004" containerName="configure-os-edpm-deployment-openstack-edpm-ipam" Jan 11 18:04:18 crc kubenswrapper[4837]: I0111 18:04:18.845123 4837 memory_manager.go:354] "RemoveStaleState removing state" podUID="355acc57-d5c4-46fa-8881-61cae424d004" containerName="configure-os-edpm-deployment-openstack-edpm-ipam" Jan 11 18:04:18 crc kubenswrapper[4837]: I0111 18:04:18.846130 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ssh-known-hosts-edpm-deployment-jcmv2" Jan 11 18:04:18 crc kubenswrapper[4837]: I0111 18:04:18.849953 4837 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Jan 11 18:04:18 crc kubenswrapper[4837]: I0111 18:04:18.849960 4837 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-bzt89" Jan 11 18:04:18 crc kubenswrapper[4837]: I0111 18:04:18.849964 4837 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Jan 11 18:04:18 crc kubenswrapper[4837]: I0111 18:04:18.851814 4837 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Jan 11 18:04:18 crc kubenswrapper[4837]: I0111 18:04:18.854072 4837 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ssh-known-hosts-edpm-deployment-jcmv2"] Jan 11 18:04:18 crc kubenswrapper[4837]: I0111 18:04:18.870134 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/bd3dd5e3-2424-41fd-a0cf-ae265214d12f-ssh-key-openstack-edpm-ipam\") pod \"ssh-known-hosts-edpm-deployment-jcmv2\" (UID: \"bd3dd5e3-2424-41fd-a0cf-ae265214d12f\") " pod="openstack/ssh-known-hosts-edpm-deployment-jcmv2" Jan 11 18:04:18 crc kubenswrapper[4837]: I0111 18:04:18.870210 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory-0\" (UniqueName: \"kubernetes.io/secret/bd3dd5e3-2424-41fd-a0cf-ae265214d12f-inventory-0\") pod \"ssh-known-hosts-edpm-deployment-jcmv2\" (UID: \"bd3dd5e3-2424-41fd-a0cf-ae265214d12f\") " pod="openstack/ssh-known-hosts-edpm-deployment-jcmv2" Jan 11 18:04:18 crc kubenswrapper[4837]: I0111 18:04:18.870266 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6t6js\" (UniqueName: \"kubernetes.io/projected/bd3dd5e3-2424-41fd-a0cf-ae265214d12f-kube-api-access-6t6js\") pod \"ssh-known-hosts-edpm-deployment-jcmv2\" (UID: \"bd3dd5e3-2424-41fd-a0cf-ae265214d12f\") " pod="openstack/ssh-known-hosts-edpm-deployment-jcmv2" Jan 11 18:04:18 crc kubenswrapper[4837]: I0111 18:04:18.972371 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/bd3dd5e3-2424-41fd-a0cf-ae265214d12f-ssh-key-openstack-edpm-ipam\") pod \"ssh-known-hosts-edpm-deployment-jcmv2\" (UID: \"bd3dd5e3-2424-41fd-a0cf-ae265214d12f\") " pod="openstack/ssh-known-hosts-edpm-deployment-jcmv2" Jan 11 18:04:18 crc kubenswrapper[4837]: I0111 18:04:18.972439 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory-0\" (UniqueName: \"kubernetes.io/secret/bd3dd5e3-2424-41fd-a0cf-ae265214d12f-inventory-0\") pod \"ssh-known-hosts-edpm-deployment-jcmv2\" (UID: \"bd3dd5e3-2424-41fd-a0cf-ae265214d12f\") " pod="openstack/ssh-known-hosts-edpm-deployment-jcmv2" Jan 11 18:04:18 crc kubenswrapper[4837]: I0111 18:04:18.972464 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6t6js\" (UniqueName: \"kubernetes.io/projected/bd3dd5e3-2424-41fd-a0cf-ae265214d12f-kube-api-access-6t6js\") pod \"ssh-known-hosts-edpm-deployment-jcmv2\" (UID: \"bd3dd5e3-2424-41fd-a0cf-ae265214d12f\") " pod="openstack/ssh-known-hosts-edpm-deployment-jcmv2" Jan 11 18:04:18 crc kubenswrapper[4837]: I0111 18:04:18.978391 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/bd3dd5e3-2424-41fd-a0cf-ae265214d12f-ssh-key-openstack-edpm-ipam\") pod \"ssh-known-hosts-edpm-deployment-jcmv2\" (UID: \"bd3dd5e3-2424-41fd-a0cf-ae265214d12f\") " pod="openstack/ssh-known-hosts-edpm-deployment-jcmv2" Jan 11 18:04:18 crc kubenswrapper[4837]: I0111 18:04:18.979377 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory-0\" (UniqueName: \"kubernetes.io/secret/bd3dd5e3-2424-41fd-a0cf-ae265214d12f-inventory-0\") pod \"ssh-known-hosts-edpm-deployment-jcmv2\" (UID: \"bd3dd5e3-2424-41fd-a0cf-ae265214d12f\") " pod="openstack/ssh-known-hosts-edpm-deployment-jcmv2" Jan 11 18:04:18 crc kubenswrapper[4837]: I0111 18:04:18.995213 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6t6js\" (UniqueName: \"kubernetes.io/projected/bd3dd5e3-2424-41fd-a0cf-ae265214d12f-kube-api-access-6t6js\") pod \"ssh-known-hosts-edpm-deployment-jcmv2\" (UID: \"bd3dd5e3-2424-41fd-a0cf-ae265214d12f\") " pod="openstack/ssh-known-hosts-edpm-deployment-jcmv2" Jan 11 18:04:19 crc kubenswrapper[4837]: I0111 18:04:19.171041 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ssh-known-hosts-edpm-deployment-jcmv2" Jan 11 18:04:19 crc kubenswrapper[4837]: I0111 18:04:19.732328 4837 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ssh-known-hosts-edpm-deployment-jcmv2"] Jan 11 18:04:19 crc kubenswrapper[4837]: I0111 18:04:19.736927 4837 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Jan 11 18:04:19 crc kubenswrapper[4837]: I0111 18:04:19.753661 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ssh-known-hosts-edpm-deployment-jcmv2" event={"ID":"bd3dd5e3-2424-41fd-a0cf-ae265214d12f","Type":"ContainerStarted","Data":"ddeb5a60da4e0f571da8d4ddae4650d6eb9cd11aca0da960b166473310325c5a"} Jan 11 18:04:20 crc kubenswrapper[4837]: I0111 18:04:20.766656 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ssh-known-hosts-edpm-deployment-jcmv2" event={"ID":"bd3dd5e3-2424-41fd-a0cf-ae265214d12f","Type":"ContainerStarted","Data":"d78f2d7b00dd967dccd4eb59cd0a54adfbf9d261009fafb2076000bf679cdcb1"} Jan 11 18:04:20 crc kubenswrapper[4837]: I0111 18:04:20.805428 4837 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ssh-known-hosts-edpm-deployment-jcmv2" podStartSLOduration=2.364964418 podStartE2EDuration="2.805395884s" podCreationTimestamp="2026-01-11 18:04:18 +0000 UTC" firstStartedPulling="2026-01-11 18:04:19.736629113 +0000 UTC m=+2033.914821819" lastFinishedPulling="2026-01-11 18:04:20.177060539 +0000 UTC m=+2034.355253285" observedRunningTime="2026-01-11 18:04:20.78919993 +0000 UTC m=+2034.967392706" watchObservedRunningTime="2026-01-11 18:04:20.805395884 +0000 UTC m=+2034.983588620" Jan 11 18:04:27 crc kubenswrapper[4837]: I0111 18:04:27.842019 4837 generic.go:334] "Generic (PLEG): container finished" podID="bd3dd5e3-2424-41fd-a0cf-ae265214d12f" containerID="d78f2d7b00dd967dccd4eb59cd0a54adfbf9d261009fafb2076000bf679cdcb1" exitCode=0 Jan 11 18:04:27 crc kubenswrapper[4837]: I0111 18:04:27.842115 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ssh-known-hosts-edpm-deployment-jcmv2" event={"ID":"bd3dd5e3-2424-41fd-a0cf-ae265214d12f","Type":"ContainerDied","Data":"d78f2d7b00dd967dccd4eb59cd0a54adfbf9d261009fafb2076000bf679cdcb1"} Jan 11 18:04:29 crc kubenswrapper[4837]: I0111 18:04:29.311706 4837 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ssh-known-hosts-edpm-deployment-jcmv2" Jan 11 18:04:29 crc kubenswrapper[4837]: I0111 18:04:29.465169 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory-0\" (UniqueName: \"kubernetes.io/secret/bd3dd5e3-2424-41fd-a0cf-ae265214d12f-inventory-0\") pod \"bd3dd5e3-2424-41fd-a0cf-ae265214d12f\" (UID: \"bd3dd5e3-2424-41fd-a0cf-ae265214d12f\") " Jan 11 18:04:29 crc kubenswrapper[4837]: I0111 18:04:29.465322 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/bd3dd5e3-2424-41fd-a0cf-ae265214d12f-ssh-key-openstack-edpm-ipam\") pod \"bd3dd5e3-2424-41fd-a0cf-ae265214d12f\" (UID: \"bd3dd5e3-2424-41fd-a0cf-ae265214d12f\") " Jan 11 18:04:29 crc kubenswrapper[4837]: I0111 18:04:29.465465 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6t6js\" (UniqueName: \"kubernetes.io/projected/bd3dd5e3-2424-41fd-a0cf-ae265214d12f-kube-api-access-6t6js\") pod \"bd3dd5e3-2424-41fd-a0cf-ae265214d12f\" (UID: \"bd3dd5e3-2424-41fd-a0cf-ae265214d12f\") " Jan 11 18:04:29 crc kubenswrapper[4837]: I0111 18:04:29.470609 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bd3dd5e3-2424-41fd-a0cf-ae265214d12f-kube-api-access-6t6js" (OuterVolumeSpecName: "kube-api-access-6t6js") pod "bd3dd5e3-2424-41fd-a0cf-ae265214d12f" (UID: "bd3dd5e3-2424-41fd-a0cf-ae265214d12f"). InnerVolumeSpecName "kube-api-access-6t6js". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 11 18:04:29 crc kubenswrapper[4837]: I0111 18:04:29.490141 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bd3dd5e3-2424-41fd-a0cf-ae265214d12f-inventory-0" (OuterVolumeSpecName: "inventory-0") pod "bd3dd5e3-2424-41fd-a0cf-ae265214d12f" (UID: "bd3dd5e3-2424-41fd-a0cf-ae265214d12f"). InnerVolumeSpecName "inventory-0". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 11 18:04:29 crc kubenswrapper[4837]: I0111 18:04:29.500866 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bd3dd5e3-2424-41fd-a0cf-ae265214d12f-ssh-key-openstack-edpm-ipam" (OuterVolumeSpecName: "ssh-key-openstack-edpm-ipam") pod "bd3dd5e3-2424-41fd-a0cf-ae265214d12f" (UID: "bd3dd5e3-2424-41fd-a0cf-ae265214d12f"). InnerVolumeSpecName "ssh-key-openstack-edpm-ipam". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 11 18:04:29 crc kubenswrapper[4837]: I0111 18:04:29.568029 4837 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6t6js\" (UniqueName: \"kubernetes.io/projected/bd3dd5e3-2424-41fd-a0cf-ae265214d12f-kube-api-access-6t6js\") on node \"crc\" DevicePath \"\"" Jan 11 18:04:29 crc kubenswrapper[4837]: I0111 18:04:29.568298 4837 reconciler_common.go:293] "Volume detached for volume \"inventory-0\" (UniqueName: \"kubernetes.io/secret/bd3dd5e3-2424-41fd-a0cf-ae265214d12f-inventory-0\") on node \"crc\" DevicePath \"\"" Jan 11 18:04:29 crc kubenswrapper[4837]: I0111 18:04:29.568310 4837 reconciler_common.go:293] "Volume detached for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/bd3dd5e3-2424-41fd-a0cf-ae265214d12f-ssh-key-openstack-edpm-ipam\") on node \"crc\" DevicePath \"\"" Jan 11 18:04:29 crc kubenswrapper[4837]: I0111 18:04:29.863408 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ssh-known-hosts-edpm-deployment-jcmv2" event={"ID":"bd3dd5e3-2424-41fd-a0cf-ae265214d12f","Type":"ContainerDied","Data":"ddeb5a60da4e0f571da8d4ddae4650d6eb9cd11aca0da960b166473310325c5a"} Jan 11 18:04:29 crc kubenswrapper[4837]: I0111 18:04:29.863445 4837 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ssh-known-hosts-edpm-deployment-jcmv2" Jan 11 18:04:29 crc kubenswrapper[4837]: I0111 18:04:29.863459 4837 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="ddeb5a60da4e0f571da8d4ddae4650d6eb9cd11aca0da960b166473310325c5a" Jan 11 18:04:29 crc kubenswrapper[4837]: I0111 18:04:29.948710 4837 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/run-os-edpm-deployment-openstack-edpm-ipam-xtrks"] Jan 11 18:04:29 crc kubenswrapper[4837]: E0111 18:04:29.949545 4837 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="bd3dd5e3-2424-41fd-a0cf-ae265214d12f" containerName="ssh-known-hosts-edpm-deployment" Jan 11 18:04:29 crc kubenswrapper[4837]: I0111 18:04:29.949590 4837 state_mem.go:107] "Deleted CPUSet assignment" podUID="bd3dd5e3-2424-41fd-a0cf-ae265214d12f" containerName="ssh-known-hosts-edpm-deployment" Jan 11 18:04:29 crc kubenswrapper[4837]: I0111 18:04:29.950115 4837 memory_manager.go:354] "RemoveStaleState removing state" podUID="bd3dd5e3-2424-41fd-a0cf-ae265214d12f" containerName="ssh-known-hosts-edpm-deployment" Jan 11 18:04:29 crc kubenswrapper[4837]: I0111 18:04:29.951482 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-xtrks" Jan 11 18:04:29 crc kubenswrapper[4837]: I0111 18:04:29.953529 4837 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-bzt89" Jan 11 18:04:29 crc kubenswrapper[4837]: I0111 18:04:29.953970 4837 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Jan 11 18:04:29 crc kubenswrapper[4837]: I0111 18:04:29.954124 4837 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Jan 11 18:04:29 crc kubenswrapper[4837]: I0111 18:04:29.954377 4837 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Jan 11 18:04:29 crc kubenswrapper[4837]: I0111 18:04:29.963826 4837 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/run-os-edpm-deployment-openstack-edpm-ipam-xtrks"] Jan 11 18:04:30 crc kubenswrapper[4837]: I0111 18:04:30.077910 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/2809dbe5-de4c-4d4d-9a2c-85c51f4591cb-inventory\") pod \"run-os-edpm-deployment-openstack-edpm-ipam-xtrks\" (UID: \"2809dbe5-de4c-4d4d-9a2c-85c51f4591cb\") " pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-xtrks" Jan 11 18:04:30 crc kubenswrapper[4837]: I0111 18:04:30.078257 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/2809dbe5-de4c-4d4d-9a2c-85c51f4591cb-ssh-key-openstack-edpm-ipam\") pod \"run-os-edpm-deployment-openstack-edpm-ipam-xtrks\" (UID: \"2809dbe5-de4c-4d4d-9a2c-85c51f4591cb\") " pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-xtrks" Jan 11 18:04:30 crc kubenswrapper[4837]: I0111 18:04:30.078351 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6h45g\" (UniqueName: \"kubernetes.io/projected/2809dbe5-de4c-4d4d-9a2c-85c51f4591cb-kube-api-access-6h45g\") pod \"run-os-edpm-deployment-openstack-edpm-ipam-xtrks\" (UID: \"2809dbe5-de4c-4d4d-9a2c-85c51f4591cb\") " pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-xtrks" Jan 11 18:04:30 crc kubenswrapper[4837]: I0111 18:04:30.181103 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/2809dbe5-de4c-4d4d-9a2c-85c51f4591cb-ssh-key-openstack-edpm-ipam\") pod \"run-os-edpm-deployment-openstack-edpm-ipam-xtrks\" (UID: \"2809dbe5-de4c-4d4d-9a2c-85c51f4591cb\") " pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-xtrks" Jan 11 18:04:30 crc kubenswrapper[4837]: I0111 18:04:30.181173 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6h45g\" (UniqueName: \"kubernetes.io/projected/2809dbe5-de4c-4d4d-9a2c-85c51f4591cb-kube-api-access-6h45g\") pod \"run-os-edpm-deployment-openstack-edpm-ipam-xtrks\" (UID: \"2809dbe5-de4c-4d4d-9a2c-85c51f4591cb\") " pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-xtrks" Jan 11 18:04:30 crc kubenswrapper[4837]: I0111 18:04:30.181293 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/2809dbe5-de4c-4d4d-9a2c-85c51f4591cb-inventory\") pod \"run-os-edpm-deployment-openstack-edpm-ipam-xtrks\" (UID: \"2809dbe5-de4c-4d4d-9a2c-85c51f4591cb\") " pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-xtrks" Jan 11 18:04:30 crc kubenswrapper[4837]: I0111 18:04:30.185879 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/2809dbe5-de4c-4d4d-9a2c-85c51f4591cb-ssh-key-openstack-edpm-ipam\") pod \"run-os-edpm-deployment-openstack-edpm-ipam-xtrks\" (UID: \"2809dbe5-de4c-4d4d-9a2c-85c51f4591cb\") " pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-xtrks" Jan 11 18:04:30 crc kubenswrapper[4837]: I0111 18:04:30.190478 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/2809dbe5-de4c-4d4d-9a2c-85c51f4591cb-inventory\") pod \"run-os-edpm-deployment-openstack-edpm-ipam-xtrks\" (UID: \"2809dbe5-de4c-4d4d-9a2c-85c51f4591cb\") " pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-xtrks" Jan 11 18:04:30 crc kubenswrapper[4837]: I0111 18:04:30.210135 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6h45g\" (UniqueName: \"kubernetes.io/projected/2809dbe5-de4c-4d4d-9a2c-85c51f4591cb-kube-api-access-6h45g\") pod \"run-os-edpm-deployment-openstack-edpm-ipam-xtrks\" (UID: \"2809dbe5-de4c-4d4d-9a2c-85c51f4591cb\") " pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-xtrks" Jan 11 18:04:30 crc kubenswrapper[4837]: I0111 18:04:30.280667 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-xtrks" Jan 11 18:04:30 crc kubenswrapper[4837]: I0111 18:04:30.916162 4837 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/run-os-edpm-deployment-openstack-edpm-ipam-xtrks"] Jan 11 18:04:31 crc kubenswrapper[4837]: I0111 18:04:31.888036 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-xtrks" event={"ID":"2809dbe5-de4c-4d4d-9a2c-85c51f4591cb","Type":"ContainerStarted","Data":"158c58e45a535ad3dae401ec19980edc13fec18306ccef97a744b6335e11a716"} Jan 11 18:04:31 crc kubenswrapper[4837]: I0111 18:04:31.888637 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-xtrks" event={"ID":"2809dbe5-de4c-4d4d-9a2c-85c51f4591cb","Type":"ContainerStarted","Data":"30abf6ff8f71e15145f2ea8e327a999d4f748bc31ab89fbeb87a21c43d66dd72"} Jan 11 18:04:31 crc kubenswrapper[4837]: I0111 18:04:31.925908 4837 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-xtrks" podStartSLOduration=2.445834004 podStartE2EDuration="2.925881023s" podCreationTimestamp="2026-01-11 18:04:29 +0000 UTC" firstStartedPulling="2026-01-11 18:04:30.928791074 +0000 UTC m=+2045.106983790" lastFinishedPulling="2026-01-11 18:04:31.408838103 +0000 UTC m=+2045.587030809" observedRunningTime="2026-01-11 18:04:31.911050716 +0000 UTC m=+2046.089243472" watchObservedRunningTime="2026-01-11 18:04:31.925881023 +0000 UTC m=+2046.104073759" Jan 11 18:04:37 crc kubenswrapper[4837]: I0111 18:04:37.095025 4837 scope.go:117] "RemoveContainer" containerID="1fbb68e7f3679aad99ef6abda0b2c572c0228d6ef797d566597d8a067bfc7bbb" Jan 11 18:04:39 crc kubenswrapper[4837]: I0111 18:04:39.175815 4837 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-wtl57"] Jan 11 18:04:39 crc kubenswrapper[4837]: I0111 18:04:39.193317 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-wtl57" Jan 11 18:04:39 crc kubenswrapper[4837]: I0111 18:04:39.234930 4837 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-wtl57"] Jan 11 18:04:39 crc kubenswrapper[4837]: I0111 18:04:39.268739 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a283f867-21e5-4e07-8acb-58ce437a05da-catalog-content\") pod \"redhat-operators-wtl57\" (UID: \"a283f867-21e5-4e07-8acb-58ce437a05da\") " pod="openshift-marketplace/redhat-operators-wtl57" Jan 11 18:04:39 crc kubenswrapper[4837]: I0111 18:04:39.268811 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jsvbx\" (UniqueName: \"kubernetes.io/projected/a283f867-21e5-4e07-8acb-58ce437a05da-kube-api-access-jsvbx\") pod \"redhat-operators-wtl57\" (UID: \"a283f867-21e5-4e07-8acb-58ce437a05da\") " pod="openshift-marketplace/redhat-operators-wtl57" Jan 11 18:04:39 crc kubenswrapper[4837]: I0111 18:04:39.268932 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a283f867-21e5-4e07-8acb-58ce437a05da-utilities\") pod \"redhat-operators-wtl57\" (UID: \"a283f867-21e5-4e07-8acb-58ce437a05da\") " pod="openshift-marketplace/redhat-operators-wtl57" Jan 11 18:04:39 crc kubenswrapper[4837]: I0111 18:04:39.371377 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a283f867-21e5-4e07-8acb-58ce437a05da-catalog-content\") pod \"redhat-operators-wtl57\" (UID: \"a283f867-21e5-4e07-8acb-58ce437a05da\") " pod="openshift-marketplace/redhat-operators-wtl57" Jan 11 18:04:39 crc kubenswrapper[4837]: I0111 18:04:39.371467 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jsvbx\" (UniqueName: \"kubernetes.io/projected/a283f867-21e5-4e07-8acb-58ce437a05da-kube-api-access-jsvbx\") pod \"redhat-operators-wtl57\" (UID: \"a283f867-21e5-4e07-8acb-58ce437a05da\") " pod="openshift-marketplace/redhat-operators-wtl57" Jan 11 18:04:39 crc kubenswrapper[4837]: I0111 18:04:39.371566 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a283f867-21e5-4e07-8acb-58ce437a05da-utilities\") pod \"redhat-operators-wtl57\" (UID: \"a283f867-21e5-4e07-8acb-58ce437a05da\") " pod="openshift-marketplace/redhat-operators-wtl57" Jan 11 18:04:39 crc kubenswrapper[4837]: I0111 18:04:39.372338 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a283f867-21e5-4e07-8acb-58ce437a05da-catalog-content\") pod \"redhat-operators-wtl57\" (UID: \"a283f867-21e5-4e07-8acb-58ce437a05da\") " pod="openshift-marketplace/redhat-operators-wtl57" Jan 11 18:04:39 crc kubenswrapper[4837]: I0111 18:04:39.372415 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a283f867-21e5-4e07-8acb-58ce437a05da-utilities\") pod \"redhat-operators-wtl57\" (UID: \"a283f867-21e5-4e07-8acb-58ce437a05da\") " pod="openshift-marketplace/redhat-operators-wtl57" Jan 11 18:04:39 crc kubenswrapper[4837]: I0111 18:04:39.391169 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jsvbx\" (UniqueName: \"kubernetes.io/projected/a283f867-21e5-4e07-8acb-58ce437a05da-kube-api-access-jsvbx\") pod \"redhat-operators-wtl57\" (UID: \"a283f867-21e5-4e07-8acb-58ce437a05da\") " pod="openshift-marketplace/redhat-operators-wtl57" Jan 11 18:04:39 crc kubenswrapper[4837]: I0111 18:04:39.548137 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-wtl57" Jan 11 18:04:39 crc kubenswrapper[4837]: I0111 18:04:39.965639 4837 generic.go:334] "Generic (PLEG): container finished" podID="2809dbe5-de4c-4d4d-9a2c-85c51f4591cb" containerID="158c58e45a535ad3dae401ec19980edc13fec18306ccef97a744b6335e11a716" exitCode=0 Jan 11 18:04:39 crc kubenswrapper[4837]: I0111 18:04:39.965818 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-xtrks" event={"ID":"2809dbe5-de4c-4d4d-9a2c-85c51f4591cb","Type":"ContainerDied","Data":"158c58e45a535ad3dae401ec19980edc13fec18306ccef97a744b6335e11a716"} Jan 11 18:04:40 crc kubenswrapper[4837]: I0111 18:04:40.061898 4837 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-wtl57"] Jan 11 18:04:40 crc kubenswrapper[4837]: I0111 18:04:40.975814 4837 generic.go:334] "Generic (PLEG): container finished" podID="a283f867-21e5-4e07-8acb-58ce437a05da" containerID="8e1afa8fe449a2bcf25de104615bc6e0158748c772908ef8d8238e803bd6a977" exitCode=0 Jan 11 18:04:40 crc kubenswrapper[4837]: I0111 18:04:40.975959 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-wtl57" event={"ID":"a283f867-21e5-4e07-8acb-58ce437a05da","Type":"ContainerDied","Data":"8e1afa8fe449a2bcf25de104615bc6e0158748c772908ef8d8238e803bd6a977"} Jan 11 18:04:40 crc kubenswrapper[4837]: I0111 18:04:40.976347 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-wtl57" event={"ID":"a283f867-21e5-4e07-8acb-58ce437a05da","Type":"ContainerStarted","Data":"dbf14123704f6d917c7c2dde8f80d1f6f0ca49815be9864f5b6a1037501f5049"} Jan 11 18:04:41 crc kubenswrapper[4837]: I0111 18:04:41.401074 4837 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-xtrks" Jan 11 18:04:41 crc kubenswrapper[4837]: I0111 18:04:41.512929 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/2809dbe5-de4c-4d4d-9a2c-85c51f4591cb-inventory\") pod \"2809dbe5-de4c-4d4d-9a2c-85c51f4591cb\" (UID: \"2809dbe5-de4c-4d4d-9a2c-85c51f4591cb\") " Jan 11 18:04:41 crc kubenswrapper[4837]: I0111 18:04:41.512994 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6h45g\" (UniqueName: \"kubernetes.io/projected/2809dbe5-de4c-4d4d-9a2c-85c51f4591cb-kube-api-access-6h45g\") pod \"2809dbe5-de4c-4d4d-9a2c-85c51f4591cb\" (UID: \"2809dbe5-de4c-4d4d-9a2c-85c51f4591cb\") " Jan 11 18:04:41 crc kubenswrapper[4837]: I0111 18:04:41.513232 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/2809dbe5-de4c-4d4d-9a2c-85c51f4591cb-ssh-key-openstack-edpm-ipam\") pod \"2809dbe5-de4c-4d4d-9a2c-85c51f4591cb\" (UID: \"2809dbe5-de4c-4d4d-9a2c-85c51f4591cb\") " Jan 11 18:04:41 crc kubenswrapper[4837]: I0111 18:04:41.517443 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2809dbe5-de4c-4d4d-9a2c-85c51f4591cb-kube-api-access-6h45g" (OuterVolumeSpecName: "kube-api-access-6h45g") pod "2809dbe5-de4c-4d4d-9a2c-85c51f4591cb" (UID: "2809dbe5-de4c-4d4d-9a2c-85c51f4591cb"). InnerVolumeSpecName "kube-api-access-6h45g". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 11 18:04:41 crc kubenswrapper[4837]: I0111 18:04:41.544502 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2809dbe5-de4c-4d4d-9a2c-85c51f4591cb-inventory" (OuterVolumeSpecName: "inventory") pod "2809dbe5-de4c-4d4d-9a2c-85c51f4591cb" (UID: "2809dbe5-de4c-4d4d-9a2c-85c51f4591cb"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 11 18:04:41 crc kubenswrapper[4837]: I0111 18:04:41.546322 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2809dbe5-de4c-4d4d-9a2c-85c51f4591cb-ssh-key-openstack-edpm-ipam" (OuterVolumeSpecName: "ssh-key-openstack-edpm-ipam") pod "2809dbe5-de4c-4d4d-9a2c-85c51f4591cb" (UID: "2809dbe5-de4c-4d4d-9a2c-85c51f4591cb"). InnerVolumeSpecName "ssh-key-openstack-edpm-ipam". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 11 18:04:41 crc kubenswrapper[4837]: I0111 18:04:41.615707 4837 reconciler_common.go:293] "Volume detached for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/2809dbe5-de4c-4d4d-9a2c-85c51f4591cb-ssh-key-openstack-edpm-ipam\") on node \"crc\" DevicePath \"\"" Jan 11 18:04:41 crc kubenswrapper[4837]: I0111 18:04:41.615734 4837 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/2809dbe5-de4c-4d4d-9a2c-85c51f4591cb-inventory\") on node \"crc\" DevicePath \"\"" Jan 11 18:04:41 crc kubenswrapper[4837]: I0111 18:04:41.615743 4837 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6h45g\" (UniqueName: \"kubernetes.io/projected/2809dbe5-de4c-4d4d-9a2c-85c51f4591cb-kube-api-access-6h45g\") on node \"crc\" DevicePath \"\"" Jan 11 18:04:42 crc kubenswrapper[4837]: I0111 18:04:42.039893 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-xtrks" event={"ID":"2809dbe5-de4c-4d4d-9a2c-85c51f4591cb","Type":"ContainerDied","Data":"30abf6ff8f71e15145f2ea8e327a999d4f748bc31ab89fbeb87a21c43d66dd72"} Jan 11 18:04:42 crc kubenswrapper[4837]: I0111 18:04:42.040441 4837 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="30abf6ff8f71e15145f2ea8e327a999d4f748bc31ab89fbeb87a21c43d66dd72" Jan 11 18:04:42 crc kubenswrapper[4837]: I0111 18:04:42.039998 4837 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-xtrks" Jan 11 18:04:42 crc kubenswrapper[4837]: I0111 18:04:42.107633 4837 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-ghw7s"] Jan 11 18:04:42 crc kubenswrapper[4837]: E0111 18:04:42.108260 4837 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2809dbe5-de4c-4d4d-9a2c-85c51f4591cb" containerName="run-os-edpm-deployment-openstack-edpm-ipam" Jan 11 18:04:42 crc kubenswrapper[4837]: I0111 18:04:42.108314 4837 state_mem.go:107] "Deleted CPUSet assignment" podUID="2809dbe5-de4c-4d4d-9a2c-85c51f4591cb" containerName="run-os-edpm-deployment-openstack-edpm-ipam" Jan 11 18:04:42 crc kubenswrapper[4837]: I0111 18:04:42.108623 4837 memory_manager.go:354] "RemoveStaleState removing state" podUID="2809dbe5-de4c-4d4d-9a2c-85c51f4591cb" containerName="run-os-edpm-deployment-openstack-edpm-ipam" Jan 11 18:04:42 crc kubenswrapper[4837]: I0111 18:04:42.109540 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-ghw7s" Jan 11 18:04:42 crc kubenswrapper[4837]: I0111 18:04:42.112439 4837 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Jan 11 18:04:42 crc kubenswrapper[4837]: I0111 18:04:42.114903 4837 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Jan 11 18:04:42 crc kubenswrapper[4837]: I0111 18:04:42.116381 4837 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Jan 11 18:04:42 crc kubenswrapper[4837]: I0111 18:04:42.119983 4837 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-bzt89" Jan 11 18:04:42 crc kubenswrapper[4837]: I0111 18:04:42.121867 4837 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-ghw7s"] Jan 11 18:04:42 crc kubenswrapper[4837]: I0111 18:04:42.227213 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/a2775520-8fe3-45e2-aab4-91f962ef86cb-ssh-key-openstack-edpm-ipam\") pod \"reboot-os-edpm-deployment-openstack-edpm-ipam-ghw7s\" (UID: \"a2775520-8fe3-45e2-aab4-91f962ef86cb\") " pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-ghw7s" Jan 11 18:04:42 crc kubenswrapper[4837]: I0111 18:04:42.227348 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/a2775520-8fe3-45e2-aab4-91f962ef86cb-inventory\") pod \"reboot-os-edpm-deployment-openstack-edpm-ipam-ghw7s\" (UID: \"a2775520-8fe3-45e2-aab4-91f962ef86cb\") " pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-ghw7s" Jan 11 18:04:42 crc kubenswrapper[4837]: I0111 18:04:42.227474 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wkzr9\" (UniqueName: \"kubernetes.io/projected/a2775520-8fe3-45e2-aab4-91f962ef86cb-kube-api-access-wkzr9\") pod \"reboot-os-edpm-deployment-openstack-edpm-ipam-ghw7s\" (UID: \"a2775520-8fe3-45e2-aab4-91f962ef86cb\") " pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-ghw7s" Jan 11 18:04:42 crc kubenswrapper[4837]: I0111 18:04:42.329107 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wkzr9\" (UniqueName: \"kubernetes.io/projected/a2775520-8fe3-45e2-aab4-91f962ef86cb-kube-api-access-wkzr9\") pod \"reboot-os-edpm-deployment-openstack-edpm-ipam-ghw7s\" (UID: \"a2775520-8fe3-45e2-aab4-91f962ef86cb\") " pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-ghw7s" Jan 11 18:04:42 crc kubenswrapper[4837]: I0111 18:04:42.329281 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/a2775520-8fe3-45e2-aab4-91f962ef86cb-ssh-key-openstack-edpm-ipam\") pod \"reboot-os-edpm-deployment-openstack-edpm-ipam-ghw7s\" (UID: \"a2775520-8fe3-45e2-aab4-91f962ef86cb\") " pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-ghw7s" Jan 11 18:04:42 crc kubenswrapper[4837]: I0111 18:04:42.329555 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/a2775520-8fe3-45e2-aab4-91f962ef86cb-inventory\") pod \"reboot-os-edpm-deployment-openstack-edpm-ipam-ghw7s\" (UID: \"a2775520-8fe3-45e2-aab4-91f962ef86cb\") " pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-ghw7s" Jan 11 18:04:42 crc kubenswrapper[4837]: I0111 18:04:42.334941 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/a2775520-8fe3-45e2-aab4-91f962ef86cb-ssh-key-openstack-edpm-ipam\") pod \"reboot-os-edpm-deployment-openstack-edpm-ipam-ghw7s\" (UID: \"a2775520-8fe3-45e2-aab4-91f962ef86cb\") " pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-ghw7s" Jan 11 18:04:42 crc kubenswrapper[4837]: I0111 18:04:42.335514 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/a2775520-8fe3-45e2-aab4-91f962ef86cb-inventory\") pod \"reboot-os-edpm-deployment-openstack-edpm-ipam-ghw7s\" (UID: \"a2775520-8fe3-45e2-aab4-91f962ef86cb\") " pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-ghw7s" Jan 11 18:04:42 crc kubenswrapper[4837]: I0111 18:04:42.353297 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wkzr9\" (UniqueName: \"kubernetes.io/projected/a2775520-8fe3-45e2-aab4-91f962ef86cb-kube-api-access-wkzr9\") pod \"reboot-os-edpm-deployment-openstack-edpm-ipam-ghw7s\" (UID: \"a2775520-8fe3-45e2-aab4-91f962ef86cb\") " pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-ghw7s" Jan 11 18:04:42 crc kubenswrapper[4837]: I0111 18:04:42.432568 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-ghw7s" Jan 11 18:04:42 crc kubenswrapper[4837]: I0111 18:04:42.982029 4837 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-ghw7s"] Jan 11 18:04:43 crc kubenswrapper[4837]: I0111 18:04:43.063338 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-wtl57" event={"ID":"a283f867-21e5-4e07-8acb-58ce437a05da","Type":"ContainerStarted","Data":"abf450cd9e43656072632cd75138e94762d03ff39643b27e0f49a6975c79d917"} Jan 11 18:04:43 crc kubenswrapper[4837]: I0111 18:04:43.065270 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-ghw7s" event={"ID":"a2775520-8fe3-45e2-aab4-91f962ef86cb","Type":"ContainerStarted","Data":"fd944756bf6c3fb9cf2d38b250cda057c2845b70fee92c2242a175b913f30664"} Jan 11 18:04:45 crc kubenswrapper[4837]: I0111 18:04:45.086067 4837 generic.go:334] "Generic (PLEG): container finished" podID="a283f867-21e5-4e07-8acb-58ce437a05da" containerID="abf450cd9e43656072632cd75138e94762d03ff39643b27e0f49a6975c79d917" exitCode=0 Jan 11 18:04:45 crc kubenswrapper[4837]: I0111 18:04:45.086957 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-wtl57" event={"ID":"a283f867-21e5-4e07-8acb-58ce437a05da","Type":"ContainerDied","Data":"abf450cd9e43656072632cd75138e94762d03ff39643b27e0f49a6975c79d917"} Jan 11 18:04:45 crc kubenswrapper[4837]: I0111 18:04:45.089916 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-ghw7s" event={"ID":"a2775520-8fe3-45e2-aab4-91f962ef86cb","Type":"ContainerStarted","Data":"d5b5607b034bc9d8c4af393da7677b9c48f02a6aef7a0d0a63e1d46f1b376962"} Jan 11 18:04:46 crc kubenswrapper[4837]: I0111 18:04:46.127017 4837 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-ghw7s" podStartSLOduration=3.4322361040000002 podStartE2EDuration="4.126996769s" podCreationTimestamp="2026-01-11 18:04:42 +0000 UTC" firstStartedPulling="2026-01-11 18:04:43.005971672 +0000 UTC m=+2057.184164378" lastFinishedPulling="2026-01-11 18:04:43.700732297 +0000 UTC m=+2057.878925043" observedRunningTime="2026-01-11 18:04:46.120290779 +0000 UTC m=+2060.298483495" watchObservedRunningTime="2026-01-11 18:04:46.126996769 +0000 UTC m=+2060.305189475" Jan 11 18:04:47 crc kubenswrapper[4837]: I0111 18:04:47.121774 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-wtl57" event={"ID":"a283f867-21e5-4e07-8acb-58ce437a05da","Type":"ContainerStarted","Data":"c8a927c41c75cf7e8858cfbaa70d6f3ffa4415a757c08a03ca207c57ea7dc379"} Jan 11 18:04:47 crc kubenswrapper[4837]: I0111 18:04:47.152502 4837 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-wtl57" podStartSLOduration=3.151831425 podStartE2EDuration="8.15248434s" podCreationTimestamp="2026-01-11 18:04:39 +0000 UTC" firstStartedPulling="2026-01-11 18:04:40.978316036 +0000 UTC m=+2055.156508742" lastFinishedPulling="2026-01-11 18:04:45.978968951 +0000 UTC m=+2060.157161657" observedRunningTime="2026-01-11 18:04:47.140504928 +0000 UTC m=+2061.318697654" watchObservedRunningTime="2026-01-11 18:04:47.15248434 +0000 UTC m=+2061.330677046" Jan 11 18:04:49 crc kubenswrapper[4837]: I0111 18:04:49.549237 4837 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-wtl57" Jan 11 18:04:49 crc kubenswrapper[4837]: I0111 18:04:49.549766 4837 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-wtl57" Jan 11 18:04:50 crc kubenswrapper[4837]: I0111 18:04:50.607094 4837 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-wtl57" podUID="a283f867-21e5-4e07-8acb-58ce437a05da" containerName="registry-server" probeResult="failure" output=< Jan 11 18:04:50 crc kubenswrapper[4837]: timeout: failed to connect service ":50051" within 1s Jan 11 18:04:50 crc kubenswrapper[4837]: > Jan 11 18:04:55 crc kubenswrapper[4837]: I0111 18:04:55.194179 4837 generic.go:334] "Generic (PLEG): container finished" podID="a2775520-8fe3-45e2-aab4-91f962ef86cb" containerID="d5b5607b034bc9d8c4af393da7677b9c48f02a6aef7a0d0a63e1d46f1b376962" exitCode=0 Jan 11 18:04:55 crc kubenswrapper[4837]: I0111 18:04:55.194297 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-ghw7s" event={"ID":"a2775520-8fe3-45e2-aab4-91f962ef86cb","Type":"ContainerDied","Data":"d5b5607b034bc9d8c4af393da7677b9c48f02a6aef7a0d0a63e1d46f1b376962"} Jan 11 18:04:56 crc kubenswrapper[4837]: I0111 18:04:56.711836 4837 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-ghw7s" Jan 11 18:04:56 crc kubenswrapper[4837]: I0111 18:04:56.832727 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/a2775520-8fe3-45e2-aab4-91f962ef86cb-inventory\") pod \"a2775520-8fe3-45e2-aab4-91f962ef86cb\" (UID: \"a2775520-8fe3-45e2-aab4-91f962ef86cb\") " Jan 11 18:04:56 crc kubenswrapper[4837]: I0111 18:04:56.832897 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-wkzr9\" (UniqueName: \"kubernetes.io/projected/a2775520-8fe3-45e2-aab4-91f962ef86cb-kube-api-access-wkzr9\") pod \"a2775520-8fe3-45e2-aab4-91f962ef86cb\" (UID: \"a2775520-8fe3-45e2-aab4-91f962ef86cb\") " Jan 11 18:04:56 crc kubenswrapper[4837]: I0111 18:04:56.833053 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/a2775520-8fe3-45e2-aab4-91f962ef86cb-ssh-key-openstack-edpm-ipam\") pod \"a2775520-8fe3-45e2-aab4-91f962ef86cb\" (UID: \"a2775520-8fe3-45e2-aab4-91f962ef86cb\") " Jan 11 18:04:56 crc kubenswrapper[4837]: I0111 18:04:56.839580 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a2775520-8fe3-45e2-aab4-91f962ef86cb-kube-api-access-wkzr9" (OuterVolumeSpecName: "kube-api-access-wkzr9") pod "a2775520-8fe3-45e2-aab4-91f962ef86cb" (UID: "a2775520-8fe3-45e2-aab4-91f962ef86cb"). InnerVolumeSpecName "kube-api-access-wkzr9". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 11 18:04:56 crc kubenswrapper[4837]: I0111 18:04:56.860262 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a2775520-8fe3-45e2-aab4-91f962ef86cb-inventory" (OuterVolumeSpecName: "inventory") pod "a2775520-8fe3-45e2-aab4-91f962ef86cb" (UID: "a2775520-8fe3-45e2-aab4-91f962ef86cb"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 11 18:04:56 crc kubenswrapper[4837]: I0111 18:04:56.875506 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a2775520-8fe3-45e2-aab4-91f962ef86cb-ssh-key-openstack-edpm-ipam" (OuterVolumeSpecName: "ssh-key-openstack-edpm-ipam") pod "a2775520-8fe3-45e2-aab4-91f962ef86cb" (UID: "a2775520-8fe3-45e2-aab4-91f962ef86cb"). InnerVolumeSpecName "ssh-key-openstack-edpm-ipam". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 11 18:04:56 crc kubenswrapper[4837]: I0111 18:04:56.935807 4837 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/a2775520-8fe3-45e2-aab4-91f962ef86cb-inventory\") on node \"crc\" DevicePath \"\"" Jan 11 18:04:56 crc kubenswrapper[4837]: I0111 18:04:56.935836 4837 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-wkzr9\" (UniqueName: \"kubernetes.io/projected/a2775520-8fe3-45e2-aab4-91f962ef86cb-kube-api-access-wkzr9\") on node \"crc\" DevicePath \"\"" Jan 11 18:04:56 crc kubenswrapper[4837]: I0111 18:04:56.935846 4837 reconciler_common.go:293] "Volume detached for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/a2775520-8fe3-45e2-aab4-91f962ef86cb-ssh-key-openstack-edpm-ipam\") on node \"crc\" DevicePath \"\"" Jan 11 18:04:57 crc kubenswrapper[4837]: I0111 18:04:57.213970 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-ghw7s" event={"ID":"a2775520-8fe3-45e2-aab4-91f962ef86cb","Type":"ContainerDied","Data":"fd944756bf6c3fb9cf2d38b250cda057c2845b70fee92c2242a175b913f30664"} Jan 11 18:04:57 crc kubenswrapper[4837]: I0111 18:04:57.214016 4837 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="fd944756bf6c3fb9cf2d38b250cda057c2845b70fee92c2242a175b913f30664" Jan 11 18:04:57 crc kubenswrapper[4837]: I0111 18:04:57.214080 4837 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-ghw7s" Jan 11 18:04:57 crc kubenswrapper[4837]: I0111 18:04:57.336151 4837 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/install-certs-edpm-deployment-openstack-edpm-ipam-mknj9"] Jan 11 18:04:57 crc kubenswrapper[4837]: E0111 18:04:57.336846 4837 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a2775520-8fe3-45e2-aab4-91f962ef86cb" containerName="reboot-os-edpm-deployment-openstack-edpm-ipam" Jan 11 18:04:57 crc kubenswrapper[4837]: I0111 18:04:57.336864 4837 state_mem.go:107] "Deleted CPUSet assignment" podUID="a2775520-8fe3-45e2-aab4-91f962ef86cb" containerName="reboot-os-edpm-deployment-openstack-edpm-ipam" Jan 11 18:04:57 crc kubenswrapper[4837]: I0111 18:04:57.337039 4837 memory_manager.go:354] "RemoveStaleState removing state" podUID="a2775520-8fe3-45e2-aab4-91f962ef86cb" containerName="reboot-os-edpm-deployment-openstack-edpm-ipam" Jan 11 18:04:57 crc kubenswrapper[4837]: I0111 18:04:57.337733 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-mknj9" Jan 11 18:04:57 crc kubenswrapper[4837]: I0111 18:04:57.343032 4837 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Jan 11 18:04:57 crc kubenswrapper[4837]: I0111 18:04:57.343236 4837 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-bzt89" Jan 11 18:04:57 crc kubenswrapper[4837]: I0111 18:04:57.343361 4837 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Jan 11 18:04:57 crc kubenswrapper[4837]: I0111 18:04:57.343502 4837 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Jan 11 18:04:57 crc kubenswrapper[4837]: I0111 18:04:57.343607 4837 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-libvirt-default-certs-0" Jan 11 18:04:57 crc kubenswrapper[4837]: I0111 18:04:57.343812 4837 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-ovn-default-certs-0" Jan 11 18:04:57 crc kubenswrapper[4837]: I0111 18:04:57.343970 4837 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-neutron-metadata-default-certs-0" Jan 11 18:04:57 crc kubenswrapper[4837]: I0111 18:04:57.344396 4837 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-telemetry-default-certs-0" Jan 11 18:04:57 crc kubenswrapper[4837]: I0111 18:04:57.348963 4837 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/install-certs-edpm-deployment-openstack-edpm-ipam-mknj9"] Jan 11 18:04:57 crc kubenswrapper[4837]: I0111 18:04:57.448811 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-edpm-ipam-ovn-default-certs-0\" (UniqueName: \"kubernetes.io/projected/e253716a-cb9e-4a48-aca6-5cbd870ef9d5-openstack-edpm-ipam-ovn-default-certs-0\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-mknj9\" (UID: \"e253716a-cb9e-4a48-aca6-5cbd870ef9d5\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-mknj9" Jan 11 18:04:57 crc kubenswrapper[4837]: I0111 18:04:57.449455 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/e253716a-cb9e-4a48-aca6-5cbd870ef9d5-ssh-key-openstack-edpm-ipam\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-mknj9\" (UID: \"e253716a-cb9e-4a48-aca6-5cbd870ef9d5\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-mknj9" Jan 11 18:04:57 crc kubenswrapper[4837]: I0111 18:04:57.449520 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"telemetry-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e253716a-cb9e-4a48-aca6-5cbd870ef9d5-telemetry-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-mknj9\" (UID: \"e253716a-cb9e-4a48-aca6-5cbd870ef9d5\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-mknj9" Jan 11 18:04:57 crc kubenswrapper[4837]: I0111 18:04:57.449569 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e253716a-cb9e-4a48-aca6-5cbd870ef9d5-bootstrap-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-mknj9\" (UID: \"e253716a-cb9e-4a48-aca6-5cbd870ef9d5\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-mknj9" Jan 11 18:04:57 crc kubenswrapper[4837]: I0111 18:04:57.449589 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/e253716a-cb9e-4a48-aca6-5cbd870ef9d5-inventory\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-mknj9\" (UID: \"e253716a-cb9e-4a48-aca6-5cbd870ef9d5\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-mknj9" Jan 11 18:04:57 crc kubenswrapper[4837]: I0111 18:04:57.449707 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-edpm-ipam-libvirt-default-certs-0\" (UniqueName: \"kubernetes.io/projected/e253716a-cb9e-4a48-aca6-5cbd870ef9d5-openstack-edpm-ipam-libvirt-default-certs-0\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-mknj9\" (UID: \"e253716a-cb9e-4a48-aca6-5cbd870ef9d5\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-mknj9" Jan 11 18:04:57 crc kubenswrapper[4837]: I0111 18:04:57.450336 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-edpm-ipam-telemetry-default-certs-0\" (UniqueName: \"kubernetes.io/projected/e253716a-cb9e-4a48-aca6-5cbd870ef9d5-openstack-edpm-ipam-telemetry-default-certs-0\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-mknj9\" (UID: \"e253716a-cb9e-4a48-aca6-5cbd870ef9d5\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-mknj9" Jan 11 18:04:57 crc kubenswrapper[4837]: I0111 18:04:57.450896 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-edpm-ipam-neutron-metadata-default-certs-0\" (UniqueName: \"kubernetes.io/projected/e253716a-cb9e-4a48-aca6-5cbd870ef9d5-openstack-edpm-ipam-neutron-metadata-default-certs-0\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-mknj9\" (UID: \"e253716a-cb9e-4a48-aca6-5cbd870ef9d5\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-mknj9" Jan 11 18:04:57 crc kubenswrapper[4837]: I0111 18:04:57.450923 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e253716a-cb9e-4a48-aca6-5cbd870ef9d5-libvirt-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-mknj9\" (UID: \"e253716a-cb9e-4a48-aca6-5cbd870ef9d5\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-mknj9" Jan 11 18:04:57 crc kubenswrapper[4837]: I0111 18:04:57.450945 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e253716a-cb9e-4a48-aca6-5cbd870ef9d5-ovn-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-mknj9\" (UID: \"e253716a-cb9e-4a48-aca6-5cbd870ef9d5\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-mknj9" Jan 11 18:04:57 crc kubenswrapper[4837]: I0111 18:04:57.451038 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7mjv6\" (UniqueName: \"kubernetes.io/projected/e253716a-cb9e-4a48-aca6-5cbd870ef9d5-kube-api-access-7mjv6\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-mknj9\" (UID: \"e253716a-cb9e-4a48-aca6-5cbd870ef9d5\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-mknj9" Jan 11 18:04:57 crc kubenswrapper[4837]: I0111 18:04:57.451169 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e253716a-cb9e-4a48-aca6-5cbd870ef9d5-neutron-metadata-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-mknj9\" (UID: \"e253716a-cb9e-4a48-aca6-5cbd870ef9d5\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-mknj9" Jan 11 18:04:57 crc kubenswrapper[4837]: I0111 18:04:57.451276 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e253716a-cb9e-4a48-aca6-5cbd870ef9d5-nova-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-mknj9\" (UID: \"e253716a-cb9e-4a48-aca6-5cbd870ef9d5\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-mknj9" Jan 11 18:04:57 crc kubenswrapper[4837]: I0111 18:04:57.451369 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"repo-setup-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e253716a-cb9e-4a48-aca6-5cbd870ef9d5-repo-setup-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-mknj9\" (UID: \"e253716a-cb9e-4a48-aca6-5cbd870ef9d5\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-mknj9" Jan 11 18:04:57 crc kubenswrapper[4837]: I0111 18:04:57.553035 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-edpm-ipam-telemetry-default-certs-0\" (UniqueName: \"kubernetes.io/projected/e253716a-cb9e-4a48-aca6-5cbd870ef9d5-openstack-edpm-ipam-telemetry-default-certs-0\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-mknj9\" (UID: \"e253716a-cb9e-4a48-aca6-5cbd870ef9d5\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-mknj9" Jan 11 18:04:57 crc kubenswrapper[4837]: I0111 18:04:57.553154 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-edpm-ipam-neutron-metadata-default-certs-0\" (UniqueName: \"kubernetes.io/projected/e253716a-cb9e-4a48-aca6-5cbd870ef9d5-openstack-edpm-ipam-neutron-metadata-default-certs-0\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-mknj9\" (UID: \"e253716a-cb9e-4a48-aca6-5cbd870ef9d5\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-mknj9" Jan 11 18:04:57 crc kubenswrapper[4837]: I0111 18:04:57.553191 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e253716a-cb9e-4a48-aca6-5cbd870ef9d5-libvirt-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-mknj9\" (UID: \"e253716a-cb9e-4a48-aca6-5cbd870ef9d5\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-mknj9" Jan 11 18:04:57 crc kubenswrapper[4837]: I0111 18:04:57.553239 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e253716a-cb9e-4a48-aca6-5cbd870ef9d5-ovn-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-mknj9\" (UID: \"e253716a-cb9e-4a48-aca6-5cbd870ef9d5\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-mknj9" Jan 11 18:04:57 crc kubenswrapper[4837]: I0111 18:04:57.553327 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7mjv6\" (UniqueName: \"kubernetes.io/projected/e253716a-cb9e-4a48-aca6-5cbd870ef9d5-kube-api-access-7mjv6\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-mknj9\" (UID: \"e253716a-cb9e-4a48-aca6-5cbd870ef9d5\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-mknj9" Jan 11 18:04:57 crc kubenswrapper[4837]: I0111 18:04:57.553373 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e253716a-cb9e-4a48-aca6-5cbd870ef9d5-neutron-metadata-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-mknj9\" (UID: \"e253716a-cb9e-4a48-aca6-5cbd870ef9d5\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-mknj9" Jan 11 18:04:57 crc kubenswrapper[4837]: I0111 18:04:57.553420 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e253716a-cb9e-4a48-aca6-5cbd870ef9d5-nova-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-mknj9\" (UID: \"e253716a-cb9e-4a48-aca6-5cbd870ef9d5\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-mknj9" Jan 11 18:04:57 crc kubenswrapper[4837]: I0111 18:04:57.553479 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"repo-setup-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e253716a-cb9e-4a48-aca6-5cbd870ef9d5-repo-setup-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-mknj9\" (UID: \"e253716a-cb9e-4a48-aca6-5cbd870ef9d5\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-mknj9" Jan 11 18:04:57 crc kubenswrapper[4837]: I0111 18:04:57.553552 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-edpm-ipam-ovn-default-certs-0\" (UniqueName: \"kubernetes.io/projected/e253716a-cb9e-4a48-aca6-5cbd870ef9d5-openstack-edpm-ipam-ovn-default-certs-0\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-mknj9\" (UID: \"e253716a-cb9e-4a48-aca6-5cbd870ef9d5\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-mknj9" Jan 11 18:04:57 crc kubenswrapper[4837]: I0111 18:04:57.553704 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/e253716a-cb9e-4a48-aca6-5cbd870ef9d5-ssh-key-openstack-edpm-ipam\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-mknj9\" (UID: \"e253716a-cb9e-4a48-aca6-5cbd870ef9d5\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-mknj9" Jan 11 18:04:57 crc kubenswrapper[4837]: I0111 18:04:57.553812 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"telemetry-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e253716a-cb9e-4a48-aca6-5cbd870ef9d5-telemetry-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-mknj9\" (UID: \"e253716a-cb9e-4a48-aca6-5cbd870ef9d5\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-mknj9" Jan 11 18:04:57 crc kubenswrapper[4837]: I0111 18:04:57.553924 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e253716a-cb9e-4a48-aca6-5cbd870ef9d5-bootstrap-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-mknj9\" (UID: \"e253716a-cb9e-4a48-aca6-5cbd870ef9d5\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-mknj9" Jan 11 18:04:57 crc kubenswrapper[4837]: I0111 18:04:57.553965 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/e253716a-cb9e-4a48-aca6-5cbd870ef9d5-inventory\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-mknj9\" (UID: \"e253716a-cb9e-4a48-aca6-5cbd870ef9d5\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-mknj9" Jan 11 18:04:57 crc kubenswrapper[4837]: I0111 18:04:57.554004 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-edpm-ipam-libvirt-default-certs-0\" (UniqueName: \"kubernetes.io/projected/e253716a-cb9e-4a48-aca6-5cbd870ef9d5-openstack-edpm-ipam-libvirt-default-certs-0\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-mknj9\" (UID: \"e253716a-cb9e-4a48-aca6-5cbd870ef9d5\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-mknj9" Jan 11 18:04:57 crc kubenswrapper[4837]: I0111 18:04:57.558984 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e253716a-cb9e-4a48-aca6-5cbd870ef9d5-libvirt-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-mknj9\" (UID: \"e253716a-cb9e-4a48-aca6-5cbd870ef9d5\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-mknj9" Jan 11 18:04:57 crc kubenswrapper[4837]: I0111 18:04:57.560071 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-edpm-ipam-ovn-default-certs-0\" (UniqueName: \"kubernetes.io/projected/e253716a-cb9e-4a48-aca6-5cbd870ef9d5-openstack-edpm-ipam-ovn-default-certs-0\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-mknj9\" (UID: \"e253716a-cb9e-4a48-aca6-5cbd870ef9d5\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-mknj9" Jan 11 18:04:57 crc kubenswrapper[4837]: I0111 18:04:57.560162 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/e253716a-cb9e-4a48-aca6-5cbd870ef9d5-ssh-key-openstack-edpm-ipam\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-mknj9\" (UID: \"e253716a-cb9e-4a48-aca6-5cbd870ef9d5\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-mknj9" Jan 11 18:04:57 crc kubenswrapper[4837]: I0111 18:04:57.560498 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e253716a-cb9e-4a48-aca6-5cbd870ef9d5-ovn-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-mknj9\" (UID: \"e253716a-cb9e-4a48-aca6-5cbd870ef9d5\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-mknj9" Jan 11 18:04:57 crc kubenswrapper[4837]: I0111 18:04:57.561185 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e253716a-cb9e-4a48-aca6-5cbd870ef9d5-nova-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-mknj9\" (UID: \"e253716a-cb9e-4a48-aca6-5cbd870ef9d5\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-mknj9" Jan 11 18:04:57 crc kubenswrapper[4837]: I0111 18:04:57.561379 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-edpm-ipam-libvirt-default-certs-0\" (UniqueName: \"kubernetes.io/projected/e253716a-cb9e-4a48-aca6-5cbd870ef9d5-openstack-edpm-ipam-libvirt-default-certs-0\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-mknj9\" (UID: \"e253716a-cb9e-4a48-aca6-5cbd870ef9d5\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-mknj9" Jan 11 18:04:57 crc kubenswrapper[4837]: I0111 18:04:57.561839 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e253716a-cb9e-4a48-aca6-5cbd870ef9d5-neutron-metadata-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-mknj9\" (UID: \"e253716a-cb9e-4a48-aca6-5cbd870ef9d5\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-mknj9" Jan 11 18:04:57 crc kubenswrapper[4837]: I0111 18:04:57.562476 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"telemetry-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e253716a-cb9e-4a48-aca6-5cbd870ef9d5-telemetry-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-mknj9\" (UID: \"e253716a-cb9e-4a48-aca6-5cbd870ef9d5\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-mknj9" Jan 11 18:04:57 crc kubenswrapper[4837]: I0111 18:04:57.563067 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-edpm-ipam-telemetry-default-certs-0\" (UniqueName: \"kubernetes.io/projected/e253716a-cb9e-4a48-aca6-5cbd870ef9d5-openstack-edpm-ipam-telemetry-default-certs-0\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-mknj9\" (UID: \"e253716a-cb9e-4a48-aca6-5cbd870ef9d5\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-mknj9" Jan 11 18:04:57 crc kubenswrapper[4837]: I0111 18:04:57.563358 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"repo-setup-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e253716a-cb9e-4a48-aca6-5cbd870ef9d5-repo-setup-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-mknj9\" (UID: \"e253716a-cb9e-4a48-aca6-5cbd870ef9d5\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-mknj9" Jan 11 18:04:57 crc kubenswrapper[4837]: I0111 18:04:57.564240 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/e253716a-cb9e-4a48-aca6-5cbd870ef9d5-inventory\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-mknj9\" (UID: \"e253716a-cb9e-4a48-aca6-5cbd870ef9d5\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-mknj9" Jan 11 18:04:57 crc kubenswrapper[4837]: I0111 18:04:57.564438 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-edpm-ipam-neutron-metadata-default-certs-0\" (UniqueName: \"kubernetes.io/projected/e253716a-cb9e-4a48-aca6-5cbd870ef9d5-openstack-edpm-ipam-neutron-metadata-default-certs-0\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-mknj9\" (UID: \"e253716a-cb9e-4a48-aca6-5cbd870ef9d5\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-mknj9" Jan 11 18:04:57 crc kubenswrapper[4837]: I0111 18:04:57.571364 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7mjv6\" (UniqueName: \"kubernetes.io/projected/e253716a-cb9e-4a48-aca6-5cbd870ef9d5-kube-api-access-7mjv6\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-mknj9\" (UID: \"e253716a-cb9e-4a48-aca6-5cbd870ef9d5\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-mknj9" Jan 11 18:04:57 crc kubenswrapper[4837]: I0111 18:04:57.571797 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e253716a-cb9e-4a48-aca6-5cbd870ef9d5-bootstrap-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-mknj9\" (UID: \"e253716a-cb9e-4a48-aca6-5cbd870ef9d5\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-mknj9" Jan 11 18:04:57 crc kubenswrapper[4837]: I0111 18:04:57.659471 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-mknj9" Jan 11 18:04:58 crc kubenswrapper[4837]: W0111 18:04:58.249997 4837 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pode253716a_cb9e_4a48_aca6_5cbd870ef9d5.slice/crio-8939609d9e3c6ac63adf6e16124c1bd088a9e2a5b358c21052140a4bfd9e9023 WatchSource:0}: Error finding container 8939609d9e3c6ac63adf6e16124c1bd088a9e2a5b358c21052140a4bfd9e9023: Status 404 returned error can't find the container with id 8939609d9e3c6ac63adf6e16124c1bd088a9e2a5b358c21052140a4bfd9e9023 Jan 11 18:04:58 crc kubenswrapper[4837]: I0111 18:04:58.250884 4837 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/install-certs-edpm-deployment-openstack-edpm-ipam-mknj9"] Jan 11 18:04:59 crc kubenswrapper[4837]: I0111 18:04:59.250179 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-mknj9" event={"ID":"e253716a-cb9e-4a48-aca6-5cbd870ef9d5","Type":"ContainerStarted","Data":"39d067c1d261391b7dee553c12806ec5e81c6130ac983c4cbde375b20936e14b"} Jan 11 18:04:59 crc kubenswrapper[4837]: I0111 18:04:59.250615 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-mknj9" event={"ID":"e253716a-cb9e-4a48-aca6-5cbd870ef9d5","Type":"ContainerStarted","Data":"8939609d9e3c6ac63adf6e16124c1bd088a9e2a5b358c21052140a4bfd9e9023"} Jan 11 18:04:59 crc kubenswrapper[4837]: I0111 18:04:59.286516 4837 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-mknj9" podStartSLOduration=1.808826775 podStartE2EDuration="2.286491251s" podCreationTimestamp="2026-01-11 18:04:57 +0000 UTC" firstStartedPulling="2026-01-11 18:04:58.252928553 +0000 UTC m=+2072.431121269" lastFinishedPulling="2026-01-11 18:04:58.730593039 +0000 UTC m=+2072.908785745" observedRunningTime="2026-01-11 18:04:59.28424538 +0000 UTC m=+2073.462438096" watchObservedRunningTime="2026-01-11 18:04:59.286491251 +0000 UTC m=+2073.464683977" Jan 11 18:04:59 crc kubenswrapper[4837]: I0111 18:04:59.622159 4837 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-wtl57" Jan 11 18:04:59 crc kubenswrapper[4837]: I0111 18:04:59.722402 4837 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-wtl57" Jan 11 18:04:59 crc kubenswrapper[4837]: I0111 18:04:59.866932 4837 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-wtl57"] Jan 11 18:05:01 crc kubenswrapper[4837]: I0111 18:05:01.273472 4837 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-wtl57" podUID="a283f867-21e5-4e07-8acb-58ce437a05da" containerName="registry-server" containerID="cri-o://c8a927c41c75cf7e8858cfbaa70d6f3ffa4415a757c08a03ca207c57ea7dc379" gracePeriod=2 Jan 11 18:05:02 crc kubenswrapper[4837]: I0111 18:05:02.282466 4837 generic.go:334] "Generic (PLEG): container finished" podID="a283f867-21e5-4e07-8acb-58ce437a05da" containerID="c8a927c41c75cf7e8858cfbaa70d6f3ffa4415a757c08a03ca207c57ea7dc379" exitCode=0 Jan 11 18:05:02 crc kubenswrapper[4837]: I0111 18:05:02.282783 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-wtl57" event={"ID":"a283f867-21e5-4e07-8acb-58ce437a05da","Type":"ContainerDied","Data":"c8a927c41c75cf7e8858cfbaa70d6f3ffa4415a757c08a03ca207c57ea7dc379"} Jan 11 18:05:02 crc kubenswrapper[4837]: I0111 18:05:02.282808 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-wtl57" event={"ID":"a283f867-21e5-4e07-8acb-58ce437a05da","Type":"ContainerDied","Data":"dbf14123704f6d917c7c2dde8f80d1f6f0ca49815be9864f5b6a1037501f5049"} Jan 11 18:05:02 crc kubenswrapper[4837]: I0111 18:05:02.282819 4837 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="dbf14123704f6d917c7c2dde8f80d1f6f0ca49815be9864f5b6a1037501f5049" Jan 11 18:05:02 crc kubenswrapper[4837]: I0111 18:05:02.324460 4837 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-wtl57" Jan 11 18:05:02 crc kubenswrapper[4837]: I0111 18:05:02.484004 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a283f867-21e5-4e07-8acb-58ce437a05da-catalog-content\") pod \"a283f867-21e5-4e07-8acb-58ce437a05da\" (UID: \"a283f867-21e5-4e07-8acb-58ce437a05da\") " Jan 11 18:05:02 crc kubenswrapper[4837]: I0111 18:05:02.484383 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jsvbx\" (UniqueName: \"kubernetes.io/projected/a283f867-21e5-4e07-8acb-58ce437a05da-kube-api-access-jsvbx\") pod \"a283f867-21e5-4e07-8acb-58ce437a05da\" (UID: \"a283f867-21e5-4e07-8acb-58ce437a05da\") " Jan 11 18:05:02 crc kubenswrapper[4837]: I0111 18:05:02.484434 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a283f867-21e5-4e07-8acb-58ce437a05da-utilities\") pod \"a283f867-21e5-4e07-8acb-58ce437a05da\" (UID: \"a283f867-21e5-4e07-8acb-58ce437a05da\") " Jan 11 18:05:02 crc kubenswrapper[4837]: I0111 18:05:02.485321 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/a283f867-21e5-4e07-8acb-58ce437a05da-utilities" (OuterVolumeSpecName: "utilities") pod "a283f867-21e5-4e07-8acb-58ce437a05da" (UID: "a283f867-21e5-4e07-8acb-58ce437a05da"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 11 18:05:02 crc kubenswrapper[4837]: I0111 18:05:02.497529 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a283f867-21e5-4e07-8acb-58ce437a05da-kube-api-access-jsvbx" (OuterVolumeSpecName: "kube-api-access-jsvbx") pod "a283f867-21e5-4e07-8acb-58ce437a05da" (UID: "a283f867-21e5-4e07-8acb-58ce437a05da"). InnerVolumeSpecName "kube-api-access-jsvbx". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 11 18:05:02 crc kubenswrapper[4837]: I0111 18:05:02.586654 4837 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jsvbx\" (UniqueName: \"kubernetes.io/projected/a283f867-21e5-4e07-8acb-58ce437a05da-kube-api-access-jsvbx\") on node \"crc\" DevicePath \"\"" Jan 11 18:05:02 crc kubenswrapper[4837]: I0111 18:05:02.586704 4837 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a283f867-21e5-4e07-8acb-58ce437a05da-utilities\") on node \"crc\" DevicePath \"\"" Jan 11 18:05:02 crc kubenswrapper[4837]: I0111 18:05:02.605342 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/a283f867-21e5-4e07-8acb-58ce437a05da-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "a283f867-21e5-4e07-8acb-58ce437a05da" (UID: "a283f867-21e5-4e07-8acb-58ce437a05da"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 11 18:05:02 crc kubenswrapper[4837]: I0111 18:05:02.688337 4837 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a283f867-21e5-4e07-8acb-58ce437a05da-catalog-content\") on node \"crc\" DevicePath \"\"" Jan 11 18:05:03 crc kubenswrapper[4837]: I0111 18:05:03.295865 4837 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-wtl57" Jan 11 18:05:03 crc kubenswrapper[4837]: I0111 18:05:03.337825 4837 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-wtl57"] Jan 11 18:05:03 crc kubenswrapper[4837]: I0111 18:05:03.348158 4837 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-wtl57"] Jan 11 18:05:04 crc kubenswrapper[4837]: I0111 18:05:04.376755 4837 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a283f867-21e5-4e07-8acb-58ce437a05da" path="/var/lib/kubelet/pods/a283f867-21e5-4e07-8acb-58ce437a05da/volumes" Jan 11 18:05:09 crc kubenswrapper[4837]: I0111 18:05:09.443708 4837 patch_prober.go:28] interesting pod/machine-config-daemon-pqnst container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Jan 11 18:05:09 crc kubenswrapper[4837]: I0111 18:05:09.444319 4837 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-pqnst" podUID="1cf6fa66-290a-4e29-bafc-e60185a22fe2" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Jan 11 18:05:39 crc kubenswrapper[4837]: I0111 18:05:39.444158 4837 patch_prober.go:28] interesting pod/machine-config-daemon-pqnst container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Jan 11 18:05:39 crc kubenswrapper[4837]: I0111 18:05:39.445573 4837 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-pqnst" podUID="1cf6fa66-290a-4e29-bafc-e60185a22fe2" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Jan 11 18:05:41 crc kubenswrapper[4837]: I0111 18:05:41.711585 4837 generic.go:334] "Generic (PLEG): container finished" podID="e253716a-cb9e-4a48-aca6-5cbd870ef9d5" containerID="39d067c1d261391b7dee553c12806ec5e81c6130ac983c4cbde375b20936e14b" exitCode=0 Jan 11 18:05:41 crc kubenswrapper[4837]: I0111 18:05:41.711725 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-mknj9" event={"ID":"e253716a-cb9e-4a48-aca6-5cbd870ef9d5","Type":"ContainerDied","Data":"39d067c1d261391b7dee553c12806ec5e81c6130ac983c4cbde375b20936e14b"} Jan 11 18:05:43 crc kubenswrapper[4837]: I0111 18:05:43.189207 4837 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-mknj9" Jan 11 18:05:43 crc kubenswrapper[4837]: I0111 18:05:43.333988 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e253716a-cb9e-4a48-aca6-5cbd870ef9d5-libvirt-combined-ca-bundle\") pod \"e253716a-cb9e-4a48-aca6-5cbd870ef9d5\" (UID: \"e253716a-cb9e-4a48-aca6-5cbd870ef9d5\") " Jan 11 18:05:43 crc kubenswrapper[4837]: I0111 18:05:43.334036 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e253716a-cb9e-4a48-aca6-5cbd870ef9d5-bootstrap-combined-ca-bundle\") pod \"e253716a-cb9e-4a48-aca6-5cbd870ef9d5\" (UID: \"e253716a-cb9e-4a48-aca6-5cbd870ef9d5\") " Jan 11 18:05:43 crc kubenswrapper[4837]: I0111 18:05:43.334086 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e253716a-cb9e-4a48-aca6-5cbd870ef9d5-ovn-combined-ca-bundle\") pod \"e253716a-cb9e-4a48-aca6-5cbd870ef9d5\" (UID: \"e253716a-cb9e-4a48-aca6-5cbd870ef9d5\") " Jan 11 18:05:43 crc kubenswrapper[4837]: I0111 18:05:43.334129 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e253716a-cb9e-4a48-aca6-5cbd870ef9d5-nova-combined-ca-bundle\") pod \"e253716a-cb9e-4a48-aca6-5cbd870ef9d5\" (UID: \"e253716a-cb9e-4a48-aca6-5cbd870ef9d5\") " Jan 11 18:05:43 crc kubenswrapper[4837]: I0111 18:05:43.334180 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"openstack-edpm-ipam-neutron-metadata-default-certs-0\" (UniqueName: \"kubernetes.io/projected/e253716a-cb9e-4a48-aca6-5cbd870ef9d5-openstack-edpm-ipam-neutron-metadata-default-certs-0\") pod \"e253716a-cb9e-4a48-aca6-5cbd870ef9d5\" (UID: \"e253716a-cb9e-4a48-aca6-5cbd870ef9d5\") " Jan 11 18:05:43 crc kubenswrapper[4837]: I0111 18:05:43.334205 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"openstack-edpm-ipam-telemetry-default-certs-0\" (UniqueName: \"kubernetes.io/projected/e253716a-cb9e-4a48-aca6-5cbd870ef9d5-openstack-edpm-ipam-telemetry-default-certs-0\") pod \"e253716a-cb9e-4a48-aca6-5cbd870ef9d5\" (UID: \"e253716a-cb9e-4a48-aca6-5cbd870ef9d5\") " Jan 11 18:05:43 crc kubenswrapper[4837]: I0111 18:05:43.334255 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7mjv6\" (UniqueName: \"kubernetes.io/projected/e253716a-cb9e-4a48-aca6-5cbd870ef9d5-kube-api-access-7mjv6\") pod \"e253716a-cb9e-4a48-aca6-5cbd870ef9d5\" (UID: \"e253716a-cb9e-4a48-aca6-5cbd870ef9d5\") " Jan 11 18:05:43 crc kubenswrapper[4837]: I0111 18:05:43.334281 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"telemetry-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e253716a-cb9e-4a48-aca6-5cbd870ef9d5-telemetry-combined-ca-bundle\") pod \"e253716a-cb9e-4a48-aca6-5cbd870ef9d5\" (UID: \"e253716a-cb9e-4a48-aca6-5cbd870ef9d5\") " Jan 11 18:05:43 crc kubenswrapper[4837]: I0111 18:05:43.334328 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/e253716a-cb9e-4a48-aca6-5cbd870ef9d5-inventory\") pod \"e253716a-cb9e-4a48-aca6-5cbd870ef9d5\" (UID: \"e253716a-cb9e-4a48-aca6-5cbd870ef9d5\") " Jan 11 18:05:43 crc kubenswrapper[4837]: I0111 18:05:43.334343 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"openstack-edpm-ipam-libvirt-default-certs-0\" (UniqueName: \"kubernetes.io/projected/e253716a-cb9e-4a48-aca6-5cbd870ef9d5-openstack-edpm-ipam-libvirt-default-certs-0\") pod \"e253716a-cb9e-4a48-aca6-5cbd870ef9d5\" (UID: \"e253716a-cb9e-4a48-aca6-5cbd870ef9d5\") " Jan 11 18:05:43 crc kubenswrapper[4837]: I0111 18:05:43.334358 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"repo-setup-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e253716a-cb9e-4a48-aca6-5cbd870ef9d5-repo-setup-combined-ca-bundle\") pod \"e253716a-cb9e-4a48-aca6-5cbd870ef9d5\" (UID: \"e253716a-cb9e-4a48-aca6-5cbd870ef9d5\") " Jan 11 18:05:43 crc kubenswrapper[4837]: I0111 18:05:43.334390 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"openstack-edpm-ipam-ovn-default-certs-0\" (UniqueName: \"kubernetes.io/projected/e253716a-cb9e-4a48-aca6-5cbd870ef9d5-openstack-edpm-ipam-ovn-default-certs-0\") pod \"e253716a-cb9e-4a48-aca6-5cbd870ef9d5\" (UID: \"e253716a-cb9e-4a48-aca6-5cbd870ef9d5\") " Jan 11 18:05:43 crc kubenswrapper[4837]: I0111 18:05:43.334413 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/e253716a-cb9e-4a48-aca6-5cbd870ef9d5-ssh-key-openstack-edpm-ipam\") pod \"e253716a-cb9e-4a48-aca6-5cbd870ef9d5\" (UID: \"e253716a-cb9e-4a48-aca6-5cbd870ef9d5\") " Jan 11 18:05:43 crc kubenswrapper[4837]: I0111 18:05:43.334434 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e253716a-cb9e-4a48-aca6-5cbd870ef9d5-neutron-metadata-combined-ca-bundle\") pod \"e253716a-cb9e-4a48-aca6-5cbd870ef9d5\" (UID: \"e253716a-cb9e-4a48-aca6-5cbd870ef9d5\") " Jan 11 18:05:43 crc kubenswrapper[4837]: I0111 18:05:43.342145 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e253716a-cb9e-4a48-aca6-5cbd870ef9d5-repo-setup-combined-ca-bundle" (OuterVolumeSpecName: "repo-setup-combined-ca-bundle") pod "e253716a-cb9e-4a48-aca6-5cbd870ef9d5" (UID: "e253716a-cb9e-4a48-aca6-5cbd870ef9d5"). InnerVolumeSpecName "repo-setup-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 11 18:05:43 crc kubenswrapper[4837]: I0111 18:05:43.343166 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e253716a-cb9e-4a48-aca6-5cbd870ef9d5-libvirt-combined-ca-bundle" (OuterVolumeSpecName: "libvirt-combined-ca-bundle") pod "e253716a-cb9e-4a48-aca6-5cbd870ef9d5" (UID: "e253716a-cb9e-4a48-aca6-5cbd870ef9d5"). InnerVolumeSpecName "libvirt-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 11 18:05:43 crc kubenswrapper[4837]: I0111 18:05:43.344047 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e253716a-cb9e-4a48-aca6-5cbd870ef9d5-kube-api-access-7mjv6" (OuterVolumeSpecName: "kube-api-access-7mjv6") pod "e253716a-cb9e-4a48-aca6-5cbd870ef9d5" (UID: "e253716a-cb9e-4a48-aca6-5cbd870ef9d5"). InnerVolumeSpecName "kube-api-access-7mjv6". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 11 18:05:43 crc kubenswrapper[4837]: I0111 18:05:43.344096 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e253716a-cb9e-4a48-aca6-5cbd870ef9d5-neutron-metadata-combined-ca-bundle" (OuterVolumeSpecName: "neutron-metadata-combined-ca-bundle") pod "e253716a-cb9e-4a48-aca6-5cbd870ef9d5" (UID: "e253716a-cb9e-4a48-aca6-5cbd870ef9d5"). InnerVolumeSpecName "neutron-metadata-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 11 18:05:43 crc kubenswrapper[4837]: I0111 18:05:43.344140 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e253716a-cb9e-4a48-aca6-5cbd870ef9d5-openstack-edpm-ipam-telemetry-default-certs-0" (OuterVolumeSpecName: "openstack-edpm-ipam-telemetry-default-certs-0") pod "e253716a-cb9e-4a48-aca6-5cbd870ef9d5" (UID: "e253716a-cb9e-4a48-aca6-5cbd870ef9d5"). InnerVolumeSpecName "openstack-edpm-ipam-telemetry-default-certs-0". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 11 18:05:43 crc kubenswrapper[4837]: I0111 18:05:43.344196 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e253716a-cb9e-4a48-aca6-5cbd870ef9d5-openstack-edpm-ipam-neutron-metadata-default-certs-0" (OuterVolumeSpecName: "openstack-edpm-ipam-neutron-metadata-default-certs-0") pod "e253716a-cb9e-4a48-aca6-5cbd870ef9d5" (UID: "e253716a-cb9e-4a48-aca6-5cbd870ef9d5"). InnerVolumeSpecName "openstack-edpm-ipam-neutron-metadata-default-certs-0". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 11 18:05:43 crc kubenswrapper[4837]: I0111 18:05:43.344740 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e253716a-cb9e-4a48-aca6-5cbd870ef9d5-nova-combined-ca-bundle" (OuterVolumeSpecName: "nova-combined-ca-bundle") pod "e253716a-cb9e-4a48-aca6-5cbd870ef9d5" (UID: "e253716a-cb9e-4a48-aca6-5cbd870ef9d5"). InnerVolumeSpecName "nova-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 11 18:05:43 crc kubenswrapper[4837]: I0111 18:05:43.344831 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e253716a-cb9e-4a48-aca6-5cbd870ef9d5-telemetry-combined-ca-bundle" (OuterVolumeSpecName: "telemetry-combined-ca-bundle") pod "e253716a-cb9e-4a48-aca6-5cbd870ef9d5" (UID: "e253716a-cb9e-4a48-aca6-5cbd870ef9d5"). InnerVolumeSpecName "telemetry-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 11 18:05:43 crc kubenswrapper[4837]: I0111 18:05:43.346047 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e253716a-cb9e-4a48-aca6-5cbd870ef9d5-openstack-edpm-ipam-libvirt-default-certs-0" (OuterVolumeSpecName: "openstack-edpm-ipam-libvirt-default-certs-0") pod "e253716a-cb9e-4a48-aca6-5cbd870ef9d5" (UID: "e253716a-cb9e-4a48-aca6-5cbd870ef9d5"). InnerVolumeSpecName "openstack-edpm-ipam-libvirt-default-certs-0". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 11 18:05:43 crc kubenswrapper[4837]: I0111 18:05:43.346741 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e253716a-cb9e-4a48-aca6-5cbd870ef9d5-openstack-edpm-ipam-ovn-default-certs-0" (OuterVolumeSpecName: "openstack-edpm-ipam-ovn-default-certs-0") pod "e253716a-cb9e-4a48-aca6-5cbd870ef9d5" (UID: "e253716a-cb9e-4a48-aca6-5cbd870ef9d5"). InnerVolumeSpecName "openstack-edpm-ipam-ovn-default-certs-0". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 11 18:05:43 crc kubenswrapper[4837]: I0111 18:05:43.347173 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e253716a-cb9e-4a48-aca6-5cbd870ef9d5-ovn-combined-ca-bundle" (OuterVolumeSpecName: "ovn-combined-ca-bundle") pod "e253716a-cb9e-4a48-aca6-5cbd870ef9d5" (UID: "e253716a-cb9e-4a48-aca6-5cbd870ef9d5"). InnerVolumeSpecName "ovn-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 11 18:05:43 crc kubenswrapper[4837]: I0111 18:05:43.347249 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e253716a-cb9e-4a48-aca6-5cbd870ef9d5-bootstrap-combined-ca-bundle" (OuterVolumeSpecName: "bootstrap-combined-ca-bundle") pod "e253716a-cb9e-4a48-aca6-5cbd870ef9d5" (UID: "e253716a-cb9e-4a48-aca6-5cbd870ef9d5"). InnerVolumeSpecName "bootstrap-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 11 18:05:43 crc kubenswrapper[4837]: I0111 18:05:43.379985 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e253716a-cb9e-4a48-aca6-5cbd870ef9d5-inventory" (OuterVolumeSpecName: "inventory") pod "e253716a-cb9e-4a48-aca6-5cbd870ef9d5" (UID: "e253716a-cb9e-4a48-aca6-5cbd870ef9d5"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 11 18:05:43 crc kubenswrapper[4837]: I0111 18:05:43.386378 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e253716a-cb9e-4a48-aca6-5cbd870ef9d5-ssh-key-openstack-edpm-ipam" (OuterVolumeSpecName: "ssh-key-openstack-edpm-ipam") pod "e253716a-cb9e-4a48-aca6-5cbd870ef9d5" (UID: "e253716a-cb9e-4a48-aca6-5cbd870ef9d5"). InnerVolumeSpecName "ssh-key-openstack-edpm-ipam". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 11 18:05:43 crc kubenswrapper[4837]: I0111 18:05:43.436697 4837 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/e253716a-cb9e-4a48-aca6-5cbd870ef9d5-inventory\") on node \"crc\" DevicePath \"\"" Jan 11 18:05:43 crc kubenswrapper[4837]: I0111 18:05:43.436737 4837 reconciler_common.go:293] "Volume detached for volume \"openstack-edpm-ipam-libvirt-default-certs-0\" (UniqueName: \"kubernetes.io/projected/e253716a-cb9e-4a48-aca6-5cbd870ef9d5-openstack-edpm-ipam-libvirt-default-certs-0\") on node \"crc\" DevicePath \"\"" Jan 11 18:05:43 crc kubenswrapper[4837]: I0111 18:05:43.436753 4837 reconciler_common.go:293] "Volume detached for volume \"repo-setup-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e253716a-cb9e-4a48-aca6-5cbd870ef9d5-repo-setup-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 11 18:05:43 crc kubenswrapper[4837]: I0111 18:05:43.436765 4837 reconciler_common.go:293] "Volume detached for volume \"openstack-edpm-ipam-ovn-default-certs-0\" (UniqueName: \"kubernetes.io/projected/e253716a-cb9e-4a48-aca6-5cbd870ef9d5-openstack-edpm-ipam-ovn-default-certs-0\") on node \"crc\" DevicePath \"\"" Jan 11 18:05:43 crc kubenswrapper[4837]: I0111 18:05:43.436779 4837 reconciler_common.go:293] "Volume detached for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/e253716a-cb9e-4a48-aca6-5cbd870ef9d5-ssh-key-openstack-edpm-ipam\") on node \"crc\" DevicePath \"\"" Jan 11 18:05:43 crc kubenswrapper[4837]: I0111 18:05:43.436790 4837 reconciler_common.go:293] "Volume detached for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e253716a-cb9e-4a48-aca6-5cbd870ef9d5-neutron-metadata-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 11 18:05:43 crc kubenswrapper[4837]: I0111 18:05:43.436804 4837 reconciler_common.go:293] "Volume detached for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e253716a-cb9e-4a48-aca6-5cbd870ef9d5-libvirt-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 11 18:05:43 crc kubenswrapper[4837]: I0111 18:05:43.436819 4837 reconciler_common.go:293] "Volume detached for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e253716a-cb9e-4a48-aca6-5cbd870ef9d5-bootstrap-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 11 18:05:43 crc kubenswrapper[4837]: I0111 18:05:43.436830 4837 reconciler_common.go:293] "Volume detached for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e253716a-cb9e-4a48-aca6-5cbd870ef9d5-ovn-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 11 18:05:43 crc kubenswrapper[4837]: I0111 18:05:43.436840 4837 reconciler_common.go:293] "Volume detached for volume \"nova-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e253716a-cb9e-4a48-aca6-5cbd870ef9d5-nova-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 11 18:05:43 crc kubenswrapper[4837]: I0111 18:05:43.436851 4837 reconciler_common.go:293] "Volume detached for volume \"openstack-edpm-ipam-neutron-metadata-default-certs-0\" (UniqueName: \"kubernetes.io/projected/e253716a-cb9e-4a48-aca6-5cbd870ef9d5-openstack-edpm-ipam-neutron-metadata-default-certs-0\") on node \"crc\" DevicePath \"\"" Jan 11 18:05:43 crc kubenswrapper[4837]: I0111 18:05:43.436863 4837 reconciler_common.go:293] "Volume detached for volume \"openstack-edpm-ipam-telemetry-default-certs-0\" (UniqueName: \"kubernetes.io/projected/e253716a-cb9e-4a48-aca6-5cbd870ef9d5-openstack-edpm-ipam-telemetry-default-certs-0\") on node \"crc\" DevicePath \"\"" Jan 11 18:05:43 crc kubenswrapper[4837]: I0111 18:05:43.436874 4837 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-7mjv6\" (UniqueName: \"kubernetes.io/projected/e253716a-cb9e-4a48-aca6-5cbd870ef9d5-kube-api-access-7mjv6\") on node \"crc\" DevicePath \"\"" Jan 11 18:05:43 crc kubenswrapper[4837]: I0111 18:05:43.436885 4837 reconciler_common.go:293] "Volume detached for volume \"telemetry-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e253716a-cb9e-4a48-aca6-5cbd870ef9d5-telemetry-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 11 18:05:43 crc kubenswrapper[4837]: I0111 18:05:43.730394 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-mknj9" event={"ID":"e253716a-cb9e-4a48-aca6-5cbd870ef9d5","Type":"ContainerDied","Data":"8939609d9e3c6ac63adf6e16124c1bd088a9e2a5b358c21052140a4bfd9e9023"} Jan 11 18:05:43 crc kubenswrapper[4837]: I0111 18:05:43.730483 4837 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="8939609d9e3c6ac63adf6e16124c1bd088a9e2a5b358c21052140a4bfd9e9023" Jan 11 18:05:43 crc kubenswrapper[4837]: I0111 18:05:43.730600 4837 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-mknj9" Jan 11 18:05:43 crc kubenswrapper[4837]: I0111 18:05:43.902242 4837 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovn-edpm-deployment-openstack-edpm-ipam-gqrp6"] Jan 11 18:05:43 crc kubenswrapper[4837]: E0111 18:05:43.902819 4837 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a283f867-21e5-4e07-8acb-58ce437a05da" containerName="extract-utilities" Jan 11 18:05:43 crc kubenswrapper[4837]: I0111 18:05:43.902845 4837 state_mem.go:107] "Deleted CPUSet assignment" podUID="a283f867-21e5-4e07-8acb-58ce437a05da" containerName="extract-utilities" Jan 11 18:05:43 crc kubenswrapper[4837]: E0111 18:05:43.902891 4837 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e253716a-cb9e-4a48-aca6-5cbd870ef9d5" containerName="install-certs-edpm-deployment-openstack-edpm-ipam" Jan 11 18:05:43 crc kubenswrapper[4837]: I0111 18:05:43.902902 4837 state_mem.go:107] "Deleted CPUSet assignment" podUID="e253716a-cb9e-4a48-aca6-5cbd870ef9d5" containerName="install-certs-edpm-deployment-openstack-edpm-ipam" Jan 11 18:05:43 crc kubenswrapper[4837]: E0111 18:05:43.902925 4837 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a283f867-21e5-4e07-8acb-58ce437a05da" containerName="extract-content" Jan 11 18:05:43 crc kubenswrapper[4837]: I0111 18:05:43.902933 4837 state_mem.go:107] "Deleted CPUSet assignment" podUID="a283f867-21e5-4e07-8acb-58ce437a05da" containerName="extract-content" Jan 11 18:05:43 crc kubenswrapper[4837]: E0111 18:05:43.902951 4837 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a283f867-21e5-4e07-8acb-58ce437a05da" containerName="registry-server" Jan 11 18:05:43 crc kubenswrapper[4837]: I0111 18:05:43.902959 4837 state_mem.go:107] "Deleted CPUSet assignment" podUID="a283f867-21e5-4e07-8acb-58ce437a05da" containerName="registry-server" Jan 11 18:05:43 crc kubenswrapper[4837]: I0111 18:05:43.903209 4837 memory_manager.go:354] "RemoveStaleState removing state" podUID="a283f867-21e5-4e07-8acb-58ce437a05da" containerName="registry-server" Jan 11 18:05:43 crc kubenswrapper[4837]: I0111 18:05:43.903234 4837 memory_manager.go:354] "RemoveStaleState removing state" podUID="e253716a-cb9e-4a48-aca6-5cbd870ef9d5" containerName="install-certs-edpm-deployment-openstack-edpm-ipam" Jan 11 18:05:43 crc kubenswrapper[4837]: I0111 18:05:43.904202 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-gqrp6" Jan 11 18:05:43 crc kubenswrapper[4837]: I0111 18:05:43.908316 4837 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-bzt89" Jan 11 18:05:43 crc kubenswrapper[4837]: I0111 18:05:43.908319 4837 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovncontroller-config" Jan 11 18:05:43 crc kubenswrapper[4837]: I0111 18:05:43.908364 4837 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Jan 11 18:05:43 crc kubenswrapper[4837]: I0111 18:05:43.908379 4837 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Jan 11 18:05:43 crc kubenswrapper[4837]: I0111 18:05:43.927021 4837 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Jan 11 18:05:43 crc kubenswrapper[4837]: I0111 18:05:43.932355 4837 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-edpm-deployment-openstack-edpm-ipam-gqrp6"] Jan 11 18:05:44 crc kubenswrapper[4837]: I0111 18:05:44.051251 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovncontroller-config-0\" (UniqueName: \"kubernetes.io/configmap/8b03a0af-96d2-4573-aef4-3010b10d138b-ovncontroller-config-0\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-gqrp6\" (UID: \"8b03a0af-96d2-4573-aef4-3010b10d138b\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-gqrp6" Jan 11 18:05:44 crc kubenswrapper[4837]: I0111 18:05:44.051343 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/8b03a0af-96d2-4573-aef4-3010b10d138b-ssh-key-openstack-edpm-ipam\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-gqrp6\" (UID: \"8b03a0af-96d2-4573-aef4-3010b10d138b\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-gqrp6" Jan 11 18:05:44 crc kubenswrapper[4837]: I0111 18:05:44.051475 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/8b03a0af-96d2-4573-aef4-3010b10d138b-inventory\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-gqrp6\" (UID: \"8b03a0af-96d2-4573-aef4-3010b10d138b\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-gqrp6" Jan 11 18:05:44 crc kubenswrapper[4837]: I0111 18:05:44.051523 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rs7jb\" (UniqueName: \"kubernetes.io/projected/8b03a0af-96d2-4573-aef4-3010b10d138b-kube-api-access-rs7jb\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-gqrp6\" (UID: \"8b03a0af-96d2-4573-aef4-3010b10d138b\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-gqrp6" Jan 11 18:05:44 crc kubenswrapper[4837]: I0111 18:05:44.051549 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8b03a0af-96d2-4573-aef4-3010b10d138b-ovn-combined-ca-bundle\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-gqrp6\" (UID: \"8b03a0af-96d2-4573-aef4-3010b10d138b\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-gqrp6" Jan 11 18:05:44 crc kubenswrapper[4837]: I0111 18:05:44.152936 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovncontroller-config-0\" (UniqueName: \"kubernetes.io/configmap/8b03a0af-96d2-4573-aef4-3010b10d138b-ovncontroller-config-0\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-gqrp6\" (UID: \"8b03a0af-96d2-4573-aef4-3010b10d138b\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-gqrp6" Jan 11 18:05:44 crc kubenswrapper[4837]: I0111 18:05:44.153043 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/8b03a0af-96d2-4573-aef4-3010b10d138b-ssh-key-openstack-edpm-ipam\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-gqrp6\" (UID: \"8b03a0af-96d2-4573-aef4-3010b10d138b\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-gqrp6" Jan 11 18:05:44 crc kubenswrapper[4837]: I0111 18:05:44.153079 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/8b03a0af-96d2-4573-aef4-3010b10d138b-inventory\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-gqrp6\" (UID: \"8b03a0af-96d2-4573-aef4-3010b10d138b\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-gqrp6" Jan 11 18:05:44 crc kubenswrapper[4837]: I0111 18:05:44.153100 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rs7jb\" (UniqueName: \"kubernetes.io/projected/8b03a0af-96d2-4573-aef4-3010b10d138b-kube-api-access-rs7jb\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-gqrp6\" (UID: \"8b03a0af-96d2-4573-aef4-3010b10d138b\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-gqrp6" Jan 11 18:05:44 crc kubenswrapper[4837]: I0111 18:05:44.153121 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8b03a0af-96d2-4573-aef4-3010b10d138b-ovn-combined-ca-bundle\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-gqrp6\" (UID: \"8b03a0af-96d2-4573-aef4-3010b10d138b\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-gqrp6" Jan 11 18:05:44 crc kubenswrapper[4837]: I0111 18:05:44.153977 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovncontroller-config-0\" (UniqueName: \"kubernetes.io/configmap/8b03a0af-96d2-4573-aef4-3010b10d138b-ovncontroller-config-0\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-gqrp6\" (UID: \"8b03a0af-96d2-4573-aef4-3010b10d138b\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-gqrp6" Jan 11 18:05:44 crc kubenswrapper[4837]: I0111 18:05:44.157325 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/8b03a0af-96d2-4573-aef4-3010b10d138b-inventory\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-gqrp6\" (UID: \"8b03a0af-96d2-4573-aef4-3010b10d138b\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-gqrp6" Jan 11 18:05:44 crc kubenswrapper[4837]: I0111 18:05:44.158205 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/8b03a0af-96d2-4573-aef4-3010b10d138b-ssh-key-openstack-edpm-ipam\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-gqrp6\" (UID: \"8b03a0af-96d2-4573-aef4-3010b10d138b\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-gqrp6" Jan 11 18:05:44 crc kubenswrapper[4837]: I0111 18:05:44.161376 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8b03a0af-96d2-4573-aef4-3010b10d138b-ovn-combined-ca-bundle\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-gqrp6\" (UID: \"8b03a0af-96d2-4573-aef4-3010b10d138b\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-gqrp6" Jan 11 18:05:44 crc kubenswrapper[4837]: I0111 18:05:44.174514 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rs7jb\" (UniqueName: \"kubernetes.io/projected/8b03a0af-96d2-4573-aef4-3010b10d138b-kube-api-access-rs7jb\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-gqrp6\" (UID: \"8b03a0af-96d2-4573-aef4-3010b10d138b\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-gqrp6" Jan 11 18:05:44 crc kubenswrapper[4837]: I0111 18:05:44.222817 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-gqrp6" Jan 11 18:05:44 crc kubenswrapper[4837]: I0111 18:05:44.796687 4837 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-edpm-deployment-openstack-edpm-ipam-gqrp6"] Jan 11 18:05:45 crc kubenswrapper[4837]: I0111 18:05:45.765119 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-gqrp6" event={"ID":"8b03a0af-96d2-4573-aef4-3010b10d138b","Type":"ContainerStarted","Data":"ccd535f530e51037f1c8395170c26c644bbde110940e9ce0f277223adc321757"} Jan 11 18:05:45 crc kubenswrapper[4837]: I0111 18:05:45.765174 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-gqrp6" event={"ID":"8b03a0af-96d2-4573-aef4-3010b10d138b","Type":"ContainerStarted","Data":"dc02ec13723bb069b74ea8eca42bc75d9c6f9ae59e4bb9d6a80676c58c3541e5"} Jan 11 18:05:45 crc kubenswrapper[4837]: I0111 18:05:45.807394 4837 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-gqrp6" podStartSLOduration=2.402509179 podStartE2EDuration="2.807378072s" podCreationTimestamp="2026-01-11 18:05:43 +0000 UTC" firstStartedPulling="2026-01-11 18:05:44.803190672 +0000 UTC m=+2118.981383378" lastFinishedPulling="2026-01-11 18:05:45.208059565 +0000 UTC m=+2119.386252271" observedRunningTime="2026-01-11 18:05:45.790980613 +0000 UTC m=+2119.969173319" watchObservedRunningTime="2026-01-11 18:05:45.807378072 +0000 UTC m=+2119.985570778" Jan 11 18:06:09 crc kubenswrapper[4837]: I0111 18:06:09.444012 4837 patch_prober.go:28] interesting pod/machine-config-daemon-pqnst container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Jan 11 18:06:09 crc kubenswrapper[4837]: I0111 18:06:09.446507 4837 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-pqnst" podUID="1cf6fa66-290a-4e29-bafc-e60185a22fe2" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Jan 11 18:06:09 crc kubenswrapper[4837]: I0111 18:06:09.446948 4837 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-pqnst" Jan 11 18:06:09 crc kubenswrapper[4837]: I0111 18:06:09.449005 4837 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"6a353f8b2825371ba8ab38b9b1d705da9972513542365a5827d613ce87b61f00"} pod="openshift-machine-config-operator/machine-config-daemon-pqnst" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Jan 11 18:06:09 crc kubenswrapper[4837]: I0111 18:06:09.449359 4837 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-pqnst" podUID="1cf6fa66-290a-4e29-bafc-e60185a22fe2" containerName="machine-config-daemon" containerID="cri-o://6a353f8b2825371ba8ab38b9b1d705da9972513542365a5827d613ce87b61f00" gracePeriod=600 Jan 11 18:06:10 crc kubenswrapper[4837]: I0111 18:06:10.035248 4837 generic.go:334] "Generic (PLEG): container finished" podID="1cf6fa66-290a-4e29-bafc-e60185a22fe2" containerID="6a353f8b2825371ba8ab38b9b1d705da9972513542365a5827d613ce87b61f00" exitCode=0 Jan 11 18:06:10 crc kubenswrapper[4837]: I0111 18:06:10.035314 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-pqnst" event={"ID":"1cf6fa66-290a-4e29-bafc-e60185a22fe2","Type":"ContainerDied","Data":"6a353f8b2825371ba8ab38b9b1d705da9972513542365a5827d613ce87b61f00"} Jan 11 18:06:10 crc kubenswrapper[4837]: I0111 18:06:10.035652 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-pqnst" event={"ID":"1cf6fa66-290a-4e29-bafc-e60185a22fe2","Type":"ContainerStarted","Data":"2154fb56352ed810c3f2e54326f8ee766b9503eb23a751202bc7e683091dfb89"} Jan 11 18:06:10 crc kubenswrapper[4837]: I0111 18:06:10.035668 4837 scope.go:117] "RemoveContainer" containerID="275040296db5cef9700aa2e2244ea724fe1663bdc93466378b56f7e33809d0f9" Jan 11 18:06:53 crc kubenswrapper[4837]: I0111 18:06:53.505898 4837 generic.go:334] "Generic (PLEG): container finished" podID="8b03a0af-96d2-4573-aef4-3010b10d138b" containerID="ccd535f530e51037f1c8395170c26c644bbde110940e9ce0f277223adc321757" exitCode=0 Jan 11 18:06:53 crc kubenswrapper[4837]: I0111 18:06:53.506038 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-gqrp6" event={"ID":"8b03a0af-96d2-4573-aef4-3010b10d138b","Type":"ContainerDied","Data":"ccd535f530e51037f1c8395170c26c644bbde110940e9ce0f277223adc321757"} Jan 11 18:06:54 crc kubenswrapper[4837]: I0111 18:06:54.942347 4837 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-gqrp6" Jan 11 18:06:55 crc kubenswrapper[4837]: I0111 18:06:55.042603 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovncontroller-config-0\" (UniqueName: \"kubernetes.io/configmap/8b03a0af-96d2-4573-aef4-3010b10d138b-ovncontroller-config-0\") pod \"8b03a0af-96d2-4573-aef4-3010b10d138b\" (UID: \"8b03a0af-96d2-4573-aef4-3010b10d138b\") " Jan 11 18:06:55 crc kubenswrapper[4837]: I0111 18:06:55.043012 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-rs7jb\" (UniqueName: \"kubernetes.io/projected/8b03a0af-96d2-4573-aef4-3010b10d138b-kube-api-access-rs7jb\") pod \"8b03a0af-96d2-4573-aef4-3010b10d138b\" (UID: \"8b03a0af-96d2-4573-aef4-3010b10d138b\") " Jan 11 18:06:55 crc kubenswrapper[4837]: I0111 18:06:55.043142 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8b03a0af-96d2-4573-aef4-3010b10d138b-ovn-combined-ca-bundle\") pod \"8b03a0af-96d2-4573-aef4-3010b10d138b\" (UID: \"8b03a0af-96d2-4573-aef4-3010b10d138b\") " Jan 11 18:06:55 crc kubenswrapper[4837]: I0111 18:06:55.043268 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/8b03a0af-96d2-4573-aef4-3010b10d138b-ssh-key-openstack-edpm-ipam\") pod \"8b03a0af-96d2-4573-aef4-3010b10d138b\" (UID: \"8b03a0af-96d2-4573-aef4-3010b10d138b\") " Jan 11 18:06:55 crc kubenswrapper[4837]: I0111 18:06:55.043315 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/8b03a0af-96d2-4573-aef4-3010b10d138b-inventory\") pod \"8b03a0af-96d2-4573-aef4-3010b10d138b\" (UID: \"8b03a0af-96d2-4573-aef4-3010b10d138b\") " Jan 11 18:06:55 crc kubenswrapper[4837]: I0111 18:06:55.048753 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8b03a0af-96d2-4573-aef4-3010b10d138b-ovn-combined-ca-bundle" (OuterVolumeSpecName: "ovn-combined-ca-bundle") pod "8b03a0af-96d2-4573-aef4-3010b10d138b" (UID: "8b03a0af-96d2-4573-aef4-3010b10d138b"). InnerVolumeSpecName "ovn-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 11 18:06:55 crc kubenswrapper[4837]: I0111 18:06:55.048923 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8b03a0af-96d2-4573-aef4-3010b10d138b-kube-api-access-rs7jb" (OuterVolumeSpecName: "kube-api-access-rs7jb") pod "8b03a0af-96d2-4573-aef4-3010b10d138b" (UID: "8b03a0af-96d2-4573-aef4-3010b10d138b"). InnerVolumeSpecName "kube-api-access-rs7jb". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 11 18:06:55 crc kubenswrapper[4837]: I0111 18:06:55.071880 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8b03a0af-96d2-4573-aef4-3010b10d138b-ssh-key-openstack-edpm-ipam" (OuterVolumeSpecName: "ssh-key-openstack-edpm-ipam") pod "8b03a0af-96d2-4573-aef4-3010b10d138b" (UID: "8b03a0af-96d2-4573-aef4-3010b10d138b"). InnerVolumeSpecName "ssh-key-openstack-edpm-ipam". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 11 18:06:55 crc kubenswrapper[4837]: I0111 18:06:55.081218 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8b03a0af-96d2-4573-aef4-3010b10d138b-ovncontroller-config-0" (OuterVolumeSpecName: "ovncontroller-config-0") pod "8b03a0af-96d2-4573-aef4-3010b10d138b" (UID: "8b03a0af-96d2-4573-aef4-3010b10d138b"). InnerVolumeSpecName "ovncontroller-config-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 11 18:06:55 crc kubenswrapper[4837]: I0111 18:06:55.082730 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8b03a0af-96d2-4573-aef4-3010b10d138b-inventory" (OuterVolumeSpecName: "inventory") pod "8b03a0af-96d2-4573-aef4-3010b10d138b" (UID: "8b03a0af-96d2-4573-aef4-3010b10d138b"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 11 18:06:55 crc kubenswrapper[4837]: I0111 18:06:55.146477 4837 reconciler_common.go:293] "Volume detached for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8b03a0af-96d2-4573-aef4-3010b10d138b-ovn-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 11 18:06:55 crc kubenswrapper[4837]: I0111 18:06:55.146519 4837 reconciler_common.go:293] "Volume detached for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/8b03a0af-96d2-4573-aef4-3010b10d138b-ssh-key-openstack-edpm-ipam\") on node \"crc\" DevicePath \"\"" Jan 11 18:06:55 crc kubenswrapper[4837]: I0111 18:06:55.146537 4837 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/8b03a0af-96d2-4573-aef4-3010b10d138b-inventory\") on node \"crc\" DevicePath \"\"" Jan 11 18:06:55 crc kubenswrapper[4837]: I0111 18:06:55.146550 4837 reconciler_common.go:293] "Volume detached for volume \"ovncontroller-config-0\" (UniqueName: \"kubernetes.io/configmap/8b03a0af-96d2-4573-aef4-3010b10d138b-ovncontroller-config-0\") on node \"crc\" DevicePath \"\"" Jan 11 18:06:55 crc kubenswrapper[4837]: I0111 18:06:55.146563 4837 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-rs7jb\" (UniqueName: \"kubernetes.io/projected/8b03a0af-96d2-4573-aef4-3010b10d138b-kube-api-access-rs7jb\") on node \"crc\" DevicePath \"\"" Jan 11 18:06:55 crc kubenswrapper[4837]: I0111 18:06:55.530903 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-gqrp6" event={"ID":"8b03a0af-96d2-4573-aef4-3010b10d138b","Type":"ContainerDied","Data":"dc02ec13723bb069b74ea8eca42bc75d9c6f9ae59e4bb9d6a80676c58c3541e5"} Jan 11 18:06:55 crc kubenswrapper[4837]: I0111 18:06:55.530989 4837 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-gqrp6" Jan 11 18:06:55 crc kubenswrapper[4837]: I0111 18:06:55.531348 4837 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="dc02ec13723bb069b74ea8eca42bc75d9c6f9ae59e4bb9d6a80676c58c3541e5" Jan 11 18:06:55 crc kubenswrapper[4837]: I0111 18:06:55.613549 4837 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-s96gx"] Jan 11 18:06:55 crc kubenswrapper[4837]: E0111 18:06:55.614207 4837 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8b03a0af-96d2-4573-aef4-3010b10d138b" containerName="ovn-edpm-deployment-openstack-edpm-ipam" Jan 11 18:06:55 crc kubenswrapper[4837]: I0111 18:06:55.614278 4837 state_mem.go:107] "Deleted CPUSet assignment" podUID="8b03a0af-96d2-4573-aef4-3010b10d138b" containerName="ovn-edpm-deployment-openstack-edpm-ipam" Jan 11 18:06:55 crc kubenswrapper[4837]: I0111 18:06:55.614510 4837 memory_manager.go:354] "RemoveStaleState removing state" podUID="8b03a0af-96d2-4573-aef4-3010b10d138b" containerName="ovn-edpm-deployment-openstack-edpm-ipam" Jan 11 18:06:55 crc kubenswrapper[4837]: I0111 18:06:55.615258 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-s96gx" Jan 11 18:06:55 crc kubenswrapper[4837]: I0111 18:06:55.619779 4837 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Jan 11 18:06:55 crc kubenswrapper[4837]: I0111 18:06:55.619784 4837 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-ovn-metadata-agent-neutron-config" Jan 11 18:06:55 crc kubenswrapper[4837]: I0111 18:06:55.619818 4837 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-bzt89" Jan 11 18:06:55 crc kubenswrapper[4837]: I0111 18:06:55.619977 4837 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Jan 11 18:06:55 crc kubenswrapper[4837]: I0111 18:06:55.621276 4837 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-metadata-neutron-config" Jan 11 18:06:55 crc kubenswrapper[4837]: I0111 18:06:55.621293 4837 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Jan 11 18:06:55 crc kubenswrapper[4837]: I0111 18:06:55.627645 4837 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-s96gx"] Jan 11 18:06:55 crc kubenswrapper[4837]: I0111 18:06:55.684347 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/dafad3b0-31b4-467e-9604-485cb65e91e5-ssh-key-openstack-edpm-ipam\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-s96gx\" (UID: \"dafad3b0-31b4-467e-9604-485cb65e91e5\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-s96gx" Jan 11 18:06:55 crc kubenswrapper[4837]: I0111 18:06:55.684407 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wzwwk\" (UniqueName: \"kubernetes.io/projected/dafad3b0-31b4-467e-9604-485cb65e91e5-kube-api-access-wzwwk\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-s96gx\" (UID: \"dafad3b0-31b4-467e-9604-485cb65e91e5\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-s96gx" Jan 11 18:06:55 crc kubenswrapper[4837]: I0111 18:06:55.684461 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/dafad3b0-31b4-467e-9604-485cb65e91e5-inventory\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-s96gx\" (UID: \"dafad3b0-31b4-467e-9604-485cb65e91e5\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-s96gx" Jan 11 18:06:55 crc kubenswrapper[4837]: I0111 18:06:55.684505 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-metadata-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/dafad3b0-31b4-467e-9604-485cb65e91e5-nova-metadata-neutron-config-0\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-s96gx\" (UID: \"dafad3b0-31b4-467e-9604-485cb65e91e5\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-s96gx" Jan 11 18:06:55 crc kubenswrapper[4837]: I0111 18:06:55.684529 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"neutron-ovn-metadata-agent-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/dafad3b0-31b4-467e-9604-485cb65e91e5-neutron-ovn-metadata-agent-neutron-config-0\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-s96gx\" (UID: \"dafad3b0-31b4-467e-9604-485cb65e91e5\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-s96gx" Jan 11 18:06:55 crc kubenswrapper[4837]: I0111 18:06:55.684780 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/dafad3b0-31b4-467e-9604-485cb65e91e5-neutron-metadata-combined-ca-bundle\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-s96gx\" (UID: \"dafad3b0-31b4-467e-9604-485cb65e91e5\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-s96gx" Jan 11 18:06:55 crc kubenswrapper[4837]: I0111 18:06:55.786908 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wzwwk\" (UniqueName: \"kubernetes.io/projected/dafad3b0-31b4-467e-9604-485cb65e91e5-kube-api-access-wzwwk\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-s96gx\" (UID: \"dafad3b0-31b4-467e-9604-485cb65e91e5\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-s96gx" Jan 11 18:06:55 crc kubenswrapper[4837]: I0111 18:06:55.786965 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/dafad3b0-31b4-467e-9604-485cb65e91e5-inventory\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-s96gx\" (UID: \"dafad3b0-31b4-467e-9604-485cb65e91e5\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-s96gx" Jan 11 18:06:55 crc kubenswrapper[4837]: I0111 18:06:55.786997 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-metadata-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/dafad3b0-31b4-467e-9604-485cb65e91e5-nova-metadata-neutron-config-0\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-s96gx\" (UID: \"dafad3b0-31b4-467e-9604-485cb65e91e5\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-s96gx" Jan 11 18:06:55 crc kubenswrapper[4837]: I0111 18:06:55.787018 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"neutron-ovn-metadata-agent-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/dafad3b0-31b4-467e-9604-485cb65e91e5-neutron-ovn-metadata-agent-neutron-config-0\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-s96gx\" (UID: \"dafad3b0-31b4-467e-9604-485cb65e91e5\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-s96gx" Jan 11 18:06:55 crc kubenswrapper[4837]: I0111 18:06:55.787098 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/dafad3b0-31b4-467e-9604-485cb65e91e5-neutron-metadata-combined-ca-bundle\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-s96gx\" (UID: \"dafad3b0-31b4-467e-9604-485cb65e91e5\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-s96gx" Jan 11 18:06:55 crc kubenswrapper[4837]: I0111 18:06:55.787168 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/dafad3b0-31b4-467e-9604-485cb65e91e5-ssh-key-openstack-edpm-ipam\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-s96gx\" (UID: \"dafad3b0-31b4-467e-9604-485cb65e91e5\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-s96gx" Jan 11 18:06:55 crc kubenswrapper[4837]: I0111 18:06:55.790412 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"neutron-ovn-metadata-agent-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/dafad3b0-31b4-467e-9604-485cb65e91e5-neutron-ovn-metadata-agent-neutron-config-0\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-s96gx\" (UID: \"dafad3b0-31b4-467e-9604-485cb65e91e5\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-s96gx" Jan 11 18:06:55 crc kubenswrapper[4837]: I0111 18:06:55.791291 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/dafad3b0-31b4-467e-9604-485cb65e91e5-inventory\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-s96gx\" (UID: \"dafad3b0-31b4-467e-9604-485cb65e91e5\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-s96gx" Jan 11 18:06:55 crc kubenswrapper[4837]: I0111 18:06:55.791313 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/dafad3b0-31b4-467e-9604-485cb65e91e5-ssh-key-openstack-edpm-ipam\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-s96gx\" (UID: \"dafad3b0-31b4-467e-9604-485cb65e91e5\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-s96gx" Jan 11 18:06:55 crc kubenswrapper[4837]: I0111 18:06:55.792236 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-metadata-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/dafad3b0-31b4-467e-9604-485cb65e91e5-nova-metadata-neutron-config-0\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-s96gx\" (UID: \"dafad3b0-31b4-467e-9604-485cb65e91e5\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-s96gx" Jan 11 18:06:55 crc kubenswrapper[4837]: I0111 18:06:55.793618 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/dafad3b0-31b4-467e-9604-485cb65e91e5-neutron-metadata-combined-ca-bundle\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-s96gx\" (UID: \"dafad3b0-31b4-467e-9604-485cb65e91e5\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-s96gx" Jan 11 18:06:55 crc kubenswrapper[4837]: I0111 18:06:55.810458 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wzwwk\" (UniqueName: \"kubernetes.io/projected/dafad3b0-31b4-467e-9604-485cb65e91e5-kube-api-access-wzwwk\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-s96gx\" (UID: \"dafad3b0-31b4-467e-9604-485cb65e91e5\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-s96gx" Jan 11 18:06:55 crc kubenswrapper[4837]: I0111 18:06:55.934614 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-s96gx" Jan 11 18:06:56 crc kubenswrapper[4837]: W0111 18:06:56.484077 4837 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-poddafad3b0_31b4_467e_9604_485cb65e91e5.slice/crio-b4d519fa8ed671ee8a281c6e940ccfa8500c4cc4a57cd472f2c72c0888d24b77 WatchSource:0}: Error finding container b4d519fa8ed671ee8a281c6e940ccfa8500c4cc4a57cd472f2c72c0888d24b77: Status 404 returned error can't find the container with id b4d519fa8ed671ee8a281c6e940ccfa8500c4cc4a57cd472f2c72c0888d24b77 Jan 11 18:06:56 crc kubenswrapper[4837]: I0111 18:06:56.489558 4837 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-s96gx"] Jan 11 18:06:56 crc kubenswrapper[4837]: I0111 18:06:56.545738 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-s96gx" event={"ID":"dafad3b0-31b4-467e-9604-485cb65e91e5","Type":"ContainerStarted","Data":"b4d519fa8ed671ee8a281c6e940ccfa8500c4cc4a57cd472f2c72c0888d24b77"} Jan 11 18:06:57 crc kubenswrapper[4837]: I0111 18:06:57.561005 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-s96gx" event={"ID":"dafad3b0-31b4-467e-9604-485cb65e91e5","Type":"ContainerStarted","Data":"3c9b6eb638c464a1cd8b130767ebba1ffe94561fed63eb8cbf54a0e858e60904"} Jan 11 18:06:57 crc kubenswrapper[4837]: I0111 18:06:57.599271 4837 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-s96gx" podStartSLOduration=1.967467526 podStartE2EDuration="2.599242918s" podCreationTimestamp="2026-01-11 18:06:55 +0000 UTC" firstStartedPulling="2026-01-11 18:06:56.486406325 +0000 UTC m=+2190.664599071" lastFinishedPulling="2026-01-11 18:06:57.118181727 +0000 UTC m=+2191.296374463" observedRunningTime="2026-01-11 18:06:57.591230133 +0000 UTC m=+2191.769422899" watchObservedRunningTime="2026-01-11 18:06:57.599242918 +0000 UTC m=+2191.777435664" Jan 11 18:07:14 crc kubenswrapper[4837]: I0111 18:07:14.034297 4837 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-7ls2z"] Jan 11 18:07:14 crc kubenswrapper[4837]: I0111 18:07:14.037143 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-7ls2z" Jan 11 18:07:14 crc kubenswrapper[4837]: I0111 18:07:14.058756 4837 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-7ls2z"] Jan 11 18:07:14 crc kubenswrapper[4837]: I0111 18:07:14.100322 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b05e8d99-38a1-4a31-a540-97058c253b29-utilities\") pod \"redhat-marketplace-7ls2z\" (UID: \"b05e8d99-38a1-4a31-a540-97058c253b29\") " pod="openshift-marketplace/redhat-marketplace-7ls2z" Jan 11 18:07:14 crc kubenswrapper[4837]: I0111 18:07:14.100452 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9sz4k\" (UniqueName: \"kubernetes.io/projected/b05e8d99-38a1-4a31-a540-97058c253b29-kube-api-access-9sz4k\") pod \"redhat-marketplace-7ls2z\" (UID: \"b05e8d99-38a1-4a31-a540-97058c253b29\") " pod="openshift-marketplace/redhat-marketplace-7ls2z" Jan 11 18:07:14 crc kubenswrapper[4837]: I0111 18:07:14.100502 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b05e8d99-38a1-4a31-a540-97058c253b29-catalog-content\") pod \"redhat-marketplace-7ls2z\" (UID: \"b05e8d99-38a1-4a31-a540-97058c253b29\") " pod="openshift-marketplace/redhat-marketplace-7ls2z" Jan 11 18:07:14 crc kubenswrapper[4837]: I0111 18:07:14.202222 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9sz4k\" (UniqueName: \"kubernetes.io/projected/b05e8d99-38a1-4a31-a540-97058c253b29-kube-api-access-9sz4k\") pod \"redhat-marketplace-7ls2z\" (UID: \"b05e8d99-38a1-4a31-a540-97058c253b29\") " pod="openshift-marketplace/redhat-marketplace-7ls2z" Jan 11 18:07:14 crc kubenswrapper[4837]: I0111 18:07:14.202302 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b05e8d99-38a1-4a31-a540-97058c253b29-catalog-content\") pod \"redhat-marketplace-7ls2z\" (UID: \"b05e8d99-38a1-4a31-a540-97058c253b29\") " pod="openshift-marketplace/redhat-marketplace-7ls2z" Jan 11 18:07:14 crc kubenswrapper[4837]: I0111 18:07:14.202382 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b05e8d99-38a1-4a31-a540-97058c253b29-utilities\") pod \"redhat-marketplace-7ls2z\" (UID: \"b05e8d99-38a1-4a31-a540-97058c253b29\") " pod="openshift-marketplace/redhat-marketplace-7ls2z" Jan 11 18:07:14 crc kubenswrapper[4837]: I0111 18:07:14.202980 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b05e8d99-38a1-4a31-a540-97058c253b29-catalog-content\") pod \"redhat-marketplace-7ls2z\" (UID: \"b05e8d99-38a1-4a31-a540-97058c253b29\") " pod="openshift-marketplace/redhat-marketplace-7ls2z" Jan 11 18:07:14 crc kubenswrapper[4837]: I0111 18:07:14.203078 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b05e8d99-38a1-4a31-a540-97058c253b29-utilities\") pod \"redhat-marketplace-7ls2z\" (UID: \"b05e8d99-38a1-4a31-a540-97058c253b29\") " pod="openshift-marketplace/redhat-marketplace-7ls2z" Jan 11 18:07:14 crc kubenswrapper[4837]: I0111 18:07:14.230232 4837 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-4vdb9"] Jan 11 18:07:14 crc kubenswrapper[4837]: I0111 18:07:14.232317 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-4vdb9" Jan 11 18:07:14 crc kubenswrapper[4837]: I0111 18:07:14.240213 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9sz4k\" (UniqueName: \"kubernetes.io/projected/b05e8d99-38a1-4a31-a540-97058c253b29-kube-api-access-9sz4k\") pod \"redhat-marketplace-7ls2z\" (UID: \"b05e8d99-38a1-4a31-a540-97058c253b29\") " pod="openshift-marketplace/redhat-marketplace-7ls2z" Jan 11 18:07:14 crc kubenswrapper[4837]: I0111 18:07:14.255118 4837 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-4vdb9"] Jan 11 18:07:14 crc kubenswrapper[4837]: I0111 18:07:14.303709 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/818995b0-a7e2-40de-99d2-d0256d2ef2f6-catalog-content\") pod \"community-operators-4vdb9\" (UID: \"818995b0-a7e2-40de-99d2-d0256d2ef2f6\") " pod="openshift-marketplace/community-operators-4vdb9" Jan 11 18:07:14 crc kubenswrapper[4837]: I0111 18:07:14.303812 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/818995b0-a7e2-40de-99d2-d0256d2ef2f6-utilities\") pod \"community-operators-4vdb9\" (UID: \"818995b0-a7e2-40de-99d2-d0256d2ef2f6\") " pod="openshift-marketplace/community-operators-4vdb9" Jan 11 18:07:14 crc kubenswrapper[4837]: I0111 18:07:14.303888 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nms58\" (UniqueName: \"kubernetes.io/projected/818995b0-a7e2-40de-99d2-d0256d2ef2f6-kube-api-access-nms58\") pod \"community-operators-4vdb9\" (UID: \"818995b0-a7e2-40de-99d2-d0256d2ef2f6\") " pod="openshift-marketplace/community-operators-4vdb9" Jan 11 18:07:14 crc kubenswrapper[4837]: I0111 18:07:14.405356 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/818995b0-a7e2-40de-99d2-d0256d2ef2f6-utilities\") pod \"community-operators-4vdb9\" (UID: \"818995b0-a7e2-40de-99d2-d0256d2ef2f6\") " pod="openshift-marketplace/community-operators-4vdb9" Jan 11 18:07:14 crc kubenswrapper[4837]: I0111 18:07:14.405483 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-nms58\" (UniqueName: \"kubernetes.io/projected/818995b0-a7e2-40de-99d2-d0256d2ef2f6-kube-api-access-nms58\") pod \"community-operators-4vdb9\" (UID: \"818995b0-a7e2-40de-99d2-d0256d2ef2f6\") " pod="openshift-marketplace/community-operators-4vdb9" Jan 11 18:07:14 crc kubenswrapper[4837]: I0111 18:07:14.405519 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/818995b0-a7e2-40de-99d2-d0256d2ef2f6-catalog-content\") pod \"community-operators-4vdb9\" (UID: \"818995b0-a7e2-40de-99d2-d0256d2ef2f6\") " pod="openshift-marketplace/community-operators-4vdb9" Jan 11 18:07:14 crc kubenswrapper[4837]: I0111 18:07:14.406438 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-7ls2z" Jan 11 18:07:14 crc kubenswrapper[4837]: I0111 18:07:14.406848 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/818995b0-a7e2-40de-99d2-d0256d2ef2f6-utilities\") pod \"community-operators-4vdb9\" (UID: \"818995b0-a7e2-40de-99d2-d0256d2ef2f6\") " pod="openshift-marketplace/community-operators-4vdb9" Jan 11 18:07:14 crc kubenswrapper[4837]: I0111 18:07:14.406970 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/818995b0-a7e2-40de-99d2-d0256d2ef2f6-catalog-content\") pod \"community-operators-4vdb9\" (UID: \"818995b0-a7e2-40de-99d2-d0256d2ef2f6\") " pod="openshift-marketplace/community-operators-4vdb9" Jan 11 18:07:14 crc kubenswrapper[4837]: I0111 18:07:14.425547 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-nms58\" (UniqueName: \"kubernetes.io/projected/818995b0-a7e2-40de-99d2-d0256d2ef2f6-kube-api-access-nms58\") pod \"community-operators-4vdb9\" (UID: \"818995b0-a7e2-40de-99d2-d0256d2ef2f6\") " pod="openshift-marketplace/community-operators-4vdb9" Jan 11 18:07:14 crc kubenswrapper[4837]: I0111 18:07:14.630545 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-4vdb9" Jan 11 18:07:14 crc kubenswrapper[4837]: I0111 18:07:14.925893 4837 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-7ls2z"] Jan 11 18:07:15 crc kubenswrapper[4837]: I0111 18:07:15.095911 4837 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-4vdb9"] Jan 11 18:07:15 crc kubenswrapper[4837]: W0111 18:07:15.147514 4837 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod818995b0_a7e2_40de_99d2_d0256d2ef2f6.slice/crio-1b1c648626a8a94af5ff2242a72acce9ab1f2c3ae4a905df97776cc3a4ea3a56 WatchSource:0}: Error finding container 1b1c648626a8a94af5ff2242a72acce9ab1f2c3ae4a905df97776cc3a4ea3a56: Status 404 returned error can't find the container with id 1b1c648626a8a94af5ff2242a72acce9ab1f2c3ae4a905df97776cc3a4ea3a56 Jan 11 18:07:15 crc kubenswrapper[4837]: I0111 18:07:15.767293 4837 generic.go:334] "Generic (PLEG): container finished" podID="818995b0-a7e2-40de-99d2-d0256d2ef2f6" containerID="53895ee4d5577d81b07b60fbd2c9b984fc609045db0f08e7f539ecf8513fc379" exitCode=0 Jan 11 18:07:15 crc kubenswrapper[4837]: I0111 18:07:15.767381 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-4vdb9" event={"ID":"818995b0-a7e2-40de-99d2-d0256d2ef2f6","Type":"ContainerDied","Data":"53895ee4d5577d81b07b60fbd2c9b984fc609045db0f08e7f539ecf8513fc379"} Jan 11 18:07:15 crc kubenswrapper[4837]: I0111 18:07:15.768095 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-4vdb9" event={"ID":"818995b0-a7e2-40de-99d2-d0256d2ef2f6","Type":"ContainerStarted","Data":"1b1c648626a8a94af5ff2242a72acce9ab1f2c3ae4a905df97776cc3a4ea3a56"} Jan 11 18:07:15 crc kubenswrapper[4837]: I0111 18:07:15.770879 4837 generic.go:334] "Generic (PLEG): container finished" podID="b05e8d99-38a1-4a31-a540-97058c253b29" containerID="9b71b0998dea6fab561519acf168ab967dac39f045d3041bd1df9e99c221448e" exitCode=0 Jan 11 18:07:15 crc kubenswrapper[4837]: I0111 18:07:15.770920 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-7ls2z" event={"ID":"b05e8d99-38a1-4a31-a540-97058c253b29","Type":"ContainerDied","Data":"9b71b0998dea6fab561519acf168ab967dac39f045d3041bd1df9e99c221448e"} Jan 11 18:07:15 crc kubenswrapper[4837]: I0111 18:07:15.770949 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-7ls2z" event={"ID":"b05e8d99-38a1-4a31-a540-97058c253b29","Type":"ContainerStarted","Data":"b07bb4358b54e176056a670660237ebfb960712b8a254e84332db02ad2132303"} Jan 11 18:07:16 crc kubenswrapper[4837]: I0111 18:07:16.635557 4837 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-dgwhl"] Jan 11 18:07:16 crc kubenswrapper[4837]: I0111 18:07:16.637824 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-dgwhl" Jan 11 18:07:16 crc kubenswrapper[4837]: I0111 18:07:16.657566 4837 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-dgwhl"] Jan 11 18:07:16 crc kubenswrapper[4837]: I0111 18:07:16.763491 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-f94cp\" (UniqueName: \"kubernetes.io/projected/94fcad98-33aa-47ad-ba40-2a685af9b8b3-kube-api-access-f94cp\") pod \"certified-operators-dgwhl\" (UID: \"94fcad98-33aa-47ad-ba40-2a685af9b8b3\") " pod="openshift-marketplace/certified-operators-dgwhl" Jan 11 18:07:16 crc kubenswrapper[4837]: I0111 18:07:16.763538 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/94fcad98-33aa-47ad-ba40-2a685af9b8b3-utilities\") pod \"certified-operators-dgwhl\" (UID: \"94fcad98-33aa-47ad-ba40-2a685af9b8b3\") " pod="openshift-marketplace/certified-operators-dgwhl" Jan 11 18:07:16 crc kubenswrapper[4837]: I0111 18:07:16.763764 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/94fcad98-33aa-47ad-ba40-2a685af9b8b3-catalog-content\") pod \"certified-operators-dgwhl\" (UID: \"94fcad98-33aa-47ad-ba40-2a685af9b8b3\") " pod="openshift-marketplace/certified-operators-dgwhl" Jan 11 18:07:16 crc kubenswrapper[4837]: I0111 18:07:16.784327 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-4vdb9" event={"ID":"818995b0-a7e2-40de-99d2-d0256d2ef2f6","Type":"ContainerStarted","Data":"5f78d3eaeeb454d99391ee7b8f84a4ee582108738f2bc039ce70955ab299158e"} Jan 11 18:07:16 crc kubenswrapper[4837]: I0111 18:07:16.865561 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/94fcad98-33aa-47ad-ba40-2a685af9b8b3-utilities\") pod \"certified-operators-dgwhl\" (UID: \"94fcad98-33aa-47ad-ba40-2a685af9b8b3\") " pod="openshift-marketplace/certified-operators-dgwhl" Jan 11 18:07:16 crc kubenswrapper[4837]: I0111 18:07:16.865774 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/94fcad98-33aa-47ad-ba40-2a685af9b8b3-catalog-content\") pod \"certified-operators-dgwhl\" (UID: \"94fcad98-33aa-47ad-ba40-2a685af9b8b3\") " pod="openshift-marketplace/certified-operators-dgwhl" Jan 11 18:07:16 crc kubenswrapper[4837]: I0111 18:07:16.865890 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-f94cp\" (UniqueName: \"kubernetes.io/projected/94fcad98-33aa-47ad-ba40-2a685af9b8b3-kube-api-access-f94cp\") pod \"certified-operators-dgwhl\" (UID: \"94fcad98-33aa-47ad-ba40-2a685af9b8b3\") " pod="openshift-marketplace/certified-operators-dgwhl" Jan 11 18:07:16 crc kubenswrapper[4837]: I0111 18:07:16.866173 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/94fcad98-33aa-47ad-ba40-2a685af9b8b3-catalog-content\") pod \"certified-operators-dgwhl\" (UID: \"94fcad98-33aa-47ad-ba40-2a685af9b8b3\") " pod="openshift-marketplace/certified-operators-dgwhl" Jan 11 18:07:16 crc kubenswrapper[4837]: I0111 18:07:16.866172 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/94fcad98-33aa-47ad-ba40-2a685af9b8b3-utilities\") pod \"certified-operators-dgwhl\" (UID: \"94fcad98-33aa-47ad-ba40-2a685af9b8b3\") " pod="openshift-marketplace/certified-operators-dgwhl" Jan 11 18:07:16 crc kubenswrapper[4837]: I0111 18:07:16.894928 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-f94cp\" (UniqueName: \"kubernetes.io/projected/94fcad98-33aa-47ad-ba40-2a685af9b8b3-kube-api-access-f94cp\") pod \"certified-operators-dgwhl\" (UID: \"94fcad98-33aa-47ad-ba40-2a685af9b8b3\") " pod="openshift-marketplace/certified-operators-dgwhl" Jan 11 18:07:16 crc kubenswrapper[4837]: I0111 18:07:16.961483 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-dgwhl" Jan 11 18:07:17 crc kubenswrapper[4837]: I0111 18:07:17.433851 4837 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-dgwhl"] Jan 11 18:07:17 crc kubenswrapper[4837]: W0111 18:07:17.442180 4837 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod94fcad98_33aa_47ad_ba40_2a685af9b8b3.slice/crio-291fcfb1b76e7803551d1c8bacc3cc73c8f95d62af0c43dfa3aa6faeae672441 WatchSource:0}: Error finding container 291fcfb1b76e7803551d1c8bacc3cc73c8f95d62af0c43dfa3aa6faeae672441: Status 404 returned error can't find the container with id 291fcfb1b76e7803551d1c8bacc3cc73c8f95d62af0c43dfa3aa6faeae672441 Jan 11 18:07:17 crc kubenswrapper[4837]: I0111 18:07:17.800523 4837 generic.go:334] "Generic (PLEG): container finished" podID="b05e8d99-38a1-4a31-a540-97058c253b29" containerID="59e0be6ac5e0a05145dbb91db5aea95f4318da2bae70932b57afe2ebd49ef87b" exitCode=0 Jan 11 18:07:17 crc kubenswrapper[4837]: I0111 18:07:17.800614 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-7ls2z" event={"ID":"b05e8d99-38a1-4a31-a540-97058c253b29","Type":"ContainerDied","Data":"59e0be6ac5e0a05145dbb91db5aea95f4318da2bae70932b57afe2ebd49ef87b"} Jan 11 18:07:17 crc kubenswrapper[4837]: I0111 18:07:17.804889 4837 generic.go:334] "Generic (PLEG): container finished" podID="94fcad98-33aa-47ad-ba40-2a685af9b8b3" containerID="6108e9a7b8131cf2ccf93c7adc7ca57023958b54da6d04160a5af0b8d1ad50b0" exitCode=0 Jan 11 18:07:17 crc kubenswrapper[4837]: I0111 18:07:17.804961 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-dgwhl" event={"ID":"94fcad98-33aa-47ad-ba40-2a685af9b8b3","Type":"ContainerDied","Data":"6108e9a7b8131cf2ccf93c7adc7ca57023958b54da6d04160a5af0b8d1ad50b0"} Jan 11 18:07:17 crc kubenswrapper[4837]: I0111 18:07:17.804991 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-dgwhl" event={"ID":"94fcad98-33aa-47ad-ba40-2a685af9b8b3","Type":"ContainerStarted","Data":"291fcfb1b76e7803551d1c8bacc3cc73c8f95d62af0c43dfa3aa6faeae672441"} Jan 11 18:07:17 crc kubenswrapper[4837]: I0111 18:07:17.808327 4837 generic.go:334] "Generic (PLEG): container finished" podID="818995b0-a7e2-40de-99d2-d0256d2ef2f6" containerID="5f78d3eaeeb454d99391ee7b8f84a4ee582108738f2bc039ce70955ab299158e" exitCode=0 Jan 11 18:07:17 crc kubenswrapper[4837]: I0111 18:07:17.808406 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-4vdb9" event={"ID":"818995b0-a7e2-40de-99d2-d0256d2ef2f6","Type":"ContainerDied","Data":"5f78d3eaeeb454d99391ee7b8f84a4ee582108738f2bc039ce70955ab299158e"} Jan 11 18:07:18 crc kubenswrapper[4837]: I0111 18:07:18.856918 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-7ls2z" event={"ID":"b05e8d99-38a1-4a31-a540-97058c253b29","Type":"ContainerStarted","Data":"c72c9bb9398e13ce936d2f50673b178bd8ec848a54dcb69d256fda01cc3a493f"} Jan 11 18:07:18 crc kubenswrapper[4837]: I0111 18:07:18.860461 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-dgwhl" event={"ID":"94fcad98-33aa-47ad-ba40-2a685af9b8b3","Type":"ContainerStarted","Data":"edfb3e06f1c480f6171a06b6132bc84c3314e1f95a00030cc32258d0ba2b1ac7"} Jan 11 18:07:18 crc kubenswrapper[4837]: I0111 18:07:18.867860 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-4vdb9" event={"ID":"818995b0-a7e2-40de-99d2-d0256d2ef2f6","Type":"ContainerStarted","Data":"c2c7240fbf77516a4c435b6b6137e62a2a0cd0ad8d790d88a50fa022f6b78614"} Jan 11 18:07:18 crc kubenswrapper[4837]: I0111 18:07:18.885442 4837 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-7ls2z" podStartSLOduration=2.4062983669999998 podStartE2EDuration="4.885424036s" podCreationTimestamp="2026-01-11 18:07:14 +0000 UTC" firstStartedPulling="2026-01-11 18:07:15.774640429 +0000 UTC m=+2209.952833135" lastFinishedPulling="2026-01-11 18:07:18.253766098 +0000 UTC m=+2212.431958804" observedRunningTime="2026-01-11 18:07:18.877499793 +0000 UTC m=+2213.055692499" watchObservedRunningTime="2026-01-11 18:07:18.885424036 +0000 UTC m=+2213.063616742" Jan 11 18:07:18 crc kubenswrapper[4837]: I0111 18:07:18.894783 4837 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-4vdb9" podStartSLOduration=2.394723016 podStartE2EDuration="4.894748556s" podCreationTimestamp="2026-01-11 18:07:14 +0000 UTC" firstStartedPulling="2026-01-11 18:07:15.769504651 +0000 UTC m=+2209.947697377" lastFinishedPulling="2026-01-11 18:07:18.269530211 +0000 UTC m=+2212.447722917" observedRunningTime="2026-01-11 18:07:18.894688055 +0000 UTC m=+2213.072880751" watchObservedRunningTime="2026-01-11 18:07:18.894748556 +0000 UTC m=+2213.072941262" Jan 11 18:07:19 crc kubenswrapper[4837]: I0111 18:07:19.877020 4837 generic.go:334] "Generic (PLEG): container finished" podID="94fcad98-33aa-47ad-ba40-2a685af9b8b3" containerID="edfb3e06f1c480f6171a06b6132bc84c3314e1f95a00030cc32258d0ba2b1ac7" exitCode=0 Jan 11 18:07:19 crc kubenswrapper[4837]: I0111 18:07:19.877124 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-dgwhl" event={"ID":"94fcad98-33aa-47ad-ba40-2a685af9b8b3","Type":"ContainerDied","Data":"edfb3e06f1c480f6171a06b6132bc84c3314e1f95a00030cc32258d0ba2b1ac7"} Jan 11 18:07:21 crc kubenswrapper[4837]: I0111 18:07:21.902134 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-dgwhl" event={"ID":"94fcad98-33aa-47ad-ba40-2a685af9b8b3","Type":"ContainerStarted","Data":"f5591ad9c9adbc5a2bd3676d00838f30161ea3e290430a260192da7b12712f76"} Jan 11 18:07:21 crc kubenswrapper[4837]: I0111 18:07:21.930969 4837 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-dgwhl" podStartSLOduration=2.953178267 podStartE2EDuration="5.930949629s" podCreationTimestamp="2026-01-11 18:07:16 +0000 UTC" firstStartedPulling="2026-01-11 18:07:17.806732311 +0000 UTC m=+2211.984925057" lastFinishedPulling="2026-01-11 18:07:20.784503693 +0000 UTC m=+2214.962696419" observedRunningTime="2026-01-11 18:07:21.923368755 +0000 UTC m=+2216.101561461" watchObservedRunningTime="2026-01-11 18:07:21.930949629 +0000 UTC m=+2216.109142335" Jan 11 18:07:24 crc kubenswrapper[4837]: I0111 18:07:24.407629 4837 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-7ls2z" Jan 11 18:07:24 crc kubenswrapper[4837]: I0111 18:07:24.408585 4837 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-7ls2z" Jan 11 18:07:24 crc kubenswrapper[4837]: I0111 18:07:24.477035 4837 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-7ls2z" Jan 11 18:07:24 crc kubenswrapper[4837]: I0111 18:07:24.631773 4837 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-4vdb9" Jan 11 18:07:24 crc kubenswrapper[4837]: I0111 18:07:24.632062 4837 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-4vdb9" Jan 11 18:07:24 crc kubenswrapper[4837]: I0111 18:07:24.688583 4837 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-4vdb9" Jan 11 18:07:24 crc kubenswrapper[4837]: I0111 18:07:24.973775 4837 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-7ls2z" Jan 11 18:07:24 crc kubenswrapper[4837]: I0111 18:07:24.973891 4837 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-4vdb9" Jan 11 18:07:26 crc kubenswrapper[4837]: I0111 18:07:26.022722 4837 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-7ls2z"] Jan 11 18:07:26 crc kubenswrapper[4837]: I0111 18:07:26.962590 4837 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-dgwhl" Jan 11 18:07:26 crc kubenswrapper[4837]: I0111 18:07:26.963084 4837 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-dgwhl" Jan 11 18:07:27 crc kubenswrapper[4837]: I0111 18:07:27.025118 4837 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-4vdb9"] Jan 11 18:07:27 crc kubenswrapper[4837]: I0111 18:07:27.025358 4837 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-4vdb9" podUID="818995b0-a7e2-40de-99d2-d0256d2ef2f6" containerName="registry-server" containerID="cri-o://c2c7240fbf77516a4c435b6b6137e62a2a0cd0ad8d790d88a50fa022f6b78614" gracePeriod=2 Jan 11 18:07:27 crc kubenswrapper[4837]: I0111 18:07:27.049914 4837 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-dgwhl" Jan 11 18:07:27 crc kubenswrapper[4837]: I0111 18:07:27.972451 4837 generic.go:334] "Generic (PLEG): container finished" podID="818995b0-a7e2-40de-99d2-d0256d2ef2f6" containerID="c2c7240fbf77516a4c435b6b6137e62a2a0cd0ad8d790d88a50fa022f6b78614" exitCode=0 Jan 11 18:07:27 crc kubenswrapper[4837]: I0111 18:07:27.972798 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-4vdb9" event={"ID":"818995b0-a7e2-40de-99d2-d0256d2ef2f6","Type":"ContainerDied","Data":"c2c7240fbf77516a4c435b6b6137e62a2a0cd0ad8d790d88a50fa022f6b78614"} Jan 11 18:07:27 crc kubenswrapper[4837]: I0111 18:07:27.973138 4837 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-7ls2z" podUID="b05e8d99-38a1-4a31-a540-97058c253b29" containerName="registry-server" containerID="cri-o://c72c9bb9398e13ce936d2f50673b178bd8ec848a54dcb69d256fda01cc3a493f" gracePeriod=2 Jan 11 18:07:28 crc kubenswrapper[4837]: I0111 18:07:28.044740 4837 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-dgwhl" Jan 11 18:07:28 crc kubenswrapper[4837]: I0111 18:07:28.151433 4837 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-4vdb9" Jan 11 18:07:28 crc kubenswrapper[4837]: I0111 18:07:28.292746 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/818995b0-a7e2-40de-99d2-d0256d2ef2f6-utilities\") pod \"818995b0-a7e2-40de-99d2-d0256d2ef2f6\" (UID: \"818995b0-a7e2-40de-99d2-d0256d2ef2f6\") " Jan 11 18:07:28 crc kubenswrapper[4837]: I0111 18:07:28.293569 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/818995b0-a7e2-40de-99d2-d0256d2ef2f6-utilities" (OuterVolumeSpecName: "utilities") pod "818995b0-a7e2-40de-99d2-d0256d2ef2f6" (UID: "818995b0-a7e2-40de-99d2-d0256d2ef2f6"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 11 18:07:28 crc kubenswrapper[4837]: I0111 18:07:28.293793 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/818995b0-a7e2-40de-99d2-d0256d2ef2f6-catalog-content\") pod \"818995b0-a7e2-40de-99d2-d0256d2ef2f6\" (UID: \"818995b0-a7e2-40de-99d2-d0256d2ef2f6\") " Jan 11 18:07:28 crc kubenswrapper[4837]: I0111 18:07:28.293974 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-nms58\" (UniqueName: \"kubernetes.io/projected/818995b0-a7e2-40de-99d2-d0256d2ef2f6-kube-api-access-nms58\") pod \"818995b0-a7e2-40de-99d2-d0256d2ef2f6\" (UID: \"818995b0-a7e2-40de-99d2-d0256d2ef2f6\") " Jan 11 18:07:28 crc kubenswrapper[4837]: I0111 18:07:28.294710 4837 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/818995b0-a7e2-40de-99d2-d0256d2ef2f6-utilities\") on node \"crc\" DevicePath \"\"" Jan 11 18:07:28 crc kubenswrapper[4837]: I0111 18:07:28.302745 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/818995b0-a7e2-40de-99d2-d0256d2ef2f6-kube-api-access-nms58" (OuterVolumeSpecName: "kube-api-access-nms58") pod "818995b0-a7e2-40de-99d2-d0256d2ef2f6" (UID: "818995b0-a7e2-40de-99d2-d0256d2ef2f6"). InnerVolumeSpecName "kube-api-access-nms58". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 11 18:07:28 crc kubenswrapper[4837]: I0111 18:07:28.351201 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/818995b0-a7e2-40de-99d2-d0256d2ef2f6-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "818995b0-a7e2-40de-99d2-d0256d2ef2f6" (UID: "818995b0-a7e2-40de-99d2-d0256d2ef2f6"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 11 18:07:28 crc kubenswrapper[4837]: I0111 18:07:28.396736 4837 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-nms58\" (UniqueName: \"kubernetes.io/projected/818995b0-a7e2-40de-99d2-d0256d2ef2f6-kube-api-access-nms58\") on node \"crc\" DevicePath \"\"" Jan 11 18:07:28 crc kubenswrapper[4837]: I0111 18:07:28.396781 4837 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/818995b0-a7e2-40de-99d2-d0256d2ef2f6-catalog-content\") on node \"crc\" DevicePath \"\"" Jan 11 18:07:28 crc kubenswrapper[4837]: I0111 18:07:28.406047 4837 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-7ls2z" Jan 11 18:07:28 crc kubenswrapper[4837]: I0111 18:07:28.498841 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b05e8d99-38a1-4a31-a540-97058c253b29-utilities\") pod \"b05e8d99-38a1-4a31-a540-97058c253b29\" (UID: \"b05e8d99-38a1-4a31-a540-97058c253b29\") " Jan 11 18:07:28 crc kubenswrapper[4837]: I0111 18:07:28.499127 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-9sz4k\" (UniqueName: \"kubernetes.io/projected/b05e8d99-38a1-4a31-a540-97058c253b29-kube-api-access-9sz4k\") pod \"b05e8d99-38a1-4a31-a540-97058c253b29\" (UID: \"b05e8d99-38a1-4a31-a540-97058c253b29\") " Jan 11 18:07:28 crc kubenswrapper[4837]: I0111 18:07:28.499885 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b05e8d99-38a1-4a31-a540-97058c253b29-utilities" (OuterVolumeSpecName: "utilities") pod "b05e8d99-38a1-4a31-a540-97058c253b29" (UID: "b05e8d99-38a1-4a31-a540-97058c253b29"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 11 18:07:28 crc kubenswrapper[4837]: I0111 18:07:28.500157 4837 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b05e8d99-38a1-4a31-a540-97058c253b29-utilities\") on node \"crc\" DevicePath \"\"" Jan 11 18:07:28 crc kubenswrapper[4837]: I0111 18:07:28.504297 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b05e8d99-38a1-4a31-a540-97058c253b29-kube-api-access-9sz4k" (OuterVolumeSpecName: "kube-api-access-9sz4k") pod "b05e8d99-38a1-4a31-a540-97058c253b29" (UID: "b05e8d99-38a1-4a31-a540-97058c253b29"). InnerVolumeSpecName "kube-api-access-9sz4k". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 11 18:07:28 crc kubenswrapper[4837]: I0111 18:07:28.600989 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b05e8d99-38a1-4a31-a540-97058c253b29-catalog-content\") pod \"b05e8d99-38a1-4a31-a540-97058c253b29\" (UID: \"b05e8d99-38a1-4a31-a540-97058c253b29\") " Jan 11 18:07:28 crc kubenswrapper[4837]: I0111 18:07:28.601628 4837 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-9sz4k\" (UniqueName: \"kubernetes.io/projected/b05e8d99-38a1-4a31-a540-97058c253b29-kube-api-access-9sz4k\") on node \"crc\" DevicePath \"\"" Jan 11 18:07:28 crc kubenswrapper[4837]: I0111 18:07:28.629326 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b05e8d99-38a1-4a31-a540-97058c253b29-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "b05e8d99-38a1-4a31-a540-97058c253b29" (UID: "b05e8d99-38a1-4a31-a540-97058c253b29"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 11 18:07:28 crc kubenswrapper[4837]: I0111 18:07:28.703436 4837 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b05e8d99-38a1-4a31-a540-97058c253b29-catalog-content\") on node \"crc\" DevicePath \"\"" Jan 11 18:07:29 crc kubenswrapper[4837]: I0111 18:07:29.012006 4837 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-4vdb9" Jan 11 18:07:29 crc kubenswrapper[4837]: I0111 18:07:29.012018 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-4vdb9" event={"ID":"818995b0-a7e2-40de-99d2-d0256d2ef2f6","Type":"ContainerDied","Data":"1b1c648626a8a94af5ff2242a72acce9ab1f2c3ae4a905df97776cc3a4ea3a56"} Jan 11 18:07:29 crc kubenswrapper[4837]: I0111 18:07:29.012403 4837 scope.go:117] "RemoveContainer" containerID="c2c7240fbf77516a4c435b6b6137e62a2a0cd0ad8d790d88a50fa022f6b78614" Jan 11 18:07:29 crc kubenswrapper[4837]: I0111 18:07:29.018336 4837 generic.go:334] "Generic (PLEG): container finished" podID="b05e8d99-38a1-4a31-a540-97058c253b29" containerID="c72c9bb9398e13ce936d2f50673b178bd8ec848a54dcb69d256fda01cc3a493f" exitCode=0 Jan 11 18:07:29 crc kubenswrapper[4837]: I0111 18:07:29.018620 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-7ls2z" event={"ID":"b05e8d99-38a1-4a31-a540-97058c253b29","Type":"ContainerDied","Data":"c72c9bb9398e13ce936d2f50673b178bd8ec848a54dcb69d256fda01cc3a493f"} Jan 11 18:07:29 crc kubenswrapper[4837]: I0111 18:07:29.018791 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-7ls2z" event={"ID":"b05e8d99-38a1-4a31-a540-97058c253b29","Type":"ContainerDied","Data":"b07bb4358b54e176056a670660237ebfb960712b8a254e84332db02ad2132303"} Jan 11 18:07:29 crc kubenswrapper[4837]: I0111 18:07:29.018665 4837 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-7ls2z" Jan 11 18:07:29 crc kubenswrapper[4837]: I0111 18:07:29.063352 4837 scope.go:117] "RemoveContainer" containerID="5f78d3eaeeb454d99391ee7b8f84a4ee582108738f2bc039ce70955ab299158e" Jan 11 18:07:29 crc kubenswrapper[4837]: I0111 18:07:29.065168 4837 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-4vdb9"] Jan 11 18:07:29 crc kubenswrapper[4837]: I0111 18:07:29.077787 4837 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-4vdb9"] Jan 11 18:07:29 crc kubenswrapper[4837]: I0111 18:07:29.090577 4837 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-7ls2z"] Jan 11 18:07:29 crc kubenswrapper[4837]: I0111 18:07:29.098749 4837 scope.go:117] "RemoveContainer" containerID="53895ee4d5577d81b07b60fbd2c9b984fc609045db0f08e7f539ecf8513fc379" Jan 11 18:07:29 crc kubenswrapper[4837]: I0111 18:07:29.101013 4837 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-7ls2z"] Jan 11 18:07:29 crc kubenswrapper[4837]: I0111 18:07:29.162822 4837 scope.go:117] "RemoveContainer" containerID="c72c9bb9398e13ce936d2f50673b178bd8ec848a54dcb69d256fda01cc3a493f" Jan 11 18:07:29 crc kubenswrapper[4837]: I0111 18:07:29.227718 4837 scope.go:117] "RemoveContainer" containerID="59e0be6ac5e0a05145dbb91db5aea95f4318da2bae70932b57afe2ebd49ef87b" Jan 11 18:07:29 crc kubenswrapper[4837]: I0111 18:07:29.256355 4837 scope.go:117] "RemoveContainer" containerID="9b71b0998dea6fab561519acf168ab967dac39f045d3041bd1df9e99c221448e" Jan 11 18:07:29 crc kubenswrapper[4837]: I0111 18:07:29.285408 4837 scope.go:117] "RemoveContainer" containerID="c72c9bb9398e13ce936d2f50673b178bd8ec848a54dcb69d256fda01cc3a493f" Jan 11 18:07:29 crc kubenswrapper[4837]: E0111 18:07:29.285940 4837 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"c72c9bb9398e13ce936d2f50673b178bd8ec848a54dcb69d256fda01cc3a493f\": container with ID starting with c72c9bb9398e13ce936d2f50673b178bd8ec848a54dcb69d256fda01cc3a493f not found: ID does not exist" containerID="c72c9bb9398e13ce936d2f50673b178bd8ec848a54dcb69d256fda01cc3a493f" Jan 11 18:07:29 crc kubenswrapper[4837]: I0111 18:07:29.285975 4837 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c72c9bb9398e13ce936d2f50673b178bd8ec848a54dcb69d256fda01cc3a493f"} err="failed to get container status \"c72c9bb9398e13ce936d2f50673b178bd8ec848a54dcb69d256fda01cc3a493f\": rpc error: code = NotFound desc = could not find container \"c72c9bb9398e13ce936d2f50673b178bd8ec848a54dcb69d256fda01cc3a493f\": container with ID starting with c72c9bb9398e13ce936d2f50673b178bd8ec848a54dcb69d256fda01cc3a493f not found: ID does not exist" Jan 11 18:07:29 crc kubenswrapper[4837]: I0111 18:07:29.285994 4837 scope.go:117] "RemoveContainer" containerID="59e0be6ac5e0a05145dbb91db5aea95f4318da2bae70932b57afe2ebd49ef87b" Jan 11 18:07:29 crc kubenswrapper[4837]: E0111 18:07:29.286570 4837 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"59e0be6ac5e0a05145dbb91db5aea95f4318da2bae70932b57afe2ebd49ef87b\": container with ID starting with 59e0be6ac5e0a05145dbb91db5aea95f4318da2bae70932b57afe2ebd49ef87b not found: ID does not exist" containerID="59e0be6ac5e0a05145dbb91db5aea95f4318da2bae70932b57afe2ebd49ef87b" Jan 11 18:07:29 crc kubenswrapper[4837]: I0111 18:07:29.286670 4837 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"59e0be6ac5e0a05145dbb91db5aea95f4318da2bae70932b57afe2ebd49ef87b"} err="failed to get container status \"59e0be6ac5e0a05145dbb91db5aea95f4318da2bae70932b57afe2ebd49ef87b\": rpc error: code = NotFound desc = could not find container \"59e0be6ac5e0a05145dbb91db5aea95f4318da2bae70932b57afe2ebd49ef87b\": container with ID starting with 59e0be6ac5e0a05145dbb91db5aea95f4318da2bae70932b57afe2ebd49ef87b not found: ID does not exist" Jan 11 18:07:29 crc kubenswrapper[4837]: I0111 18:07:29.286760 4837 scope.go:117] "RemoveContainer" containerID="9b71b0998dea6fab561519acf168ab967dac39f045d3041bd1df9e99c221448e" Jan 11 18:07:29 crc kubenswrapper[4837]: E0111 18:07:29.287631 4837 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"9b71b0998dea6fab561519acf168ab967dac39f045d3041bd1df9e99c221448e\": container with ID starting with 9b71b0998dea6fab561519acf168ab967dac39f045d3041bd1df9e99c221448e not found: ID does not exist" containerID="9b71b0998dea6fab561519acf168ab967dac39f045d3041bd1df9e99c221448e" Jan 11 18:07:29 crc kubenswrapper[4837]: I0111 18:07:29.287655 4837 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9b71b0998dea6fab561519acf168ab967dac39f045d3041bd1df9e99c221448e"} err="failed to get container status \"9b71b0998dea6fab561519acf168ab967dac39f045d3041bd1df9e99c221448e\": rpc error: code = NotFound desc = could not find container \"9b71b0998dea6fab561519acf168ab967dac39f045d3041bd1df9e99c221448e\": container with ID starting with 9b71b0998dea6fab561519acf168ab967dac39f045d3041bd1df9e99c221448e not found: ID does not exist" Jan 11 18:07:30 crc kubenswrapper[4837]: I0111 18:07:30.382323 4837 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="818995b0-a7e2-40de-99d2-d0256d2ef2f6" path="/var/lib/kubelet/pods/818995b0-a7e2-40de-99d2-d0256d2ef2f6/volumes" Jan 11 18:07:30 crc kubenswrapper[4837]: I0111 18:07:30.384442 4837 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b05e8d99-38a1-4a31-a540-97058c253b29" path="/var/lib/kubelet/pods/b05e8d99-38a1-4a31-a540-97058c253b29/volumes" Jan 11 18:07:30 crc kubenswrapper[4837]: I0111 18:07:30.832397 4837 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-dgwhl"] Jan 11 18:07:30 crc kubenswrapper[4837]: I0111 18:07:30.832764 4837 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-dgwhl" podUID="94fcad98-33aa-47ad-ba40-2a685af9b8b3" containerName="registry-server" containerID="cri-o://f5591ad9c9adbc5a2bd3676d00838f30161ea3e290430a260192da7b12712f76" gracePeriod=2 Jan 11 18:07:31 crc kubenswrapper[4837]: I0111 18:07:31.064610 4837 generic.go:334] "Generic (PLEG): container finished" podID="94fcad98-33aa-47ad-ba40-2a685af9b8b3" containerID="f5591ad9c9adbc5a2bd3676d00838f30161ea3e290430a260192da7b12712f76" exitCode=0 Jan 11 18:07:31 crc kubenswrapper[4837]: I0111 18:07:31.064648 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-dgwhl" event={"ID":"94fcad98-33aa-47ad-ba40-2a685af9b8b3","Type":"ContainerDied","Data":"f5591ad9c9adbc5a2bd3676d00838f30161ea3e290430a260192da7b12712f76"} Jan 11 18:07:31 crc kubenswrapper[4837]: I0111 18:07:31.309718 4837 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-dgwhl" Jan 11 18:07:31 crc kubenswrapper[4837]: I0111 18:07:31.467854 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/94fcad98-33aa-47ad-ba40-2a685af9b8b3-catalog-content\") pod \"94fcad98-33aa-47ad-ba40-2a685af9b8b3\" (UID: \"94fcad98-33aa-47ad-ba40-2a685af9b8b3\") " Jan 11 18:07:31 crc kubenswrapper[4837]: I0111 18:07:31.467954 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/94fcad98-33aa-47ad-ba40-2a685af9b8b3-utilities\") pod \"94fcad98-33aa-47ad-ba40-2a685af9b8b3\" (UID: \"94fcad98-33aa-47ad-ba40-2a685af9b8b3\") " Jan 11 18:07:31 crc kubenswrapper[4837]: I0111 18:07:31.468030 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-f94cp\" (UniqueName: \"kubernetes.io/projected/94fcad98-33aa-47ad-ba40-2a685af9b8b3-kube-api-access-f94cp\") pod \"94fcad98-33aa-47ad-ba40-2a685af9b8b3\" (UID: \"94fcad98-33aa-47ad-ba40-2a685af9b8b3\") " Jan 11 18:07:31 crc kubenswrapper[4837]: I0111 18:07:31.469368 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/94fcad98-33aa-47ad-ba40-2a685af9b8b3-utilities" (OuterVolumeSpecName: "utilities") pod "94fcad98-33aa-47ad-ba40-2a685af9b8b3" (UID: "94fcad98-33aa-47ad-ba40-2a685af9b8b3"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 11 18:07:31 crc kubenswrapper[4837]: I0111 18:07:31.477533 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/94fcad98-33aa-47ad-ba40-2a685af9b8b3-kube-api-access-f94cp" (OuterVolumeSpecName: "kube-api-access-f94cp") pod "94fcad98-33aa-47ad-ba40-2a685af9b8b3" (UID: "94fcad98-33aa-47ad-ba40-2a685af9b8b3"). InnerVolumeSpecName "kube-api-access-f94cp". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 11 18:07:31 crc kubenswrapper[4837]: I0111 18:07:31.534660 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/94fcad98-33aa-47ad-ba40-2a685af9b8b3-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "94fcad98-33aa-47ad-ba40-2a685af9b8b3" (UID: "94fcad98-33aa-47ad-ba40-2a685af9b8b3"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 11 18:07:31 crc kubenswrapper[4837]: I0111 18:07:31.571477 4837 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/94fcad98-33aa-47ad-ba40-2a685af9b8b3-catalog-content\") on node \"crc\" DevicePath \"\"" Jan 11 18:07:31 crc kubenswrapper[4837]: I0111 18:07:31.571543 4837 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/94fcad98-33aa-47ad-ba40-2a685af9b8b3-utilities\") on node \"crc\" DevicePath \"\"" Jan 11 18:07:31 crc kubenswrapper[4837]: I0111 18:07:31.571572 4837 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-f94cp\" (UniqueName: \"kubernetes.io/projected/94fcad98-33aa-47ad-ba40-2a685af9b8b3-kube-api-access-f94cp\") on node \"crc\" DevicePath \"\"" Jan 11 18:07:32 crc kubenswrapper[4837]: I0111 18:07:32.081639 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-dgwhl" event={"ID":"94fcad98-33aa-47ad-ba40-2a685af9b8b3","Type":"ContainerDied","Data":"291fcfb1b76e7803551d1c8bacc3cc73c8f95d62af0c43dfa3aa6faeae672441"} Jan 11 18:07:32 crc kubenswrapper[4837]: I0111 18:07:32.081730 4837 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-dgwhl" Jan 11 18:07:32 crc kubenswrapper[4837]: I0111 18:07:32.082196 4837 scope.go:117] "RemoveContainer" containerID="f5591ad9c9adbc5a2bd3676d00838f30161ea3e290430a260192da7b12712f76" Jan 11 18:07:32 crc kubenswrapper[4837]: I0111 18:07:32.117274 4837 scope.go:117] "RemoveContainer" containerID="edfb3e06f1c480f6171a06b6132bc84c3314e1f95a00030cc32258d0ba2b1ac7" Jan 11 18:07:32 crc kubenswrapper[4837]: I0111 18:07:32.140559 4837 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-dgwhl"] Jan 11 18:07:32 crc kubenswrapper[4837]: I0111 18:07:32.150956 4837 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-dgwhl"] Jan 11 18:07:32 crc kubenswrapper[4837]: I0111 18:07:32.173388 4837 scope.go:117] "RemoveContainer" containerID="6108e9a7b8131cf2ccf93c7adc7ca57023958b54da6d04160a5af0b8d1ad50b0" Jan 11 18:07:32 crc kubenswrapper[4837]: I0111 18:07:32.391002 4837 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="94fcad98-33aa-47ad-ba40-2a685af9b8b3" path="/var/lib/kubelet/pods/94fcad98-33aa-47ad-ba40-2a685af9b8b3/volumes" Jan 11 18:07:51 crc kubenswrapper[4837]: I0111 18:07:51.274703 4837 generic.go:334] "Generic (PLEG): container finished" podID="dafad3b0-31b4-467e-9604-485cb65e91e5" containerID="3c9b6eb638c464a1cd8b130767ebba1ffe94561fed63eb8cbf54a0e858e60904" exitCode=0 Jan 11 18:07:51 crc kubenswrapper[4837]: I0111 18:07:51.274777 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-s96gx" event={"ID":"dafad3b0-31b4-467e-9604-485cb65e91e5","Type":"ContainerDied","Data":"3c9b6eb638c464a1cd8b130767ebba1ffe94561fed63eb8cbf54a0e858e60904"} Jan 11 18:07:52 crc kubenswrapper[4837]: I0111 18:07:52.766488 4837 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-s96gx" Jan 11 18:07:52 crc kubenswrapper[4837]: I0111 18:07:52.943913 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-wzwwk\" (UniqueName: \"kubernetes.io/projected/dafad3b0-31b4-467e-9604-485cb65e91e5-kube-api-access-wzwwk\") pod \"dafad3b0-31b4-467e-9604-485cb65e91e5\" (UID: \"dafad3b0-31b4-467e-9604-485cb65e91e5\") " Jan 11 18:07:52 crc kubenswrapper[4837]: I0111 18:07:52.944001 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/dafad3b0-31b4-467e-9604-485cb65e91e5-ssh-key-openstack-edpm-ipam\") pod \"dafad3b0-31b4-467e-9604-485cb65e91e5\" (UID: \"dafad3b0-31b4-467e-9604-485cb65e91e5\") " Jan 11 18:07:52 crc kubenswrapper[4837]: I0111 18:07:52.944072 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/dafad3b0-31b4-467e-9604-485cb65e91e5-inventory\") pod \"dafad3b0-31b4-467e-9604-485cb65e91e5\" (UID: \"dafad3b0-31b4-467e-9604-485cb65e91e5\") " Jan 11 18:07:52 crc kubenswrapper[4837]: I0111 18:07:52.944097 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/dafad3b0-31b4-467e-9604-485cb65e91e5-neutron-metadata-combined-ca-bundle\") pod \"dafad3b0-31b4-467e-9604-485cb65e91e5\" (UID: \"dafad3b0-31b4-467e-9604-485cb65e91e5\") " Jan 11 18:07:52 crc kubenswrapper[4837]: I0111 18:07:52.944136 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"neutron-ovn-metadata-agent-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/dafad3b0-31b4-467e-9604-485cb65e91e5-neutron-ovn-metadata-agent-neutron-config-0\") pod \"dafad3b0-31b4-467e-9604-485cb65e91e5\" (UID: \"dafad3b0-31b4-467e-9604-485cb65e91e5\") " Jan 11 18:07:52 crc kubenswrapper[4837]: I0111 18:07:52.944257 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-metadata-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/dafad3b0-31b4-467e-9604-485cb65e91e5-nova-metadata-neutron-config-0\") pod \"dafad3b0-31b4-467e-9604-485cb65e91e5\" (UID: \"dafad3b0-31b4-467e-9604-485cb65e91e5\") " Jan 11 18:07:52 crc kubenswrapper[4837]: I0111 18:07:52.960903 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/dafad3b0-31b4-467e-9604-485cb65e91e5-neutron-metadata-combined-ca-bundle" (OuterVolumeSpecName: "neutron-metadata-combined-ca-bundle") pod "dafad3b0-31b4-467e-9604-485cb65e91e5" (UID: "dafad3b0-31b4-467e-9604-485cb65e91e5"). InnerVolumeSpecName "neutron-metadata-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 11 18:07:52 crc kubenswrapper[4837]: I0111 18:07:52.962365 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/dafad3b0-31b4-467e-9604-485cb65e91e5-kube-api-access-wzwwk" (OuterVolumeSpecName: "kube-api-access-wzwwk") pod "dafad3b0-31b4-467e-9604-485cb65e91e5" (UID: "dafad3b0-31b4-467e-9604-485cb65e91e5"). InnerVolumeSpecName "kube-api-access-wzwwk". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 11 18:07:52 crc kubenswrapper[4837]: I0111 18:07:52.980868 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/dafad3b0-31b4-467e-9604-485cb65e91e5-inventory" (OuterVolumeSpecName: "inventory") pod "dafad3b0-31b4-467e-9604-485cb65e91e5" (UID: "dafad3b0-31b4-467e-9604-485cb65e91e5"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 11 18:07:52 crc kubenswrapper[4837]: I0111 18:07:52.990172 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/dafad3b0-31b4-467e-9604-485cb65e91e5-neutron-ovn-metadata-agent-neutron-config-0" (OuterVolumeSpecName: "neutron-ovn-metadata-agent-neutron-config-0") pod "dafad3b0-31b4-467e-9604-485cb65e91e5" (UID: "dafad3b0-31b4-467e-9604-485cb65e91e5"). InnerVolumeSpecName "neutron-ovn-metadata-agent-neutron-config-0". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 11 18:07:52 crc kubenswrapper[4837]: I0111 18:07:52.990586 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/dafad3b0-31b4-467e-9604-485cb65e91e5-ssh-key-openstack-edpm-ipam" (OuterVolumeSpecName: "ssh-key-openstack-edpm-ipam") pod "dafad3b0-31b4-467e-9604-485cb65e91e5" (UID: "dafad3b0-31b4-467e-9604-485cb65e91e5"). InnerVolumeSpecName "ssh-key-openstack-edpm-ipam". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 11 18:07:52 crc kubenswrapper[4837]: I0111 18:07:52.997350 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/dafad3b0-31b4-467e-9604-485cb65e91e5-nova-metadata-neutron-config-0" (OuterVolumeSpecName: "nova-metadata-neutron-config-0") pod "dafad3b0-31b4-467e-9604-485cb65e91e5" (UID: "dafad3b0-31b4-467e-9604-485cb65e91e5"). InnerVolumeSpecName "nova-metadata-neutron-config-0". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 11 18:07:53 crc kubenswrapper[4837]: I0111 18:07:53.046237 4837 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-wzwwk\" (UniqueName: \"kubernetes.io/projected/dafad3b0-31b4-467e-9604-485cb65e91e5-kube-api-access-wzwwk\") on node \"crc\" DevicePath \"\"" Jan 11 18:07:53 crc kubenswrapper[4837]: I0111 18:07:53.046908 4837 reconciler_common.go:293] "Volume detached for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/dafad3b0-31b4-467e-9604-485cb65e91e5-ssh-key-openstack-edpm-ipam\") on node \"crc\" DevicePath \"\"" Jan 11 18:07:53 crc kubenswrapper[4837]: I0111 18:07:53.046930 4837 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/dafad3b0-31b4-467e-9604-485cb65e91e5-inventory\") on node \"crc\" DevicePath \"\"" Jan 11 18:07:53 crc kubenswrapper[4837]: I0111 18:07:53.046940 4837 reconciler_common.go:293] "Volume detached for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/dafad3b0-31b4-467e-9604-485cb65e91e5-neutron-metadata-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 11 18:07:53 crc kubenswrapper[4837]: I0111 18:07:53.046955 4837 reconciler_common.go:293] "Volume detached for volume \"neutron-ovn-metadata-agent-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/dafad3b0-31b4-467e-9604-485cb65e91e5-neutron-ovn-metadata-agent-neutron-config-0\") on node \"crc\" DevicePath \"\"" Jan 11 18:07:53 crc kubenswrapper[4837]: I0111 18:07:53.046968 4837 reconciler_common.go:293] "Volume detached for volume \"nova-metadata-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/dafad3b0-31b4-467e-9604-485cb65e91e5-nova-metadata-neutron-config-0\") on node \"crc\" DevicePath \"\"" Jan 11 18:07:53 crc kubenswrapper[4837]: I0111 18:07:53.299721 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-s96gx" event={"ID":"dafad3b0-31b4-467e-9604-485cb65e91e5","Type":"ContainerDied","Data":"b4d519fa8ed671ee8a281c6e940ccfa8500c4cc4a57cd472f2c72c0888d24b77"} Jan 11 18:07:53 crc kubenswrapper[4837]: I0111 18:07:53.299822 4837 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="b4d519fa8ed671ee8a281c6e940ccfa8500c4cc4a57cd472f2c72c0888d24b77" Jan 11 18:07:53 crc kubenswrapper[4837]: I0111 18:07:53.299943 4837 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-s96gx" Jan 11 18:07:53 crc kubenswrapper[4837]: I0111 18:07:53.425896 4837 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/libvirt-edpm-deployment-openstack-edpm-ipam-dwz7s"] Jan 11 18:07:53 crc kubenswrapper[4837]: E0111 18:07:53.426288 4837 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="818995b0-a7e2-40de-99d2-d0256d2ef2f6" containerName="extract-utilities" Jan 11 18:07:53 crc kubenswrapper[4837]: I0111 18:07:53.426303 4837 state_mem.go:107] "Deleted CPUSet assignment" podUID="818995b0-a7e2-40de-99d2-d0256d2ef2f6" containerName="extract-utilities" Jan 11 18:07:53 crc kubenswrapper[4837]: E0111 18:07:53.426324 4837 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b05e8d99-38a1-4a31-a540-97058c253b29" containerName="registry-server" Jan 11 18:07:53 crc kubenswrapper[4837]: I0111 18:07:53.426332 4837 state_mem.go:107] "Deleted CPUSet assignment" podUID="b05e8d99-38a1-4a31-a540-97058c253b29" containerName="registry-server" Jan 11 18:07:53 crc kubenswrapper[4837]: E0111 18:07:53.426355 4837 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b05e8d99-38a1-4a31-a540-97058c253b29" containerName="extract-utilities" Jan 11 18:07:53 crc kubenswrapper[4837]: I0111 18:07:53.426364 4837 state_mem.go:107] "Deleted CPUSet assignment" podUID="b05e8d99-38a1-4a31-a540-97058c253b29" containerName="extract-utilities" Jan 11 18:07:53 crc kubenswrapper[4837]: E0111 18:07:53.426376 4837 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b05e8d99-38a1-4a31-a540-97058c253b29" containerName="extract-content" Jan 11 18:07:53 crc kubenswrapper[4837]: I0111 18:07:53.426385 4837 state_mem.go:107] "Deleted CPUSet assignment" podUID="b05e8d99-38a1-4a31-a540-97058c253b29" containerName="extract-content" Jan 11 18:07:53 crc kubenswrapper[4837]: E0111 18:07:53.426403 4837 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="818995b0-a7e2-40de-99d2-d0256d2ef2f6" containerName="extract-content" Jan 11 18:07:53 crc kubenswrapper[4837]: I0111 18:07:53.426411 4837 state_mem.go:107] "Deleted CPUSet assignment" podUID="818995b0-a7e2-40de-99d2-d0256d2ef2f6" containerName="extract-content" Jan 11 18:07:53 crc kubenswrapper[4837]: E0111 18:07:53.426426 4837 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="dafad3b0-31b4-467e-9604-485cb65e91e5" containerName="neutron-metadata-edpm-deployment-openstack-edpm-ipam" Jan 11 18:07:53 crc kubenswrapper[4837]: I0111 18:07:53.426436 4837 state_mem.go:107] "Deleted CPUSet assignment" podUID="dafad3b0-31b4-467e-9604-485cb65e91e5" containerName="neutron-metadata-edpm-deployment-openstack-edpm-ipam" Jan 11 18:07:53 crc kubenswrapper[4837]: E0111 18:07:53.426453 4837 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="94fcad98-33aa-47ad-ba40-2a685af9b8b3" containerName="registry-server" Jan 11 18:07:53 crc kubenswrapper[4837]: I0111 18:07:53.426461 4837 state_mem.go:107] "Deleted CPUSet assignment" podUID="94fcad98-33aa-47ad-ba40-2a685af9b8b3" containerName="registry-server" Jan 11 18:07:53 crc kubenswrapper[4837]: E0111 18:07:53.426476 4837 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="818995b0-a7e2-40de-99d2-d0256d2ef2f6" containerName="registry-server" Jan 11 18:07:53 crc kubenswrapper[4837]: I0111 18:07:53.426484 4837 state_mem.go:107] "Deleted CPUSet assignment" podUID="818995b0-a7e2-40de-99d2-d0256d2ef2f6" containerName="registry-server" Jan 11 18:07:53 crc kubenswrapper[4837]: E0111 18:07:53.426510 4837 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="94fcad98-33aa-47ad-ba40-2a685af9b8b3" containerName="extract-utilities" Jan 11 18:07:53 crc kubenswrapper[4837]: I0111 18:07:53.426518 4837 state_mem.go:107] "Deleted CPUSet assignment" podUID="94fcad98-33aa-47ad-ba40-2a685af9b8b3" containerName="extract-utilities" Jan 11 18:07:53 crc kubenswrapper[4837]: E0111 18:07:53.426532 4837 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="94fcad98-33aa-47ad-ba40-2a685af9b8b3" containerName="extract-content" Jan 11 18:07:53 crc kubenswrapper[4837]: I0111 18:07:53.426540 4837 state_mem.go:107] "Deleted CPUSet assignment" podUID="94fcad98-33aa-47ad-ba40-2a685af9b8b3" containerName="extract-content" Jan 11 18:07:53 crc kubenswrapper[4837]: I0111 18:07:53.426817 4837 memory_manager.go:354] "RemoveStaleState removing state" podUID="b05e8d99-38a1-4a31-a540-97058c253b29" containerName="registry-server" Jan 11 18:07:53 crc kubenswrapper[4837]: I0111 18:07:53.426842 4837 memory_manager.go:354] "RemoveStaleState removing state" podUID="94fcad98-33aa-47ad-ba40-2a685af9b8b3" containerName="registry-server" Jan 11 18:07:53 crc kubenswrapper[4837]: I0111 18:07:53.426865 4837 memory_manager.go:354] "RemoveStaleState removing state" podUID="818995b0-a7e2-40de-99d2-d0256d2ef2f6" containerName="registry-server" Jan 11 18:07:53 crc kubenswrapper[4837]: I0111 18:07:53.426884 4837 memory_manager.go:354] "RemoveStaleState removing state" podUID="dafad3b0-31b4-467e-9604-485cb65e91e5" containerName="neutron-metadata-edpm-deployment-openstack-edpm-ipam" Jan 11 18:07:53 crc kubenswrapper[4837]: I0111 18:07:53.427558 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-dwz7s" Jan 11 18:07:53 crc kubenswrapper[4837]: I0111 18:07:53.429192 4837 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-bzt89" Jan 11 18:07:53 crc kubenswrapper[4837]: I0111 18:07:53.429410 4837 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Jan 11 18:07:53 crc kubenswrapper[4837]: I0111 18:07:53.429650 4837 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Jan 11 18:07:53 crc kubenswrapper[4837]: I0111 18:07:53.430267 4837 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"libvirt-secret" Jan 11 18:07:53 crc kubenswrapper[4837]: I0111 18:07:53.434378 4837 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Jan 11 18:07:53 crc kubenswrapper[4837]: I0111 18:07:53.442727 4837 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/libvirt-edpm-deployment-openstack-edpm-ipam-dwz7s"] Jan 11 18:07:53 crc kubenswrapper[4837]: I0111 18:07:53.456120 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"libvirt-secret-0\" (UniqueName: \"kubernetes.io/secret/2948a5a1-4557-4e6b-82d0-6b8e9d7408b1-libvirt-secret-0\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-dwz7s\" (UID: \"2948a5a1-4557-4e6b-82d0-6b8e9d7408b1\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-dwz7s" Jan 11 18:07:53 crc kubenswrapper[4837]: I0111 18:07:53.456290 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/2948a5a1-4557-4e6b-82d0-6b8e9d7408b1-inventory\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-dwz7s\" (UID: \"2948a5a1-4557-4e6b-82d0-6b8e9d7408b1\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-dwz7s" Jan 11 18:07:53 crc kubenswrapper[4837]: I0111 18:07:53.456500 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qkm6n\" (UniqueName: \"kubernetes.io/projected/2948a5a1-4557-4e6b-82d0-6b8e9d7408b1-kube-api-access-qkm6n\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-dwz7s\" (UID: \"2948a5a1-4557-4e6b-82d0-6b8e9d7408b1\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-dwz7s" Jan 11 18:07:53 crc kubenswrapper[4837]: I0111 18:07:53.456605 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2948a5a1-4557-4e6b-82d0-6b8e9d7408b1-libvirt-combined-ca-bundle\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-dwz7s\" (UID: \"2948a5a1-4557-4e6b-82d0-6b8e9d7408b1\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-dwz7s" Jan 11 18:07:53 crc kubenswrapper[4837]: I0111 18:07:53.456632 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/2948a5a1-4557-4e6b-82d0-6b8e9d7408b1-ssh-key-openstack-edpm-ipam\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-dwz7s\" (UID: \"2948a5a1-4557-4e6b-82d0-6b8e9d7408b1\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-dwz7s" Jan 11 18:07:53 crc kubenswrapper[4837]: I0111 18:07:53.558097 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qkm6n\" (UniqueName: \"kubernetes.io/projected/2948a5a1-4557-4e6b-82d0-6b8e9d7408b1-kube-api-access-qkm6n\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-dwz7s\" (UID: \"2948a5a1-4557-4e6b-82d0-6b8e9d7408b1\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-dwz7s" Jan 11 18:07:53 crc kubenswrapper[4837]: I0111 18:07:53.558167 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2948a5a1-4557-4e6b-82d0-6b8e9d7408b1-libvirt-combined-ca-bundle\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-dwz7s\" (UID: \"2948a5a1-4557-4e6b-82d0-6b8e9d7408b1\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-dwz7s" Jan 11 18:07:53 crc kubenswrapper[4837]: I0111 18:07:53.558187 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/2948a5a1-4557-4e6b-82d0-6b8e9d7408b1-ssh-key-openstack-edpm-ipam\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-dwz7s\" (UID: \"2948a5a1-4557-4e6b-82d0-6b8e9d7408b1\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-dwz7s" Jan 11 18:07:53 crc kubenswrapper[4837]: I0111 18:07:53.558245 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"libvirt-secret-0\" (UniqueName: \"kubernetes.io/secret/2948a5a1-4557-4e6b-82d0-6b8e9d7408b1-libvirt-secret-0\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-dwz7s\" (UID: \"2948a5a1-4557-4e6b-82d0-6b8e9d7408b1\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-dwz7s" Jan 11 18:07:53 crc kubenswrapper[4837]: I0111 18:07:53.558287 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/2948a5a1-4557-4e6b-82d0-6b8e9d7408b1-inventory\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-dwz7s\" (UID: \"2948a5a1-4557-4e6b-82d0-6b8e9d7408b1\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-dwz7s" Jan 11 18:07:53 crc kubenswrapper[4837]: I0111 18:07:53.562002 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"libvirt-secret-0\" (UniqueName: \"kubernetes.io/secret/2948a5a1-4557-4e6b-82d0-6b8e9d7408b1-libvirt-secret-0\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-dwz7s\" (UID: \"2948a5a1-4557-4e6b-82d0-6b8e9d7408b1\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-dwz7s" Jan 11 18:07:53 crc kubenswrapper[4837]: I0111 18:07:53.562205 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2948a5a1-4557-4e6b-82d0-6b8e9d7408b1-libvirt-combined-ca-bundle\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-dwz7s\" (UID: \"2948a5a1-4557-4e6b-82d0-6b8e9d7408b1\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-dwz7s" Jan 11 18:07:53 crc kubenswrapper[4837]: I0111 18:07:53.563718 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/2948a5a1-4557-4e6b-82d0-6b8e9d7408b1-ssh-key-openstack-edpm-ipam\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-dwz7s\" (UID: \"2948a5a1-4557-4e6b-82d0-6b8e9d7408b1\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-dwz7s" Jan 11 18:07:53 crc kubenswrapper[4837]: I0111 18:07:53.566711 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/2948a5a1-4557-4e6b-82d0-6b8e9d7408b1-inventory\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-dwz7s\" (UID: \"2948a5a1-4557-4e6b-82d0-6b8e9d7408b1\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-dwz7s" Jan 11 18:07:53 crc kubenswrapper[4837]: I0111 18:07:53.584471 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qkm6n\" (UniqueName: \"kubernetes.io/projected/2948a5a1-4557-4e6b-82d0-6b8e9d7408b1-kube-api-access-qkm6n\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-dwz7s\" (UID: \"2948a5a1-4557-4e6b-82d0-6b8e9d7408b1\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-dwz7s" Jan 11 18:07:53 crc kubenswrapper[4837]: I0111 18:07:53.754456 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-dwz7s" Jan 11 18:07:54 crc kubenswrapper[4837]: I0111 18:07:54.092340 4837 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/libvirt-edpm-deployment-openstack-edpm-ipam-dwz7s"] Jan 11 18:07:54 crc kubenswrapper[4837]: I0111 18:07:54.312501 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-dwz7s" event={"ID":"2948a5a1-4557-4e6b-82d0-6b8e9d7408b1","Type":"ContainerStarted","Data":"a1a188ee8142e7514a5778a59b2537d22a15c5562af78960ac7be6034f2807ad"} Jan 11 18:07:55 crc kubenswrapper[4837]: I0111 18:07:55.323248 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-dwz7s" event={"ID":"2948a5a1-4557-4e6b-82d0-6b8e9d7408b1","Type":"ContainerStarted","Data":"5a7123d314f5173108d174d1164d2bb1ae68f244eff3c031996841f9f80cba38"} Jan 11 18:07:55 crc kubenswrapper[4837]: I0111 18:07:55.365085 4837 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-dwz7s" podStartSLOduration=1.748099178 podStartE2EDuration="2.365062191s" podCreationTimestamp="2026-01-11 18:07:53 +0000 UTC" firstStartedPulling="2026-01-11 18:07:54.107220871 +0000 UTC m=+2248.285413577" lastFinishedPulling="2026-01-11 18:07:54.724183884 +0000 UTC m=+2248.902376590" observedRunningTime="2026-01-11 18:07:55.347612792 +0000 UTC m=+2249.525805528" watchObservedRunningTime="2026-01-11 18:07:55.365062191 +0000 UTC m=+2249.543254897" Jan 11 18:08:09 crc kubenswrapper[4837]: I0111 18:08:09.444910 4837 patch_prober.go:28] interesting pod/machine-config-daemon-pqnst container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Jan 11 18:08:09 crc kubenswrapper[4837]: I0111 18:08:09.446868 4837 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-pqnst" podUID="1cf6fa66-290a-4e29-bafc-e60185a22fe2" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Jan 11 18:08:39 crc kubenswrapper[4837]: I0111 18:08:39.444438 4837 patch_prober.go:28] interesting pod/machine-config-daemon-pqnst container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Jan 11 18:08:39 crc kubenswrapper[4837]: I0111 18:08:39.445242 4837 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-pqnst" podUID="1cf6fa66-290a-4e29-bafc-e60185a22fe2" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Jan 11 18:09:09 crc kubenswrapper[4837]: I0111 18:09:09.444209 4837 patch_prober.go:28] interesting pod/machine-config-daemon-pqnst container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Jan 11 18:09:09 crc kubenswrapper[4837]: I0111 18:09:09.444748 4837 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-pqnst" podUID="1cf6fa66-290a-4e29-bafc-e60185a22fe2" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Jan 11 18:09:09 crc kubenswrapper[4837]: I0111 18:09:09.444794 4837 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-pqnst" Jan 11 18:09:09 crc kubenswrapper[4837]: I0111 18:09:09.445540 4837 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"2154fb56352ed810c3f2e54326f8ee766b9503eb23a751202bc7e683091dfb89"} pod="openshift-machine-config-operator/machine-config-daemon-pqnst" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Jan 11 18:09:09 crc kubenswrapper[4837]: I0111 18:09:09.445594 4837 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-pqnst" podUID="1cf6fa66-290a-4e29-bafc-e60185a22fe2" containerName="machine-config-daemon" containerID="cri-o://2154fb56352ed810c3f2e54326f8ee766b9503eb23a751202bc7e683091dfb89" gracePeriod=600 Jan 11 18:09:09 crc kubenswrapper[4837]: E0111 18:09:09.591710 4837 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-pqnst_openshift-machine-config-operator(1cf6fa66-290a-4e29-bafc-e60185a22fe2)\"" pod="openshift-machine-config-operator/machine-config-daemon-pqnst" podUID="1cf6fa66-290a-4e29-bafc-e60185a22fe2" Jan 11 18:09:10 crc kubenswrapper[4837]: I0111 18:09:10.058196 4837 generic.go:334] "Generic (PLEG): container finished" podID="1cf6fa66-290a-4e29-bafc-e60185a22fe2" containerID="2154fb56352ed810c3f2e54326f8ee766b9503eb23a751202bc7e683091dfb89" exitCode=0 Jan 11 18:09:10 crc kubenswrapper[4837]: I0111 18:09:10.058244 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-pqnst" event={"ID":"1cf6fa66-290a-4e29-bafc-e60185a22fe2","Type":"ContainerDied","Data":"2154fb56352ed810c3f2e54326f8ee766b9503eb23a751202bc7e683091dfb89"} Jan 11 18:09:10 crc kubenswrapper[4837]: I0111 18:09:10.058278 4837 scope.go:117] "RemoveContainer" containerID="6a353f8b2825371ba8ab38b9b1d705da9972513542365a5827d613ce87b61f00" Jan 11 18:09:10 crc kubenswrapper[4837]: I0111 18:09:10.059457 4837 scope.go:117] "RemoveContainer" containerID="2154fb56352ed810c3f2e54326f8ee766b9503eb23a751202bc7e683091dfb89" Jan 11 18:09:10 crc kubenswrapper[4837]: E0111 18:09:10.060142 4837 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-pqnst_openshift-machine-config-operator(1cf6fa66-290a-4e29-bafc-e60185a22fe2)\"" pod="openshift-machine-config-operator/machine-config-daemon-pqnst" podUID="1cf6fa66-290a-4e29-bafc-e60185a22fe2" Jan 11 18:09:21 crc kubenswrapper[4837]: I0111 18:09:21.364551 4837 scope.go:117] "RemoveContainer" containerID="2154fb56352ed810c3f2e54326f8ee766b9503eb23a751202bc7e683091dfb89" Jan 11 18:09:21 crc kubenswrapper[4837]: E0111 18:09:21.365314 4837 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-pqnst_openshift-machine-config-operator(1cf6fa66-290a-4e29-bafc-e60185a22fe2)\"" pod="openshift-machine-config-operator/machine-config-daemon-pqnst" podUID="1cf6fa66-290a-4e29-bafc-e60185a22fe2" Jan 11 18:09:32 crc kubenswrapper[4837]: I0111 18:09:32.365197 4837 scope.go:117] "RemoveContainer" containerID="2154fb56352ed810c3f2e54326f8ee766b9503eb23a751202bc7e683091dfb89" Jan 11 18:09:32 crc kubenswrapper[4837]: E0111 18:09:32.366266 4837 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-pqnst_openshift-machine-config-operator(1cf6fa66-290a-4e29-bafc-e60185a22fe2)\"" pod="openshift-machine-config-operator/machine-config-daemon-pqnst" podUID="1cf6fa66-290a-4e29-bafc-e60185a22fe2" Jan 11 18:09:44 crc kubenswrapper[4837]: I0111 18:09:44.364145 4837 scope.go:117] "RemoveContainer" containerID="2154fb56352ed810c3f2e54326f8ee766b9503eb23a751202bc7e683091dfb89" Jan 11 18:09:44 crc kubenswrapper[4837]: E0111 18:09:44.364894 4837 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-pqnst_openshift-machine-config-operator(1cf6fa66-290a-4e29-bafc-e60185a22fe2)\"" pod="openshift-machine-config-operator/machine-config-daemon-pqnst" podUID="1cf6fa66-290a-4e29-bafc-e60185a22fe2" Jan 11 18:09:57 crc kubenswrapper[4837]: I0111 18:09:57.365507 4837 scope.go:117] "RemoveContainer" containerID="2154fb56352ed810c3f2e54326f8ee766b9503eb23a751202bc7e683091dfb89" Jan 11 18:09:57 crc kubenswrapper[4837]: E0111 18:09:57.366304 4837 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-pqnst_openshift-machine-config-operator(1cf6fa66-290a-4e29-bafc-e60185a22fe2)\"" pod="openshift-machine-config-operator/machine-config-daemon-pqnst" podUID="1cf6fa66-290a-4e29-bafc-e60185a22fe2" Jan 11 18:10:09 crc kubenswrapper[4837]: I0111 18:10:09.364332 4837 scope.go:117] "RemoveContainer" containerID="2154fb56352ed810c3f2e54326f8ee766b9503eb23a751202bc7e683091dfb89" Jan 11 18:10:09 crc kubenswrapper[4837]: E0111 18:10:09.365268 4837 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-pqnst_openshift-machine-config-operator(1cf6fa66-290a-4e29-bafc-e60185a22fe2)\"" pod="openshift-machine-config-operator/machine-config-daemon-pqnst" podUID="1cf6fa66-290a-4e29-bafc-e60185a22fe2" Jan 11 18:10:21 crc kubenswrapper[4837]: I0111 18:10:21.365096 4837 scope.go:117] "RemoveContainer" containerID="2154fb56352ed810c3f2e54326f8ee766b9503eb23a751202bc7e683091dfb89" Jan 11 18:10:21 crc kubenswrapper[4837]: E0111 18:10:21.366158 4837 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-pqnst_openshift-machine-config-operator(1cf6fa66-290a-4e29-bafc-e60185a22fe2)\"" pod="openshift-machine-config-operator/machine-config-daemon-pqnst" podUID="1cf6fa66-290a-4e29-bafc-e60185a22fe2" Jan 11 18:10:36 crc kubenswrapper[4837]: I0111 18:10:36.369961 4837 scope.go:117] "RemoveContainer" containerID="2154fb56352ed810c3f2e54326f8ee766b9503eb23a751202bc7e683091dfb89" Jan 11 18:10:36 crc kubenswrapper[4837]: E0111 18:10:36.370788 4837 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-pqnst_openshift-machine-config-operator(1cf6fa66-290a-4e29-bafc-e60185a22fe2)\"" pod="openshift-machine-config-operator/machine-config-daemon-pqnst" podUID="1cf6fa66-290a-4e29-bafc-e60185a22fe2" Jan 11 18:10:47 crc kubenswrapper[4837]: I0111 18:10:47.364720 4837 scope.go:117] "RemoveContainer" containerID="2154fb56352ed810c3f2e54326f8ee766b9503eb23a751202bc7e683091dfb89" Jan 11 18:10:47 crc kubenswrapper[4837]: E0111 18:10:47.366134 4837 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-pqnst_openshift-machine-config-operator(1cf6fa66-290a-4e29-bafc-e60185a22fe2)\"" pod="openshift-machine-config-operator/machine-config-daemon-pqnst" podUID="1cf6fa66-290a-4e29-bafc-e60185a22fe2" Jan 11 18:11:00 crc kubenswrapper[4837]: I0111 18:11:00.365291 4837 scope.go:117] "RemoveContainer" containerID="2154fb56352ed810c3f2e54326f8ee766b9503eb23a751202bc7e683091dfb89" Jan 11 18:11:00 crc kubenswrapper[4837]: E0111 18:11:00.366568 4837 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-pqnst_openshift-machine-config-operator(1cf6fa66-290a-4e29-bafc-e60185a22fe2)\"" pod="openshift-machine-config-operator/machine-config-daemon-pqnst" podUID="1cf6fa66-290a-4e29-bafc-e60185a22fe2" Jan 11 18:11:11 crc kubenswrapper[4837]: I0111 18:11:11.364638 4837 scope.go:117] "RemoveContainer" containerID="2154fb56352ed810c3f2e54326f8ee766b9503eb23a751202bc7e683091dfb89" Jan 11 18:11:11 crc kubenswrapper[4837]: E0111 18:11:11.365594 4837 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-pqnst_openshift-machine-config-operator(1cf6fa66-290a-4e29-bafc-e60185a22fe2)\"" pod="openshift-machine-config-operator/machine-config-daemon-pqnst" podUID="1cf6fa66-290a-4e29-bafc-e60185a22fe2" Jan 11 18:11:22 crc kubenswrapper[4837]: I0111 18:11:22.364964 4837 scope.go:117] "RemoveContainer" containerID="2154fb56352ed810c3f2e54326f8ee766b9503eb23a751202bc7e683091dfb89" Jan 11 18:11:22 crc kubenswrapper[4837]: E0111 18:11:22.365590 4837 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-pqnst_openshift-machine-config-operator(1cf6fa66-290a-4e29-bafc-e60185a22fe2)\"" pod="openshift-machine-config-operator/machine-config-daemon-pqnst" podUID="1cf6fa66-290a-4e29-bafc-e60185a22fe2" Jan 11 18:11:36 crc kubenswrapper[4837]: I0111 18:11:36.374164 4837 scope.go:117] "RemoveContainer" containerID="2154fb56352ed810c3f2e54326f8ee766b9503eb23a751202bc7e683091dfb89" Jan 11 18:11:36 crc kubenswrapper[4837]: E0111 18:11:36.384261 4837 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-pqnst_openshift-machine-config-operator(1cf6fa66-290a-4e29-bafc-e60185a22fe2)\"" pod="openshift-machine-config-operator/machine-config-daemon-pqnst" podUID="1cf6fa66-290a-4e29-bafc-e60185a22fe2" Jan 11 18:11:37 crc kubenswrapper[4837]: I0111 18:11:37.408829 4837 scope.go:117] "RemoveContainer" containerID="c8a927c41c75cf7e8858cfbaa70d6f3ffa4415a757c08a03ca207c57ea7dc379" Jan 11 18:11:37 crc kubenswrapper[4837]: I0111 18:11:37.431404 4837 scope.go:117] "RemoveContainer" containerID="8e1afa8fe449a2bcf25de104615bc6e0158748c772908ef8d8238e803bd6a977" Jan 11 18:11:37 crc kubenswrapper[4837]: I0111 18:11:37.455866 4837 scope.go:117] "RemoveContainer" containerID="abf450cd9e43656072632cd75138e94762d03ff39643b27e0f49a6975c79d917" Jan 11 18:11:49 crc kubenswrapper[4837]: I0111 18:11:49.364191 4837 scope.go:117] "RemoveContainer" containerID="2154fb56352ed810c3f2e54326f8ee766b9503eb23a751202bc7e683091dfb89" Jan 11 18:11:49 crc kubenswrapper[4837]: E0111 18:11:49.365054 4837 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-pqnst_openshift-machine-config-operator(1cf6fa66-290a-4e29-bafc-e60185a22fe2)\"" pod="openshift-machine-config-operator/machine-config-daemon-pqnst" podUID="1cf6fa66-290a-4e29-bafc-e60185a22fe2" Jan 11 18:12:01 crc kubenswrapper[4837]: I0111 18:12:01.364184 4837 scope.go:117] "RemoveContainer" containerID="2154fb56352ed810c3f2e54326f8ee766b9503eb23a751202bc7e683091dfb89" Jan 11 18:12:01 crc kubenswrapper[4837]: E0111 18:12:01.364913 4837 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-pqnst_openshift-machine-config-operator(1cf6fa66-290a-4e29-bafc-e60185a22fe2)\"" pod="openshift-machine-config-operator/machine-config-daemon-pqnst" podUID="1cf6fa66-290a-4e29-bafc-e60185a22fe2" Jan 11 18:12:14 crc kubenswrapper[4837]: I0111 18:12:14.364969 4837 scope.go:117] "RemoveContainer" containerID="2154fb56352ed810c3f2e54326f8ee766b9503eb23a751202bc7e683091dfb89" Jan 11 18:12:14 crc kubenswrapper[4837]: E0111 18:12:14.365976 4837 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-pqnst_openshift-machine-config-operator(1cf6fa66-290a-4e29-bafc-e60185a22fe2)\"" pod="openshift-machine-config-operator/machine-config-daemon-pqnst" podUID="1cf6fa66-290a-4e29-bafc-e60185a22fe2" Jan 11 18:12:21 crc kubenswrapper[4837]: I0111 18:12:21.997148 4837 generic.go:334] "Generic (PLEG): container finished" podID="2948a5a1-4557-4e6b-82d0-6b8e9d7408b1" containerID="5a7123d314f5173108d174d1164d2bb1ae68f244eff3c031996841f9f80cba38" exitCode=0 Jan 11 18:12:21 crc kubenswrapper[4837]: I0111 18:12:21.997240 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-dwz7s" event={"ID":"2948a5a1-4557-4e6b-82d0-6b8e9d7408b1","Type":"ContainerDied","Data":"5a7123d314f5173108d174d1164d2bb1ae68f244eff3c031996841f9f80cba38"} Jan 11 18:12:23 crc kubenswrapper[4837]: I0111 18:12:23.385763 4837 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-dwz7s" Jan 11 18:12:23 crc kubenswrapper[4837]: I0111 18:12:23.486562 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/2948a5a1-4557-4e6b-82d0-6b8e9d7408b1-ssh-key-openstack-edpm-ipam\") pod \"2948a5a1-4557-4e6b-82d0-6b8e9d7408b1\" (UID: \"2948a5a1-4557-4e6b-82d0-6b8e9d7408b1\") " Jan 11 18:12:23 crc kubenswrapper[4837]: I0111 18:12:23.486695 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"libvirt-secret-0\" (UniqueName: \"kubernetes.io/secret/2948a5a1-4557-4e6b-82d0-6b8e9d7408b1-libvirt-secret-0\") pod \"2948a5a1-4557-4e6b-82d0-6b8e9d7408b1\" (UID: \"2948a5a1-4557-4e6b-82d0-6b8e9d7408b1\") " Jan 11 18:12:23 crc kubenswrapper[4837]: I0111 18:12:23.486836 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2948a5a1-4557-4e6b-82d0-6b8e9d7408b1-libvirt-combined-ca-bundle\") pod \"2948a5a1-4557-4e6b-82d0-6b8e9d7408b1\" (UID: \"2948a5a1-4557-4e6b-82d0-6b8e9d7408b1\") " Jan 11 18:12:23 crc kubenswrapper[4837]: I0111 18:12:23.486861 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qkm6n\" (UniqueName: \"kubernetes.io/projected/2948a5a1-4557-4e6b-82d0-6b8e9d7408b1-kube-api-access-qkm6n\") pod \"2948a5a1-4557-4e6b-82d0-6b8e9d7408b1\" (UID: \"2948a5a1-4557-4e6b-82d0-6b8e9d7408b1\") " Jan 11 18:12:23 crc kubenswrapper[4837]: I0111 18:12:23.486896 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/2948a5a1-4557-4e6b-82d0-6b8e9d7408b1-inventory\") pod \"2948a5a1-4557-4e6b-82d0-6b8e9d7408b1\" (UID: \"2948a5a1-4557-4e6b-82d0-6b8e9d7408b1\") " Jan 11 18:12:23 crc kubenswrapper[4837]: I0111 18:12:23.493109 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2948a5a1-4557-4e6b-82d0-6b8e9d7408b1-libvirt-combined-ca-bundle" (OuterVolumeSpecName: "libvirt-combined-ca-bundle") pod "2948a5a1-4557-4e6b-82d0-6b8e9d7408b1" (UID: "2948a5a1-4557-4e6b-82d0-6b8e9d7408b1"). InnerVolumeSpecName "libvirt-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 11 18:12:23 crc kubenswrapper[4837]: I0111 18:12:23.493147 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2948a5a1-4557-4e6b-82d0-6b8e9d7408b1-kube-api-access-qkm6n" (OuterVolumeSpecName: "kube-api-access-qkm6n") pod "2948a5a1-4557-4e6b-82d0-6b8e9d7408b1" (UID: "2948a5a1-4557-4e6b-82d0-6b8e9d7408b1"). InnerVolumeSpecName "kube-api-access-qkm6n". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 11 18:12:23 crc kubenswrapper[4837]: I0111 18:12:23.513777 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2948a5a1-4557-4e6b-82d0-6b8e9d7408b1-ssh-key-openstack-edpm-ipam" (OuterVolumeSpecName: "ssh-key-openstack-edpm-ipam") pod "2948a5a1-4557-4e6b-82d0-6b8e9d7408b1" (UID: "2948a5a1-4557-4e6b-82d0-6b8e9d7408b1"). InnerVolumeSpecName "ssh-key-openstack-edpm-ipam". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 11 18:12:23 crc kubenswrapper[4837]: I0111 18:12:23.514074 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2948a5a1-4557-4e6b-82d0-6b8e9d7408b1-inventory" (OuterVolumeSpecName: "inventory") pod "2948a5a1-4557-4e6b-82d0-6b8e9d7408b1" (UID: "2948a5a1-4557-4e6b-82d0-6b8e9d7408b1"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 11 18:12:23 crc kubenswrapper[4837]: I0111 18:12:23.528437 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2948a5a1-4557-4e6b-82d0-6b8e9d7408b1-libvirt-secret-0" (OuterVolumeSpecName: "libvirt-secret-0") pod "2948a5a1-4557-4e6b-82d0-6b8e9d7408b1" (UID: "2948a5a1-4557-4e6b-82d0-6b8e9d7408b1"). InnerVolumeSpecName "libvirt-secret-0". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 11 18:12:23 crc kubenswrapper[4837]: I0111 18:12:23.588994 4837 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/2948a5a1-4557-4e6b-82d0-6b8e9d7408b1-inventory\") on node \"crc\" DevicePath \"\"" Jan 11 18:12:23 crc kubenswrapper[4837]: I0111 18:12:23.589017 4837 reconciler_common.go:293] "Volume detached for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/2948a5a1-4557-4e6b-82d0-6b8e9d7408b1-ssh-key-openstack-edpm-ipam\") on node \"crc\" DevicePath \"\"" Jan 11 18:12:23 crc kubenswrapper[4837]: I0111 18:12:23.589028 4837 reconciler_common.go:293] "Volume detached for volume \"libvirt-secret-0\" (UniqueName: \"kubernetes.io/secret/2948a5a1-4557-4e6b-82d0-6b8e9d7408b1-libvirt-secret-0\") on node \"crc\" DevicePath \"\"" Jan 11 18:12:23 crc kubenswrapper[4837]: I0111 18:12:23.589037 4837 reconciler_common.go:293] "Volume detached for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2948a5a1-4557-4e6b-82d0-6b8e9d7408b1-libvirt-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 11 18:12:23 crc kubenswrapper[4837]: I0111 18:12:23.589047 4837 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qkm6n\" (UniqueName: \"kubernetes.io/projected/2948a5a1-4557-4e6b-82d0-6b8e9d7408b1-kube-api-access-qkm6n\") on node \"crc\" DevicePath \"\"" Jan 11 18:12:24 crc kubenswrapper[4837]: I0111 18:12:24.019379 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-dwz7s" event={"ID":"2948a5a1-4557-4e6b-82d0-6b8e9d7408b1","Type":"ContainerDied","Data":"a1a188ee8142e7514a5778a59b2537d22a15c5562af78960ac7be6034f2807ad"} Jan 11 18:12:24 crc kubenswrapper[4837]: I0111 18:12:24.019428 4837 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="a1a188ee8142e7514a5778a59b2537d22a15c5562af78960ac7be6034f2807ad" Jan 11 18:12:24 crc kubenswrapper[4837]: I0111 18:12:24.019467 4837 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-dwz7s" Jan 11 18:12:24 crc kubenswrapper[4837]: I0111 18:12:24.131409 4837 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-edpm-deployment-openstack-edpm-ipam-fctb7"] Jan 11 18:12:24 crc kubenswrapper[4837]: E0111 18:12:24.132039 4837 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2948a5a1-4557-4e6b-82d0-6b8e9d7408b1" containerName="libvirt-edpm-deployment-openstack-edpm-ipam" Jan 11 18:12:24 crc kubenswrapper[4837]: I0111 18:12:24.132071 4837 state_mem.go:107] "Deleted CPUSet assignment" podUID="2948a5a1-4557-4e6b-82d0-6b8e9d7408b1" containerName="libvirt-edpm-deployment-openstack-edpm-ipam" Jan 11 18:12:24 crc kubenswrapper[4837]: I0111 18:12:24.132430 4837 memory_manager.go:354] "RemoveStaleState removing state" podUID="2948a5a1-4557-4e6b-82d0-6b8e9d7408b1" containerName="libvirt-edpm-deployment-openstack-edpm-ipam" Jan 11 18:12:24 crc kubenswrapper[4837]: I0111 18:12:24.133408 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-fctb7" Jan 11 18:12:24 crc kubenswrapper[4837]: I0111 18:12:24.136929 4837 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-migration-ssh-key" Jan 11 18:12:24 crc kubenswrapper[4837]: I0111 18:12:24.137891 4837 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Jan 11 18:12:24 crc kubenswrapper[4837]: I0111 18:12:24.138068 4837 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Jan 11 18:12:24 crc kubenswrapper[4837]: I0111 18:12:24.138209 4837 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"nova-extra-config" Jan 11 18:12:24 crc kubenswrapper[4837]: I0111 18:12:24.138482 4837 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-compute-config" Jan 11 18:12:24 crc kubenswrapper[4837]: I0111 18:12:24.138605 4837 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Jan 11 18:12:24 crc kubenswrapper[4837]: I0111 18:12:24.140447 4837 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-bzt89" Jan 11 18:12:24 crc kubenswrapper[4837]: I0111 18:12:24.149749 4837 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-edpm-deployment-openstack-edpm-ipam-fctb7"] Jan 11 18:12:24 crc kubenswrapper[4837]: I0111 18:12:24.199873 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/af0d2223-27cf-46b8-9105-735784f027d5-inventory\") pod \"nova-edpm-deployment-openstack-edpm-ipam-fctb7\" (UID: \"af0d2223-27cf-46b8-9105-735784f027d5\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-fctb7" Jan 11 18:12:24 crc kubenswrapper[4837]: I0111 18:12:24.199948 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/af0d2223-27cf-46b8-9105-735784f027d5-nova-combined-ca-bundle\") pod \"nova-edpm-deployment-openstack-edpm-ipam-fctb7\" (UID: \"af0d2223-27cf-46b8-9105-735784f027d5\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-fctb7" Jan 11 18:12:24 crc kubenswrapper[4837]: I0111 18:12:24.200012 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-cell1-compute-config-0\" (UniqueName: \"kubernetes.io/secret/af0d2223-27cf-46b8-9105-735784f027d5-nova-cell1-compute-config-0\") pod \"nova-edpm-deployment-openstack-edpm-ipam-fctb7\" (UID: \"af0d2223-27cf-46b8-9105-735784f027d5\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-fctb7" Jan 11 18:12:24 crc kubenswrapper[4837]: I0111 18:12:24.200030 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-cell1-compute-config-1\" (UniqueName: \"kubernetes.io/secret/af0d2223-27cf-46b8-9105-735784f027d5-nova-cell1-compute-config-1\") pod \"nova-edpm-deployment-openstack-edpm-ipam-fctb7\" (UID: \"af0d2223-27cf-46b8-9105-735784f027d5\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-fctb7" Jan 11 18:12:24 crc kubenswrapper[4837]: I0111 18:12:24.200347 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-migration-ssh-key-0\" (UniqueName: \"kubernetes.io/secret/af0d2223-27cf-46b8-9105-735784f027d5-nova-migration-ssh-key-0\") pod \"nova-edpm-deployment-openstack-edpm-ipam-fctb7\" (UID: \"af0d2223-27cf-46b8-9105-735784f027d5\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-fctb7" Jan 11 18:12:24 crc kubenswrapper[4837]: I0111 18:12:24.200440 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-migration-ssh-key-1\" (UniqueName: \"kubernetes.io/secret/af0d2223-27cf-46b8-9105-735784f027d5-nova-migration-ssh-key-1\") pod \"nova-edpm-deployment-openstack-edpm-ipam-fctb7\" (UID: \"af0d2223-27cf-46b8-9105-735784f027d5\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-fctb7" Jan 11 18:12:24 crc kubenswrapper[4837]: I0111 18:12:24.200516 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xqqks\" (UniqueName: \"kubernetes.io/projected/af0d2223-27cf-46b8-9105-735784f027d5-kube-api-access-xqqks\") pod \"nova-edpm-deployment-openstack-edpm-ipam-fctb7\" (UID: \"af0d2223-27cf-46b8-9105-735784f027d5\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-fctb7" Jan 11 18:12:24 crc kubenswrapper[4837]: I0111 18:12:24.200613 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-extra-config-0\" (UniqueName: \"kubernetes.io/configmap/af0d2223-27cf-46b8-9105-735784f027d5-nova-extra-config-0\") pod \"nova-edpm-deployment-openstack-edpm-ipam-fctb7\" (UID: \"af0d2223-27cf-46b8-9105-735784f027d5\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-fctb7" Jan 11 18:12:24 crc kubenswrapper[4837]: I0111 18:12:24.200653 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/af0d2223-27cf-46b8-9105-735784f027d5-ssh-key-openstack-edpm-ipam\") pod \"nova-edpm-deployment-openstack-edpm-ipam-fctb7\" (UID: \"af0d2223-27cf-46b8-9105-735784f027d5\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-fctb7" Jan 11 18:12:24 crc kubenswrapper[4837]: I0111 18:12:24.302666 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/af0d2223-27cf-46b8-9105-735784f027d5-inventory\") pod \"nova-edpm-deployment-openstack-edpm-ipam-fctb7\" (UID: \"af0d2223-27cf-46b8-9105-735784f027d5\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-fctb7" Jan 11 18:12:24 crc kubenswrapper[4837]: I0111 18:12:24.302751 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/af0d2223-27cf-46b8-9105-735784f027d5-nova-combined-ca-bundle\") pod \"nova-edpm-deployment-openstack-edpm-ipam-fctb7\" (UID: \"af0d2223-27cf-46b8-9105-735784f027d5\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-fctb7" Jan 11 18:12:24 crc kubenswrapper[4837]: I0111 18:12:24.302806 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-cell1-compute-config-0\" (UniqueName: \"kubernetes.io/secret/af0d2223-27cf-46b8-9105-735784f027d5-nova-cell1-compute-config-0\") pod \"nova-edpm-deployment-openstack-edpm-ipam-fctb7\" (UID: \"af0d2223-27cf-46b8-9105-735784f027d5\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-fctb7" Jan 11 18:12:24 crc kubenswrapper[4837]: I0111 18:12:24.302827 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-cell1-compute-config-1\" (UniqueName: \"kubernetes.io/secret/af0d2223-27cf-46b8-9105-735784f027d5-nova-cell1-compute-config-1\") pod \"nova-edpm-deployment-openstack-edpm-ipam-fctb7\" (UID: \"af0d2223-27cf-46b8-9105-735784f027d5\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-fctb7" Jan 11 18:12:24 crc kubenswrapper[4837]: I0111 18:12:24.302914 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-migration-ssh-key-0\" (UniqueName: \"kubernetes.io/secret/af0d2223-27cf-46b8-9105-735784f027d5-nova-migration-ssh-key-0\") pod \"nova-edpm-deployment-openstack-edpm-ipam-fctb7\" (UID: \"af0d2223-27cf-46b8-9105-735784f027d5\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-fctb7" Jan 11 18:12:24 crc kubenswrapper[4837]: I0111 18:12:24.302947 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-migration-ssh-key-1\" (UniqueName: \"kubernetes.io/secret/af0d2223-27cf-46b8-9105-735784f027d5-nova-migration-ssh-key-1\") pod \"nova-edpm-deployment-openstack-edpm-ipam-fctb7\" (UID: \"af0d2223-27cf-46b8-9105-735784f027d5\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-fctb7" Jan 11 18:12:24 crc kubenswrapper[4837]: I0111 18:12:24.302982 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xqqks\" (UniqueName: \"kubernetes.io/projected/af0d2223-27cf-46b8-9105-735784f027d5-kube-api-access-xqqks\") pod \"nova-edpm-deployment-openstack-edpm-ipam-fctb7\" (UID: \"af0d2223-27cf-46b8-9105-735784f027d5\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-fctb7" Jan 11 18:12:24 crc kubenswrapper[4837]: I0111 18:12:24.303019 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/af0d2223-27cf-46b8-9105-735784f027d5-ssh-key-openstack-edpm-ipam\") pod \"nova-edpm-deployment-openstack-edpm-ipam-fctb7\" (UID: \"af0d2223-27cf-46b8-9105-735784f027d5\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-fctb7" Jan 11 18:12:24 crc kubenswrapper[4837]: I0111 18:12:24.303045 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-extra-config-0\" (UniqueName: \"kubernetes.io/configmap/af0d2223-27cf-46b8-9105-735784f027d5-nova-extra-config-0\") pod \"nova-edpm-deployment-openstack-edpm-ipam-fctb7\" (UID: \"af0d2223-27cf-46b8-9105-735784f027d5\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-fctb7" Jan 11 18:12:24 crc kubenswrapper[4837]: I0111 18:12:24.308430 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-extra-config-0\" (UniqueName: \"kubernetes.io/configmap/af0d2223-27cf-46b8-9105-735784f027d5-nova-extra-config-0\") pod \"nova-edpm-deployment-openstack-edpm-ipam-fctb7\" (UID: \"af0d2223-27cf-46b8-9105-735784f027d5\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-fctb7" Jan 11 18:12:24 crc kubenswrapper[4837]: I0111 18:12:24.309927 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/af0d2223-27cf-46b8-9105-735784f027d5-nova-combined-ca-bundle\") pod \"nova-edpm-deployment-openstack-edpm-ipam-fctb7\" (UID: \"af0d2223-27cf-46b8-9105-735784f027d5\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-fctb7" Jan 11 18:12:24 crc kubenswrapper[4837]: I0111 18:12:24.310040 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-cell1-compute-config-1\" (UniqueName: \"kubernetes.io/secret/af0d2223-27cf-46b8-9105-735784f027d5-nova-cell1-compute-config-1\") pod \"nova-edpm-deployment-openstack-edpm-ipam-fctb7\" (UID: \"af0d2223-27cf-46b8-9105-735784f027d5\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-fctb7" Jan 11 18:12:24 crc kubenswrapper[4837]: I0111 18:12:24.313983 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/af0d2223-27cf-46b8-9105-735784f027d5-inventory\") pod \"nova-edpm-deployment-openstack-edpm-ipam-fctb7\" (UID: \"af0d2223-27cf-46b8-9105-735784f027d5\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-fctb7" Jan 11 18:12:24 crc kubenswrapper[4837]: I0111 18:12:24.315079 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-cell1-compute-config-0\" (UniqueName: \"kubernetes.io/secret/af0d2223-27cf-46b8-9105-735784f027d5-nova-cell1-compute-config-0\") pod \"nova-edpm-deployment-openstack-edpm-ipam-fctb7\" (UID: \"af0d2223-27cf-46b8-9105-735784f027d5\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-fctb7" Jan 11 18:12:24 crc kubenswrapper[4837]: I0111 18:12:24.327212 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-migration-ssh-key-0\" (UniqueName: \"kubernetes.io/secret/af0d2223-27cf-46b8-9105-735784f027d5-nova-migration-ssh-key-0\") pod \"nova-edpm-deployment-openstack-edpm-ipam-fctb7\" (UID: \"af0d2223-27cf-46b8-9105-735784f027d5\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-fctb7" Jan 11 18:12:24 crc kubenswrapper[4837]: I0111 18:12:24.327296 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-migration-ssh-key-1\" (UniqueName: \"kubernetes.io/secret/af0d2223-27cf-46b8-9105-735784f027d5-nova-migration-ssh-key-1\") pod \"nova-edpm-deployment-openstack-edpm-ipam-fctb7\" (UID: \"af0d2223-27cf-46b8-9105-735784f027d5\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-fctb7" Jan 11 18:12:24 crc kubenswrapper[4837]: I0111 18:12:24.328259 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/af0d2223-27cf-46b8-9105-735784f027d5-ssh-key-openstack-edpm-ipam\") pod \"nova-edpm-deployment-openstack-edpm-ipam-fctb7\" (UID: \"af0d2223-27cf-46b8-9105-735784f027d5\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-fctb7" Jan 11 18:12:24 crc kubenswrapper[4837]: I0111 18:12:24.331055 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xqqks\" (UniqueName: \"kubernetes.io/projected/af0d2223-27cf-46b8-9105-735784f027d5-kube-api-access-xqqks\") pod \"nova-edpm-deployment-openstack-edpm-ipam-fctb7\" (UID: \"af0d2223-27cf-46b8-9105-735784f027d5\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-fctb7" Jan 11 18:12:24 crc kubenswrapper[4837]: I0111 18:12:24.465326 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-fctb7" Jan 11 18:12:25 crc kubenswrapper[4837]: I0111 18:12:25.026962 4837 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Jan 11 18:12:25 crc kubenswrapper[4837]: I0111 18:12:25.035892 4837 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-edpm-deployment-openstack-edpm-ipam-fctb7"] Jan 11 18:12:26 crc kubenswrapper[4837]: I0111 18:12:26.037921 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-fctb7" event={"ID":"af0d2223-27cf-46b8-9105-735784f027d5","Type":"ContainerStarted","Data":"f4084b0ea233aca9f994420e98cb13af13871c931fd6e46ee0aa604360efdcbd"} Jan 11 18:12:26 crc kubenswrapper[4837]: I0111 18:12:26.038280 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-fctb7" event={"ID":"af0d2223-27cf-46b8-9105-735784f027d5","Type":"ContainerStarted","Data":"83732a97f0101f779f499ea1a9fe449e44c01dbc30ff41e8e3a1bf7e70902aba"} Jan 11 18:12:26 crc kubenswrapper[4837]: I0111 18:12:26.078424 4837 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-fctb7" podStartSLOduration=1.596596065 podStartE2EDuration="2.078400709s" podCreationTimestamp="2026-01-11 18:12:24 +0000 UTC" firstStartedPulling="2026-01-11 18:12:25.026733263 +0000 UTC m=+2519.204925969" lastFinishedPulling="2026-01-11 18:12:25.508537907 +0000 UTC m=+2519.686730613" observedRunningTime="2026-01-11 18:12:26.067629519 +0000 UTC m=+2520.245822225" watchObservedRunningTime="2026-01-11 18:12:26.078400709 +0000 UTC m=+2520.256593425" Jan 11 18:12:29 crc kubenswrapper[4837]: I0111 18:12:29.364721 4837 scope.go:117] "RemoveContainer" containerID="2154fb56352ed810c3f2e54326f8ee766b9503eb23a751202bc7e683091dfb89" Jan 11 18:12:29 crc kubenswrapper[4837]: E0111 18:12:29.365592 4837 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-pqnst_openshift-machine-config-operator(1cf6fa66-290a-4e29-bafc-e60185a22fe2)\"" pod="openshift-machine-config-operator/machine-config-daemon-pqnst" podUID="1cf6fa66-290a-4e29-bafc-e60185a22fe2" Jan 11 18:12:40 crc kubenswrapper[4837]: I0111 18:12:40.363970 4837 scope.go:117] "RemoveContainer" containerID="2154fb56352ed810c3f2e54326f8ee766b9503eb23a751202bc7e683091dfb89" Jan 11 18:12:40 crc kubenswrapper[4837]: E0111 18:12:40.364794 4837 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-pqnst_openshift-machine-config-operator(1cf6fa66-290a-4e29-bafc-e60185a22fe2)\"" pod="openshift-machine-config-operator/machine-config-daemon-pqnst" podUID="1cf6fa66-290a-4e29-bafc-e60185a22fe2" Jan 11 18:12:55 crc kubenswrapper[4837]: I0111 18:12:55.365200 4837 scope.go:117] "RemoveContainer" containerID="2154fb56352ed810c3f2e54326f8ee766b9503eb23a751202bc7e683091dfb89" Jan 11 18:12:55 crc kubenswrapper[4837]: E0111 18:12:55.366777 4837 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-pqnst_openshift-machine-config-operator(1cf6fa66-290a-4e29-bafc-e60185a22fe2)\"" pod="openshift-machine-config-operator/machine-config-daemon-pqnst" podUID="1cf6fa66-290a-4e29-bafc-e60185a22fe2" Jan 11 18:13:10 crc kubenswrapper[4837]: I0111 18:13:10.364462 4837 scope.go:117] "RemoveContainer" containerID="2154fb56352ed810c3f2e54326f8ee766b9503eb23a751202bc7e683091dfb89" Jan 11 18:13:10 crc kubenswrapper[4837]: E0111 18:13:10.365609 4837 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-pqnst_openshift-machine-config-operator(1cf6fa66-290a-4e29-bafc-e60185a22fe2)\"" pod="openshift-machine-config-operator/machine-config-daemon-pqnst" podUID="1cf6fa66-290a-4e29-bafc-e60185a22fe2" Jan 11 18:13:25 crc kubenswrapper[4837]: I0111 18:13:25.364354 4837 scope.go:117] "RemoveContainer" containerID="2154fb56352ed810c3f2e54326f8ee766b9503eb23a751202bc7e683091dfb89" Jan 11 18:13:25 crc kubenswrapper[4837]: E0111 18:13:25.364973 4837 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-pqnst_openshift-machine-config-operator(1cf6fa66-290a-4e29-bafc-e60185a22fe2)\"" pod="openshift-machine-config-operator/machine-config-daemon-pqnst" podUID="1cf6fa66-290a-4e29-bafc-e60185a22fe2" Jan 11 18:13:40 crc kubenswrapper[4837]: I0111 18:13:40.364398 4837 scope.go:117] "RemoveContainer" containerID="2154fb56352ed810c3f2e54326f8ee766b9503eb23a751202bc7e683091dfb89" Jan 11 18:13:40 crc kubenswrapper[4837]: E0111 18:13:40.365503 4837 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-pqnst_openshift-machine-config-operator(1cf6fa66-290a-4e29-bafc-e60185a22fe2)\"" pod="openshift-machine-config-operator/machine-config-daemon-pqnst" podUID="1cf6fa66-290a-4e29-bafc-e60185a22fe2" Jan 11 18:13:54 crc kubenswrapper[4837]: I0111 18:13:54.364655 4837 scope.go:117] "RemoveContainer" containerID="2154fb56352ed810c3f2e54326f8ee766b9503eb23a751202bc7e683091dfb89" Jan 11 18:13:54 crc kubenswrapper[4837]: E0111 18:13:54.365517 4837 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-pqnst_openshift-machine-config-operator(1cf6fa66-290a-4e29-bafc-e60185a22fe2)\"" pod="openshift-machine-config-operator/machine-config-daemon-pqnst" podUID="1cf6fa66-290a-4e29-bafc-e60185a22fe2" Jan 11 18:14:08 crc kubenswrapper[4837]: I0111 18:14:08.364798 4837 scope.go:117] "RemoveContainer" containerID="2154fb56352ed810c3f2e54326f8ee766b9503eb23a751202bc7e683091dfb89" Jan 11 18:14:08 crc kubenswrapper[4837]: E0111 18:14:08.365419 4837 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-pqnst_openshift-machine-config-operator(1cf6fa66-290a-4e29-bafc-e60185a22fe2)\"" pod="openshift-machine-config-operator/machine-config-daemon-pqnst" podUID="1cf6fa66-290a-4e29-bafc-e60185a22fe2" Jan 11 18:14:20 crc kubenswrapper[4837]: I0111 18:14:20.365390 4837 scope.go:117] "RemoveContainer" containerID="2154fb56352ed810c3f2e54326f8ee766b9503eb23a751202bc7e683091dfb89" Jan 11 18:14:21 crc kubenswrapper[4837]: I0111 18:14:21.159929 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-pqnst" event={"ID":"1cf6fa66-290a-4e29-bafc-e60185a22fe2","Type":"ContainerStarted","Data":"f1a9906e1f83fc15699d820fe6fec4c1a5b7031cfbe2d186a79b8184951f2fcd"} Jan 11 18:14:55 crc kubenswrapper[4837]: I0111 18:14:55.491070 4837 generic.go:334] "Generic (PLEG): container finished" podID="af0d2223-27cf-46b8-9105-735784f027d5" containerID="f4084b0ea233aca9f994420e98cb13af13871c931fd6e46ee0aa604360efdcbd" exitCode=0 Jan 11 18:14:55 crc kubenswrapper[4837]: I0111 18:14:55.491158 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-fctb7" event={"ID":"af0d2223-27cf-46b8-9105-735784f027d5","Type":"ContainerDied","Data":"f4084b0ea233aca9f994420e98cb13af13871c931fd6e46ee0aa604360efdcbd"} Jan 11 18:14:56 crc kubenswrapper[4837]: I0111 18:14:56.872996 4837 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-fctb7" Jan 11 18:14:57 crc kubenswrapper[4837]: I0111 18:14:57.035283 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xqqks\" (UniqueName: \"kubernetes.io/projected/af0d2223-27cf-46b8-9105-735784f027d5-kube-api-access-xqqks\") pod \"af0d2223-27cf-46b8-9105-735784f027d5\" (UID: \"af0d2223-27cf-46b8-9105-735784f027d5\") " Jan 11 18:14:57 crc kubenswrapper[4837]: I0111 18:14:57.035802 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/af0d2223-27cf-46b8-9105-735784f027d5-ssh-key-openstack-edpm-ipam\") pod \"af0d2223-27cf-46b8-9105-735784f027d5\" (UID: \"af0d2223-27cf-46b8-9105-735784f027d5\") " Jan 11 18:14:57 crc kubenswrapper[4837]: I0111 18:14:57.035889 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-extra-config-0\" (UniqueName: \"kubernetes.io/configmap/af0d2223-27cf-46b8-9105-735784f027d5-nova-extra-config-0\") pod \"af0d2223-27cf-46b8-9105-735784f027d5\" (UID: \"af0d2223-27cf-46b8-9105-735784f027d5\") " Jan 11 18:14:57 crc kubenswrapper[4837]: I0111 18:14:57.035942 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-cell1-compute-config-1\" (UniqueName: \"kubernetes.io/secret/af0d2223-27cf-46b8-9105-735784f027d5-nova-cell1-compute-config-1\") pod \"af0d2223-27cf-46b8-9105-735784f027d5\" (UID: \"af0d2223-27cf-46b8-9105-735784f027d5\") " Jan 11 18:14:57 crc kubenswrapper[4837]: I0111 18:14:57.035992 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-migration-ssh-key-1\" (UniqueName: \"kubernetes.io/secret/af0d2223-27cf-46b8-9105-735784f027d5-nova-migration-ssh-key-1\") pod \"af0d2223-27cf-46b8-9105-735784f027d5\" (UID: \"af0d2223-27cf-46b8-9105-735784f027d5\") " Jan 11 18:14:57 crc kubenswrapper[4837]: I0111 18:14:57.036029 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/af0d2223-27cf-46b8-9105-735784f027d5-inventory\") pod \"af0d2223-27cf-46b8-9105-735784f027d5\" (UID: \"af0d2223-27cf-46b8-9105-735784f027d5\") " Jan 11 18:14:57 crc kubenswrapper[4837]: I0111 18:14:57.036050 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-migration-ssh-key-0\" (UniqueName: \"kubernetes.io/secret/af0d2223-27cf-46b8-9105-735784f027d5-nova-migration-ssh-key-0\") pod \"af0d2223-27cf-46b8-9105-735784f027d5\" (UID: \"af0d2223-27cf-46b8-9105-735784f027d5\") " Jan 11 18:14:57 crc kubenswrapper[4837]: I0111 18:14:57.036074 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/af0d2223-27cf-46b8-9105-735784f027d5-nova-combined-ca-bundle\") pod \"af0d2223-27cf-46b8-9105-735784f027d5\" (UID: \"af0d2223-27cf-46b8-9105-735784f027d5\") " Jan 11 18:14:57 crc kubenswrapper[4837]: I0111 18:14:57.036103 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-cell1-compute-config-0\" (UniqueName: \"kubernetes.io/secret/af0d2223-27cf-46b8-9105-735784f027d5-nova-cell1-compute-config-0\") pod \"af0d2223-27cf-46b8-9105-735784f027d5\" (UID: \"af0d2223-27cf-46b8-9105-735784f027d5\") " Jan 11 18:14:57 crc kubenswrapper[4837]: I0111 18:14:57.041832 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/af0d2223-27cf-46b8-9105-735784f027d5-kube-api-access-xqqks" (OuterVolumeSpecName: "kube-api-access-xqqks") pod "af0d2223-27cf-46b8-9105-735784f027d5" (UID: "af0d2223-27cf-46b8-9105-735784f027d5"). InnerVolumeSpecName "kube-api-access-xqqks". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 11 18:14:57 crc kubenswrapper[4837]: I0111 18:14:57.046359 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/af0d2223-27cf-46b8-9105-735784f027d5-nova-combined-ca-bundle" (OuterVolumeSpecName: "nova-combined-ca-bundle") pod "af0d2223-27cf-46b8-9105-735784f027d5" (UID: "af0d2223-27cf-46b8-9105-735784f027d5"). InnerVolumeSpecName "nova-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 11 18:14:57 crc kubenswrapper[4837]: I0111 18:14:57.066074 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/af0d2223-27cf-46b8-9105-735784f027d5-nova-cell1-compute-config-1" (OuterVolumeSpecName: "nova-cell1-compute-config-1") pod "af0d2223-27cf-46b8-9105-735784f027d5" (UID: "af0d2223-27cf-46b8-9105-735784f027d5"). InnerVolumeSpecName "nova-cell1-compute-config-1". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 11 18:14:57 crc kubenswrapper[4837]: I0111 18:14:57.068725 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/af0d2223-27cf-46b8-9105-735784f027d5-nova-migration-ssh-key-1" (OuterVolumeSpecName: "nova-migration-ssh-key-1") pod "af0d2223-27cf-46b8-9105-735784f027d5" (UID: "af0d2223-27cf-46b8-9105-735784f027d5"). InnerVolumeSpecName "nova-migration-ssh-key-1". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 11 18:14:57 crc kubenswrapper[4837]: I0111 18:14:57.074085 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/af0d2223-27cf-46b8-9105-735784f027d5-ssh-key-openstack-edpm-ipam" (OuterVolumeSpecName: "ssh-key-openstack-edpm-ipam") pod "af0d2223-27cf-46b8-9105-735784f027d5" (UID: "af0d2223-27cf-46b8-9105-735784f027d5"). InnerVolumeSpecName "ssh-key-openstack-edpm-ipam". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 11 18:14:57 crc kubenswrapper[4837]: I0111 18:14:57.075751 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/af0d2223-27cf-46b8-9105-735784f027d5-nova-migration-ssh-key-0" (OuterVolumeSpecName: "nova-migration-ssh-key-0") pod "af0d2223-27cf-46b8-9105-735784f027d5" (UID: "af0d2223-27cf-46b8-9105-735784f027d5"). InnerVolumeSpecName "nova-migration-ssh-key-0". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 11 18:14:57 crc kubenswrapper[4837]: I0111 18:14:57.076562 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/af0d2223-27cf-46b8-9105-735784f027d5-nova-cell1-compute-config-0" (OuterVolumeSpecName: "nova-cell1-compute-config-0") pod "af0d2223-27cf-46b8-9105-735784f027d5" (UID: "af0d2223-27cf-46b8-9105-735784f027d5"). InnerVolumeSpecName "nova-cell1-compute-config-0". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 11 18:14:57 crc kubenswrapper[4837]: I0111 18:14:57.086024 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/af0d2223-27cf-46b8-9105-735784f027d5-nova-extra-config-0" (OuterVolumeSpecName: "nova-extra-config-0") pod "af0d2223-27cf-46b8-9105-735784f027d5" (UID: "af0d2223-27cf-46b8-9105-735784f027d5"). InnerVolumeSpecName "nova-extra-config-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 11 18:14:57 crc kubenswrapper[4837]: I0111 18:14:57.087305 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/af0d2223-27cf-46b8-9105-735784f027d5-inventory" (OuterVolumeSpecName: "inventory") pod "af0d2223-27cf-46b8-9105-735784f027d5" (UID: "af0d2223-27cf-46b8-9105-735784f027d5"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 11 18:14:57 crc kubenswrapper[4837]: I0111 18:14:57.138437 4837 reconciler_common.go:293] "Volume detached for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/af0d2223-27cf-46b8-9105-735784f027d5-ssh-key-openstack-edpm-ipam\") on node \"crc\" DevicePath \"\"" Jan 11 18:14:57 crc kubenswrapper[4837]: I0111 18:14:57.138470 4837 reconciler_common.go:293] "Volume detached for volume \"nova-extra-config-0\" (UniqueName: \"kubernetes.io/configmap/af0d2223-27cf-46b8-9105-735784f027d5-nova-extra-config-0\") on node \"crc\" DevicePath \"\"" Jan 11 18:14:57 crc kubenswrapper[4837]: I0111 18:14:57.138479 4837 reconciler_common.go:293] "Volume detached for volume \"nova-cell1-compute-config-1\" (UniqueName: \"kubernetes.io/secret/af0d2223-27cf-46b8-9105-735784f027d5-nova-cell1-compute-config-1\") on node \"crc\" DevicePath \"\"" Jan 11 18:14:57 crc kubenswrapper[4837]: I0111 18:14:57.138487 4837 reconciler_common.go:293] "Volume detached for volume \"nova-migration-ssh-key-1\" (UniqueName: \"kubernetes.io/secret/af0d2223-27cf-46b8-9105-735784f027d5-nova-migration-ssh-key-1\") on node \"crc\" DevicePath \"\"" Jan 11 18:14:57 crc kubenswrapper[4837]: I0111 18:14:57.138496 4837 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/af0d2223-27cf-46b8-9105-735784f027d5-inventory\") on node \"crc\" DevicePath \"\"" Jan 11 18:14:57 crc kubenswrapper[4837]: I0111 18:14:57.138504 4837 reconciler_common.go:293] "Volume detached for volume \"nova-migration-ssh-key-0\" (UniqueName: \"kubernetes.io/secret/af0d2223-27cf-46b8-9105-735784f027d5-nova-migration-ssh-key-0\") on node \"crc\" DevicePath \"\"" Jan 11 18:14:57 crc kubenswrapper[4837]: I0111 18:14:57.138516 4837 reconciler_common.go:293] "Volume detached for volume \"nova-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/af0d2223-27cf-46b8-9105-735784f027d5-nova-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 11 18:14:57 crc kubenswrapper[4837]: I0111 18:14:57.138528 4837 reconciler_common.go:293] "Volume detached for volume \"nova-cell1-compute-config-0\" (UniqueName: \"kubernetes.io/secret/af0d2223-27cf-46b8-9105-735784f027d5-nova-cell1-compute-config-0\") on node \"crc\" DevicePath \"\"" Jan 11 18:14:57 crc kubenswrapper[4837]: I0111 18:14:57.138540 4837 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xqqks\" (UniqueName: \"kubernetes.io/projected/af0d2223-27cf-46b8-9105-735784f027d5-kube-api-access-xqqks\") on node \"crc\" DevicePath \"\"" Jan 11 18:14:57 crc kubenswrapper[4837]: I0111 18:14:57.511297 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-fctb7" event={"ID":"af0d2223-27cf-46b8-9105-735784f027d5","Type":"ContainerDied","Data":"83732a97f0101f779f499ea1a9fe449e44c01dbc30ff41e8e3a1bf7e70902aba"} Jan 11 18:14:57 crc kubenswrapper[4837]: I0111 18:14:57.511344 4837 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="83732a97f0101f779f499ea1a9fe449e44c01dbc30ff41e8e3a1bf7e70902aba" Jan 11 18:14:57 crc kubenswrapper[4837]: I0111 18:14:57.511419 4837 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-fctb7" Jan 11 18:14:57 crc kubenswrapper[4837]: I0111 18:14:57.633510 4837 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/telemetry-edpm-deployment-openstack-edpm-ipam-fg96b"] Jan 11 18:14:57 crc kubenswrapper[4837]: E0111 18:14:57.634085 4837 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="af0d2223-27cf-46b8-9105-735784f027d5" containerName="nova-edpm-deployment-openstack-edpm-ipam" Jan 11 18:14:57 crc kubenswrapper[4837]: I0111 18:14:57.634107 4837 state_mem.go:107] "Deleted CPUSet assignment" podUID="af0d2223-27cf-46b8-9105-735784f027d5" containerName="nova-edpm-deployment-openstack-edpm-ipam" Jan 11 18:14:57 crc kubenswrapper[4837]: I0111 18:14:57.634506 4837 memory_manager.go:354] "RemoveStaleState removing state" podUID="af0d2223-27cf-46b8-9105-735784f027d5" containerName="nova-edpm-deployment-openstack-edpm-ipam" Jan 11 18:14:57 crc kubenswrapper[4837]: I0111 18:14:57.635489 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-fg96b" Jan 11 18:14:57 crc kubenswrapper[4837]: I0111 18:14:57.644344 4837 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/telemetry-edpm-deployment-openstack-edpm-ipam-fg96b"] Jan 11 18:14:57 crc kubenswrapper[4837]: I0111 18:14:57.678361 4837 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-compute-config-data" Jan 11 18:14:57 crc kubenswrapper[4837]: I0111 18:14:57.678872 4837 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-bzt89" Jan 11 18:14:57 crc kubenswrapper[4837]: I0111 18:14:57.679027 4837 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Jan 11 18:14:57 crc kubenswrapper[4837]: I0111 18:14:57.679200 4837 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Jan 11 18:14:57 crc kubenswrapper[4837]: I0111 18:14:57.679442 4837 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Jan 11 18:14:57 crc kubenswrapper[4837]: I0111 18:14:57.778797 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"telemetry-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/38ba1b37-c033-461b-bf07-7aecd5d1e5a1-telemetry-combined-ca-bundle\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-fg96b\" (UID: \"38ba1b37-c033-461b-bf07-7aecd5d1e5a1\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-fg96b" Jan 11 18:14:57 crc kubenswrapper[4837]: I0111 18:14:57.778889 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceilometer-compute-config-data-0\" (UniqueName: \"kubernetes.io/secret/38ba1b37-c033-461b-bf07-7aecd5d1e5a1-ceilometer-compute-config-data-0\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-fg96b\" (UID: \"38ba1b37-c033-461b-bf07-7aecd5d1e5a1\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-fg96b" Jan 11 18:14:57 crc kubenswrapper[4837]: I0111 18:14:57.778974 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/38ba1b37-c033-461b-bf07-7aecd5d1e5a1-ssh-key-openstack-edpm-ipam\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-fg96b\" (UID: \"38ba1b37-c033-461b-bf07-7aecd5d1e5a1\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-fg96b" Jan 11 18:14:57 crc kubenswrapper[4837]: I0111 18:14:57.779000 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/38ba1b37-c033-461b-bf07-7aecd5d1e5a1-inventory\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-fg96b\" (UID: \"38ba1b37-c033-461b-bf07-7aecd5d1e5a1\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-fg96b" Jan 11 18:14:57 crc kubenswrapper[4837]: I0111 18:14:57.779048 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceilometer-compute-config-data-2\" (UniqueName: \"kubernetes.io/secret/38ba1b37-c033-461b-bf07-7aecd5d1e5a1-ceilometer-compute-config-data-2\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-fg96b\" (UID: \"38ba1b37-c033-461b-bf07-7aecd5d1e5a1\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-fg96b" Jan 11 18:14:57 crc kubenswrapper[4837]: I0111 18:14:57.779394 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceilometer-compute-config-data-1\" (UniqueName: \"kubernetes.io/secret/38ba1b37-c033-461b-bf07-7aecd5d1e5a1-ceilometer-compute-config-data-1\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-fg96b\" (UID: \"38ba1b37-c033-461b-bf07-7aecd5d1e5a1\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-fg96b" Jan 11 18:14:57 crc kubenswrapper[4837]: I0111 18:14:57.779505 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4pkzr\" (UniqueName: \"kubernetes.io/projected/38ba1b37-c033-461b-bf07-7aecd5d1e5a1-kube-api-access-4pkzr\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-fg96b\" (UID: \"38ba1b37-c033-461b-bf07-7aecd5d1e5a1\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-fg96b" Jan 11 18:14:57 crc kubenswrapper[4837]: I0111 18:14:57.881499 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4pkzr\" (UniqueName: \"kubernetes.io/projected/38ba1b37-c033-461b-bf07-7aecd5d1e5a1-kube-api-access-4pkzr\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-fg96b\" (UID: \"38ba1b37-c033-461b-bf07-7aecd5d1e5a1\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-fg96b" Jan 11 18:14:57 crc kubenswrapper[4837]: I0111 18:14:57.881637 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"telemetry-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/38ba1b37-c033-461b-bf07-7aecd5d1e5a1-telemetry-combined-ca-bundle\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-fg96b\" (UID: \"38ba1b37-c033-461b-bf07-7aecd5d1e5a1\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-fg96b" Jan 11 18:14:57 crc kubenswrapper[4837]: I0111 18:14:57.881707 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceilometer-compute-config-data-0\" (UniqueName: \"kubernetes.io/secret/38ba1b37-c033-461b-bf07-7aecd5d1e5a1-ceilometer-compute-config-data-0\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-fg96b\" (UID: \"38ba1b37-c033-461b-bf07-7aecd5d1e5a1\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-fg96b" Jan 11 18:14:57 crc kubenswrapper[4837]: I0111 18:14:57.881736 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/38ba1b37-c033-461b-bf07-7aecd5d1e5a1-ssh-key-openstack-edpm-ipam\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-fg96b\" (UID: \"38ba1b37-c033-461b-bf07-7aecd5d1e5a1\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-fg96b" Jan 11 18:14:57 crc kubenswrapper[4837]: I0111 18:14:57.881760 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/38ba1b37-c033-461b-bf07-7aecd5d1e5a1-inventory\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-fg96b\" (UID: \"38ba1b37-c033-461b-bf07-7aecd5d1e5a1\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-fg96b" Jan 11 18:14:57 crc kubenswrapper[4837]: I0111 18:14:57.881824 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceilometer-compute-config-data-2\" (UniqueName: \"kubernetes.io/secret/38ba1b37-c033-461b-bf07-7aecd5d1e5a1-ceilometer-compute-config-data-2\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-fg96b\" (UID: \"38ba1b37-c033-461b-bf07-7aecd5d1e5a1\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-fg96b" Jan 11 18:14:57 crc kubenswrapper[4837]: I0111 18:14:57.881924 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceilometer-compute-config-data-1\" (UniqueName: \"kubernetes.io/secret/38ba1b37-c033-461b-bf07-7aecd5d1e5a1-ceilometer-compute-config-data-1\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-fg96b\" (UID: \"38ba1b37-c033-461b-bf07-7aecd5d1e5a1\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-fg96b" Jan 11 18:14:57 crc kubenswrapper[4837]: I0111 18:14:57.887585 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceilometer-compute-config-data-2\" (UniqueName: \"kubernetes.io/secret/38ba1b37-c033-461b-bf07-7aecd5d1e5a1-ceilometer-compute-config-data-2\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-fg96b\" (UID: \"38ba1b37-c033-461b-bf07-7aecd5d1e5a1\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-fg96b" Jan 11 18:14:57 crc kubenswrapper[4837]: I0111 18:14:57.887914 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceilometer-compute-config-data-1\" (UniqueName: \"kubernetes.io/secret/38ba1b37-c033-461b-bf07-7aecd5d1e5a1-ceilometer-compute-config-data-1\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-fg96b\" (UID: \"38ba1b37-c033-461b-bf07-7aecd5d1e5a1\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-fg96b" Jan 11 18:14:57 crc kubenswrapper[4837]: I0111 18:14:57.889892 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/38ba1b37-c033-461b-bf07-7aecd5d1e5a1-inventory\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-fg96b\" (UID: \"38ba1b37-c033-461b-bf07-7aecd5d1e5a1\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-fg96b" Jan 11 18:14:57 crc kubenswrapper[4837]: I0111 18:14:57.891274 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/38ba1b37-c033-461b-bf07-7aecd5d1e5a1-ssh-key-openstack-edpm-ipam\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-fg96b\" (UID: \"38ba1b37-c033-461b-bf07-7aecd5d1e5a1\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-fg96b" Jan 11 18:14:57 crc kubenswrapper[4837]: I0111 18:14:57.893647 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"telemetry-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/38ba1b37-c033-461b-bf07-7aecd5d1e5a1-telemetry-combined-ca-bundle\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-fg96b\" (UID: \"38ba1b37-c033-461b-bf07-7aecd5d1e5a1\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-fg96b" Jan 11 18:14:57 crc kubenswrapper[4837]: I0111 18:14:57.901849 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceilometer-compute-config-data-0\" (UniqueName: \"kubernetes.io/secret/38ba1b37-c033-461b-bf07-7aecd5d1e5a1-ceilometer-compute-config-data-0\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-fg96b\" (UID: \"38ba1b37-c033-461b-bf07-7aecd5d1e5a1\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-fg96b" Jan 11 18:14:57 crc kubenswrapper[4837]: I0111 18:14:57.902883 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4pkzr\" (UniqueName: \"kubernetes.io/projected/38ba1b37-c033-461b-bf07-7aecd5d1e5a1-kube-api-access-4pkzr\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-fg96b\" (UID: \"38ba1b37-c033-461b-bf07-7aecd5d1e5a1\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-fg96b" Jan 11 18:14:58 crc kubenswrapper[4837]: I0111 18:14:58.001303 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-fg96b" Jan 11 18:14:58 crc kubenswrapper[4837]: I0111 18:14:58.567164 4837 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/telemetry-edpm-deployment-openstack-edpm-ipam-fg96b"] Jan 11 18:14:59 crc kubenswrapper[4837]: I0111 18:14:59.529431 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-fg96b" event={"ID":"38ba1b37-c033-461b-bf07-7aecd5d1e5a1","Type":"ContainerStarted","Data":"9e7f26165fcda891d91a7a5a89605df117af27bf9fcc30b9fa7078393a464c2f"} Jan 11 18:15:00 crc kubenswrapper[4837]: I0111 18:15:00.132593 4837 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29469255-m7vmd"] Jan 11 18:15:00 crc kubenswrapper[4837]: I0111 18:15:00.133814 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29469255-m7vmd" Jan 11 18:15:00 crc kubenswrapper[4837]: I0111 18:15:00.136742 4837 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"collect-profiles-config" Jan 11 18:15:00 crc kubenswrapper[4837]: I0111 18:15:00.139305 4837 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"collect-profiles-dockercfg-kzf4t" Jan 11 18:15:00 crc kubenswrapper[4837]: I0111 18:15:00.150526 4837 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29469255-m7vmd"] Jan 11 18:15:00 crc kubenswrapper[4837]: I0111 18:15:00.331698 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/82dc5a36-2841-458f-bfe2-05985b81ee4d-secret-volume\") pod \"collect-profiles-29469255-m7vmd\" (UID: \"82dc5a36-2841-458f-bfe2-05985b81ee4d\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29469255-m7vmd" Jan 11 18:15:00 crc kubenswrapper[4837]: I0111 18:15:00.331815 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/82dc5a36-2841-458f-bfe2-05985b81ee4d-config-volume\") pod \"collect-profiles-29469255-m7vmd\" (UID: \"82dc5a36-2841-458f-bfe2-05985b81ee4d\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29469255-m7vmd" Jan 11 18:15:00 crc kubenswrapper[4837]: I0111 18:15:00.332274 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ktcd5\" (UniqueName: \"kubernetes.io/projected/82dc5a36-2841-458f-bfe2-05985b81ee4d-kube-api-access-ktcd5\") pod \"collect-profiles-29469255-m7vmd\" (UID: \"82dc5a36-2841-458f-bfe2-05985b81ee4d\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29469255-m7vmd" Jan 11 18:15:00 crc kubenswrapper[4837]: I0111 18:15:00.435489 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ktcd5\" (UniqueName: \"kubernetes.io/projected/82dc5a36-2841-458f-bfe2-05985b81ee4d-kube-api-access-ktcd5\") pod \"collect-profiles-29469255-m7vmd\" (UID: \"82dc5a36-2841-458f-bfe2-05985b81ee4d\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29469255-m7vmd" Jan 11 18:15:00 crc kubenswrapper[4837]: I0111 18:15:00.435640 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/82dc5a36-2841-458f-bfe2-05985b81ee4d-secret-volume\") pod \"collect-profiles-29469255-m7vmd\" (UID: \"82dc5a36-2841-458f-bfe2-05985b81ee4d\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29469255-m7vmd" Jan 11 18:15:00 crc kubenswrapper[4837]: I0111 18:15:00.435779 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/82dc5a36-2841-458f-bfe2-05985b81ee4d-config-volume\") pod \"collect-profiles-29469255-m7vmd\" (UID: \"82dc5a36-2841-458f-bfe2-05985b81ee4d\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29469255-m7vmd" Jan 11 18:15:00 crc kubenswrapper[4837]: I0111 18:15:00.436929 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/82dc5a36-2841-458f-bfe2-05985b81ee4d-config-volume\") pod \"collect-profiles-29469255-m7vmd\" (UID: \"82dc5a36-2841-458f-bfe2-05985b81ee4d\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29469255-m7vmd" Jan 11 18:15:00 crc kubenswrapper[4837]: I0111 18:15:00.441771 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/82dc5a36-2841-458f-bfe2-05985b81ee4d-secret-volume\") pod \"collect-profiles-29469255-m7vmd\" (UID: \"82dc5a36-2841-458f-bfe2-05985b81ee4d\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29469255-m7vmd" Jan 11 18:15:00 crc kubenswrapper[4837]: I0111 18:15:00.455775 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ktcd5\" (UniqueName: \"kubernetes.io/projected/82dc5a36-2841-458f-bfe2-05985b81ee4d-kube-api-access-ktcd5\") pod \"collect-profiles-29469255-m7vmd\" (UID: \"82dc5a36-2841-458f-bfe2-05985b81ee4d\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29469255-m7vmd" Jan 11 18:15:00 crc kubenswrapper[4837]: I0111 18:15:00.456804 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29469255-m7vmd" Jan 11 18:15:00 crc kubenswrapper[4837]: I0111 18:15:00.541702 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-fg96b" event={"ID":"38ba1b37-c033-461b-bf07-7aecd5d1e5a1","Type":"ContainerStarted","Data":"3d9167f4dbb4cc706f92e015b354d06066bdd55da31150c3f44887d682592abc"} Jan 11 18:15:00 crc kubenswrapper[4837]: I0111 18:15:00.575199 4837 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-fg96b" podStartSLOduration=2.839012693 podStartE2EDuration="3.575179295s" podCreationTimestamp="2026-01-11 18:14:57 +0000 UTC" firstStartedPulling="2026-01-11 18:14:58.550282373 +0000 UTC m=+2672.728475099" lastFinishedPulling="2026-01-11 18:14:59.286448945 +0000 UTC m=+2673.464641701" observedRunningTime="2026-01-11 18:15:00.563734197 +0000 UTC m=+2674.741926903" watchObservedRunningTime="2026-01-11 18:15:00.575179295 +0000 UTC m=+2674.753372001" Jan 11 18:15:00 crc kubenswrapper[4837]: I0111 18:15:00.923210 4837 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29469255-m7vmd"] Jan 11 18:15:00 crc kubenswrapper[4837]: W0111 18:15:00.926992 4837 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod82dc5a36_2841_458f_bfe2_05985b81ee4d.slice/crio-6518915d0d38e461a3b4f02865cc5ce1c217cde804c2aaef7bba8dc701b1c734 WatchSource:0}: Error finding container 6518915d0d38e461a3b4f02865cc5ce1c217cde804c2aaef7bba8dc701b1c734: Status 404 returned error can't find the container with id 6518915d0d38e461a3b4f02865cc5ce1c217cde804c2aaef7bba8dc701b1c734 Jan 11 18:15:01 crc kubenswrapper[4837]: I0111 18:15:01.553520 4837 generic.go:334] "Generic (PLEG): container finished" podID="82dc5a36-2841-458f-bfe2-05985b81ee4d" containerID="1731172e762d4dd5f4de06e52daa60a5de64fc11568f528e62bbb2ddd1ced6b6" exitCode=0 Jan 11 18:15:01 crc kubenswrapper[4837]: I0111 18:15:01.553666 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29469255-m7vmd" event={"ID":"82dc5a36-2841-458f-bfe2-05985b81ee4d","Type":"ContainerDied","Data":"1731172e762d4dd5f4de06e52daa60a5de64fc11568f528e62bbb2ddd1ced6b6"} Jan 11 18:15:01 crc kubenswrapper[4837]: I0111 18:15:01.553754 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29469255-m7vmd" event={"ID":"82dc5a36-2841-458f-bfe2-05985b81ee4d","Type":"ContainerStarted","Data":"6518915d0d38e461a3b4f02865cc5ce1c217cde804c2aaef7bba8dc701b1c734"} Jan 11 18:15:02 crc kubenswrapper[4837]: I0111 18:15:02.908578 4837 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29469255-m7vmd" Jan 11 18:15:03 crc kubenswrapper[4837]: I0111 18:15:03.087249 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/82dc5a36-2841-458f-bfe2-05985b81ee4d-secret-volume\") pod \"82dc5a36-2841-458f-bfe2-05985b81ee4d\" (UID: \"82dc5a36-2841-458f-bfe2-05985b81ee4d\") " Jan 11 18:15:03 crc kubenswrapper[4837]: I0111 18:15:03.087332 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ktcd5\" (UniqueName: \"kubernetes.io/projected/82dc5a36-2841-458f-bfe2-05985b81ee4d-kube-api-access-ktcd5\") pod \"82dc5a36-2841-458f-bfe2-05985b81ee4d\" (UID: \"82dc5a36-2841-458f-bfe2-05985b81ee4d\") " Jan 11 18:15:03 crc kubenswrapper[4837]: I0111 18:15:03.087413 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/82dc5a36-2841-458f-bfe2-05985b81ee4d-config-volume\") pod \"82dc5a36-2841-458f-bfe2-05985b81ee4d\" (UID: \"82dc5a36-2841-458f-bfe2-05985b81ee4d\") " Jan 11 18:15:03 crc kubenswrapper[4837]: I0111 18:15:03.088430 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/82dc5a36-2841-458f-bfe2-05985b81ee4d-config-volume" (OuterVolumeSpecName: "config-volume") pod "82dc5a36-2841-458f-bfe2-05985b81ee4d" (UID: "82dc5a36-2841-458f-bfe2-05985b81ee4d"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 11 18:15:03 crc kubenswrapper[4837]: I0111 18:15:03.093147 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/82dc5a36-2841-458f-bfe2-05985b81ee4d-kube-api-access-ktcd5" (OuterVolumeSpecName: "kube-api-access-ktcd5") pod "82dc5a36-2841-458f-bfe2-05985b81ee4d" (UID: "82dc5a36-2841-458f-bfe2-05985b81ee4d"). InnerVolumeSpecName "kube-api-access-ktcd5". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 11 18:15:03 crc kubenswrapper[4837]: I0111 18:15:03.094501 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/82dc5a36-2841-458f-bfe2-05985b81ee4d-secret-volume" (OuterVolumeSpecName: "secret-volume") pod "82dc5a36-2841-458f-bfe2-05985b81ee4d" (UID: "82dc5a36-2841-458f-bfe2-05985b81ee4d"). InnerVolumeSpecName "secret-volume". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 11 18:15:03 crc kubenswrapper[4837]: I0111 18:15:03.189570 4837 reconciler_common.go:293] "Volume detached for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/82dc5a36-2841-458f-bfe2-05985b81ee4d-secret-volume\") on node \"crc\" DevicePath \"\"" Jan 11 18:15:03 crc kubenswrapper[4837]: I0111 18:15:03.189614 4837 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-ktcd5\" (UniqueName: \"kubernetes.io/projected/82dc5a36-2841-458f-bfe2-05985b81ee4d-kube-api-access-ktcd5\") on node \"crc\" DevicePath \"\"" Jan 11 18:15:03 crc kubenswrapper[4837]: I0111 18:15:03.189627 4837 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/82dc5a36-2841-458f-bfe2-05985b81ee4d-config-volume\") on node \"crc\" DevicePath \"\"" Jan 11 18:15:03 crc kubenswrapper[4837]: I0111 18:15:03.573807 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29469255-m7vmd" event={"ID":"82dc5a36-2841-458f-bfe2-05985b81ee4d","Type":"ContainerDied","Data":"6518915d0d38e461a3b4f02865cc5ce1c217cde804c2aaef7bba8dc701b1c734"} Jan 11 18:15:03 crc kubenswrapper[4837]: I0111 18:15:03.573890 4837 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="6518915d0d38e461a3b4f02865cc5ce1c217cde804c2aaef7bba8dc701b1c734" Jan 11 18:15:03 crc kubenswrapper[4837]: I0111 18:15:03.573910 4837 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29469255-m7vmd" Jan 11 18:15:04 crc kubenswrapper[4837]: I0111 18:15:03.999732 4837 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29469210-m2c5r"] Jan 11 18:15:04 crc kubenswrapper[4837]: I0111 18:15:04.007723 4837 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29469210-m2c5r"] Jan 11 18:15:04 crc kubenswrapper[4837]: I0111 18:15:04.379053 4837 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="90317184-ec5a-4dc2-a9f8-6075c0d78aa1" path="/var/lib/kubelet/pods/90317184-ec5a-4dc2-a9f8-6075c0d78aa1/volumes" Jan 11 18:15:37 crc kubenswrapper[4837]: I0111 18:15:37.624377 4837 scope.go:117] "RemoveContainer" containerID="68d6e80bddec6bf073fe45ce7b2b3e74314236ca041b7b15504ef166b0de6576" Jan 11 18:16:39 crc kubenswrapper[4837]: I0111 18:16:39.444320 4837 patch_prober.go:28] interesting pod/machine-config-daemon-pqnst container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Jan 11 18:16:39 crc kubenswrapper[4837]: I0111 18:16:39.444951 4837 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-pqnst" podUID="1cf6fa66-290a-4e29-bafc-e60185a22fe2" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Jan 11 18:17:09 crc kubenswrapper[4837]: I0111 18:17:09.444031 4837 patch_prober.go:28] interesting pod/machine-config-daemon-pqnst container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Jan 11 18:17:09 crc kubenswrapper[4837]: I0111 18:17:09.444548 4837 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-pqnst" podUID="1cf6fa66-290a-4e29-bafc-e60185a22fe2" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Jan 11 18:17:28 crc kubenswrapper[4837]: I0111 18:17:28.628991 4837 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-94mhl"] Jan 11 18:17:28 crc kubenswrapper[4837]: E0111 18:17:28.629940 4837 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="82dc5a36-2841-458f-bfe2-05985b81ee4d" containerName="collect-profiles" Jan 11 18:17:28 crc kubenswrapper[4837]: I0111 18:17:28.629952 4837 state_mem.go:107] "Deleted CPUSet assignment" podUID="82dc5a36-2841-458f-bfe2-05985b81ee4d" containerName="collect-profiles" Jan 11 18:17:28 crc kubenswrapper[4837]: I0111 18:17:28.630134 4837 memory_manager.go:354] "RemoveStaleState removing state" podUID="82dc5a36-2841-458f-bfe2-05985b81ee4d" containerName="collect-profiles" Jan 11 18:17:28 crc kubenswrapper[4837]: I0111 18:17:28.631508 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-94mhl" Jan 11 18:17:28 crc kubenswrapper[4837]: I0111 18:17:28.652740 4837 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-94mhl"] Jan 11 18:17:28 crc kubenswrapper[4837]: I0111 18:17:28.683419 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vmgd9\" (UniqueName: \"kubernetes.io/projected/ed448dd8-9328-467b-bb4e-a0e661ff5073-kube-api-access-vmgd9\") pod \"certified-operators-94mhl\" (UID: \"ed448dd8-9328-467b-bb4e-a0e661ff5073\") " pod="openshift-marketplace/certified-operators-94mhl" Jan 11 18:17:28 crc kubenswrapper[4837]: I0111 18:17:28.683732 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ed448dd8-9328-467b-bb4e-a0e661ff5073-utilities\") pod \"certified-operators-94mhl\" (UID: \"ed448dd8-9328-467b-bb4e-a0e661ff5073\") " pod="openshift-marketplace/certified-operators-94mhl" Jan 11 18:17:28 crc kubenswrapper[4837]: I0111 18:17:28.683861 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ed448dd8-9328-467b-bb4e-a0e661ff5073-catalog-content\") pod \"certified-operators-94mhl\" (UID: \"ed448dd8-9328-467b-bb4e-a0e661ff5073\") " pod="openshift-marketplace/certified-operators-94mhl" Jan 11 18:17:28 crc kubenswrapper[4837]: I0111 18:17:28.786303 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ed448dd8-9328-467b-bb4e-a0e661ff5073-utilities\") pod \"certified-operators-94mhl\" (UID: \"ed448dd8-9328-467b-bb4e-a0e661ff5073\") " pod="openshift-marketplace/certified-operators-94mhl" Jan 11 18:17:28 crc kubenswrapper[4837]: I0111 18:17:28.786433 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ed448dd8-9328-467b-bb4e-a0e661ff5073-catalog-content\") pod \"certified-operators-94mhl\" (UID: \"ed448dd8-9328-467b-bb4e-a0e661ff5073\") " pod="openshift-marketplace/certified-operators-94mhl" Jan 11 18:17:28 crc kubenswrapper[4837]: I0111 18:17:28.786507 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vmgd9\" (UniqueName: \"kubernetes.io/projected/ed448dd8-9328-467b-bb4e-a0e661ff5073-kube-api-access-vmgd9\") pod \"certified-operators-94mhl\" (UID: \"ed448dd8-9328-467b-bb4e-a0e661ff5073\") " pod="openshift-marketplace/certified-operators-94mhl" Jan 11 18:17:28 crc kubenswrapper[4837]: I0111 18:17:28.786912 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ed448dd8-9328-467b-bb4e-a0e661ff5073-utilities\") pod \"certified-operators-94mhl\" (UID: \"ed448dd8-9328-467b-bb4e-a0e661ff5073\") " pod="openshift-marketplace/certified-operators-94mhl" Jan 11 18:17:28 crc kubenswrapper[4837]: I0111 18:17:28.787030 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ed448dd8-9328-467b-bb4e-a0e661ff5073-catalog-content\") pod \"certified-operators-94mhl\" (UID: \"ed448dd8-9328-467b-bb4e-a0e661ff5073\") " pod="openshift-marketplace/certified-operators-94mhl" Jan 11 18:17:28 crc kubenswrapper[4837]: I0111 18:17:28.807017 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vmgd9\" (UniqueName: \"kubernetes.io/projected/ed448dd8-9328-467b-bb4e-a0e661ff5073-kube-api-access-vmgd9\") pod \"certified-operators-94mhl\" (UID: \"ed448dd8-9328-467b-bb4e-a0e661ff5073\") " pod="openshift-marketplace/certified-operators-94mhl" Jan 11 18:17:28 crc kubenswrapper[4837]: I0111 18:17:28.964125 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-94mhl" Jan 11 18:17:29 crc kubenswrapper[4837]: I0111 18:17:29.491259 4837 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-94mhl"] Jan 11 18:17:30 crc kubenswrapper[4837]: I0111 18:17:30.023089 4837 generic.go:334] "Generic (PLEG): container finished" podID="ed448dd8-9328-467b-bb4e-a0e661ff5073" containerID="887993d9d2372b1721447ae1a29d4a1a58140a0a36d0f01d51a7c56baa258911" exitCode=0 Jan 11 18:17:30 crc kubenswrapper[4837]: I0111 18:17:30.023242 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-94mhl" event={"ID":"ed448dd8-9328-467b-bb4e-a0e661ff5073","Type":"ContainerDied","Data":"887993d9d2372b1721447ae1a29d4a1a58140a0a36d0f01d51a7c56baa258911"} Jan 11 18:17:30 crc kubenswrapper[4837]: I0111 18:17:30.023333 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-94mhl" event={"ID":"ed448dd8-9328-467b-bb4e-a0e661ff5073","Type":"ContainerStarted","Data":"ec6e964b32c5ad2a36b4e4bf91a9cbfb2c2fbd989ea5eb1957e74dc4aed9902d"} Jan 11 18:17:30 crc kubenswrapper[4837]: I0111 18:17:30.025214 4837 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Jan 11 18:17:31 crc kubenswrapper[4837]: I0111 18:17:31.036806 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-94mhl" event={"ID":"ed448dd8-9328-467b-bb4e-a0e661ff5073","Type":"ContainerStarted","Data":"f8b46e944b1b590b5ed5ba043765e49e9d04df6cbd1c5c26585e46910c2b8342"} Jan 11 18:17:32 crc kubenswrapper[4837]: I0111 18:17:32.047468 4837 generic.go:334] "Generic (PLEG): container finished" podID="ed448dd8-9328-467b-bb4e-a0e661ff5073" containerID="f8b46e944b1b590b5ed5ba043765e49e9d04df6cbd1c5c26585e46910c2b8342" exitCode=0 Jan 11 18:17:32 crc kubenswrapper[4837]: I0111 18:17:32.047532 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-94mhl" event={"ID":"ed448dd8-9328-467b-bb4e-a0e661ff5073","Type":"ContainerDied","Data":"f8b46e944b1b590b5ed5ba043765e49e9d04df6cbd1c5c26585e46910c2b8342"} Jan 11 18:17:33 crc kubenswrapper[4837]: I0111 18:17:33.062346 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-94mhl" event={"ID":"ed448dd8-9328-467b-bb4e-a0e661ff5073","Type":"ContainerStarted","Data":"59d1a81c95a68c5fe41158cfa451f0fc3c6d4b12bd9afc5613344617b7b50d27"} Jan 11 18:17:36 crc kubenswrapper[4837]: E0111 18:17:36.334119 4837 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod38ba1b37_c033_461b_bf07_7aecd5d1e5a1.slice/crio-conmon-3d9167f4dbb4cc706f92e015b354d06066bdd55da31150c3f44887d682592abc.scope\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod38ba1b37_c033_461b_bf07_7aecd5d1e5a1.slice/crio-3d9167f4dbb4cc706f92e015b354d06066bdd55da31150c3f44887d682592abc.scope\": RecentStats: unable to find data in memory cache]" Jan 11 18:17:37 crc kubenswrapper[4837]: I0111 18:17:37.108827 4837 generic.go:334] "Generic (PLEG): container finished" podID="38ba1b37-c033-461b-bf07-7aecd5d1e5a1" containerID="3d9167f4dbb4cc706f92e015b354d06066bdd55da31150c3f44887d682592abc" exitCode=0 Jan 11 18:17:37 crc kubenswrapper[4837]: I0111 18:17:37.108894 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-fg96b" event={"ID":"38ba1b37-c033-461b-bf07-7aecd5d1e5a1","Type":"ContainerDied","Data":"3d9167f4dbb4cc706f92e015b354d06066bdd55da31150c3f44887d682592abc"} Jan 11 18:17:37 crc kubenswrapper[4837]: I0111 18:17:37.140644 4837 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-94mhl" podStartSLOduration=6.70319074 podStartE2EDuration="9.140621774s" podCreationTimestamp="2026-01-11 18:17:28 +0000 UTC" firstStartedPulling="2026-01-11 18:17:30.0249574 +0000 UTC m=+2824.203150106" lastFinishedPulling="2026-01-11 18:17:32.462388394 +0000 UTC m=+2826.640581140" observedRunningTime="2026-01-11 18:17:33.089644998 +0000 UTC m=+2827.267837724" watchObservedRunningTime="2026-01-11 18:17:37.140621774 +0000 UTC m=+2831.318814480" Jan 11 18:17:38 crc kubenswrapper[4837]: I0111 18:17:38.637087 4837 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-fg96b" Jan 11 18:17:38 crc kubenswrapper[4837]: I0111 18:17:38.824655 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"telemetry-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/38ba1b37-c033-461b-bf07-7aecd5d1e5a1-telemetry-combined-ca-bundle\") pod \"38ba1b37-c033-461b-bf07-7aecd5d1e5a1\" (UID: \"38ba1b37-c033-461b-bf07-7aecd5d1e5a1\") " Jan 11 18:17:38 crc kubenswrapper[4837]: I0111 18:17:38.824748 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/38ba1b37-c033-461b-bf07-7aecd5d1e5a1-inventory\") pod \"38ba1b37-c033-461b-bf07-7aecd5d1e5a1\" (UID: \"38ba1b37-c033-461b-bf07-7aecd5d1e5a1\") " Jan 11 18:17:38 crc kubenswrapper[4837]: I0111 18:17:38.824851 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4pkzr\" (UniqueName: \"kubernetes.io/projected/38ba1b37-c033-461b-bf07-7aecd5d1e5a1-kube-api-access-4pkzr\") pod \"38ba1b37-c033-461b-bf07-7aecd5d1e5a1\" (UID: \"38ba1b37-c033-461b-bf07-7aecd5d1e5a1\") " Jan 11 18:17:38 crc kubenswrapper[4837]: I0111 18:17:38.824925 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceilometer-compute-config-data-0\" (UniqueName: \"kubernetes.io/secret/38ba1b37-c033-461b-bf07-7aecd5d1e5a1-ceilometer-compute-config-data-0\") pod \"38ba1b37-c033-461b-bf07-7aecd5d1e5a1\" (UID: \"38ba1b37-c033-461b-bf07-7aecd5d1e5a1\") " Jan 11 18:17:38 crc kubenswrapper[4837]: I0111 18:17:38.824948 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceilometer-compute-config-data-1\" (UniqueName: \"kubernetes.io/secret/38ba1b37-c033-461b-bf07-7aecd5d1e5a1-ceilometer-compute-config-data-1\") pod \"38ba1b37-c033-461b-bf07-7aecd5d1e5a1\" (UID: \"38ba1b37-c033-461b-bf07-7aecd5d1e5a1\") " Jan 11 18:17:38 crc kubenswrapper[4837]: I0111 18:17:38.825000 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/38ba1b37-c033-461b-bf07-7aecd5d1e5a1-ssh-key-openstack-edpm-ipam\") pod \"38ba1b37-c033-461b-bf07-7aecd5d1e5a1\" (UID: \"38ba1b37-c033-461b-bf07-7aecd5d1e5a1\") " Jan 11 18:17:38 crc kubenswrapper[4837]: I0111 18:17:38.825030 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceilometer-compute-config-data-2\" (UniqueName: \"kubernetes.io/secret/38ba1b37-c033-461b-bf07-7aecd5d1e5a1-ceilometer-compute-config-data-2\") pod \"38ba1b37-c033-461b-bf07-7aecd5d1e5a1\" (UID: \"38ba1b37-c033-461b-bf07-7aecd5d1e5a1\") " Jan 11 18:17:38 crc kubenswrapper[4837]: I0111 18:17:38.839010 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/38ba1b37-c033-461b-bf07-7aecd5d1e5a1-telemetry-combined-ca-bundle" (OuterVolumeSpecName: "telemetry-combined-ca-bundle") pod "38ba1b37-c033-461b-bf07-7aecd5d1e5a1" (UID: "38ba1b37-c033-461b-bf07-7aecd5d1e5a1"). InnerVolumeSpecName "telemetry-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 11 18:17:38 crc kubenswrapper[4837]: I0111 18:17:38.839020 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/38ba1b37-c033-461b-bf07-7aecd5d1e5a1-kube-api-access-4pkzr" (OuterVolumeSpecName: "kube-api-access-4pkzr") pod "38ba1b37-c033-461b-bf07-7aecd5d1e5a1" (UID: "38ba1b37-c033-461b-bf07-7aecd5d1e5a1"). InnerVolumeSpecName "kube-api-access-4pkzr". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 11 18:17:38 crc kubenswrapper[4837]: I0111 18:17:38.855822 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/38ba1b37-c033-461b-bf07-7aecd5d1e5a1-ceilometer-compute-config-data-0" (OuterVolumeSpecName: "ceilometer-compute-config-data-0") pod "38ba1b37-c033-461b-bf07-7aecd5d1e5a1" (UID: "38ba1b37-c033-461b-bf07-7aecd5d1e5a1"). InnerVolumeSpecName "ceilometer-compute-config-data-0". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 11 18:17:38 crc kubenswrapper[4837]: I0111 18:17:38.856787 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/38ba1b37-c033-461b-bf07-7aecd5d1e5a1-inventory" (OuterVolumeSpecName: "inventory") pod "38ba1b37-c033-461b-bf07-7aecd5d1e5a1" (UID: "38ba1b37-c033-461b-bf07-7aecd5d1e5a1"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 11 18:17:38 crc kubenswrapper[4837]: I0111 18:17:38.862963 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/38ba1b37-c033-461b-bf07-7aecd5d1e5a1-ssh-key-openstack-edpm-ipam" (OuterVolumeSpecName: "ssh-key-openstack-edpm-ipam") pod "38ba1b37-c033-461b-bf07-7aecd5d1e5a1" (UID: "38ba1b37-c033-461b-bf07-7aecd5d1e5a1"). InnerVolumeSpecName "ssh-key-openstack-edpm-ipam". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 11 18:17:38 crc kubenswrapper[4837]: I0111 18:17:38.865488 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/38ba1b37-c033-461b-bf07-7aecd5d1e5a1-ceilometer-compute-config-data-2" (OuterVolumeSpecName: "ceilometer-compute-config-data-2") pod "38ba1b37-c033-461b-bf07-7aecd5d1e5a1" (UID: "38ba1b37-c033-461b-bf07-7aecd5d1e5a1"). InnerVolumeSpecName "ceilometer-compute-config-data-2". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 11 18:17:38 crc kubenswrapper[4837]: I0111 18:17:38.877981 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/38ba1b37-c033-461b-bf07-7aecd5d1e5a1-ceilometer-compute-config-data-1" (OuterVolumeSpecName: "ceilometer-compute-config-data-1") pod "38ba1b37-c033-461b-bf07-7aecd5d1e5a1" (UID: "38ba1b37-c033-461b-bf07-7aecd5d1e5a1"). InnerVolumeSpecName "ceilometer-compute-config-data-1". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 11 18:17:38 crc kubenswrapper[4837]: I0111 18:17:38.928260 4837 reconciler_common.go:293] "Volume detached for volume \"telemetry-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/38ba1b37-c033-461b-bf07-7aecd5d1e5a1-telemetry-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 11 18:17:38 crc kubenswrapper[4837]: I0111 18:17:38.928429 4837 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/38ba1b37-c033-461b-bf07-7aecd5d1e5a1-inventory\") on node \"crc\" DevicePath \"\"" Jan 11 18:17:38 crc kubenswrapper[4837]: I0111 18:17:38.928446 4837 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-4pkzr\" (UniqueName: \"kubernetes.io/projected/38ba1b37-c033-461b-bf07-7aecd5d1e5a1-kube-api-access-4pkzr\") on node \"crc\" DevicePath \"\"" Jan 11 18:17:38 crc kubenswrapper[4837]: I0111 18:17:38.928554 4837 reconciler_common.go:293] "Volume detached for volume \"ceilometer-compute-config-data-0\" (UniqueName: \"kubernetes.io/secret/38ba1b37-c033-461b-bf07-7aecd5d1e5a1-ceilometer-compute-config-data-0\") on node \"crc\" DevicePath \"\"" Jan 11 18:17:38 crc kubenswrapper[4837]: I0111 18:17:38.928571 4837 reconciler_common.go:293] "Volume detached for volume \"ceilometer-compute-config-data-1\" (UniqueName: \"kubernetes.io/secret/38ba1b37-c033-461b-bf07-7aecd5d1e5a1-ceilometer-compute-config-data-1\") on node \"crc\" DevicePath \"\"" Jan 11 18:17:38 crc kubenswrapper[4837]: I0111 18:17:38.928585 4837 reconciler_common.go:293] "Volume detached for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/38ba1b37-c033-461b-bf07-7aecd5d1e5a1-ssh-key-openstack-edpm-ipam\") on node \"crc\" DevicePath \"\"" Jan 11 18:17:38 crc kubenswrapper[4837]: I0111 18:17:38.928597 4837 reconciler_common.go:293] "Volume detached for volume \"ceilometer-compute-config-data-2\" (UniqueName: \"kubernetes.io/secret/38ba1b37-c033-461b-bf07-7aecd5d1e5a1-ceilometer-compute-config-data-2\") on node \"crc\" DevicePath \"\"" Jan 11 18:17:38 crc kubenswrapper[4837]: I0111 18:17:38.965084 4837 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-94mhl" Jan 11 18:17:38 crc kubenswrapper[4837]: I0111 18:17:38.965933 4837 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-94mhl" Jan 11 18:17:39 crc kubenswrapper[4837]: I0111 18:17:39.029001 4837 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-94mhl" Jan 11 18:17:39 crc kubenswrapper[4837]: I0111 18:17:39.130888 4837 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-fg96b" Jan 11 18:17:39 crc kubenswrapper[4837]: I0111 18:17:39.130931 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-fg96b" event={"ID":"38ba1b37-c033-461b-bf07-7aecd5d1e5a1","Type":"ContainerDied","Data":"9e7f26165fcda891d91a7a5a89605df117af27bf9fcc30b9fa7078393a464c2f"} Jan 11 18:17:39 crc kubenswrapper[4837]: I0111 18:17:39.131000 4837 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="9e7f26165fcda891d91a7a5a89605df117af27bf9fcc30b9fa7078393a464c2f" Jan 11 18:17:39 crc kubenswrapper[4837]: I0111 18:17:39.204421 4837 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-94mhl" Jan 11 18:17:39 crc kubenswrapper[4837]: I0111 18:17:39.276387 4837 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-94mhl"] Jan 11 18:17:39 crc kubenswrapper[4837]: I0111 18:17:39.444098 4837 patch_prober.go:28] interesting pod/machine-config-daemon-pqnst container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Jan 11 18:17:39 crc kubenswrapper[4837]: I0111 18:17:39.444196 4837 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-pqnst" podUID="1cf6fa66-290a-4e29-bafc-e60185a22fe2" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Jan 11 18:17:39 crc kubenswrapper[4837]: I0111 18:17:39.444283 4837 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-pqnst" Jan 11 18:17:39 crc kubenswrapper[4837]: I0111 18:17:39.445065 4837 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"f1a9906e1f83fc15699d820fe6fec4c1a5b7031cfbe2d186a79b8184951f2fcd"} pod="openshift-machine-config-operator/machine-config-daemon-pqnst" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Jan 11 18:17:39 crc kubenswrapper[4837]: I0111 18:17:39.445143 4837 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-pqnst" podUID="1cf6fa66-290a-4e29-bafc-e60185a22fe2" containerName="machine-config-daemon" containerID="cri-o://f1a9906e1f83fc15699d820fe6fec4c1a5b7031cfbe2d186a79b8184951f2fcd" gracePeriod=600 Jan 11 18:17:40 crc kubenswrapper[4837]: I0111 18:17:40.141641 4837 generic.go:334] "Generic (PLEG): container finished" podID="1cf6fa66-290a-4e29-bafc-e60185a22fe2" containerID="f1a9906e1f83fc15699d820fe6fec4c1a5b7031cfbe2d186a79b8184951f2fcd" exitCode=0 Jan 11 18:17:40 crc kubenswrapper[4837]: I0111 18:17:40.141723 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-pqnst" event={"ID":"1cf6fa66-290a-4e29-bafc-e60185a22fe2","Type":"ContainerDied","Data":"f1a9906e1f83fc15699d820fe6fec4c1a5b7031cfbe2d186a79b8184951f2fcd"} Jan 11 18:17:40 crc kubenswrapper[4837]: I0111 18:17:40.142507 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-pqnst" event={"ID":"1cf6fa66-290a-4e29-bafc-e60185a22fe2","Type":"ContainerStarted","Data":"39b5b7180c0907dd777b55e3381ab4312f0b40d0e0b28505d8e75e226146dfdf"} Jan 11 18:17:40 crc kubenswrapper[4837]: I0111 18:17:40.142538 4837 scope.go:117] "RemoveContainer" containerID="2154fb56352ed810c3f2e54326f8ee766b9503eb23a751202bc7e683091dfb89" Jan 11 18:17:41 crc kubenswrapper[4837]: I0111 18:17:41.152538 4837 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-94mhl" podUID="ed448dd8-9328-467b-bb4e-a0e661ff5073" containerName="registry-server" containerID="cri-o://59d1a81c95a68c5fe41158cfa451f0fc3c6d4b12bd9afc5613344617b7b50d27" gracePeriod=2 Jan 11 18:17:41 crc kubenswrapper[4837]: I0111 18:17:41.636981 4837 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-94mhl" Jan 11 18:17:41 crc kubenswrapper[4837]: I0111 18:17:41.788838 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ed448dd8-9328-467b-bb4e-a0e661ff5073-utilities\") pod \"ed448dd8-9328-467b-bb4e-a0e661ff5073\" (UID: \"ed448dd8-9328-467b-bb4e-a0e661ff5073\") " Jan 11 18:17:41 crc kubenswrapper[4837]: I0111 18:17:41.789224 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ed448dd8-9328-467b-bb4e-a0e661ff5073-catalog-content\") pod \"ed448dd8-9328-467b-bb4e-a0e661ff5073\" (UID: \"ed448dd8-9328-467b-bb4e-a0e661ff5073\") " Jan 11 18:17:41 crc kubenswrapper[4837]: I0111 18:17:41.789330 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-vmgd9\" (UniqueName: \"kubernetes.io/projected/ed448dd8-9328-467b-bb4e-a0e661ff5073-kube-api-access-vmgd9\") pod \"ed448dd8-9328-467b-bb4e-a0e661ff5073\" (UID: \"ed448dd8-9328-467b-bb4e-a0e661ff5073\") " Jan 11 18:17:41 crc kubenswrapper[4837]: I0111 18:17:41.789845 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/ed448dd8-9328-467b-bb4e-a0e661ff5073-utilities" (OuterVolumeSpecName: "utilities") pod "ed448dd8-9328-467b-bb4e-a0e661ff5073" (UID: "ed448dd8-9328-467b-bb4e-a0e661ff5073"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 11 18:17:41 crc kubenswrapper[4837]: I0111 18:17:41.791828 4837 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ed448dd8-9328-467b-bb4e-a0e661ff5073-utilities\") on node \"crc\" DevicePath \"\"" Jan 11 18:17:41 crc kubenswrapper[4837]: I0111 18:17:41.796851 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ed448dd8-9328-467b-bb4e-a0e661ff5073-kube-api-access-vmgd9" (OuterVolumeSpecName: "kube-api-access-vmgd9") pod "ed448dd8-9328-467b-bb4e-a0e661ff5073" (UID: "ed448dd8-9328-467b-bb4e-a0e661ff5073"). InnerVolumeSpecName "kube-api-access-vmgd9". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 11 18:17:41 crc kubenswrapper[4837]: I0111 18:17:41.834764 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/ed448dd8-9328-467b-bb4e-a0e661ff5073-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "ed448dd8-9328-467b-bb4e-a0e661ff5073" (UID: "ed448dd8-9328-467b-bb4e-a0e661ff5073"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 11 18:17:41 crc kubenswrapper[4837]: I0111 18:17:41.893610 4837 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ed448dd8-9328-467b-bb4e-a0e661ff5073-catalog-content\") on node \"crc\" DevicePath \"\"" Jan 11 18:17:41 crc kubenswrapper[4837]: I0111 18:17:41.893646 4837 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-vmgd9\" (UniqueName: \"kubernetes.io/projected/ed448dd8-9328-467b-bb4e-a0e661ff5073-kube-api-access-vmgd9\") on node \"crc\" DevicePath \"\"" Jan 11 18:17:42 crc kubenswrapper[4837]: I0111 18:17:42.162157 4837 generic.go:334] "Generic (PLEG): container finished" podID="ed448dd8-9328-467b-bb4e-a0e661ff5073" containerID="59d1a81c95a68c5fe41158cfa451f0fc3c6d4b12bd9afc5613344617b7b50d27" exitCode=0 Jan 11 18:17:42 crc kubenswrapper[4837]: I0111 18:17:42.162245 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-94mhl" event={"ID":"ed448dd8-9328-467b-bb4e-a0e661ff5073","Type":"ContainerDied","Data":"59d1a81c95a68c5fe41158cfa451f0fc3c6d4b12bd9afc5613344617b7b50d27"} Jan 11 18:17:42 crc kubenswrapper[4837]: I0111 18:17:42.162265 4837 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-94mhl" Jan 11 18:17:42 crc kubenswrapper[4837]: I0111 18:17:42.162294 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-94mhl" event={"ID":"ed448dd8-9328-467b-bb4e-a0e661ff5073","Type":"ContainerDied","Data":"ec6e964b32c5ad2a36b4e4bf91a9cbfb2c2fbd989ea5eb1957e74dc4aed9902d"} Jan 11 18:17:42 crc kubenswrapper[4837]: I0111 18:17:42.162317 4837 scope.go:117] "RemoveContainer" containerID="59d1a81c95a68c5fe41158cfa451f0fc3c6d4b12bd9afc5613344617b7b50d27" Jan 11 18:17:42 crc kubenswrapper[4837]: I0111 18:17:42.199716 4837 scope.go:117] "RemoveContainer" containerID="f8b46e944b1b590b5ed5ba043765e49e9d04df6cbd1c5c26585e46910c2b8342" Jan 11 18:17:42 crc kubenswrapper[4837]: I0111 18:17:42.208650 4837 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-94mhl"] Jan 11 18:17:42 crc kubenswrapper[4837]: I0111 18:17:42.217769 4837 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-94mhl"] Jan 11 18:17:42 crc kubenswrapper[4837]: I0111 18:17:42.222790 4837 scope.go:117] "RemoveContainer" containerID="887993d9d2372b1721447ae1a29d4a1a58140a0a36d0f01d51a7c56baa258911" Jan 11 18:17:42 crc kubenswrapper[4837]: I0111 18:17:42.260069 4837 scope.go:117] "RemoveContainer" containerID="59d1a81c95a68c5fe41158cfa451f0fc3c6d4b12bd9afc5613344617b7b50d27" Jan 11 18:17:42 crc kubenswrapper[4837]: E0111 18:17:42.260635 4837 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"59d1a81c95a68c5fe41158cfa451f0fc3c6d4b12bd9afc5613344617b7b50d27\": container with ID starting with 59d1a81c95a68c5fe41158cfa451f0fc3c6d4b12bd9afc5613344617b7b50d27 not found: ID does not exist" containerID="59d1a81c95a68c5fe41158cfa451f0fc3c6d4b12bd9afc5613344617b7b50d27" Jan 11 18:17:42 crc kubenswrapper[4837]: I0111 18:17:42.260714 4837 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"59d1a81c95a68c5fe41158cfa451f0fc3c6d4b12bd9afc5613344617b7b50d27"} err="failed to get container status \"59d1a81c95a68c5fe41158cfa451f0fc3c6d4b12bd9afc5613344617b7b50d27\": rpc error: code = NotFound desc = could not find container \"59d1a81c95a68c5fe41158cfa451f0fc3c6d4b12bd9afc5613344617b7b50d27\": container with ID starting with 59d1a81c95a68c5fe41158cfa451f0fc3c6d4b12bd9afc5613344617b7b50d27 not found: ID does not exist" Jan 11 18:17:42 crc kubenswrapper[4837]: I0111 18:17:42.260760 4837 scope.go:117] "RemoveContainer" containerID="f8b46e944b1b590b5ed5ba043765e49e9d04df6cbd1c5c26585e46910c2b8342" Jan 11 18:17:42 crc kubenswrapper[4837]: E0111 18:17:42.261163 4837 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f8b46e944b1b590b5ed5ba043765e49e9d04df6cbd1c5c26585e46910c2b8342\": container with ID starting with f8b46e944b1b590b5ed5ba043765e49e9d04df6cbd1c5c26585e46910c2b8342 not found: ID does not exist" containerID="f8b46e944b1b590b5ed5ba043765e49e9d04df6cbd1c5c26585e46910c2b8342" Jan 11 18:17:42 crc kubenswrapper[4837]: I0111 18:17:42.261221 4837 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f8b46e944b1b590b5ed5ba043765e49e9d04df6cbd1c5c26585e46910c2b8342"} err="failed to get container status \"f8b46e944b1b590b5ed5ba043765e49e9d04df6cbd1c5c26585e46910c2b8342\": rpc error: code = NotFound desc = could not find container \"f8b46e944b1b590b5ed5ba043765e49e9d04df6cbd1c5c26585e46910c2b8342\": container with ID starting with f8b46e944b1b590b5ed5ba043765e49e9d04df6cbd1c5c26585e46910c2b8342 not found: ID does not exist" Jan 11 18:17:42 crc kubenswrapper[4837]: I0111 18:17:42.261267 4837 scope.go:117] "RemoveContainer" containerID="887993d9d2372b1721447ae1a29d4a1a58140a0a36d0f01d51a7c56baa258911" Jan 11 18:17:42 crc kubenswrapper[4837]: E0111 18:17:42.261616 4837 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"887993d9d2372b1721447ae1a29d4a1a58140a0a36d0f01d51a7c56baa258911\": container with ID starting with 887993d9d2372b1721447ae1a29d4a1a58140a0a36d0f01d51a7c56baa258911 not found: ID does not exist" containerID="887993d9d2372b1721447ae1a29d4a1a58140a0a36d0f01d51a7c56baa258911" Jan 11 18:17:42 crc kubenswrapper[4837]: I0111 18:17:42.261663 4837 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"887993d9d2372b1721447ae1a29d4a1a58140a0a36d0f01d51a7c56baa258911"} err="failed to get container status \"887993d9d2372b1721447ae1a29d4a1a58140a0a36d0f01d51a7c56baa258911\": rpc error: code = NotFound desc = could not find container \"887993d9d2372b1721447ae1a29d4a1a58140a0a36d0f01d51a7c56baa258911\": container with ID starting with 887993d9d2372b1721447ae1a29d4a1a58140a0a36d0f01d51a7c56baa258911 not found: ID does not exist" Jan 11 18:17:42 crc kubenswrapper[4837]: I0111 18:17:42.375562 4837 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ed448dd8-9328-467b-bb4e-a0e661ff5073" path="/var/lib/kubelet/pods/ed448dd8-9328-467b-bb4e-a0e661ff5073/volumes" Jan 11 18:17:44 crc kubenswrapper[4837]: I0111 18:17:44.692723 4837 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-mgv5j"] Jan 11 18:17:44 crc kubenswrapper[4837]: E0111 18:17:44.694949 4837 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ed448dd8-9328-467b-bb4e-a0e661ff5073" containerName="extract-utilities" Jan 11 18:17:44 crc kubenswrapper[4837]: I0111 18:17:44.694982 4837 state_mem.go:107] "Deleted CPUSet assignment" podUID="ed448dd8-9328-467b-bb4e-a0e661ff5073" containerName="extract-utilities" Jan 11 18:17:44 crc kubenswrapper[4837]: E0111 18:17:44.695007 4837 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="38ba1b37-c033-461b-bf07-7aecd5d1e5a1" containerName="telemetry-edpm-deployment-openstack-edpm-ipam" Jan 11 18:17:44 crc kubenswrapper[4837]: I0111 18:17:44.695032 4837 state_mem.go:107] "Deleted CPUSet assignment" podUID="38ba1b37-c033-461b-bf07-7aecd5d1e5a1" containerName="telemetry-edpm-deployment-openstack-edpm-ipam" Jan 11 18:17:44 crc kubenswrapper[4837]: E0111 18:17:44.695056 4837 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ed448dd8-9328-467b-bb4e-a0e661ff5073" containerName="extract-content" Jan 11 18:17:44 crc kubenswrapper[4837]: I0111 18:17:44.695066 4837 state_mem.go:107] "Deleted CPUSet assignment" podUID="ed448dd8-9328-467b-bb4e-a0e661ff5073" containerName="extract-content" Jan 11 18:17:44 crc kubenswrapper[4837]: E0111 18:17:44.695083 4837 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ed448dd8-9328-467b-bb4e-a0e661ff5073" containerName="registry-server" Jan 11 18:17:44 crc kubenswrapper[4837]: I0111 18:17:44.695089 4837 state_mem.go:107] "Deleted CPUSet assignment" podUID="ed448dd8-9328-467b-bb4e-a0e661ff5073" containerName="registry-server" Jan 11 18:17:44 crc kubenswrapper[4837]: I0111 18:17:44.695270 4837 memory_manager.go:354] "RemoveStaleState removing state" podUID="38ba1b37-c033-461b-bf07-7aecd5d1e5a1" containerName="telemetry-edpm-deployment-openstack-edpm-ipam" Jan 11 18:17:44 crc kubenswrapper[4837]: I0111 18:17:44.695300 4837 memory_manager.go:354] "RemoveStaleState removing state" podUID="ed448dd8-9328-467b-bb4e-a0e661ff5073" containerName="registry-server" Jan 11 18:17:44 crc kubenswrapper[4837]: I0111 18:17:44.696869 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-mgv5j" Jan 11 18:17:44 crc kubenswrapper[4837]: I0111 18:17:44.708972 4837 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-mgv5j"] Jan 11 18:17:44 crc kubenswrapper[4837]: I0111 18:17:44.854970 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/6b288523-acdf-4742-81a2-aabef3dff7ce-utilities\") pod \"redhat-marketplace-mgv5j\" (UID: \"6b288523-acdf-4742-81a2-aabef3dff7ce\") " pod="openshift-marketplace/redhat-marketplace-mgv5j" Jan 11 18:17:44 crc kubenswrapper[4837]: I0111 18:17:44.855113 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vznlt\" (UniqueName: \"kubernetes.io/projected/6b288523-acdf-4742-81a2-aabef3dff7ce-kube-api-access-vznlt\") pod \"redhat-marketplace-mgv5j\" (UID: \"6b288523-acdf-4742-81a2-aabef3dff7ce\") " pod="openshift-marketplace/redhat-marketplace-mgv5j" Jan 11 18:17:44 crc kubenswrapper[4837]: I0111 18:17:44.855270 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/6b288523-acdf-4742-81a2-aabef3dff7ce-catalog-content\") pod \"redhat-marketplace-mgv5j\" (UID: \"6b288523-acdf-4742-81a2-aabef3dff7ce\") " pod="openshift-marketplace/redhat-marketplace-mgv5j" Jan 11 18:17:44 crc kubenswrapper[4837]: I0111 18:17:44.956702 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/6b288523-acdf-4742-81a2-aabef3dff7ce-catalog-content\") pod \"redhat-marketplace-mgv5j\" (UID: \"6b288523-acdf-4742-81a2-aabef3dff7ce\") " pod="openshift-marketplace/redhat-marketplace-mgv5j" Jan 11 18:17:44 crc kubenswrapper[4837]: I0111 18:17:44.956820 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/6b288523-acdf-4742-81a2-aabef3dff7ce-utilities\") pod \"redhat-marketplace-mgv5j\" (UID: \"6b288523-acdf-4742-81a2-aabef3dff7ce\") " pod="openshift-marketplace/redhat-marketplace-mgv5j" Jan 11 18:17:44 crc kubenswrapper[4837]: I0111 18:17:44.956906 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vznlt\" (UniqueName: \"kubernetes.io/projected/6b288523-acdf-4742-81a2-aabef3dff7ce-kube-api-access-vznlt\") pod \"redhat-marketplace-mgv5j\" (UID: \"6b288523-acdf-4742-81a2-aabef3dff7ce\") " pod="openshift-marketplace/redhat-marketplace-mgv5j" Jan 11 18:17:44 crc kubenswrapper[4837]: I0111 18:17:44.957269 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/6b288523-acdf-4742-81a2-aabef3dff7ce-catalog-content\") pod \"redhat-marketplace-mgv5j\" (UID: \"6b288523-acdf-4742-81a2-aabef3dff7ce\") " pod="openshift-marketplace/redhat-marketplace-mgv5j" Jan 11 18:17:44 crc kubenswrapper[4837]: I0111 18:17:44.957354 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/6b288523-acdf-4742-81a2-aabef3dff7ce-utilities\") pod \"redhat-marketplace-mgv5j\" (UID: \"6b288523-acdf-4742-81a2-aabef3dff7ce\") " pod="openshift-marketplace/redhat-marketplace-mgv5j" Jan 11 18:17:44 crc kubenswrapper[4837]: I0111 18:17:44.989600 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vznlt\" (UniqueName: \"kubernetes.io/projected/6b288523-acdf-4742-81a2-aabef3dff7ce-kube-api-access-vznlt\") pod \"redhat-marketplace-mgv5j\" (UID: \"6b288523-acdf-4742-81a2-aabef3dff7ce\") " pod="openshift-marketplace/redhat-marketplace-mgv5j" Jan 11 18:17:45 crc kubenswrapper[4837]: I0111 18:17:45.022909 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-mgv5j" Jan 11 18:17:45 crc kubenswrapper[4837]: I0111 18:17:45.529138 4837 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-mgv5j"] Jan 11 18:17:46 crc kubenswrapper[4837]: I0111 18:17:46.217591 4837 generic.go:334] "Generic (PLEG): container finished" podID="6b288523-acdf-4742-81a2-aabef3dff7ce" containerID="85e6f8b7f9b35a183629ef2051e9a71d34fd7eaac5e0c8cd2d1bddb0f1ccaa83" exitCode=0 Jan 11 18:17:46 crc kubenswrapper[4837]: I0111 18:17:46.219244 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-mgv5j" event={"ID":"6b288523-acdf-4742-81a2-aabef3dff7ce","Type":"ContainerDied","Data":"85e6f8b7f9b35a183629ef2051e9a71d34fd7eaac5e0c8cd2d1bddb0f1ccaa83"} Jan 11 18:17:46 crc kubenswrapper[4837]: I0111 18:17:46.219289 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-mgv5j" event={"ID":"6b288523-acdf-4742-81a2-aabef3dff7ce","Type":"ContainerStarted","Data":"db4a9a4c59f987c1653d35ba26f8eedaec421697125d9a7c8e7c0586a804ba43"} Jan 11 18:17:48 crc kubenswrapper[4837]: I0111 18:17:48.243441 4837 generic.go:334] "Generic (PLEG): container finished" podID="6b288523-acdf-4742-81a2-aabef3dff7ce" containerID="eda3a3b90e5032a11d3a705f81f86a53f68ac7527edab35e7b3dac0d76344ce2" exitCode=0 Jan 11 18:17:48 crc kubenswrapper[4837]: I0111 18:17:48.245100 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-mgv5j" event={"ID":"6b288523-acdf-4742-81a2-aabef3dff7ce","Type":"ContainerDied","Data":"eda3a3b90e5032a11d3a705f81f86a53f68ac7527edab35e7b3dac0d76344ce2"} Jan 11 18:17:48 crc kubenswrapper[4837]: I0111 18:17:48.288731 4837 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-krrnj"] Jan 11 18:17:48 crc kubenswrapper[4837]: I0111 18:17:48.291775 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-krrnj" Jan 11 18:17:48 crc kubenswrapper[4837]: I0111 18:17:48.305751 4837 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-krrnj"] Jan 11 18:17:48 crc kubenswrapper[4837]: I0111 18:17:48.352339 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-q4zv2\" (UniqueName: \"kubernetes.io/projected/7e725d66-1f9e-4ebd-9f11-66f983a00222-kube-api-access-q4zv2\") pod \"community-operators-krrnj\" (UID: \"7e725d66-1f9e-4ebd-9f11-66f983a00222\") " pod="openshift-marketplace/community-operators-krrnj" Jan 11 18:17:48 crc kubenswrapper[4837]: I0111 18:17:48.352456 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/7e725d66-1f9e-4ebd-9f11-66f983a00222-catalog-content\") pod \"community-operators-krrnj\" (UID: \"7e725d66-1f9e-4ebd-9f11-66f983a00222\") " pod="openshift-marketplace/community-operators-krrnj" Jan 11 18:17:48 crc kubenswrapper[4837]: I0111 18:17:48.352494 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/7e725d66-1f9e-4ebd-9f11-66f983a00222-utilities\") pod \"community-operators-krrnj\" (UID: \"7e725d66-1f9e-4ebd-9f11-66f983a00222\") " pod="openshift-marketplace/community-operators-krrnj" Jan 11 18:17:48 crc kubenswrapper[4837]: I0111 18:17:48.455119 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-q4zv2\" (UniqueName: \"kubernetes.io/projected/7e725d66-1f9e-4ebd-9f11-66f983a00222-kube-api-access-q4zv2\") pod \"community-operators-krrnj\" (UID: \"7e725d66-1f9e-4ebd-9f11-66f983a00222\") " pod="openshift-marketplace/community-operators-krrnj" Jan 11 18:17:48 crc kubenswrapper[4837]: I0111 18:17:48.455360 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/7e725d66-1f9e-4ebd-9f11-66f983a00222-catalog-content\") pod \"community-operators-krrnj\" (UID: \"7e725d66-1f9e-4ebd-9f11-66f983a00222\") " pod="openshift-marketplace/community-operators-krrnj" Jan 11 18:17:48 crc kubenswrapper[4837]: I0111 18:17:48.455440 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/7e725d66-1f9e-4ebd-9f11-66f983a00222-utilities\") pod \"community-operators-krrnj\" (UID: \"7e725d66-1f9e-4ebd-9f11-66f983a00222\") " pod="openshift-marketplace/community-operators-krrnj" Jan 11 18:17:48 crc kubenswrapper[4837]: I0111 18:17:48.455928 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/7e725d66-1f9e-4ebd-9f11-66f983a00222-utilities\") pod \"community-operators-krrnj\" (UID: \"7e725d66-1f9e-4ebd-9f11-66f983a00222\") " pod="openshift-marketplace/community-operators-krrnj" Jan 11 18:17:48 crc kubenswrapper[4837]: I0111 18:17:48.455956 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/7e725d66-1f9e-4ebd-9f11-66f983a00222-catalog-content\") pod \"community-operators-krrnj\" (UID: \"7e725d66-1f9e-4ebd-9f11-66f983a00222\") " pod="openshift-marketplace/community-operators-krrnj" Jan 11 18:17:48 crc kubenswrapper[4837]: I0111 18:17:48.476521 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-q4zv2\" (UniqueName: \"kubernetes.io/projected/7e725d66-1f9e-4ebd-9f11-66f983a00222-kube-api-access-q4zv2\") pod \"community-operators-krrnj\" (UID: \"7e725d66-1f9e-4ebd-9f11-66f983a00222\") " pod="openshift-marketplace/community-operators-krrnj" Jan 11 18:17:48 crc kubenswrapper[4837]: I0111 18:17:48.620893 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-krrnj" Jan 11 18:17:49 crc kubenswrapper[4837]: I0111 18:17:49.181694 4837 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-krrnj"] Jan 11 18:17:49 crc kubenswrapper[4837]: I0111 18:17:49.256882 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-mgv5j" event={"ID":"6b288523-acdf-4742-81a2-aabef3dff7ce","Type":"ContainerStarted","Data":"064b612aa59f809174b1484f0a3d014a5d276676900c95d570d9afc417d0b2bf"} Jan 11 18:17:49 crc kubenswrapper[4837]: I0111 18:17:49.258556 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-krrnj" event={"ID":"7e725d66-1f9e-4ebd-9f11-66f983a00222","Type":"ContainerStarted","Data":"6580fa7c775767716435bc58d93adf15fa7591c02fabd0a9afd2a12230305ed2"} Jan 11 18:17:49 crc kubenswrapper[4837]: I0111 18:17:49.285509 4837 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-mgv5j" podStartSLOduration=2.702518698 podStartE2EDuration="5.285491625s" podCreationTimestamp="2026-01-11 18:17:44 +0000 UTC" firstStartedPulling="2026-01-11 18:17:46.220830148 +0000 UTC m=+2840.399022854" lastFinishedPulling="2026-01-11 18:17:48.803803085 +0000 UTC m=+2842.981995781" observedRunningTime="2026-01-11 18:17:49.283987914 +0000 UTC m=+2843.462180620" watchObservedRunningTime="2026-01-11 18:17:49.285491625 +0000 UTC m=+2843.463684321" Jan 11 18:17:50 crc kubenswrapper[4837]: I0111 18:17:50.269576 4837 generic.go:334] "Generic (PLEG): container finished" podID="7e725d66-1f9e-4ebd-9f11-66f983a00222" containerID="cc4ed19bd039ca77837737593404c4bc452dd6ec058ecae7007ed3ff76c54533" exitCode=0 Jan 11 18:17:50 crc kubenswrapper[4837]: I0111 18:17:50.269669 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-krrnj" event={"ID":"7e725d66-1f9e-4ebd-9f11-66f983a00222","Type":"ContainerDied","Data":"cc4ed19bd039ca77837737593404c4bc452dd6ec058ecae7007ed3ff76c54533"} Jan 11 18:17:51 crc kubenswrapper[4837]: I0111 18:17:51.281429 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-krrnj" event={"ID":"7e725d66-1f9e-4ebd-9f11-66f983a00222","Type":"ContainerStarted","Data":"a25e3022b4671bb5f83e953f55021cd96cc58ab9651d48970db15496b2435bdc"} Jan 11 18:17:52 crc kubenswrapper[4837]: I0111 18:17:52.292470 4837 generic.go:334] "Generic (PLEG): container finished" podID="7e725d66-1f9e-4ebd-9f11-66f983a00222" containerID="a25e3022b4671bb5f83e953f55021cd96cc58ab9651d48970db15496b2435bdc" exitCode=0 Jan 11 18:17:52 crc kubenswrapper[4837]: I0111 18:17:52.292528 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-krrnj" event={"ID":"7e725d66-1f9e-4ebd-9f11-66f983a00222","Type":"ContainerDied","Data":"a25e3022b4671bb5f83e953f55021cd96cc58ab9651d48970db15496b2435bdc"} Jan 11 18:17:53 crc kubenswrapper[4837]: I0111 18:17:53.309212 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-krrnj" event={"ID":"7e725d66-1f9e-4ebd-9f11-66f983a00222","Type":"ContainerStarted","Data":"6d4150a166416e154f472c3a83edc7e45d7e5a041fb46292c5986bfe78dab632"} Jan 11 18:17:53 crc kubenswrapper[4837]: I0111 18:17:53.335946 4837 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-krrnj" podStartSLOduration=2.727733402 podStartE2EDuration="5.335924337s" podCreationTimestamp="2026-01-11 18:17:48 +0000 UTC" firstStartedPulling="2026-01-11 18:17:50.271844665 +0000 UTC m=+2844.450037371" lastFinishedPulling="2026-01-11 18:17:52.8800356 +0000 UTC m=+2847.058228306" observedRunningTime="2026-01-11 18:17:53.333538753 +0000 UTC m=+2847.511731469" watchObservedRunningTime="2026-01-11 18:17:53.335924337 +0000 UTC m=+2847.514117053" Jan 11 18:17:55 crc kubenswrapper[4837]: I0111 18:17:55.023113 4837 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-mgv5j" Jan 11 18:17:55 crc kubenswrapper[4837]: I0111 18:17:55.025054 4837 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-mgv5j" Jan 11 18:17:55 crc kubenswrapper[4837]: I0111 18:17:55.105272 4837 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-mgv5j" Jan 11 18:17:55 crc kubenswrapper[4837]: I0111 18:17:55.381764 4837 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-mgv5j" Jan 11 18:17:56 crc kubenswrapper[4837]: I0111 18:17:56.076327 4837 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-mgv5j"] Jan 11 18:17:57 crc kubenswrapper[4837]: I0111 18:17:57.351721 4837 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-mgv5j" podUID="6b288523-acdf-4742-81a2-aabef3dff7ce" containerName="registry-server" containerID="cri-o://064b612aa59f809174b1484f0a3d014a5d276676900c95d570d9afc417d0b2bf" gracePeriod=2 Jan 11 18:17:57 crc kubenswrapper[4837]: I0111 18:17:57.893488 4837 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-mgv5j" Jan 11 18:17:57 crc kubenswrapper[4837]: I0111 18:17:57.985341 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-vznlt\" (UniqueName: \"kubernetes.io/projected/6b288523-acdf-4742-81a2-aabef3dff7ce-kube-api-access-vznlt\") pod \"6b288523-acdf-4742-81a2-aabef3dff7ce\" (UID: \"6b288523-acdf-4742-81a2-aabef3dff7ce\") " Jan 11 18:17:57 crc kubenswrapper[4837]: I0111 18:17:57.985576 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/6b288523-acdf-4742-81a2-aabef3dff7ce-catalog-content\") pod \"6b288523-acdf-4742-81a2-aabef3dff7ce\" (UID: \"6b288523-acdf-4742-81a2-aabef3dff7ce\") " Jan 11 18:17:57 crc kubenswrapper[4837]: I0111 18:17:57.985628 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/6b288523-acdf-4742-81a2-aabef3dff7ce-utilities\") pod \"6b288523-acdf-4742-81a2-aabef3dff7ce\" (UID: \"6b288523-acdf-4742-81a2-aabef3dff7ce\") " Jan 11 18:17:57 crc kubenswrapper[4837]: I0111 18:17:57.986226 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/6b288523-acdf-4742-81a2-aabef3dff7ce-utilities" (OuterVolumeSpecName: "utilities") pod "6b288523-acdf-4742-81a2-aabef3dff7ce" (UID: "6b288523-acdf-4742-81a2-aabef3dff7ce"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 11 18:17:58 crc kubenswrapper[4837]: I0111 18:17:57.993952 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6b288523-acdf-4742-81a2-aabef3dff7ce-kube-api-access-vznlt" (OuterVolumeSpecName: "kube-api-access-vznlt") pod "6b288523-acdf-4742-81a2-aabef3dff7ce" (UID: "6b288523-acdf-4742-81a2-aabef3dff7ce"). InnerVolumeSpecName "kube-api-access-vznlt". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 11 18:17:58 crc kubenswrapper[4837]: I0111 18:17:58.044955 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/6b288523-acdf-4742-81a2-aabef3dff7ce-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "6b288523-acdf-4742-81a2-aabef3dff7ce" (UID: "6b288523-acdf-4742-81a2-aabef3dff7ce"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 11 18:17:58 crc kubenswrapper[4837]: I0111 18:17:58.088132 4837 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/6b288523-acdf-4742-81a2-aabef3dff7ce-utilities\") on node \"crc\" DevicePath \"\"" Jan 11 18:17:58 crc kubenswrapper[4837]: I0111 18:17:58.088176 4837 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-vznlt\" (UniqueName: \"kubernetes.io/projected/6b288523-acdf-4742-81a2-aabef3dff7ce-kube-api-access-vznlt\") on node \"crc\" DevicePath \"\"" Jan 11 18:17:58 crc kubenswrapper[4837]: I0111 18:17:58.088190 4837 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/6b288523-acdf-4742-81a2-aabef3dff7ce-catalog-content\") on node \"crc\" DevicePath \"\"" Jan 11 18:17:58 crc kubenswrapper[4837]: I0111 18:17:58.378888 4837 generic.go:334] "Generic (PLEG): container finished" podID="6b288523-acdf-4742-81a2-aabef3dff7ce" containerID="064b612aa59f809174b1484f0a3d014a5d276676900c95d570d9afc417d0b2bf" exitCode=0 Jan 11 18:17:58 crc kubenswrapper[4837]: I0111 18:17:58.378998 4837 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-mgv5j" Jan 11 18:17:58 crc kubenswrapper[4837]: I0111 18:17:58.385204 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-mgv5j" event={"ID":"6b288523-acdf-4742-81a2-aabef3dff7ce","Type":"ContainerDied","Data":"064b612aa59f809174b1484f0a3d014a5d276676900c95d570d9afc417d0b2bf"} Jan 11 18:17:58 crc kubenswrapper[4837]: I0111 18:17:58.385256 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-mgv5j" event={"ID":"6b288523-acdf-4742-81a2-aabef3dff7ce","Type":"ContainerDied","Data":"db4a9a4c59f987c1653d35ba26f8eedaec421697125d9a7c8e7c0586a804ba43"} Jan 11 18:17:58 crc kubenswrapper[4837]: I0111 18:17:58.385280 4837 scope.go:117] "RemoveContainer" containerID="064b612aa59f809174b1484f0a3d014a5d276676900c95d570d9afc417d0b2bf" Jan 11 18:17:58 crc kubenswrapper[4837]: I0111 18:17:58.429487 4837 scope.go:117] "RemoveContainer" containerID="eda3a3b90e5032a11d3a705f81f86a53f68ac7527edab35e7b3dac0d76344ce2" Jan 11 18:17:58 crc kubenswrapper[4837]: I0111 18:17:58.432731 4837 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-mgv5j"] Jan 11 18:17:58 crc kubenswrapper[4837]: I0111 18:17:58.447186 4837 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-mgv5j"] Jan 11 18:17:58 crc kubenswrapper[4837]: I0111 18:17:58.455573 4837 scope.go:117] "RemoveContainer" containerID="85e6f8b7f9b35a183629ef2051e9a71d34fd7eaac5e0c8cd2d1bddb0f1ccaa83" Jan 11 18:17:58 crc kubenswrapper[4837]: I0111 18:17:58.506223 4837 scope.go:117] "RemoveContainer" containerID="064b612aa59f809174b1484f0a3d014a5d276676900c95d570d9afc417d0b2bf" Jan 11 18:17:58 crc kubenswrapper[4837]: E0111 18:17:58.506756 4837 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"064b612aa59f809174b1484f0a3d014a5d276676900c95d570d9afc417d0b2bf\": container with ID starting with 064b612aa59f809174b1484f0a3d014a5d276676900c95d570d9afc417d0b2bf not found: ID does not exist" containerID="064b612aa59f809174b1484f0a3d014a5d276676900c95d570d9afc417d0b2bf" Jan 11 18:17:58 crc kubenswrapper[4837]: I0111 18:17:58.506792 4837 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"064b612aa59f809174b1484f0a3d014a5d276676900c95d570d9afc417d0b2bf"} err="failed to get container status \"064b612aa59f809174b1484f0a3d014a5d276676900c95d570d9afc417d0b2bf\": rpc error: code = NotFound desc = could not find container \"064b612aa59f809174b1484f0a3d014a5d276676900c95d570d9afc417d0b2bf\": container with ID starting with 064b612aa59f809174b1484f0a3d014a5d276676900c95d570d9afc417d0b2bf not found: ID does not exist" Jan 11 18:17:58 crc kubenswrapper[4837]: I0111 18:17:58.506823 4837 scope.go:117] "RemoveContainer" containerID="eda3a3b90e5032a11d3a705f81f86a53f68ac7527edab35e7b3dac0d76344ce2" Jan 11 18:17:58 crc kubenswrapper[4837]: E0111 18:17:58.507114 4837 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"eda3a3b90e5032a11d3a705f81f86a53f68ac7527edab35e7b3dac0d76344ce2\": container with ID starting with eda3a3b90e5032a11d3a705f81f86a53f68ac7527edab35e7b3dac0d76344ce2 not found: ID does not exist" containerID="eda3a3b90e5032a11d3a705f81f86a53f68ac7527edab35e7b3dac0d76344ce2" Jan 11 18:17:58 crc kubenswrapper[4837]: I0111 18:17:58.507192 4837 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"eda3a3b90e5032a11d3a705f81f86a53f68ac7527edab35e7b3dac0d76344ce2"} err="failed to get container status \"eda3a3b90e5032a11d3a705f81f86a53f68ac7527edab35e7b3dac0d76344ce2\": rpc error: code = NotFound desc = could not find container \"eda3a3b90e5032a11d3a705f81f86a53f68ac7527edab35e7b3dac0d76344ce2\": container with ID starting with eda3a3b90e5032a11d3a705f81f86a53f68ac7527edab35e7b3dac0d76344ce2 not found: ID does not exist" Jan 11 18:17:58 crc kubenswrapper[4837]: I0111 18:17:58.507258 4837 scope.go:117] "RemoveContainer" containerID="85e6f8b7f9b35a183629ef2051e9a71d34fd7eaac5e0c8cd2d1bddb0f1ccaa83" Jan 11 18:17:58 crc kubenswrapper[4837]: E0111 18:17:58.507657 4837 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"85e6f8b7f9b35a183629ef2051e9a71d34fd7eaac5e0c8cd2d1bddb0f1ccaa83\": container with ID starting with 85e6f8b7f9b35a183629ef2051e9a71d34fd7eaac5e0c8cd2d1bddb0f1ccaa83 not found: ID does not exist" containerID="85e6f8b7f9b35a183629ef2051e9a71d34fd7eaac5e0c8cd2d1bddb0f1ccaa83" Jan 11 18:17:58 crc kubenswrapper[4837]: I0111 18:17:58.507706 4837 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"85e6f8b7f9b35a183629ef2051e9a71d34fd7eaac5e0c8cd2d1bddb0f1ccaa83"} err="failed to get container status \"85e6f8b7f9b35a183629ef2051e9a71d34fd7eaac5e0c8cd2d1bddb0f1ccaa83\": rpc error: code = NotFound desc = could not find container \"85e6f8b7f9b35a183629ef2051e9a71d34fd7eaac5e0c8cd2d1bddb0f1ccaa83\": container with ID starting with 85e6f8b7f9b35a183629ef2051e9a71d34fd7eaac5e0c8cd2d1bddb0f1ccaa83 not found: ID does not exist" Jan 11 18:17:58 crc kubenswrapper[4837]: I0111 18:17:58.622255 4837 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-krrnj" Jan 11 18:17:58 crc kubenswrapper[4837]: I0111 18:17:58.622314 4837 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-krrnj" Jan 11 18:17:58 crc kubenswrapper[4837]: I0111 18:17:58.681092 4837 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-krrnj" Jan 11 18:17:59 crc kubenswrapper[4837]: I0111 18:17:59.451528 4837 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-krrnj" Jan 11 18:18:00 crc kubenswrapper[4837]: I0111 18:18:00.375821 4837 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6b288523-acdf-4742-81a2-aabef3dff7ce" path="/var/lib/kubelet/pods/6b288523-acdf-4742-81a2-aabef3dff7ce/volumes" Jan 11 18:18:02 crc kubenswrapper[4837]: I0111 18:18:02.278569 4837 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-krrnj"] Jan 11 18:18:02 crc kubenswrapper[4837]: I0111 18:18:02.418554 4837 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-krrnj" podUID="7e725d66-1f9e-4ebd-9f11-66f983a00222" containerName="registry-server" containerID="cri-o://6d4150a166416e154f472c3a83edc7e45d7e5a041fb46292c5986bfe78dab632" gracePeriod=2 Jan 11 18:18:02 crc kubenswrapper[4837]: I0111 18:18:02.950175 4837 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-krrnj" Jan 11 18:18:02 crc kubenswrapper[4837]: I0111 18:18:02.989185 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/7e725d66-1f9e-4ebd-9f11-66f983a00222-utilities\") pod \"7e725d66-1f9e-4ebd-9f11-66f983a00222\" (UID: \"7e725d66-1f9e-4ebd-9f11-66f983a00222\") " Jan 11 18:18:02 crc kubenswrapper[4837]: I0111 18:18:02.989309 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-q4zv2\" (UniqueName: \"kubernetes.io/projected/7e725d66-1f9e-4ebd-9f11-66f983a00222-kube-api-access-q4zv2\") pod \"7e725d66-1f9e-4ebd-9f11-66f983a00222\" (UID: \"7e725d66-1f9e-4ebd-9f11-66f983a00222\") " Jan 11 18:18:02 crc kubenswrapper[4837]: I0111 18:18:02.989350 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/7e725d66-1f9e-4ebd-9f11-66f983a00222-catalog-content\") pod \"7e725d66-1f9e-4ebd-9f11-66f983a00222\" (UID: \"7e725d66-1f9e-4ebd-9f11-66f983a00222\") " Jan 11 18:18:02 crc kubenswrapper[4837]: I0111 18:18:02.990209 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/7e725d66-1f9e-4ebd-9f11-66f983a00222-utilities" (OuterVolumeSpecName: "utilities") pod "7e725d66-1f9e-4ebd-9f11-66f983a00222" (UID: "7e725d66-1f9e-4ebd-9f11-66f983a00222"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 11 18:18:02 crc kubenswrapper[4837]: I0111 18:18:02.997012 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7e725d66-1f9e-4ebd-9f11-66f983a00222-kube-api-access-q4zv2" (OuterVolumeSpecName: "kube-api-access-q4zv2") pod "7e725d66-1f9e-4ebd-9f11-66f983a00222" (UID: "7e725d66-1f9e-4ebd-9f11-66f983a00222"). InnerVolumeSpecName "kube-api-access-q4zv2". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 11 18:18:03 crc kubenswrapper[4837]: I0111 18:18:03.049585 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/7e725d66-1f9e-4ebd-9f11-66f983a00222-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "7e725d66-1f9e-4ebd-9f11-66f983a00222" (UID: "7e725d66-1f9e-4ebd-9f11-66f983a00222"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 11 18:18:03 crc kubenswrapper[4837]: I0111 18:18:03.092486 4837 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-q4zv2\" (UniqueName: \"kubernetes.io/projected/7e725d66-1f9e-4ebd-9f11-66f983a00222-kube-api-access-q4zv2\") on node \"crc\" DevicePath \"\"" Jan 11 18:18:03 crc kubenswrapper[4837]: I0111 18:18:03.092527 4837 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/7e725d66-1f9e-4ebd-9f11-66f983a00222-catalog-content\") on node \"crc\" DevicePath \"\"" Jan 11 18:18:03 crc kubenswrapper[4837]: I0111 18:18:03.092540 4837 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/7e725d66-1f9e-4ebd-9f11-66f983a00222-utilities\") on node \"crc\" DevicePath \"\"" Jan 11 18:18:03 crc kubenswrapper[4837]: I0111 18:18:03.429837 4837 generic.go:334] "Generic (PLEG): container finished" podID="7e725d66-1f9e-4ebd-9f11-66f983a00222" containerID="6d4150a166416e154f472c3a83edc7e45d7e5a041fb46292c5986bfe78dab632" exitCode=0 Jan 11 18:18:03 crc kubenswrapper[4837]: I0111 18:18:03.429887 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-krrnj" event={"ID":"7e725d66-1f9e-4ebd-9f11-66f983a00222","Type":"ContainerDied","Data":"6d4150a166416e154f472c3a83edc7e45d7e5a041fb46292c5986bfe78dab632"} Jan 11 18:18:03 crc kubenswrapper[4837]: I0111 18:18:03.429984 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-krrnj" event={"ID":"7e725d66-1f9e-4ebd-9f11-66f983a00222","Type":"ContainerDied","Data":"6580fa7c775767716435bc58d93adf15fa7591c02fabd0a9afd2a12230305ed2"} Jan 11 18:18:03 crc kubenswrapper[4837]: I0111 18:18:03.430010 4837 scope.go:117] "RemoveContainer" containerID="6d4150a166416e154f472c3a83edc7e45d7e5a041fb46292c5986bfe78dab632" Jan 11 18:18:03 crc kubenswrapper[4837]: I0111 18:18:03.429932 4837 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-krrnj" Jan 11 18:18:03 crc kubenswrapper[4837]: I0111 18:18:03.467734 4837 scope.go:117] "RemoveContainer" containerID="a25e3022b4671bb5f83e953f55021cd96cc58ab9651d48970db15496b2435bdc" Jan 11 18:18:03 crc kubenswrapper[4837]: I0111 18:18:03.486575 4837 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-krrnj"] Jan 11 18:18:03 crc kubenswrapper[4837]: I0111 18:18:03.497926 4837 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-krrnj"] Jan 11 18:18:03 crc kubenswrapper[4837]: I0111 18:18:03.513614 4837 scope.go:117] "RemoveContainer" containerID="cc4ed19bd039ca77837737593404c4bc452dd6ec058ecae7007ed3ff76c54533" Jan 11 18:18:03 crc kubenswrapper[4837]: I0111 18:18:03.565214 4837 scope.go:117] "RemoveContainer" containerID="6d4150a166416e154f472c3a83edc7e45d7e5a041fb46292c5986bfe78dab632" Jan 11 18:18:03 crc kubenswrapper[4837]: E0111 18:18:03.565896 4837 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"6d4150a166416e154f472c3a83edc7e45d7e5a041fb46292c5986bfe78dab632\": container with ID starting with 6d4150a166416e154f472c3a83edc7e45d7e5a041fb46292c5986bfe78dab632 not found: ID does not exist" containerID="6d4150a166416e154f472c3a83edc7e45d7e5a041fb46292c5986bfe78dab632" Jan 11 18:18:03 crc kubenswrapper[4837]: I0111 18:18:03.565947 4837 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6d4150a166416e154f472c3a83edc7e45d7e5a041fb46292c5986bfe78dab632"} err="failed to get container status \"6d4150a166416e154f472c3a83edc7e45d7e5a041fb46292c5986bfe78dab632\": rpc error: code = NotFound desc = could not find container \"6d4150a166416e154f472c3a83edc7e45d7e5a041fb46292c5986bfe78dab632\": container with ID starting with 6d4150a166416e154f472c3a83edc7e45d7e5a041fb46292c5986bfe78dab632 not found: ID does not exist" Jan 11 18:18:03 crc kubenswrapper[4837]: I0111 18:18:03.565979 4837 scope.go:117] "RemoveContainer" containerID="a25e3022b4671bb5f83e953f55021cd96cc58ab9651d48970db15496b2435bdc" Jan 11 18:18:03 crc kubenswrapper[4837]: E0111 18:18:03.567307 4837 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a25e3022b4671bb5f83e953f55021cd96cc58ab9651d48970db15496b2435bdc\": container with ID starting with a25e3022b4671bb5f83e953f55021cd96cc58ab9651d48970db15496b2435bdc not found: ID does not exist" containerID="a25e3022b4671bb5f83e953f55021cd96cc58ab9651d48970db15496b2435bdc" Jan 11 18:18:03 crc kubenswrapper[4837]: I0111 18:18:03.567346 4837 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a25e3022b4671bb5f83e953f55021cd96cc58ab9651d48970db15496b2435bdc"} err="failed to get container status \"a25e3022b4671bb5f83e953f55021cd96cc58ab9651d48970db15496b2435bdc\": rpc error: code = NotFound desc = could not find container \"a25e3022b4671bb5f83e953f55021cd96cc58ab9651d48970db15496b2435bdc\": container with ID starting with a25e3022b4671bb5f83e953f55021cd96cc58ab9651d48970db15496b2435bdc not found: ID does not exist" Jan 11 18:18:03 crc kubenswrapper[4837]: I0111 18:18:03.567373 4837 scope.go:117] "RemoveContainer" containerID="cc4ed19bd039ca77837737593404c4bc452dd6ec058ecae7007ed3ff76c54533" Jan 11 18:18:03 crc kubenswrapper[4837]: E0111 18:18:03.567892 4837 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"cc4ed19bd039ca77837737593404c4bc452dd6ec058ecae7007ed3ff76c54533\": container with ID starting with cc4ed19bd039ca77837737593404c4bc452dd6ec058ecae7007ed3ff76c54533 not found: ID does not exist" containerID="cc4ed19bd039ca77837737593404c4bc452dd6ec058ecae7007ed3ff76c54533" Jan 11 18:18:03 crc kubenswrapper[4837]: I0111 18:18:03.567945 4837 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"cc4ed19bd039ca77837737593404c4bc452dd6ec058ecae7007ed3ff76c54533"} err="failed to get container status \"cc4ed19bd039ca77837737593404c4bc452dd6ec058ecae7007ed3ff76c54533\": rpc error: code = NotFound desc = could not find container \"cc4ed19bd039ca77837737593404c4bc452dd6ec058ecae7007ed3ff76c54533\": container with ID starting with cc4ed19bd039ca77837737593404c4bc452dd6ec058ecae7007ed3ff76c54533 not found: ID does not exist" Jan 11 18:18:04 crc kubenswrapper[4837]: I0111 18:18:04.376810 4837 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7e725d66-1f9e-4ebd-9f11-66f983a00222" path="/var/lib/kubelet/pods/7e725d66-1f9e-4ebd-9f11-66f983a00222/volumes" Jan 11 18:18:20 crc kubenswrapper[4837]: I0111 18:18:20.671371 4837 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/tempest-tests-tempest"] Jan 11 18:18:20 crc kubenswrapper[4837]: E0111 18:18:20.673024 4837 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6b288523-acdf-4742-81a2-aabef3dff7ce" containerName="registry-server" Jan 11 18:18:20 crc kubenswrapper[4837]: I0111 18:18:20.673063 4837 state_mem.go:107] "Deleted CPUSet assignment" podUID="6b288523-acdf-4742-81a2-aabef3dff7ce" containerName="registry-server" Jan 11 18:18:20 crc kubenswrapper[4837]: E0111 18:18:20.673092 4837 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7e725d66-1f9e-4ebd-9f11-66f983a00222" containerName="extract-utilities" Jan 11 18:18:20 crc kubenswrapper[4837]: I0111 18:18:20.673104 4837 state_mem.go:107] "Deleted CPUSet assignment" podUID="7e725d66-1f9e-4ebd-9f11-66f983a00222" containerName="extract-utilities" Jan 11 18:18:20 crc kubenswrapper[4837]: E0111 18:18:20.673130 4837 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6b288523-acdf-4742-81a2-aabef3dff7ce" containerName="extract-content" Jan 11 18:18:20 crc kubenswrapper[4837]: I0111 18:18:20.673143 4837 state_mem.go:107] "Deleted CPUSet assignment" podUID="6b288523-acdf-4742-81a2-aabef3dff7ce" containerName="extract-content" Jan 11 18:18:20 crc kubenswrapper[4837]: E0111 18:18:20.673170 4837 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7e725d66-1f9e-4ebd-9f11-66f983a00222" containerName="extract-content" Jan 11 18:18:20 crc kubenswrapper[4837]: I0111 18:18:20.673180 4837 state_mem.go:107] "Deleted CPUSet assignment" podUID="7e725d66-1f9e-4ebd-9f11-66f983a00222" containerName="extract-content" Jan 11 18:18:20 crc kubenswrapper[4837]: E0111 18:18:20.673209 4837 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6b288523-acdf-4742-81a2-aabef3dff7ce" containerName="extract-utilities" Jan 11 18:18:20 crc kubenswrapper[4837]: I0111 18:18:20.673222 4837 state_mem.go:107] "Deleted CPUSet assignment" podUID="6b288523-acdf-4742-81a2-aabef3dff7ce" containerName="extract-utilities" Jan 11 18:18:20 crc kubenswrapper[4837]: E0111 18:18:20.673238 4837 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7e725d66-1f9e-4ebd-9f11-66f983a00222" containerName="registry-server" Jan 11 18:18:20 crc kubenswrapper[4837]: I0111 18:18:20.673248 4837 state_mem.go:107] "Deleted CPUSet assignment" podUID="7e725d66-1f9e-4ebd-9f11-66f983a00222" containerName="registry-server" Jan 11 18:18:20 crc kubenswrapper[4837]: I0111 18:18:20.673567 4837 memory_manager.go:354] "RemoveStaleState removing state" podUID="6b288523-acdf-4742-81a2-aabef3dff7ce" containerName="registry-server" Jan 11 18:18:20 crc kubenswrapper[4837]: I0111 18:18:20.673601 4837 memory_manager.go:354] "RemoveStaleState removing state" podUID="7e725d66-1f9e-4ebd-9f11-66f983a00222" containerName="registry-server" Jan 11 18:18:20 crc kubenswrapper[4837]: I0111 18:18:20.674585 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/tempest-tests-tempest" Jan 11 18:18:20 crc kubenswrapper[4837]: I0111 18:18:20.678365 4837 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"default-dockercfg-xcp84" Jan 11 18:18:20 crc kubenswrapper[4837]: I0111 18:18:20.678470 4837 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"tempest-tests-tempest-env-vars-s0" Jan 11 18:18:20 crc kubenswrapper[4837]: I0111 18:18:20.678489 4837 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"tempest-tests-tempest-custom-data-s0" Jan 11 18:18:20 crc kubenswrapper[4837]: I0111 18:18:20.682520 4837 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"test-operator-controller-priv-key" Jan 11 18:18:20 crc kubenswrapper[4837]: I0111 18:18:20.685069 4837 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/tempest-tests-tempest"] Jan 11 18:18:20 crc kubenswrapper[4837]: I0111 18:18:20.690392 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/803594a1-a21b-4a8d-bf22-a2f1786b3822-config-data\") pod \"tempest-tests-tempest\" (UID: \"803594a1-a21b-4a8d-bf22-a2f1786b3822\") " pod="openstack/tempest-tests-tempest" Jan 11 18:18:20 crc kubenswrapper[4837]: I0111 18:18:20.690797 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/803594a1-a21b-4a8d-bf22-a2f1786b3822-openstack-config-secret\") pod \"tempest-tests-tempest\" (UID: \"803594a1-a21b-4a8d-bf22-a2f1786b3822\") " pod="openstack/tempest-tests-tempest" Jan 11 18:18:20 crc kubenswrapper[4837]: I0111 18:18:20.690835 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/803594a1-a21b-4a8d-bf22-a2f1786b3822-openstack-config\") pod \"tempest-tests-tempest\" (UID: \"803594a1-a21b-4a8d-bf22-a2f1786b3822\") " pod="openstack/tempest-tests-tempest" Jan 11 18:18:20 crc kubenswrapper[4837]: I0111 18:18:20.792560 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-certs\" (UniqueName: \"kubernetes.io/secret/803594a1-a21b-4a8d-bf22-a2f1786b3822-ca-certs\") pod \"tempest-tests-tempest\" (UID: \"803594a1-a21b-4a8d-bf22-a2f1786b3822\") " pod="openstack/tempest-tests-tempest" Jan 11 18:18:20 crc kubenswrapper[4837]: I0111 18:18:20.792651 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/803594a1-a21b-4a8d-bf22-a2f1786b3822-openstack-config-secret\") pod \"tempest-tests-tempest\" (UID: \"803594a1-a21b-4a8d-bf22-a2f1786b3822\") " pod="openstack/tempest-tests-tempest" Jan 11 18:18:20 crc kubenswrapper[4837]: I0111 18:18:20.792699 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"test-operator-ephemeral-workdir\" (UniqueName: \"kubernetes.io/empty-dir/803594a1-a21b-4a8d-bf22-a2f1786b3822-test-operator-ephemeral-workdir\") pod \"tempest-tests-tempest\" (UID: \"803594a1-a21b-4a8d-bf22-a2f1786b3822\") " pod="openstack/tempest-tests-tempest" Jan 11 18:18:20 crc kubenswrapper[4837]: I0111 18:18:20.792749 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/803594a1-a21b-4a8d-bf22-a2f1786b3822-openstack-config\") pod \"tempest-tests-tempest\" (UID: \"803594a1-a21b-4a8d-bf22-a2f1786b3822\") " pod="openstack/tempest-tests-tempest" Jan 11 18:18:20 crc kubenswrapper[4837]: I0111 18:18:20.792792 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/803594a1-a21b-4a8d-bf22-a2f1786b3822-ssh-key\") pod \"tempest-tests-tempest\" (UID: \"803594a1-a21b-4a8d-bf22-a2f1786b3822\") " pod="openstack/tempest-tests-tempest" Jan 11 18:18:20 crc kubenswrapper[4837]: I0111 18:18:20.792817 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"test-operator-ephemeral-temporary\" (UniqueName: \"kubernetes.io/empty-dir/803594a1-a21b-4a8d-bf22-a2f1786b3822-test-operator-ephemeral-temporary\") pod \"tempest-tests-tempest\" (UID: \"803594a1-a21b-4a8d-bf22-a2f1786b3822\") " pod="openstack/tempest-tests-tempest" Jan 11 18:18:20 crc kubenswrapper[4837]: I0111 18:18:20.792847 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7qzjl\" (UniqueName: \"kubernetes.io/projected/803594a1-a21b-4a8d-bf22-a2f1786b3822-kube-api-access-7qzjl\") pod \"tempest-tests-tempest\" (UID: \"803594a1-a21b-4a8d-bf22-a2f1786b3822\") " pod="openstack/tempest-tests-tempest" Jan 11 18:18:20 crc kubenswrapper[4837]: I0111 18:18:20.792871 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/803594a1-a21b-4a8d-bf22-a2f1786b3822-config-data\") pod \"tempest-tests-tempest\" (UID: \"803594a1-a21b-4a8d-bf22-a2f1786b3822\") " pod="openstack/tempest-tests-tempest" Jan 11 18:18:20 crc kubenswrapper[4837]: I0111 18:18:20.792997 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage05-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage05-crc\") pod \"tempest-tests-tempest\" (UID: \"803594a1-a21b-4a8d-bf22-a2f1786b3822\") " pod="openstack/tempest-tests-tempest" Jan 11 18:18:20 crc kubenswrapper[4837]: I0111 18:18:20.795667 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/803594a1-a21b-4a8d-bf22-a2f1786b3822-openstack-config\") pod \"tempest-tests-tempest\" (UID: \"803594a1-a21b-4a8d-bf22-a2f1786b3822\") " pod="openstack/tempest-tests-tempest" Jan 11 18:18:20 crc kubenswrapper[4837]: I0111 18:18:20.797919 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/803594a1-a21b-4a8d-bf22-a2f1786b3822-config-data\") pod \"tempest-tests-tempest\" (UID: \"803594a1-a21b-4a8d-bf22-a2f1786b3822\") " pod="openstack/tempest-tests-tempest" Jan 11 18:18:20 crc kubenswrapper[4837]: I0111 18:18:20.806847 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/803594a1-a21b-4a8d-bf22-a2f1786b3822-openstack-config-secret\") pod \"tempest-tests-tempest\" (UID: \"803594a1-a21b-4a8d-bf22-a2f1786b3822\") " pod="openstack/tempest-tests-tempest" Jan 11 18:18:20 crc kubenswrapper[4837]: I0111 18:18:20.894881 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage05-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage05-crc\") pod \"tempest-tests-tempest\" (UID: \"803594a1-a21b-4a8d-bf22-a2f1786b3822\") " pod="openstack/tempest-tests-tempest" Jan 11 18:18:20 crc kubenswrapper[4837]: I0111 18:18:20.895195 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ca-certs\" (UniqueName: \"kubernetes.io/secret/803594a1-a21b-4a8d-bf22-a2f1786b3822-ca-certs\") pod \"tempest-tests-tempest\" (UID: \"803594a1-a21b-4a8d-bf22-a2f1786b3822\") " pod="openstack/tempest-tests-tempest" Jan 11 18:18:20 crc kubenswrapper[4837]: I0111 18:18:20.895323 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"test-operator-ephemeral-workdir\" (UniqueName: \"kubernetes.io/empty-dir/803594a1-a21b-4a8d-bf22-a2f1786b3822-test-operator-ephemeral-workdir\") pod \"tempest-tests-tempest\" (UID: \"803594a1-a21b-4a8d-bf22-a2f1786b3822\") " pod="openstack/tempest-tests-tempest" Jan 11 18:18:20 crc kubenswrapper[4837]: I0111 18:18:20.895484 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/803594a1-a21b-4a8d-bf22-a2f1786b3822-ssh-key\") pod \"tempest-tests-tempest\" (UID: \"803594a1-a21b-4a8d-bf22-a2f1786b3822\") " pod="openstack/tempest-tests-tempest" Jan 11 18:18:20 crc kubenswrapper[4837]: I0111 18:18:20.895614 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"test-operator-ephemeral-temporary\" (UniqueName: \"kubernetes.io/empty-dir/803594a1-a21b-4a8d-bf22-a2f1786b3822-test-operator-ephemeral-temporary\") pod \"tempest-tests-tempest\" (UID: \"803594a1-a21b-4a8d-bf22-a2f1786b3822\") " pod="openstack/tempest-tests-tempest" Jan 11 18:18:20 crc kubenswrapper[4837]: I0111 18:18:20.895769 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7qzjl\" (UniqueName: \"kubernetes.io/projected/803594a1-a21b-4a8d-bf22-a2f1786b3822-kube-api-access-7qzjl\") pod \"tempest-tests-tempest\" (UID: \"803594a1-a21b-4a8d-bf22-a2f1786b3822\") " pod="openstack/tempest-tests-tempest" Jan 11 18:18:20 crc kubenswrapper[4837]: I0111 18:18:20.895952 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"test-operator-ephemeral-workdir\" (UniqueName: \"kubernetes.io/empty-dir/803594a1-a21b-4a8d-bf22-a2f1786b3822-test-operator-ephemeral-workdir\") pod \"tempest-tests-tempest\" (UID: \"803594a1-a21b-4a8d-bf22-a2f1786b3822\") " pod="openstack/tempest-tests-tempest" Jan 11 18:18:20 crc kubenswrapper[4837]: I0111 18:18:20.896124 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"test-operator-ephemeral-temporary\" (UniqueName: \"kubernetes.io/empty-dir/803594a1-a21b-4a8d-bf22-a2f1786b3822-test-operator-ephemeral-temporary\") pod \"tempest-tests-tempest\" (UID: \"803594a1-a21b-4a8d-bf22-a2f1786b3822\") " pod="openstack/tempest-tests-tempest" Jan 11 18:18:20 crc kubenswrapper[4837]: I0111 18:18:20.896161 4837 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage05-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage05-crc\") pod \"tempest-tests-tempest\" (UID: \"803594a1-a21b-4a8d-bf22-a2f1786b3822\") device mount path \"/mnt/openstack/pv05\"" pod="openstack/tempest-tests-tempest" Jan 11 18:18:20 crc kubenswrapper[4837]: I0111 18:18:20.899609 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/803594a1-a21b-4a8d-bf22-a2f1786b3822-ssh-key\") pod \"tempest-tests-tempest\" (UID: \"803594a1-a21b-4a8d-bf22-a2f1786b3822\") " pod="openstack/tempest-tests-tempest" Jan 11 18:18:20 crc kubenswrapper[4837]: I0111 18:18:20.900722 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ca-certs\" (UniqueName: \"kubernetes.io/secret/803594a1-a21b-4a8d-bf22-a2f1786b3822-ca-certs\") pod \"tempest-tests-tempest\" (UID: \"803594a1-a21b-4a8d-bf22-a2f1786b3822\") " pod="openstack/tempest-tests-tempest" Jan 11 18:18:20 crc kubenswrapper[4837]: I0111 18:18:20.916443 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7qzjl\" (UniqueName: \"kubernetes.io/projected/803594a1-a21b-4a8d-bf22-a2f1786b3822-kube-api-access-7qzjl\") pod \"tempest-tests-tempest\" (UID: \"803594a1-a21b-4a8d-bf22-a2f1786b3822\") " pod="openstack/tempest-tests-tempest" Jan 11 18:18:20 crc kubenswrapper[4837]: I0111 18:18:20.927579 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage05-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage05-crc\") pod \"tempest-tests-tempest\" (UID: \"803594a1-a21b-4a8d-bf22-a2f1786b3822\") " pod="openstack/tempest-tests-tempest" Jan 11 18:18:21 crc kubenswrapper[4837]: I0111 18:18:21.000154 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/tempest-tests-tempest" Jan 11 18:18:21 crc kubenswrapper[4837]: I0111 18:18:21.309141 4837 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/tempest-tests-tempest"] Jan 11 18:18:21 crc kubenswrapper[4837]: W0111 18:18:21.318833 4837 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod803594a1_a21b_4a8d_bf22_a2f1786b3822.slice/crio-c3c6499993014f892a165a8bf55478adcbd12e8578718b6fdb509cfb9fa85989 WatchSource:0}: Error finding container c3c6499993014f892a165a8bf55478adcbd12e8578718b6fdb509cfb9fa85989: Status 404 returned error can't find the container with id c3c6499993014f892a165a8bf55478adcbd12e8578718b6fdb509cfb9fa85989 Jan 11 18:18:21 crc kubenswrapper[4837]: I0111 18:18:21.618817 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/tempest-tests-tempest" event={"ID":"803594a1-a21b-4a8d-bf22-a2f1786b3822","Type":"ContainerStarted","Data":"c3c6499993014f892a165a8bf55478adcbd12e8578718b6fdb509cfb9fa85989"} Jan 11 18:18:53 crc kubenswrapper[4837]: E0111 18:18:53.918359 4837 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/podified-antelope-centos9/openstack-tempest-all:current-podified" Jan 11 18:18:53 crc kubenswrapper[4837]: E0111 18:18:53.919353 4837 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:tempest-tests-tempest-tests-runner,Image:quay.io/podified-antelope-centos9/openstack-tempest-all:current-podified,Command:[],Args:[],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:config-data,ReadOnly:false,MountPath:/etc/test_operator,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:test-operator-ephemeral-workdir,ReadOnly:false,MountPath:/var/lib/tempest,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:test-operator-ephemeral-temporary,ReadOnly:false,MountPath:/tmp,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:test-operator-logs,ReadOnly:false,MountPath:/var/lib/tempest/external_files,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:openstack-config,ReadOnly:true,MountPath:/etc/openstack/clouds.yaml,SubPath:clouds.yaml,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:openstack-config,ReadOnly:true,MountPath:/var/lib/tempest/.config/openstack/clouds.yaml,SubPath:clouds.yaml,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:openstack-config-secret,ReadOnly:false,MountPath:/etc/openstack/secure.yaml,SubPath:secure.yaml,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:ca-certs,ReadOnly:true,MountPath:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem,SubPath:tls-ca-bundle.pem,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:ssh-key,ReadOnly:false,MountPath:/var/lib/tempest/id_ecdsa,SubPath:ssh_key,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-7qzjl,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*42480,RunAsNonRoot:*false,ReadOnlyRootFilesystem:*false,AllowPrivilegeEscalation:*true,RunAsGroup:*42480,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{EnvFromSource{Prefix:,ConfigMapRef:&ConfigMapEnvSource{LocalObjectReference:LocalObjectReference{Name:tempest-tests-tempest-custom-data-s0,},Optional:nil,},SecretRef:nil,},EnvFromSource{Prefix:,ConfigMapRef:&ConfigMapEnvSource{LocalObjectReference:LocalObjectReference{Name:tempest-tests-tempest-env-vars-s0,},Optional:nil,},SecretRef:nil,},},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod tempest-tests-tempest_openstack(803594a1-a21b-4a8d-bf22-a2f1786b3822): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Jan 11 18:18:53 crc kubenswrapper[4837]: E0111 18:18:53.921357 4837 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"tempest-tests-tempest-tests-runner\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack/tempest-tests-tempest" podUID="803594a1-a21b-4a8d-bf22-a2f1786b3822" Jan 11 18:18:53 crc kubenswrapper[4837]: E0111 18:18:53.959982 4837 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"tempest-tests-tempest-tests-runner\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/podified-antelope-centos9/openstack-tempest-all:current-podified\\\"\"" pod="openstack/tempest-tests-tempest" podUID="803594a1-a21b-4a8d-bf22-a2f1786b3822" Jan 11 18:19:07 crc kubenswrapper[4837]: I0111 18:19:07.962011 4837 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"tempest-tests-tempest-env-vars-s0" Jan 11 18:19:09 crc kubenswrapper[4837]: I0111 18:19:09.115922 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/tempest-tests-tempest" event={"ID":"803594a1-a21b-4a8d-bf22-a2f1786b3822","Type":"ContainerStarted","Data":"ec0dbe4366ce26d5ceca364a3c5c7b477fd21e1cf35346bc091a56e28a5bc558"} Jan 11 18:19:09 crc kubenswrapper[4837]: I0111 18:19:09.153978 4837 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/tempest-tests-tempest" podStartSLOduration=3.514843024 podStartE2EDuration="50.153953442s" podCreationTimestamp="2026-01-11 18:18:19 +0000 UTC" firstStartedPulling="2026-01-11 18:18:21.320468809 +0000 UTC m=+2875.498661515" lastFinishedPulling="2026-01-11 18:19:07.959579207 +0000 UTC m=+2922.137771933" observedRunningTime="2026-01-11 18:19:09.139612766 +0000 UTC m=+2923.317805482" watchObservedRunningTime="2026-01-11 18:19:09.153953442 +0000 UTC m=+2923.332146148" Jan 11 18:19:39 crc kubenswrapper[4837]: I0111 18:19:39.443685 4837 patch_prober.go:28] interesting pod/machine-config-daemon-pqnst container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Jan 11 18:19:39 crc kubenswrapper[4837]: I0111 18:19:39.444263 4837 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-pqnst" podUID="1cf6fa66-290a-4e29-bafc-e60185a22fe2" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Jan 11 18:20:09 crc kubenswrapper[4837]: I0111 18:20:09.444015 4837 patch_prober.go:28] interesting pod/machine-config-daemon-pqnst container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Jan 11 18:20:09 crc kubenswrapper[4837]: I0111 18:20:09.444536 4837 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-pqnst" podUID="1cf6fa66-290a-4e29-bafc-e60185a22fe2" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Jan 11 18:20:39 crc kubenswrapper[4837]: I0111 18:20:39.444342 4837 patch_prober.go:28] interesting pod/machine-config-daemon-pqnst container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Jan 11 18:20:39 crc kubenswrapper[4837]: I0111 18:20:39.444910 4837 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-pqnst" podUID="1cf6fa66-290a-4e29-bafc-e60185a22fe2" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Jan 11 18:20:39 crc kubenswrapper[4837]: I0111 18:20:39.445001 4837 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-pqnst" Jan 11 18:20:39 crc kubenswrapper[4837]: I0111 18:20:39.445982 4837 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"39b5b7180c0907dd777b55e3381ab4312f0b40d0e0b28505d8e75e226146dfdf"} pod="openshift-machine-config-operator/machine-config-daemon-pqnst" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Jan 11 18:20:39 crc kubenswrapper[4837]: I0111 18:20:39.446042 4837 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-pqnst" podUID="1cf6fa66-290a-4e29-bafc-e60185a22fe2" containerName="machine-config-daemon" containerID="cri-o://39b5b7180c0907dd777b55e3381ab4312f0b40d0e0b28505d8e75e226146dfdf" gracePeriod=600 Jan 11 18:20:39 crc kubenswrapper[4837]: E0111 18:20:39.572044 4837 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-pqnst_openshift-machine-config-operator(1cf6fa66-290a-4e29-bafc-e60185a22fe2)\"" pod="openshift-machine-config-operator/machine-config-daemon-pqnst" podUID="1cf6fa66-290a-4e29-bafc-e60185a22fe2" Jan 11 18:20:40 crc kubenswrapper[4837]: I0111 18:20:40.113459 4837 generic.go:334] "Generic (PLEG): container finished" podID="1cf6fa66-290a-4e29-bafc-e60185a22fe2" containerID="39b5b7180c0907dd777b55e3381ab4312f0b40d0e0b28505d8e75e226146dfdf" exitCode=0 Jan 11 18:20:40 crc kubenswrapper[4837]: I0111 18:20:40.113576 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-pqnst" event={"ID":"1cf6fa66-290a-4e29-bafc-e60185a22fe2","Type":"ContainerDied","Data":"39b5b7180c0907dd777b55e3381ab4312f0b40d0e0b28505d8e75e226146dfdf"} Jan 11 18:20:40 crc kubenswrapper[4837]: I0111 18:20:40.113859 4837 scope.go:117] "RemoveContainer" containerID="f1a9906e1f83fc15699d820fe6fec4c1a5b7031cfbe2d186a79b8184951f2fcd" Jan 11 18:20:40 crc kubenswrapper[4837]: I0111 18:20:40.115077 4837 scope.go:117] "RemoveContainer" containerID="39b5b7180c0907dd777b55e3381ab4312f0b40d0e0b28505d8e75e226146dfdf" Jan 11 18:20:40 crc kubenswrapper[4837]: E0111 18:20:40.115538 4837 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-pqnst_openshift-machine-config-operator(1cf6fa66-290a-4e29-bafc-e60185a22fe2)\"" pod="openshift-machine-config-operator/machine-config-daemon-pqnst" podUID="1cf6fa66-290a-4e29-bafc-e60185a22fe2" Jan 11 18:20:51 crc kubenswrapper[4837]: I0111 18:20:51.364145 4837 scope.go:117] "RemoveContainer" containerID="39b5b7180c0907dd777b55e3381ab4312f0b40d0e0b28505d8e75e226146dfdf" Jan 11 18:20:51 crc kubenswrapper[4837]: E0111 18:20:51.364930 4837 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-pqnst_openshift-machine-config-operator(1cf6fa66-290a-4e29-bafc-e60185a22fe2)\"" pod="openshift-machine-config-operator/machine-config-daemon-pqnst" podUID="1cf6fa66-290a-4e29-bafc-e60185a22fe2" Jan 11 18:21:06 crc kubenswrapper[4837]: I0111 18:21:06.374221 4837 scope.go:117] "RemoveContainer" containerID="39b5b7180c0907dd777b55e3381ab4312f0b40d0e0b28505d8e75e226146dfdf" Jan 11 18:21:06 crc kubenswrapper[4837]: E0111 18:21:06.376464 4837 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-pqnst_openshift-machine-config-operator(1cf6fa66-290a-4e29-bafc-e60185a22fe2)\"" pod="openshift-machine-config-operator/machine-config-daemon-pqnst" podUID="1cf6fa66-290a-4e29-bafc-e60185a22fe2" Jan 11 18:21:22 crc kubenswrapper[4837]: I0111 18:21:22.363887 4837 scope.go:117] "RemoveContainer" containerID="39b5b7180c0907dd777b55e3381ab4312f0b40d0e0b28505d8e75e226146dfdf" Jan 11 18:21:22 crc kubenswrapper[4837]: E0111 18:21:22.364930 4837 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-pqnst_openshift-machine-config-operator(1cf6fa66-290a-4e29-bafc-e60185a22fe2)\"" pod="openshift-machine-config-operator/machine-config-daemon-pqnst" podUID="1cf6fa66-290a-4e29-bafc-e60185a22fe2" Jan 11 18:21:35 crc kubenswrapper[4837]: I0111 18:21:35.363862 4837 scope.go:117] "RemoveContainer" containerID="39b5b7180c0907dd777b55e3381ab4312f0b40d0e0b28505d8e75e226146dfdf" Jan 11 18:21:35 crc kubenswrapper[4837]: E0111 18:21:35.364622 4837 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-pqnst_openshift-machine-config-operator(1cf6fa66-290a-4e29-bafc-e60185a22fe2)\"" pod="openshift-machine-config-operator/machine-config-daemon-pqnst" podUID="1cf6fa66-290a-4e29-bafc-e60185a22fe2" Jan 11 18:21:49 crc kubenswrapper[4837]: I0111 18:21:49.365008 4837 scope.go:117] "RemoveContainer" containerID="39b5b7180c0907dd777b55e3381ab4312f0b40d0e0b28505d8e75e226146dfdf" Jan 11 18:21:49 crc kubenswrapper[4837]: E0111 18:21:49.365858 4837 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-pqnst_openshift-machine-config-operator(1cf6fa66-290a-4e29-bafc-e60185a22fe2)\"" pod="openshift-machine-config-operator/machine-config-daemon-pqnst" podUID="1cf6fa66-290a-4e29-bafc-e60185a22fe2" Jan 11 18:22:00 crc kubenswrapper[4837]: I0111 18:22:00.364795 4837 scope.go:117] "RemoveContainer" containerID="39b5b7180c0907dd777b55e3381ab4312f0b40d0e0b28505d8e75e226146dfdf" Jan 11 18:22:00 crc kubenswrapper[4837]: E0111 18:22:00.365623 4837 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-pqnst_openshift-machine-config-operator(1cf6fa66-290a-4e29-bafc-e60185a22fe2)\"" pod="openshift-machine-config-operator/machine-config-daemon-pqnst" podUID="1cf6fa66-290a-4e29-bafc-e60185a22fe2" Jan 11 18:22:15 crc kubenswrapper[4837]: I0111 18:22:15.365426 4837 scope.go:117] "RemoveContainer" containerID="39b5b7180c0907dd777b55e3381ab4312f0b40d0e0b28505d8e75e226146dfdf" Jan 11 18:22:15 crc kubenswrapper[4837]: E0111 18:22:15.366587 4837 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-pqnst_openshift-machine-config-operator(1cf6fa66-290a-4e29-bafc-e60185a22fe2)\"" pod="openshift-machine-config-operator/machine-config-daemon-pqnst" podUID="1cf6fa66-290a-4e29-bafc-e60185a22fe2" Jan 11 18:22:29 crc kubenswrapper[4837]: I0111 18:22:29.364480 4837 scope.go:117] "RemoveContainer" containerID="39b5b7180c0907dd777b55e3381ab4312f0b40d0e0b28505d8e75e226146dfdf" Jan 11 18:22:29 crc kubenswrapper[4837]: E0111 18:22:29.366409 4837 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-pqnst_openshift-machine-config-operator(1cf6fa66-290a-4e29-bafc-e60185a22fe2)\"" pod="openshift-machine-config-operator/machine-config-daemon-pqnst" podUID="1cf6fa66-290a-4e29-bafc-e60185a22fe2" Jan 11 18:22:44 crc kubenswrapper[4837]: I0111 18:22:44.364014 4837 scope.go:117] "RemoveContainer" containerID="39b5b7180c0907dd777b55e3381ab4312f0b40d0e0b28505d8e75e226146dfdf" Jan 11 18:22:44 crc kubenswrapper[4837]: E0111 18:22:44.364845 4837 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-pqnst_openshift-machine-config-operator(1cf6fa66-290a-4e29-bafc-e60185a22fe2)\"" pod="openshift-machine-config-operator/machine-config-daemon-pqnst" podUID="1cf6fa66-290a-4e29-bafc-e60185a22fe2" Jan 11 18:22:48 crc kubenswrapper[4837]: I0111 18:22:48.288449 4837 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-676br"] Jan 11 18:22:48 crc kubenswrapper[4837]: I0111 18:22:48.291783 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-676br" Jan 11 18:22:48 crc kubenswrapper[4837]: I0111 18:22:48.307688 4837 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-676br"] Jan 11 18:22:48 crc kubenswrapper[4837]: I0111 18:22:48.360108 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f0d5171a-d111-4d41-bac3-ad548e082e18-utilities\") pod \"redhat-operators-676br\" (UID: \"f0d5171a-d111-4d41-bac3-ad548e082e18\") " pod="openshift-marketplace/redhat-operators-676br" Jan 11 18:22:48 crc kubenswrapper[4837]: I0111 18:22:48.360204 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mrg8r\" (UniqueName: \"kubernetes.io/projected/f0d5171a-d111-4d41-bac3-ad548e082e18-kube-api-access-mrg8r\") pod \"redhat-operators-676br\" (UID: \"f0d5171a-d111-4d41-bac3-ad548e082e18\") " pod="openshift-marketplace/redhat-operators-676br" Jan 11 18:22:48 crc kubenswrapper[4837]: I0111 18:22:48.360415 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f0d5171a-d111-4d41-bac3-ad548e082e18-catalog-content\") pod \"redhat-operators-676br\" (UID: \"f0d5171a-d111-4d41-bac3-ad548e082e18\") " pod="openshift-marketplace/redhat-operators-676br" Jan 11 18:22:48 crc kubenswrapper[4837]: I0111 18:22:48.462785 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f0d5171a-d111-4d41-bac3-ad548e082e18-catalog-content\") pod \"redhat-operators-676br\" (UID: \"f0d5171a-d111-4d41-bac3-ad548e082e18\") " pod="openshift-marketplace/redhat-operators-676br" Jan 11 18:22:48 crc kubenswrapper[4837]: I0111 18:22:48.462910 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f0d5171a-d111-4d41-bac3-ad548e082e18-utilities\") pod \"redhat-operators-676br\" (UID: \"f0d5171a-d111-4d41-bac3-ad548e082e18\") " pod="openshift-marketplace/redhat-operators-676br" Jan 11 18:22:48 crc kubenswrapper[4837]: I0111 18:22:48.462983 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mrg8r\" (UniqueName: \"kubernetes.io/projected/f0d5171a-d111-4d41-bac3-ad548e082e18-kube-api-access-mrg8r\") pod \"redhat-operators-676br\" (UID: \"f0d5171a-d111-4d41-bac3-ad548e082e18\") " pod="openshift-marketplace/redhat-operators-676br" Jan 11 18:22:48 crc kubenswrapper[4837]: I0111 18:22:48.463927 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f0d5171a-d111-4d41-bac3-ad548e082e18-catalog-content\") pod \"redhat-operators-676br\" (UID: \"f0d5171a-d111-4d41-bac3-ad548e082e18\") " pod="openshift-marketplace/redhat-operators-676br" Jan 11 18:22:48 crc kubenswrapper[4837]: I0111 18:22:48.464255 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f0d5171a-d111-4d41-bac3-ad548e082e18-utilities\") pod \"redhat-operators-676br\" (UID: \"f0d5171a-d111-4d41-bac3-ad548e082e18\") " pod="openshift-marketplace/redhat-operators-676br" Jan 11 18:22:48 crc kubenswrapper[4837]: I0111 18:22:48.498946 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mrg8r\" (UniqueName: \"kubernetes.io/projected/f0d5171a-d111-4d41-bac3-ad548e082e18-kube-api-access-mrg8r\") pod \"redhat-operators-676br\" (UID: \"f0d5171a-d111-4d41-bac3-ad548e082e18\") " pod="openshift-marketplace/redhat-operators-676br" Jan 11 18:22:48 crc kubenswrapper[4837]: I0111 18:22:48.617008 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-676br" Jan 11 18:22:49 crc kubenswrapper[4837]: I0111 18:22:49.154629 4837 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-676br"] Jan 11 18:22:49 crc kubenswrapper[4837]: I0111 18:22:49.682575 4837 generic.go:334] "Generic (PLEG): container finished" podID="f0d5171a-d111-4d41-bac3-ad548e082e18" containerID="e1b7055b89caddbf33a4f5bbfeae44e8c89e5a2826b55edd2088edde65d77b08" exitCode=0 Jan 11 18:22:49 crc kubenswrapper[4837]: I0111 18:22:49.682943 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-676br" event={"ID":"f0d5171a-d111-4d41-bac3-ad548e082e18","Type":"ContainerDied","Data":"e1b7055b89caddbf33a4f5bbfeae44e8c89e5a2826b55edd2088edde65d77b08"} Jan 11 18:22:49 crc kubenswrapper[4837]: I0111 18:22:49.682980 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-676br" event={"ID":"f0d5171a-d111-4d41-bac3-ad548e082e18","Type":"ContainerStarted","Data":"05e7281ca6bab6eb15e60d5c0e4c441b31213bcbf1909412a73b7637942d107b"} Jan 11 18:22:49 crc kubenswrapper[4837]: I0111 18:22:49.686529 4837 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Jan 11 18:22:51 crc kubenswrapper[4837]: I0111 18:22:51.705356 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-676br" event={"ID":"f0d5171a-d111-4d41-bac3-ad548e082e18","Type":"ContainerStarted","Data":"d9df20b42e401e30cee0bc16168b45b84b95a1f481c92c75010e19559e781995"} Jan 11 18:22:53 crc kubenswrapper[4837]: I0111 18:22:53.727253 4837 generic.go:334] "Generic (PLEG): container finished" podID="f0d5171a-d111-4d41-bac3-ad548e082e18" containerID="d9df20b42e401e30cee0bc16168b45b84b95a1f481c92c75010e19559e781995" exitCode=0 Jan 11 18:22:53 crc kubenswrapper[4837]: I0111 18:22:53.727320 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-676br" event={"ID":"f0d5171a-d111-4d41-bac3-ad548e082e18","Type":"ContainerDied","Data":"d9df20b42e401e30cee0bc16168b45b84b95a1f481c92c75010e19559e781995"} Jan 11 18:22:54 crc kubenswrapper[4837]: I0111 18:22:54.740247 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-676br" event={"ID":"f0d5171a-d111-4d41-bac3-ad548e082e18","Type":"ContainerStarted","Data":"a8fd5a065e00dd5fb947ffdade607a11e42ad102d409aa158dfbca3b2e2271fa"} Jan 11 18:22:54 crc kubenswrapper[4837]: I0111 18:22:54.765235 4837 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-676br" podStartSLOduration=2.139087066 podStartE2EDuration="6.765220487s" podCreationTimestamp="2026-01-11 18:22:48 +0000 UTC" firstStartedPulling="2026-01-11 18:22:49.68630429 +0000 UTC m=+3143.864496986" lastFinishedPulling="2026-01-11 18:22:54.312437711 +0000 UTC m=+3148.490630407" observedRunningTime="2026-01-11 18:22:54.756751009 +0000 UTC m=+3148.934943715" watchObservedRunningTime="2026-01-11 18:22:54.765220487 +0000 UTC m=+3148.943413193" Jan 11 18:22:55 crc kubenswrapper[4837]: I0111 18:22:55.364075 4837 scope.go:117] "RemoveContainer" containerID="39b5b7180c0907dd777b55e3381ab4312f0b40d0e0b28505d8e75e226146dfdf" Jan 11 18:22:55 crc kubenswrapper[4837]: E0111 18:22:55.364530 4837 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-pqnst_openshift-machine-config-operator(1cf6fa66-290a-4e29-bafc-e60185a22fe2)\"" pod="openshift-machine-config-operator/machine-config-daemon-pqnst" podUID="1cf6fa66-290a-4e29-bafc-e60185a22fe2" Jan 11 18:22:58 crc kubenswrapper[4837]: I0111 18:22:58.617453 4837 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-676br" Jan 11 18:22:58 crc kubenswrapper[4837]: I0111 18:22:58.618040 4837 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-676br" Jan 11 18:22:59 crc kubenswrapper[4837]: I0111 18:22:59.670172 4837 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-676br" podUID="f0d5171a-d111-4d41-bac3-ad548e082e18" containerName="registry-server" probeResult="failure" output=< Jan 11 18:22:59 crc kubenswrapper[4837]: timeout: failed to connect service ":50051" within 1s Jan 11 18:22:59 crc kubenswrapper[4837]: > Jan 11 18:23:08 crc kubenswrapper[4837]: I0111 18:23:08.669929 4837 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-676br" Jan 11 18:23:08 crc kubenswrapper[4837]: I0111 18:23:08.747146 4837 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-676br" Jan 11 18:23:08 crc kubenswrapper[4837]: I0111 18:23:08.911853 4837 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-676br"] Jan 11 18:23:09 crc kubenswrapper[4837]: I0111 18:23:09.881431 4837 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-676br" podUID="f0d5171a-d111-4d41-bac3-ad548e082e18" containerName="registry-server" containerID="cri-o://a8fd5a065e00dd5fb947ffdade607a11e42ad102d409aa158dfbca3b2e2271fa" gracePeriod=2 Jan 11 18:23:10 crc kubenswrapper[4837]: I0111 18:23:10.364839 4837 scope.go:117] "RemoveContainer" containerID="39b5b7180c0907dd777b55e3381ab4312f0b40d0e0b28505d8e75e226146dfdf" Jan 11 18:23:10 crc kubenswrapper[4837]: E0111 18:23:10.365585 4837 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-pqnst_openshift-machine-config-operator(1cf6fa66-290a-4e29-bafc-e60185a22fe2)\"" pod="openshift-machine-config-operator/machine-config-daemon-pqnst" podUID="1cf6fa66-290a-4e29-bafc-e60185a22fe2" Jan 11 18:23:10 crc kubenswrapper[4837]: I0111 18:23:10.376751 4837 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-676br" Jan 11 18:23:10 crc kubenswrapper[4837]: I0111 18:23:10.449193 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f0d5171a-d111-4d41-bac3-ad548e082e18-utilities\") pod \"f0d5171a-d111-4d41-bac3-ad548e082e18\" (UID: \"f0d5171a-d111-4d41-bac3-ad548e082e18\") " Jan 11 18:23:10 crc kubenswrapper[4837]: I0111 18:23:10.449355 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f0d5171a-d111-4d41-bac3-ad548e082e18-catalog-content\") pod \"f0d5171a-d111-4d41-bac3-ad548e082e18\" (UID: \"f0d5171a-d111-4d41-bac3-ad548e082e18\") " Jan 11 18:23:10 crc kubenswrapper[4837]: I0111 18:23:10.449437 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mrg8r\" (UniqueName: \"kubernetes.io/projected/f0d5171a-d111-4d41-bac3-ad548e082e18-kube-api-access-mrg8r\") pod \"f0d5171a-d111-4d41-bac3-ad548e082e18\" (UID: \"f0d5171a-d111-4d41-bac3-ad548e082e18\") " Jan 11 18:23:10 crc kubenswrapper[4837]: I0111 18:23:10.450632 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/f0d5171a-d111-4d41-bac3-ad548e082e18-utilities" (OuterVolumeSpecName: "utilities") pod "f0d5171a-d111-4d41-bac3-ad548e082e18" (UID: "f0d5171a-d111-4d41-bac3-ad548e082e18"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 11 18:23:10 crc kubenswrapper[4837]: I0111 18:23:10.457512 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f0d5171a-d111-4d41-bac3-ad548e082e18-kube-api-access-mrg8r" (OuterVolumeSpecName: "kube-api-access-mrg8r") pod "f0d5171a-d111-4d41-bac3-ad548e082e18" (UID: "f0d5171a-d111-4d41-bac3-ad548e082e18"). InnerVolumeSpecName "kube-api-access-mrg8r". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 11 18:23:10 crc kubenswrapper[4837]: I0111 18:23:10.552568 4837 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f0d5171a-d111-4d41-bac3-ad548e082e18-utilities\") on node \"crc\" DevicePath \"\"" Jan 11 18:23:10 crc kubenswrapper[4837]: I0111 18:23:10.552604 4837 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mrg8r\" (UniqueName: \"kubernetes.io/projected/f0d5171a-d111-4d41-bac3-ad548e082e18-kube-api-access-mrg8r\") on node \"crc\" DevicePath \"\"" Jan 11 18:23:10 crc kubenswrapper[4837]: I0111 18:23:10.570814 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/f0d5171a-d111-4d41-bac3-ad548e082e18-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "f0d5171a-d111-4d41-bac3-ad548e082e18" (UID: "f0d5171a-d111-4d41-bac3-ad548e082e18"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 11 18:23:10 crc kubenswrapper[4837]: I0111 18:23:10.654439 4837 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f0d5171a-d111-4d41-bac3-ad548e082e18-catalog-content\") on node \"crc\" DevicePath \"\"" Jan 11 18:23:10 crc kubenswrapper[4837]: I0111 18:23:10.894129 4837 generic.go:334] "Generic (PLEG): container finished" podID="f0d5171a-d111-4d41-bac3-ad548e082e18" containerID="a8fd5a065e00dd5fb947ffdade607a11e42ad102d409aa158dfbca3b2e2271fa" exitCode=0 Jan 11 18:23:10 crc kubenswrapper[4837]: I0111 18:23:10.894179 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-676br" event={"ID":"f0d5171a-d111-4d41-bac3-ad548e082e18","Type":"ContainerDied","Data":"a8fd5a065e00dd5fb947ffdade607a11e42ad102d409aa158dfbca3b2e2271fa"} Jan 11 18:23:10 crc kubenswrapper[4837]: I0111 18:23:10.894202 4837 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-676br" Jan 11 18:23:10 crc kubenswrapper[4837]: I0111 18:23:10.894227 4837 scope.go:117] "RemoveContainer" containerID="a8fd5a065e00dd5fb947ffdade607a11e42ad102d409aa158dfbca3b2e2271fa" Jan 11 18:23:10 crc kubenswrapper[4837]: I0111 18:23:10.894212 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-676br" event={"ID":"f0d5171a-d111-4d41-bac3-ad548e082e18","Type":"ContainerDied","Data":"05e7281ca6bab6eb15e60d5c0e4c441b31213bcbf1909412a73b7637942d107b"} Jan 11 18:23:10 crc kubenswrapper[4837]: I0111 18:23:10.938024 4837 scope.go:117] "RemoveContainer" containerID="d9df20b42e401e30cee0bc16168b45b84b95a1f481c92c75010e19559e781995" Jan 11 18:23:10 crc kubenswrapper[4837]: I0111 18:23:10.940944 4837 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-676br"] Jan 11 18:23:10 crc kubenswrapper[4837]: I0111 18:23:10.949821 4837 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-676br"] Jan 11 18:23:10 crc kubenswrapper[4837]: I0111 18:23:10.978749 4837 scope.go:117] "RemoveContainer" containerID="e1b7055b89caddbf33a4f5bbfeae44e8c89e5a2826b55edd2088edde65d77b08" Jan 11 18:23:11 crc kubenswrapper[4837]: I0111 18:23:11.011051 4837 scope.go:117] "RemoveContainer" containerID="a8fd5a065e00dd5fb947ffdade607a11e42ad102d409aa158dfbca3b2e2271fa" Jan 11 18:23:11 crc kubenswrapper[4837]: E0111 18:23:11.011539 4837 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a8fd5a065e00dd5fb947ffdade607a11e42ad102d409aa158dfbca3b2e2271fa\": container with ID starting with a8fd5a065e00dd5fb947ffdade607a11e42ad102d409aa158dfbca3b2e2271fa not found: ID does not exist" containerID="a8fd5a065e00dd5fb947ffdade607a11e42ad102d409aa158dfbca3b2e2271fa" Jan 11 18:23:11 crc kubenswrapper[4837]: I0111 18:23:11.011595 4837 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a8fd5a065e00dd5fb947ffdade607a11e42ad102d409aa158dfbca3b2e2271fa"} err="failed to get container status \"a8fd5a065e00dd5fb947ffdade607a11e42ad102d409aa158dfbca3b2e2271fa\": rpc error: code = NotFound desc = could not find container \"a8fd5a065e00dd5fb947ffdade607a11e42ad102d409aa158dfbca3b2e2271fa\": container with ID starting with a8fd5a065e00dd5fb947ffdade607a11e42ad102d409aa158dfbca3b2e2271fa not found: ID does not exist" Jan 11 18:23:11 crc kubenswrapper[4837]: I0111 18:23:11.011625 4837 scope.go:117] "RemoveContainer" containerID="d9df20b42e401e30cee0bc16168b45b84b95a1f481c92c75010e19559e781995" Jan 11 18:23:11 crc kubenswrapper[4837]: E0111 18:23:11.012128 4837 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d9df20b42e401e30cee0bc16168b45b84b95a1f481c92c75010e19559e781995\": container with ID starting with d9df20b42e401e30cee0bc16168b45b84b95a1f481c92c75010e19559e781995 not found: ID does not exist" containerID="d9df20b42e401e30cee0bc16168b45b84b95a1f481c92c75010e19559e781995" Jan 11 18:23:11 crc kubenswrapper[4837]: I0111 18:23:11.012181 4837 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d9df20b42e401e30cee0bc16168b45b84b95a1f481c92c75010e19559e781995"} err="failed to get container status \"d9df20b42e401e30cee0bc16168b45b84b95a1f481c92c75010e19559e781995\": rpc error: code = NotFound desc = could not find container \"d9df20b42e401e30cee0bc16168b45b84b95a1f481c92c75010e19559e781995\": container with ID starting with d9df20b42e401e30cee0bc16168b45b84b95a1f481c92c75010e19559e781995 not found: ID does not exist" Jan 11 18:23:11 crc kubenswrapper[4837]: I0111 18:23:11.012215 4837 scope.go:117] "RemoveContainer" containerID="e1b7055b89caddbf33a4f5bbfeae44e8c89e5a2826b55edd2088edde65d77b08" Jan 11 18:23:11 crc kubenswrapper[4837]: E0111 18:23:11.012557 4837 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e1b7055b89caddbf33a4f5bbfeae44e8c89e5a2826b55edd2088edde65d77b08\": container with ID starting with e1b7055b89caddbf33a4f5bbfeae44e8c89e5a2826b55edd2088edde65d77b08 not found: ID does not exist" containerID="e1b7055b89caddbf33a4f5bbfeae44e8c89e5a2826b55edd2088edde65d77b08" Jan 11 18:23:11 crc kubenswrapper[4837]: I0111 18:23:11.012607 4837 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e1b7055b89caddbf33a4f5bbfeae44e8c89e5a2826b55edd2088edde65d77b08"} err="failed to get container status \"e1b7055b89caddbf33a4f5bbfeae44e8c89e5a2826b55edd2088edde65d77b08\": rpc error: code = NotFound desc = could not find container \"e1b7055b89caddbf33a4f5bbfeae44e8c89e5a2826b55edd2088edde65d77b08\": container with ID starting with e1b7055b89caddbf33a4f5bbfeae44e8c89e5a2826b55edd2088edde65d77b08 not found: ID does not exist" Jan 11 18:23:12 crc kubenswrapper[4837]: I0111 18:23:12.381329 4837 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f0d5171a-d111-4d41-bac3-ad548e082e18" path="/var/lib/kubelet/pods/f0d5171a-d111-4d41-bac3-ad548e082e18/volumes" Jan 11 18:23:23 crc kubenswrapper[4837]: I0111 18:23:23.364954 4837 scope.go:117] "RemoveContainer" containerID="39b5b7180c0907dd777b55e3381ab4312f0b40d0e0b28505d8e75e226146dfdf" Jan 11 18:23:23 crc kubenswrapper[4837]: E0111 18:23:23.365817 4837 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-pqnst_openshift-machine-config-operator(1cf6fa66-290a-4e29-bafc-e60185a22fe2)\"" pod="openshift-machine-config-operator/machine-config-daemon-pqnst" podUID="1cf6fa66-290a-4e29-bafc-e60185a22fe2" Jan 11 18:23:35 crc kubenswrapper[4837]: I0111 18:23:35.363833 4837 scope.go:117] "RemoveContainer" containerID="39b5b7180c0907dd777b55e3381ab4312f0b40d0e0b28505d8e75e226146dfdf" Jan 11 18:23:35 crc kubenswrapper[4837]: E0111 18:23:35.364583 4837 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-pqnst_openshift-machine-config-operator(1cf6fa66-290a-4e29-bafc-e60185a22fe2)\"" pod="openshift-machine-config-operator/machine-config-daemon-pqnst" podUID="1cf6fa66-290a-4e29-bafc-e60185a22fe2" Jan 11 18:23:47 crc kubenswrapper[4837]: I0111 18:23:47.364812 4837 scope.go:117] "RemoveContainer" containerID="39b5b7180c0907dd777b55e3381ab4312f0b40d0e0b28505d8e75e226146dfdf" Jan 11 18:23:47 crc kubenswrapper[4837]: E0111 18:23:47.367715 4837 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-pqnst_openshift-machine-config-operator(1cf6fa66-290a-4e29-bafc-e60185a22fe2)\"" pod="openshift-machine-config-operator/machine-config-daemon-pqnst" podUID="1cf6fa66-290a-4e29-bafc-e60185a22fe2" Jan 11 18:24:02 crc kubenswrapper[4837]: I0111 18:24:02.370473 4837 scope.go:117] "RemoveContainer" containerID="39b5b7180c0907dd777b55e3381ab4312f0b40d0e0b28505d8e75e226146dfdf" Jan 11 18:24:02 crc kubenswrapper[4837]: E0111 18:24:02.371237 4837 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-pqnst_openshift-machine-config-operator(1cf6fa66-290a-4e29-bafc-e60185a22fe2)\"" pod="openshift-machine-config-operator/machine-config-daemon-pqnst" podUID="1cf6fa66-290a-4e29-bafc-e60185a22fe2" Jan 11 18:24:17 crc kubenswrapper[4837]: I0111 18:24:17.364631 4837 scope.go:117] "RemoveContainer" containerID="39b5b7180c0907dd777b55e3381ab4312f0b40d0e0b28505d8e75e226146dfdf" Jan 11 18:24:17 crc kubenswrapper[4837]: E0111 18:24:17.365458 4837 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-pqnst_openshift-machine-config-operator(1cf6fa66-290a-4e29-bafc-e60185a22fe2)\"" pod="openshift-machine-config-operator/machine-config-daemon-pqnst" podUID="1cf6fa66-290a-4e29-bafc-e60185a22fe2" Jan 11 18:24:29 crc kubenswrapper[4837]: I0111 18:24:29.363603 4837 scope.go:117] "RemoveContainer" containerID="39b5b7180c0907dd777b55e3381ab4312f0b40d0e0b28505d8e75e226146dfdf" Jan 11 18:24:29 crc kubenswrapper[4837]: E0111 18:24:29.364261 4837 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-pqnst_openshift-machine-config-operator(1cf6fa66-290a-4e29-bafc-e60185a22fe2)\"" pod="openshift-machine-config-operator/machine-config-daemon-pqnst" podUID="1cf6fa66-290a-4e29-bafc-e60185a22fe2" Jan 11 18:24:41 crc kubenswrapper[4837]: I0111 18:24:41.364169 4837 scope.go:117] "RemoveContainer" containerID="39b5b7180c0907dd777b55e3381ab4312f0b40d0e0b28505d8e75e226146dfdf" Jan 11 18:24:41 crc kubenswrapper[4837]: E0111 18:24:41.365144 4837 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-pqnst_openshift-machine-config-operator(1cf6fa66-290a-4e29-bafc-e60185a22fe2)\"" pod="openshift-machine-config-operator/machine-config-daemon-pqnst" podUID="1cf6fa66-290a-4e29-bafc-e60185a22fe2" Jan 11 18:24:55 crc kubenswrapper[4837]: I0111 18:24:55.363873 4837 scope.go:117] "RemoveContainer" containerID="39b5b7180c0907dd777b55e3381ab4312f0b40d0e0b28505d8e75e226146dfdf" Jan 11 18:24:55 crc kubenswrapper[4837]: E0111 18:24:55.364606 4837 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-pqnst_openshift-machine-config-operator(1cf6fa66-290a-4e29-bafc-e60185a22fe2)\"" pod="openshift-machine-config-operator/machine-config-daemon-pqnst" podUID="1cf6fa66-290a-4e29-bafc-e60185a22fe2" Jan 11 18:25:06 crc kubenswrapper[4837]: I0111 18:25:06.371095 4837 scope.go:117] "RemoveContainer" containerID="39b5b7180c0907dd777b55e3381ab4312f0b40d0e0b28505d8e75e226146dfdf" Jan 11 18:25:06 crc kubenswrapper[4837]: E0111 18:25:06.371992 4837 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-pqnst_openshift-machine-config-operator(1cf6fa66-290a-4e29-bafc-e60185a22fe2)\"" pod="openshift-machine-config-operator/machine-config-daemon-pqnst" podUID="1cf6fa66-290a-4e29-bafc-e60185a22fe2" Jan 11 18:25:18 crc kubenswrapper[4837]: I0111 18:25:18.364714 4837 scope.go:117] "RemoveContainer" containerID="39b5b7180c0907dd777b55e3381ab4312f0b40d0e0b28505d8e75e226146dfdf" Jan 11 18:25:18 crc kubenswrapper[4837]: E0111 18:25:18.365327 4837 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-pqnst_openshift-machine-config-operator(1cf6fa66-290a-4e29-bafc-e60185a22fe2)\"" pod="openshift-machine-config-operator/machine-config-daemon-pqnst" podUID="1cf6fa66-290a-4e29-bafc-e60185a22fe2" Jan 11 18:25:30 crc kubenswrapper[4837]: I0111 18:25:30.365339 4837 scope.go:117] "RemoveContainer" containerID="39b5b7180c0907dd777b55e3381ab4312f0b40d0e0b28505d8e75e226146dfdf" Jan 11 18:25:30 crc kubenswrapper[4837]: E0111 18:25:30.366587 4837 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-pqnst_openshift-machine-config-operator(1cf6fa66-290a-4e29-bafc-e60185a22fe2)\"" pod="openshift-machine-config-operator/machine-config-daemon-pqnst" podUID="1cf6fa66-290a-4e29-bafc-e60185a22fe2" Jan 11 18:25:45 crc kubenswrapper[4837]: I0111 18:25:45.363863 4837 scope.go:117] "RemoveContainer" containerID="39b5b7180c0907dd777b55e3381ab4312f0b40d0e0b28505d8e75e226146dfdf" Jan 11 18:25:46 crc kubenswrapper[4837]: I0111 18:25:46.490579 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-pqnst" event={"ID":"1cf6fa66-290a-4e29-bafc-e60185a22fe2","Type":"ContainerStarted","Data":"4dac9e020402176affb7612f80710fb29436721f04c0573c72f72cf26ab9f649"} Jan 11 18:28:09 crc kubenswrapper[4837]: I0111 18:28:09.444276 4837 patch_prober.go:28] interesting pod/machine-config-daemon-pqnst container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Jan 11 18:28:09 crc kubenswrapper[4837]: I0111 18:28:09.444866 4837 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-pqnst" podUID="1cf6fa66-290a-4e29-bafc-e60185a22fe2" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Jan 11 18:28:36 crc kubenswrapper[4837]: I0111 18:28:36.650186 4837 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-lb869"] Jan 11 18:28:36 crc kubenswrapper[4837]: E0111 18:28:36.651275 4837 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f0d5171a-d111-4d41-bac3-ad548e082e18" containerName="extract-utilities" Jan 11 18:28:36 crc kubenswrapper[4837]: I0111 18:28:36.651293 4837 state_mem.go:107] "Deleted CPUSet assignment" podUID="f0d5171a-d111-4d41-bac3-ad548e082e18" containerName="extract-utilities" Jan 11 18:28:36 crc kubenswrapper[4837]: E0111 18:28:36.651313 4837 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f0d5171a-d111-4d41-bac3-ad548e082e18" containerName="extract-content" Jan 11 18:28:36 crc kubenswrapper[4837]: I0111 18:28:36.651321 4837 state_mem.go:107] "Deleted CPUSet assignment" podUID="f0d5171a-d111-4d41-bac3-ad548e082e18" containerName="extract-content" Jan 11 18:28:36 crc kubenswrapper[4837]: E0111 18:28:36.651343 4837 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f0d5171a-d111-4d41-bac3-ad548e082e18" containerName="registry-server" Jan 11 18:28:36 crc kubenswrapper[4837]: I0111 18:28:36.651352 4837 state_mem.go:107] "Deleted CPUSet assignment" podUID="f0d5171a-d111-4d41-bac3-ad548e082e18" containerName="registry-server" Jan 11 18:28:36 crc kubenswrapper[4837]: I0111 18:28:36.651589 4837 memory_manager.go:354] "RemoveStaleState removing state" podUID="f0d5171a-d111-4d41-bac3-ad548e082e18" containerName="registry-server" Jan 11 18:28:36 crc kubenswrapper[4837]: I0111 18:28:36.653521 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-lb869" Jan 11 18:28:36 crc kubenswrapper[4837]: I0111 18:28:36.665787 4837 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-lb869"] Jan 11 18:28:36 crc kubenswrapper[4837]: I0111 18:28:36.701491 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/9e6d86f8-298e-42c1-89a1-32d2006b6e4f-utilities\") pod \"community-operators-lb869\" (UID: \"9e6d86f8-298e-42c1-89a1-32d2006b6e4f\") " pod="openshift-marketplace/community-operators-lb869" Jan 11 18:28:36 crc kubenswrapper[4837]: I0111 18:28:36.701579 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2xfml\" (UniqueName: \"kubernetes.io/projected/9e6d86f8-298e-42c1-89a1-32d2006b6e4f-kube-api-access-2xfml\") pod \"community-operators-lb869\" (UID: \"9e6d86f8-298e-42c1-89a1-32d2006b6e4f\") " pod="openshift-marketplace/community-operators-lb869" Jan 11 18:28:36 crc kubenswrapper[4837]: I0111 18:28:36.701784 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/9e6d86f8-298e-42c1-89a1-32d2006b6e4f-catalog-content\") pod \"community-operators-lb869\" (UID: \"9e6d86f8-298e-42c1-89a1-32d2006b6e4f\") " pod="openshift-marketplace/community-operators-lb869" Jan 11 18:28:36 crc kubenswrapper[4837]: I0111 18:28:36.803283 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/9e6d86f8-298e-42c1-89a1-32d2006b6e4f-utilities\") pod \"community-operators-lb869\" (UID: \"9e6d86f8-298e-42c1-89a1-32d2006b6e4f\") " pod="openshift-marketplace/community-operators-lb869" Jan 11 18:28:36 crc kubenswrapper[4837]: I0111 18:28:36.803393 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2xfml\" (UniqueName: \"kubernetes.io/projected/9e6d86f8-298e-42c1-89a1-32d2006b6e4f-kube-api-access-2xfml\") pod \"community-operators-lb869\" (UID: \"9e6d86f8-298e-42c1-89a1-32d2006b6e4f\") " pod="openshift-marketplace/community-operators-lb869" Jan 11 18:28:36 crc kubenswrapper[4837]: I0111 18:28:36.803580 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/9e6d86f8-298e-42c1-89a1-32d2006b6e4f-catalog-content\") pod \"community-operators-lb869\" (UID: \"9e6d86f8-298e-42c1-89a1-32d2006b6e4f\") " pod="openshift-marketplace/community-operators-lb869" Jan 11 18:28:36 crc kubenswrapper[4837]: I0111 18:28:36.803853 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/9e6d86f8-298e-42c1-89a1-32d2006b6e4f-utilities\") pod \"community-operators-lb869\" (UID: \"9e6d86f8-298e-42c1-89a1-32d2006b6e4f\") " pod="openshift-marketplace/community-operators-lb869" Jan 11 18:28:36 crc kubenswrapper[4837]: I0111 18:28:36.804147 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/9e6d86f8-298e-42c1-89a1-32d2006b6e4f-catalog-content\") pod \"community-operators-lb869\" (UID: \"9e6d86f8-298e-42c1-89a1-32d2006b6e4f\") " pod="openshift-marketplace/community-operators-lb869" Jan 11 18:28:36 crc kubenswrapper[4837]: I0111 18:28:36.823605 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2xfml\" (UniqueName: \"kubernetes.io/projected/9e6d86f8-298e-42c1-89a1-32d2006b6e4f-kube-api-access-2xfml\") pod \"community-operators-lb869\" (UID: \"9e6d86f8-298e-42c1-89a1-32d2006b6e4f\") " pod="openshift-marketplace/community-operators-lb869" Jan 11 18:28:36 crc kubenswrapper[4837]: I0111 18:28:36.979368 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-lb869" Jan 11 18:28:37 crc kubenswrapper[4837]: I0111 18:28:37.521135 4837 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-lb869"] Jan 11 18:28:38 crc kubenswrapper[4837]: I0111 18:28:38.230244 4837 generic.go:334] "Generic (PLEG): container finished" podID="9e6d86f8-298e-42c1-89a1-32d2006b6e4f" containerID="73c05f8534296a117ffa1b50d40ef016b104cda9557d27e8a20fe8a5101a6112" exitCode=0 Jan 11 18:28:38 crc kubenswrapper[4837]: I0111 18:28:38.230354 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-lb869" event={"ID":"9e6d86f8-298e-42c1-89a1-32d2006b6e4f","Type":"ContainerDied","Data":"73c05f8534296a117ffa1b50d40ef016b104cda9557d27e8a20fe8a5101a6112"} Jan 11 18:28:38 crc kubenswrapper[4837]: I0111 18:28:38.230549 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-lb869" event={"ID":"9e6d86f8-298e-42c1-89a1-32d2006b6e4f","Type":"ContainerStarted","Data":"b91a18cc16acc168a69f1bafead77f3ff97a1b793d3abe5c007d73a6483866f6"} Jan 11 18:28:38 crc kubenswrapper[4837]: I0111 18:28:38.233703 4837 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Jan 11 18:28:39 crc kubenswrapper[4837]: I0111 18:28:39.244883 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-lb869" event={"ID":"9e6d86f8-298e-42c1-89a1-32d2006b6e4f","Type":"ContainerStarted","Data":"0cd1b168f4463ff87024eed8d6a97ebaffcf7a58e0fb01ce4eb99f2f92de700d"} Jan 11 18:28:39 crc kubenswrapper[4837]: I0111 18:28:39.444765 4837 patch_prober.go:28] interesting pod/machine-config-daemon-pqnst container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Jan 11 18:28:39 crc kubenswrapper[4837]: I0111 18:28:39.445029 4837 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-pqnst" podUID="1cf6fa66-290a-4e29-bafc-e60185a22fe2" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Jan 11 18:28:40 crc kubenswrapper[4837]: I0111 18:28:40.255367 4837 generic.go:334] "Generic (PLEG): container finished" podID="9e6d86f8-298e-42c1-89a1-32d2006b6e4f" containerID="0cd1b168f4463ff87024eed8d6a97ebaffcf7a58e0fb01ce4eb99f2f92de700d" exitCode=0 Jan 11 18:28:40 crc kubenswrapper[4837]: I0111 18:28:40.255409 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-lb869" event={"ID":"9e6d86f8-298e-42c1-89a1-32d2006b6e4f","Type":"ContainerDied","Data":"0cd1b168f4463ff87024eed8d6a97ebaffcf7a58e0fb01ce4eb99f2f92de700d"} Jan 11 18:28:41 crc kubenswrapper[4837]: I0111 18:28:41.268437 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-lb869" event={"ID":"9e6d86f8-298e-42c1-89a1-32d2006b6e4f","Type":"ContainerStarted","Data":"da5985f31d5c6b1bfc418a4fb4beedbe4192606679ecf5e61c839d3f2221c280"} Jan 11 18:28:41 crc kubenswrapper[4837]: I0111 18:28:41.294579 4837 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-lb869" podStartSLOduration=2.859129938 podStartE2EDuration="5.294557983s" podCreationTimestamp="2026-01-11 18:28:36 +0000 UTC" firstStartedPulling="2026-01-11 18:28:38.233384449 +0000 UTC m=+3492.411577155" lastFinishedPulling="2026-01-11 18:28:40.668812494 +0000 UTC m=+3494.847005200" observedRunningTime="2026-01-11 18:28:41.290026771 +0000 UTC m=+3495.468219497" watchObservedRunningTime="2026-01-11 18:28:41.294557983 +0000 UTC m=+3495.472750699" Jan 11 18:28:46 crc kubenswrapper[4837]: I0111 18:28:46.979858 4837 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-lb869" Jan 11 18:28:46 crc kubenswrapper[4837]: I0111 18:28:46.980336 4837 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-lb869" Jan 11 18:28:47 crc kubenswrapper[4837]: I0111 18:28:47.060640 4837 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-lb869" Jan 11 18:28:47 crc kubenswrapper[4837]: I0111 18:28:47.383823 4837 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-lb869" Jan 11 18:28:47 crc kubenswrapper[4837]: I0111 18:28:47.452985 4837 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-lb869"] Jan 11 18:28:49 crc kubenswrapper[4837]: I0111 18:28:49.341031 4837 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-lb869" podUID="9e6d86f8-298e-42c1-89a1-32d2006b6e4f" containerName="registry-server" containerID="cri-o://da5985f31d5c6b1bfc418a4fb4beedbe4192606679ecf5e61c839d3f2221c280" gracePeriod=2 Jan 11 18:28:49 crc kubenswrapper[4837]: I0111 18:28:49.858242 4837 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-lb869" Jan 11 18:28:49 crc kubenswrapper[4837]: I0111 18:28:49.879439 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2xfml\" (UniqueName: \"kubernetes.io/projected/9e6d86f8-298e-42c1-89a1-32d2006b6e4f-kube-api-access-2xfml\") pod \"9e6d86f8-298e-42c1-89a1-32d2006b6e4f\" (UID: \"9e6d86f8-298e-42c1-89a1-32d2006b6e4f\") " Jan 11 18:28:49 crc kubenswrapper[4837]: I0111 18:28:49.879663 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/9e6d86f8-298e-42c1-89a1-32d2006b6e4f-utilities\") pod \"9e6d86f8-298e-42c1-89a1-32d2006b6e4f\" (UID: \"9e6d86f8-298e-42c1-89a1-32d2006b6e4f\") " Jan 11 18:28:49 crc kubenswrapper[4837]: I0111 18:28:49.879834 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/9e6d86f8-298e-42c1-89a1-32d2006b6e4f-catalog-content\") pod \"9e6d86f8-298e-42c1-89a1-32d2006b6e4f\" (UID: \"9e6d86f8-298e-42c1-89a1-32d2006b6e4f\") " Jan 11 18:28:49 crc kubenswrapper[4837]: I0111 18:28:49.880782 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/9e6d86f8-298e-42c1-89a1-32d2006b6e4f-utilities" (OuterVolumeSpecName: "utilities") pod "9e6d86f8-298e-42c1-89a1-32d2006b6e4f" (UID: "9e6d86f8-298e-42c1-89a1-32d2006b6e4f"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 11 18:28:49 crc kubenswrapper[4837]: I0111 18:28:49.923126 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9e6d86f8-298e-42c1-89a1-32d2006b6e4f-kube-api-access-2xfml" (OuterVolumeSpecName: "kube-api-access-2xfml") pod "9e6d86f8-298e-42c1-89a1-32d2006b6e4f" (UID: "9e6d86f8-298e-42c1-89a1-32d2006b6e4f"). InnerVolumeSpecName "kube-api-access-2xfml". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 11 18:28:49 crc kubenswrapper[4837]: I0111 18:28:49.939384 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/9e6d86f8-298e-42c1-89a1-32d2006b6e4f-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "9e6d86f8-298e-42c1-89a1-32d2006b6e4f" (UID: "9e6d86f8-298e-42c1-89a1-32d2006b6e4f"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 11 18:28:49 crc kubenswrapper[4837]: I0111 18:28:49.983238 4837 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/9e6d86f8-298e-42c1-89a1-32d2006b6e4f-utilities\") on node \"crc\" DevicePath \"\"" Jan 11 18:28:49 crc kubenswrapper[4837]: I0111 18:28:49.983742 4837 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/9e6d86f8-298e-42c1-89a1-32d2006b6e4f-catalog-content\") on node \"crc\" DevicePath \"\"" Jan 11 18:28:49 crc kubenswrapper[4837]: I0111 18:28:49.983759 4837 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2xfml\" (UniqueName: \"kubernetes.io/projected/9e6d86f8-298e-42c1-89a1-32d2006b6e4f-kube-api-access-2xfml\") on node \"crc\" DevicePath \"\"" Jan 11 18:28:50 crc kubenswrapper[4837]: I0111 18:28:50.373134 4837 generic.go:334] "Generic (PLEG): container finished" podID="9e6d86f8-298e-42c1-89a1-32d2006b6e4f" containerID="da5985f31d5c6b1bfc418a4fb4beedbe4192606679ecf5e61c839d3f2221c280" exitCode=0 Jan 11 18:28:50 crc kubenswrapper[4837]: I0111 18:28:50.373288 4837 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-lb869" Jan 11 18:28:50 crc kubenswrapper[4837]: I0111 18:28:50.384171 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-lb869" event={"ID":"9e6d86f8-298e-42c1-89a1-32d2006b6e4f","Type":"ContainerDied","Data":"da5985f31d5c6b1bfc418a4fb4beedbe4192606679ecf5e61c839d3f2221c280"} Jan 11 18:28:50 crc kubenswrapper[4837]: I0111 18:28:50.384229 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-lb869" event={"ID":"9e6d86f8-298e-42c1-89a1-32d2006b6e4f","Type":"ContainerDied","Data":"b91a18cc16acc168a69f1bafead77f3ff97a1b793d3abe5c007d73a6483866f6"} Jan 11 18:28:50 crc kubenswrapper[4837]: I0111 18:28:50.384259 4837 scope.go:117] "RemoveContainer" containerID="da5985f31d5c6b1bfc418a4fb4beedbe4192606679ecf5e61c839d3f2221c280" Jan 11 18:28:50 crc kubenswrapper[4837]: I0111 18:28:50.434166 4837 scope.go:117] "RemoveContainer" containerID="0cd1b168f4463ff87024eed8d6a97ebaffcf7a58e0fb01ce4eb99f2f92de700d" Jan 11 18:28:50 crc kubenswrapper[4837]: I0111 18:28:50.434599 4837 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-lb869"] Jan 11 18:28:50 crc kubenswrapper[4837]: I0111 18:28:50.444373 4837 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-lb869"] Jan 11 18:28:50 crc kubenswrapper[4837]: I0111 18:28:50.455225 4837 scope.go:117] "RemoveContainer" containerID="73c05f8534296a117ffa1b50d40ef016b104cda9557d27e8a20fe8a5101a6112" Jan 11 18:28:50 crc kubenswrapper[4837]: I0111 18:28:50.503305 4837 scope.go:117] "RemoveContainer" containerID="da5985f31d5c6b1bfc418a4fb4beedbe4192606679ecf5e61c839d3f2221c280" Jan 11 18:28:50 crc kubenswrapper[4837]: E0111 18:28:50.503806 4837 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"da5985f31d5c6b1bfc418a4fb4beedbe4192606679ecf5e61c839d3f2221c280\": container with ID starting with da5985f31d5c6b1bfc418a4fb4beedbe4192606679ecf5e61c839d3f2221c280 not found: ID does not exist" containerID="da5985f31d5c6b1bfc418a4fb4beedbe4192606679ecf5e61c839d3f2221c280" Jan 11 18:28:50 crc kubenswrapper[4837]: I0111 18:28:50.503845 4837 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"da5985f31d5c6b1bfc418a4fb4beedbe4192606679ecf5e61c839d3f2221c280"} err="failed to get container status \"da5985f31d5c6b1bfc418a4fb4beedbe4192606679ecf5e61c839d3f2221c280\": rpc error: code = NotFound desc = could not find container \"da5985f31d5c6b1bfc418a4fb4beedbe4192606679ecf5e61c839d3f2221c280\": container with ID starting with da5985f31d5c6b1bfc418a4fb4beedbe4192606679ecf5e61c839d3f2221c280 not found: ID does not exist" Jan 11 18:28:50 crc kubenswrapper[4837]: I0111 18:28:50.503868 4837 scope.go:117] "RemoveContainer" containerID="0cd1b168f4463ff87024eed8d6a97ebaffcf7a58e0fb01ce4eb99f2f92de700d" Jan 11 18:28:50 crc kubenswrapper[4837]: E0111 18:28:50.504381 4837 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"0cd1b168f4463ff87024eed8d6a97ebaffcf7a58e0fb01ce4eb99f2f92de700d\": container with ID starting with 0cd1b168f4463ff87024eed8d6a97ebaffcf7a58e0fb01ce4eb99f2f92de700d not found: ID does not exist" containerID="0cd1b168f4463ff87024eed8d6a97ebaffcf7a58e0fb01ce4eb99f2f92de700d" Jan 11 18:28:50 crc kubenswrapper[4837]: I0111 18:28:50.504411 4837 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0cd1b168f4463ff87024eed8d6a97ebaffcf7a58e0fb01ce4eb99f2f92de700d"} err="failed to get container status \"0cd1b168f4463ff87024eed8d6a97ebaffcf7a58e0fb01ce4eb99f2f92de700d\": rpc error: code = NotFound desc = could not find container \"0cd1b168f4463ff87024eed8d6a97ebaffcf7a58e0fb01ce4eb99f2f92de700d\": container with ID starting with 0cd1b168f4463ff87024eed8d6a97ebaffcf7a58e0fb01ce4eb99f2f92de700d not found: ID does not exist" Jan 11 18:28:50 crc kubenswrapper[4837]: I0111 18:28:50.504430 4837 scope.go:117] "RemoveContainer" containerID="73c05f8534296a117ffa1b50d40ef016b104cda9557d27e8a20fe8a5101a6112" Jan 11 18:28:50 crc kubenswrapper[4837]: E0111 18:28:50.504893 4837 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"73c05f8534296a117ffa1b50d40ef016b104cda9557d27e8a20fe8a5101a6112\": container with ID starting with 73c05f8534296a117ffa1b50d40ef016b104cda9557d27e8a20fe8a5101a6112 not found: ID does not exist" containerID="73c05f8534296a117ffa1b50d40ef016b104cda9557d27e8a20fe8a5101a6112" Jan 11 18:28:50 crc kubenswrapper[4837]: I0111 18:28:50.504920 4837 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"73c05f8534296a117ffa1b50d40ef016b104cda9557d27e8a20fe8a5101a6112"} err="failed to get container status \"73c05f8534296a117ffa1b50d40ef016b104cda9557d27e8a20fe8a5101a6112\": rpc error: code = NotFound desc = could not find container \"73c05f8534296a117ffa1b50d40ef016b104cda9557d27e8a20fe8a5101a6112\": container with ID starting with 73c05f8534296a117ffa1b50d40ef016b104cda9557d27e8a20fe8a5101a6112 not found: ID does not exist" Jan 11 18:28:52 crc kubenswrapper[4837]: I0111 18:28:52.384081 4837 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="9e6d86f8-298e-42c1-89a1-32d2006b6e4f" path="/var/lib/kubelet/pods/9e6d86f8-298e-42c1-89a1-32d2006b6e4f/volumes" Jan 11 18:29:00 crc kubenswrapper[4837]: I0111 18:29:00.385105 4837 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-dtkld"] Jan 11 18:29:00 crc kubenswrapper[4837]: E0111 18:29:00.386249 4837 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9e6d86f8-298e-42c1-89a1-32d2006b6e4f" containerName="registry-server" Jan 11 18:29:00 crc kubenswrapper[4837]: I0111 18:29:00.386271 4837 state_mem.go:107] "Deleted CPUSet assignment" podUID="9e6d86f8-298e-42c1-89a1-32d2006b6e4f" containerName="registry-server" Jan 11 18:29:00 crc kubenswrapper[4837]: E0111 18:29:00.386297 4837 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9e6d86f8-298e-42c1-89a1-32d2006b6e4f" containerName="extract-utilities" Jan 11 18:29:00 crc kubenswrapper[4837]: I0111 18:29:00.386308 4837 state_mem.go:107] "Deleted CPUSet assignment" podUID="9e6d86f8-298e-42c1-89a1-32d2006b6e4f" containerName="extract-utilities" Jan 11 18:29:00 crc kubenswrapper[4837]: E0111 18:29:00.386323 4837 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9e6d86f8-298e-42c1-89a1-32d2006b6e4f" containerName="extract-content" Jan 11 18:29:00 crc kubenswrapper[4837]: I0111 18:29:00.386333 4837 state_mem.go:107] "Deleted CPUSet assignment" podUID="9e6d86f8-298e-42c1-89a1-32d2006b6e4f" containerName="extract-content" Jan 11 18:29:00 crc kubenswrapper[4837]: I0111 18:29:00.386711 4837 memory_manager.go:354] "RemoveStaleState removing state" podUID="9e6d86f8-298e-42c1-89a1-32d2006b6e4f" containerName="registry-server" Jan 11 18:29:00 crc kubenswrapper[4837]: I0111 18:29:00.388907 4837 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-dtkld"] Jan 11 18:29:00 crc kubenswrapper[4837]: I0111 18:29:00.389298 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-dtkld" Jan 11 18:29:00 crc kubenswrapper[4837]: I0111 18:29:00.523950 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/6ee0f3a8-c153-4fad-b4d4-3d6aff3de68b-utilities\") pod \"redhat-marketplace-dtkld\" (UID: \"6ee0f3a8-c153-4fad-b4d4-3d6aff3de68b\") " pod="openshift-marketplace/redhat-marketplace-dtkld" Jan 11 18:29:00 crc kubenswrapper[4837]: I0111 18:29:00.524058 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lrf8t\" (UniqueName: \"kubernetes.io/projected/6ee0f3a8-c153-4fad-b4d4-3d6aff3de68b-kube-api-access-lrf8t\") pod \"redhat-marketplace-dtkld\" (UID: \"6ee0f3a8-c153-4fad-b4d4-3d6aff3de68b\") " pod="openshift-marketplace/redhat-marketplace-dtkld" Jan 11 18:29:00 crc kubenswrapper[4837]: I0111 18:29:00.524169 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/6ee0f3a8-c153-4fad-b4d4-3d6aff3de68b-catalog-content\") pod \"redhat-marketplace-dtkld\" (UID: \"6ee0f3a8-c153-4fad-b4d4-3d6aff3de68b\") " pod="openshift-marketplace/redhat-marketplace-dtkld" Jan 11 18:29:00 crc kubenswrapper[4837]: I0111 18:29:00.627267 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/6ee0f3a8-c153-4fad-b4d4-3d6aff3de68b-utilities\") pod \"redhat-marketplace-dtkld\" (UID: \"6ee0f3a8-c153-4fad-b4d4-3d6aff3de68b\") " pod="openshift-marketplace/redhat-marketplace-dtkld" Jan 11 18:29:00 crc kubenswrapper[4837]: I0111 18:29:00.627322 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lrf8t\" (UniqueName: \"kubernetes.io/projected/6ee0f3a8-c153-4fad-b4d4-3d6aff3de68b-kube-api-access-lrf8t\") pod \"redhat-marketplace-dtkld\" (UID: \"6ee0f3a8-c153-4fad-b4d4-3d6aff3de68b\") " pod="openshift-marketplace/redhat-marketplace-dtkld" Jan 11 18:29:00 crc kubenswrapper[4837]: I0111 18:29:00.627345 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/6ee0f3a8-c153-4fad-b4d4-3d6aff3de68b-catalog-content\") pod \"redhat-marketplace-dtkld\" (UID: \"6ee0f3a8-c153-4fad-b4d4-3d6aff3de68b\") " pod="openshift-marketplace/redhat-marketplace-dtkld" Jan 11 18:29:00 crc kubenswrapper[4837]: I0111 18:29:00.628094 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/6ee0f3a8-c153-4fad-b4d4-3d6aff3de68b-utilities\") pod \"redhat-marketplace-dtkld\" (UID: \"6ee0f3a8-c153-4fad-b4d4-3d6aff3de68b\") " pod="openshift-marketplace/redhat-marketplace-dtkld" Jan 11 18:29:00 crc kubenswrapper[4837]: I0111 18:29:00.629342 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/6ee0f3a8-c153-4fad-b4d4-3d6aff3de68b-catalog-content\") pod \"redhat-marketplace-dtkld\" (UID: \"6ee0f3a8-c153-4fad-b4d4-3d6aff3de68b\") " pod="openshift-marketplace/redhat-marketplace-dtkld" Jan 11 18:29:00 crc kubenswrapper[4837]: I0111 18:29:00.651517 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lrf8t\" (UniqueName: \"kubernetes.io/projected/6ee0f3a8-c153-4fad-b4d4-3d6aff3de68b-kube-api-access-lrf8t\") pod \"redhat-marketplace-dtkld\" (UID: \"6ee0f3a8-c153-4fad-b4d4-3d6aff3de68b\") " pod="openshift-marketplace/redhat-marketplace-dtkld" Jan 11 18:29:00 crc kubenswrapper[4837]: I0111 18:29:00.724937 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-dtkld" Jan 11 18:29:01 crc kubenswrapper[4837]: I0111 18:29:01.203417 4837 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-dtkld"] Jan 11 18:29:01 crc kubenswrapper[4837]: W0111 18:29:01.206522 4837 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod6ee0f3a8_c153_4fad_b4d4_3d6aff3de68b.slice/crio-e0daecbe477e279a73a10767e04db5c9794943ef22fc2e619d0a3141ea592fce WatchSource:0}: Error finding container e0daecbe477e279a73a10767e04db5c9794943ef22fc2e619d0a3141ea592fce: Status 404 returned error can't find the container with id e0daecbe477e279a73a10767e04db5c9794943ef22fc2e619d0a3141ea592fce Jan 11 18:29:01 crc kubenswrapper[4837]: I0111 18:29:01.478645 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-dtkld" event={"ID":"6ee0f3a8-c153-4fad-b4d4-3d6aff3de68b","Type":"ContainerStarted","Data":"a34866db46716c6a7a02e085f059bc28f691429bc5adfcac003fb72c0817052a"} Jan 11 18:29:01 crc kubenswrapper[4837]: I0111 18:29:01.478714 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-dtkld" event={"ID":"6ee0f3a8-c153-4fad-b4d4-3d6aff3de68b","Type":"ContainerStarted","Data":"e0daecbe477e279a73a10767e04db5c9794943ef22fc2e619d0a3141ea592fce"} Jan 11 18:29:02 crc kubenswrapper[4837]: I0111 18:29:02.491494 4837 generic.go:334] "Generic (PLEG): container finished" podID="6ee0f3a8-c153-4fad-b4d4-3d6aff3de68b" containerID="a34866db46716c6a7a02e085f059bc28f691429bc5adfcac003fb72c0817052a" exitCode=0 Jan 11 18:29:02 crc kubenswrapper[4837]: I0111 18:29:02.491838 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-dtkld" event={"ID":"6ee0f3a8-c153-4fad-b4d4-3d6aff3de68b","Type":"ContainerDied","Data":"a34866db46716c6a7a02e085f059bc28f691429bc5adfcac003fb72c0817052a"} Jan 11 18:29:03 crc kubenswrapper[4837]: I0111 18:29:03.503319 4837 generic.go:334] "Generic (PLEG): container finished" podID="6ee0f3a8-c153-4fad-b4d4-3d6aff3de68b" containerID="0cb2b8668501931d1b36e29c6f4861044ad8b3fb2e95db3f0d3a6a85d502d4e9" exitCode=0 Jan 11 18:29:03 crc kubenswrapper[4837]: I0111 18:29:03.503383 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-dtkld" event={"ID":"6ee0f3a8-c153-4fad-b4d4-3d6aff3de68b","Type":"ContainerDied","Data":"0cb2b8668501931d1b36e29c6f4861044ad8b3fb2e95db3f0d3a6a85d502d4e9"} Jan 11 18:29:04 crc kubenswrapper[4837]: I0111 18:29:04.515957 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-dtkld" event={"ID":"6ee0f3a8-c153-4fad-b4d4-3d6aff3de68b","Type":"ContainerStarted","Data":"4e2aaa90d78c6c9e5cd8a4743348828d25024dbd559fb74f26f75a8bfb7f2874"} Jan 11 18:29:04 crc kubenswrapper[4837]: I0111 18:29:04.544067 4837 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-dtkld" podStartSLOduration=2.931045056 podStartE2EDuration="4.544050194s" podCreationTimestamp="2026-01-11 18:29:00 +0000 UTC" firstStartedPulling="2026-01-11 18:29:02.493560127 +0000 UTC m=+3516.671752843" lastFinishedPulling="2026-01-11 18:29:04.106565275 +0000 UTC m=+3518.284757981" observedRunningTime="2026-01-11 18:29:04.531254731 +0000 UTC m=+3518.709447467" watchObservedRunningTime="2026-01-11 18:29:04.544050194 +0000 UTC m=+3518.722242900" Jan 11 18:29:09 crc kubenswrapper[4837]: I0111 18:29:09.443869 4837 patch_prober.go:28] interesting pod/machine-config-daemon-pqnst container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Jan 11 18:29:09 crc kubenswrapper[4837]: I0111 18:29:09.444357 4837 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-pqnst" podUID="1cf6fa66-290a-4e29-bafc-e60185a22fe2" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Jan 11 18:29:09 crc kubenswrapper[4837]: I0111 18:29:09.444410 4837 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-pqnst" Jan 11 18:29:09 crc kubenswrapper[4837]: I0111 18:29:09.445242 4837 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"4dac9e020402176affb7612f80710fb29436721f04c0573c72f72cf26ab9f649"} pod="openshift-machine-config-operator/machine-config-daemon-pqnst" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Jan 11 18:29:09 crc kubenswrapper[4837]: I0111 18:29:09.445308 4837 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-pqnst" podUID="1cf6fa66-290a-4e29-bafc-e60185a22fe2" containerName="machine-config-daemon" containerID="cri-o://4dac9e020402176affb7612f80710fb29436721f04c0573c72f72cf26ab9f649" gracePeriod=600 Jan 11 18:29:10 crc kubenswrapper[4837]: I0111 18:29:10.581498 4837 generic.go:334] "Generic (PLEG): container finished" podID="1cf6fa66-290a-4e29-bafc-e60185a22fe2" containerID="4dac9e020402176affb7612f80710fb29436721f04c0573c72f72cf26ab9f649" exitCode=0 Jan 11 18:29:10 crc kubenswrapper[4837]: I0111 18:29:10.581944 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-pqnst" event={"ID":"1cf6fa66-290a-4e29-bafc-e60185a22fe2","Type":"ContainerDied","Data":"4dac9e020402176affb7612f80710fb29436721f04c0573c72f72cf26ab9f649"} Jan 11 18:29:10 crc kubenswrapper[4837]: I0111 18:29:10.581974 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-pqnst" event={"ID":"1cf6fa66-290a-4e29-bafc-e60185a22fe2","Type":"ContainerStarted","Data":"0032c5226bd846c444b1e1269293c95e1f965f59227e93f022572cc4a00df926"} Jan 11 18:29:10 crc kubenswrapper[4837]: I0111 18:29:10.581993 4837 scope.go:117] "RemoveContainer" containerID="39b5b7180c0907dd777b55e3381ab4312f0b40d0e0b28505d8e75e226146dfdf" Jan 11 18:29:10 crc kubenswrapper[4837]: I0111 18:29:10.726020 4837 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-dtkld" Jan 11 18:29:10 crc kubenswrapper[4837]: I0111 18:29:10.726178 4837 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-dtkld" Jan 11 18:29:10 crc kubenswrapper[4837]: I0111 18:29:10.800439 4837 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-dtkld" Jan 11 18:29:11 crc kubenswrapper[4837]: I0111 18:29:11.646418 4837 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-dtkld" Jan 11 18:29:11 crc kubenswrapper[4837]: I0111 18:29:11.709237 4837 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-dtkld"] Jan 11 18:29:13 crc kubenswrapper[4837]: I0111 18:29:13.610060 4837 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-dtkld" podUID="6ee0f3a8-c153-4fad-b4d4-3d6aff3de68b" containerName="registry-server" containerID="cri-o://4e2aaa90d78c6c9e5cd8a4743348828d25024dbd559fb74f26f75a8bfb7f2874" gracePeriod=2 Jan 11 18:29:14 crc kubenswrapper[4837]: I0111 18:29:14.620095 4837 generic.go:334] "Generic (PLEG): container finished" podID="6ee0f3a8-c153-4fad-b4d4-3d6aff3de68b" containerID="4e2aaa90d78c6c9e5cd8a4743348828d25024dbd559fb74f26f75a8bfb7f2874" exitCode=0 Jan 11 18:29:14 crc kubenswrapper[4837]: I0111 18:29:14.620211 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-dtkld" event={"ID":"6ee0f3a8-c153-4fad-b4d4-3d6aff3de68b","Type":"ContainerDied","Data":"4e2aaa90d78c6c9e5cd8a4743348828d25024dbd559fb74f26f75a8bfb7f2874"} Jan 11 18:29:14 crc kubenswrapper[4837]: I0111 18:29:14.620629 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-dtkld" event={"ID":"6ee0f3a8-c153-4fad-b4d4-3d6aff3de68b","Type":"ContainerDied","Data":"e0daecbe477e279a73a10767e04db5c9794943ef22fc2e619d0a3141ea592fce"} Jan 11 18:29:14 crc kubenswrapper[4837]: I0111 18:29:14.620645 4837 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="e0daecbe477e279a73a10767e04db5c9794943ef22fc2e619d0a3141ea592fce" Jan 11 18:29:14 crc kubenswrapper[4837]: I0111 18:29:14.626025 4837 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-dtkld" Jan 11 18:29:14 crc kubenswrapper[4837]: I0111 18:29:14.757103 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/6ee0f3a8-c153-4fad-b4d4-3d6aff3de68b-catalog-content\") pod \"6ee0f3a8-c153-4fad-b4d4-3d6aff3de68b\" (UID: \"6ee0f3a8-c153-4fad-b4d4-3d6aff3de68b\") " Jan 11 18:29:14 crc kubenswrapper[4837]: I0111 18:29:14.757438 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/6ee0f3a8-c153-4fad-b4d4-3d6aff3de68b-utilities\") pod \"6ee0f3a8-c153-4fad-b4d4-3d6aff3de68b\" (UID: \"6ee0f3a8-c153-4fad-b4d4-3d6aff3de68b\") " Jan 11 18:29:14 crc kubenswrapper[4837]: I0111 18:29:14.757511 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-lrf8t\" (UniqueName: \"kubernetes.io/projected/6ee0f3a8-c153-4fad-b4d4-3d6aff3de68b-kube-api-access-lrf8t\") pod \"6ee0f3a8-c153-4fad-b4d4-3d6aff3de68b\" (UID: \"6ee0f3a8-c153-4fad-b4d4-3d6aff3de68b\") " Jan 11 18:29:14 crc kubenswrapper[4837]: I0111 18:29:14.759472 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/6ee0f3a8-c153-4fad-b4d4-3d6aff3de68b-utilities" (OuterVolumeSpecName: "utilities") pod "6ee0f3a8-c153-4fad-b4d4-3d6aff3de68b" (UID: "6ee0f3a8-c153-4fad-b4d4-3d6aff3de68b"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 11 18:29:14 crc kubenswrapper[4837]: I0111 18:29:14.764890 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6ee0f3a8-c153-4fad-b4d4-3d6aff3de68b-kube-api-access-lrf8t" (OuterVolumeSpecName: "kube-api-access-lrf8t") pod "6ee0f3a8-c153-4fad-b4d4-3d6aff3de68b" (UID: "6ee0f3a8-c153-4fad-b4d4-3d6aff3de68b"). InnerVolumeSpecName "kube-api-access-lrf8t". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 11 18:29:14 crc kubenswrapper[4837]: I0111 18:29:14.788243 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/6ee0f3a8-c153-4fad-b4d4-3d6aff3de68b-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "6ee0f3a8-c153-4fad-b4d4-3d6aff3de68b" (UID: "6ee0f3a8-c153-4fad-b4d4-3d6aff3de68b"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 11 18:29:14 crc kubenswrapper[4837]: I0111 18:29:14.860271 4837 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-lrf8t\" (UniqueName: \"kubernetes.io/projected/6ee0f3a8-c153-4fad-b4d4-3d6aff3de68b-kube-api-access-lrf8t\") on node \"crc\" DevicePath \"\"" Jan 11 18:29:14 crc kubenswrapper[4837]: I0111 18:29:14.860529 4837 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/6ee0f3a8-c153-4fad-b4d4-3d6aff3de68b-catalog-content\") on node \"crc\" DevicePath \"\"" Jan 11 18:29:14 crc kubenswrapper[4837]: I0111 18:29:14.860621 4837 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/6ee0f3a8-c153-4fad-b4d4-3d6aff3de68b-utilities\") on node \"crc\" DevicePath \"\"" Jan 11 18:29:15 crc kubenswrapper[4837]: I0111 18:29:15.631101 4837 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-dtkld" Jan 11 18:29:15 crc kubenswrapper[4837]: I0111 18:29:15.686944 4837 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-dtkld"] Jan 11 18:29:15 crc kubenswrapper[4837]: I0111 18:29:15.702483 4837 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-dtkld"] Jan 11 18:29:16 crc kubenswrapper[4837]: I0111 18:29:16.379026 4837 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6ee0f3a8-c153-4fad-b4d4-3d6aff3de68b" path="/var/lib/kubelet/pods/6ee0f3a8-c153-4fad-b4d4-3d6aff3de68b/volumes" Jan 11 18:30:00 crc kubenswrapper[4837]: I0111 18:30:00.147603 4837 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29469270-js8gc"] Jan 11 18:30:00 crc kubenswrapper[4837]: E0111 18:30:00.148510 4837 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6ee0f3a8-c153-4fad-b4d4-3d6aff3de68b" containerName="extract-utilities" Jan 11 18:30:00 crc kubenswrapper[4837]: I0111 18:30:00.148522 4837 state_mem.go:107] "Deleted CPUSet assignment" podUID="6ee0f3a8-c153-4fad-b4d4-3d6aff3de68b" containerName="extract-utilities" Jan 11 18:30:00 crc kubenswrapper[4837]: E0111 18:30:00.148546 4837 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6ee0f3a8-c153-4fad-b4d4-3d6aff3de68b" containerName="registry-server" Jan 11 18:30:00 crc kubenswrapper[4837]: I0111 18:30:00.148552 4837 state_mem.go:107] "Deleted CPUSet assignment" podUID="6ee0f3a8-c153-4fad-b4d4-3d6aff3de68b" containerName="registry-server" Jan 11 18:30:00 crc kubenswrapper[4837]: E0111 18:30:00.148566 4837 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6ee0f3a8-c153-4fad-b4d4-3d6aff3de68b" containerName="extract-content" Jan 11 18:30:00 crc kubenswrapper[4837]: I0111 18:30:00.148572 4837 state_mem.go:107] "Deleted CPUSet assignment" podUID="6ee0f3a8-c153-4fad-b4d4-3d6aff3de68b" containerName="extract-content" Jan 11 18:30:00 crc kubenswrapper[4837]: I0111 18:30:00.148788 4837 memory_manager.go:354] "RemoveStaleState removing state" podUID="6ee0f3a8-c153-4fad-b4d4-3d6aff3de68b" containerName="registry-server" Jan 11 18:30:00 crc kubenswrapper[4837]: I0111 18:30:00.149445 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29469270-js8gc" Jan 11 18:30:00 crc kubenswrapper[4837]: I0111 18:30:00.152019 4837 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"collect-profiles-dockercfg-kzf4t" Jan 11 18:30:00 crc kubenswrapper[4837]: I0111 18:30:00.152028 4837 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"collect-profiles-config" Jan 11 18:30:00 crc kubenswrapper[4837]: I0111 18:30:00.156407 4837 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29469270-js8gc"] Jan 11 18:30:00 crc kubenswrapper[4837]: I0111 18:30:00.227120 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4m95m\" (UniqueName: \"kubernetes.io/projected/96bcfd45-dc59-4c9e-8f6e-e5ce407201fd-kube-api-access-4m95m\") pod \"collect-profiles-29469270-js8gc\" (UID: \"96bcfd45-dc59-4c9e-8f6e-e5ce407201fd\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29469270-js8gc" Jan 11 18:30:00 crc kubenswrapper[4837]: I0111 18:30:00.227162 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/96bcfd45-dc59-4c9e-8f6e-e5ce407201fd-secret-volume\") pod \"collect-profiles-29469270-js8gc\" (UID: \"96bcfd45-dc59-4c9e-8f6e-e5ce407201fd\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29469270-js8gc" Jan 11 18:30:00 crc kubenswrapper[4837]: I0111 18:30:00.227269 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/96bcfd45-dc59-4c9e-8f6e-e5ce407201fd-config-volume\") pod \"collect-profiles-29469270-js8gc\" (UID: \"96bcfd45-dc59-4c9e-8f6e-e5ce407201fd\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29469270-js8gc" Jan 11 18:30:00 crc kubenswrapper[4837]: I0111 18:30:00.328007 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4m95m\" (UniqueName: \"kubernetes.io/projected/96bcfd45-dc59-4c9e-8f6e-e5ce407201fd-kube-api-access-4m95m\") pod \"collect-profiles-29469270-js8gc\" (UID: \"96bcfd45-dc59-4c9e-8f6e-e5ce407201fd\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29469270-js8gc" Jan 11 18:30:00 crc kubenswrapper[4837]: I0111 18:30:00.328051 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/96bcfd45-dc59-4c9e-8f6e-e5ce407201fd-secret-volume\") pod \"collect-profiles-29469270-js8gc\" (UID: \"96bcfd45-dc59-4c9e-8f6e-e5ce407201fd\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29469270-js8gc" Jan 11 18:30:00 crc kubenswrapper[4837]: I0111 18:30:00.328118 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/96bcfd45-dc59-4c9e-8f6e-e5ce407201fd-config-volume\") pod \"collect-profiles-29469270-js8gc\" (UID: \"96bcfd45-dc59-4c9e-8f6e-e5ce407201fd\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29469270-js8gc" Jan 11 18:30:00 crc kubenswrapper[4837]: I0111 18:30:00.328888 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/96bcfd45-dc59-4c9e-8f6e-e5ce407201fd-config-volume\") pod \"collect-profiles-29469270-js8gc\" (UID: \"96bcfd45-dc59-4c9e-8f6e-e5ce407201fd\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29469270-js8gc" Jan 11 18:30:00 crc kubenswrapper[4837]: I0111 18:30:00.334455 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/96bcfd45-dc59-4c9e-8f6e-e5ce407201fd-secret-volume\") pod \"collect-profiles-29469270-js8gc\" (UID: \"96bcfd45-dc59-4c9e-8f6e-e5ce407201fd\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29469270-js8gc" Jan 11 18:30:00 crc kubenswrapper[4837]: I0111 18:30:00.351079 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4m95m\" (UniqueName: \"kubernetes.io/projected/96bcfd45-dc59-4c9e-8f6e-e5ce407201fd-kube-api-access-4m95m\") pod \"collect-profiles-29469270-js8gc\" (UID: \"96bcfd45-dc59-4c9e-8f6e-e5ce407201fd\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29469270-js8gc" Jan 11 18:30:00 crc kubenswrapper[4837]: I0111 18:30:00.470123 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29469270-js8gc" Jan 11 18:30:00 crc kubenswrapper[4837]: I0111 18:30:00.990925 4837 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29469270-js8gc"] Jan 11 18:30:01 crc kubenswrapper[4837]: I0111 18:30:01.047780 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29469270-js8gc" event={"ID":"96bcfd45-dc59-4c9e-8f6e-e5ce407201fd","Type":"ContainerStarted","Data":"e24b295b659fe787dfd27000fba6c5da382149b8b63f39e2366c5483ad9bdca2"} Jan 11 18:30:02 crc kubenswrapper[4837]: I0111 18:30:02.058150 4837 generic.go:334] "Generic (PLEG): container finished" podID="96bcfd45-dc59-4c9e-8f6e-e5ce407201fd" containerID="22315b253751a071cf66fc70354e2261d532231e6622c82b20acef3881225ccc" exitCode=0 Jan 11 18:30:02 crc kubenswrapper[4837]: I0111 18:30:02.058204 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29469270-js8gc" event={"ID":"96bcfd45-dc59-4c9e-8f6e-e5ce407201fd","Type":"ContainerDied","Data":"22315b253751a071cf66fc70354e2261d532231e6622c82b20acef3881225ccc"} Jan 11 18:30:03 crc kubenswrapper[4837]: I0111 18:30:03.499535 4837 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29469270-js8gc" Jan 11 18:30:03 crc kubenswrapper[4837]: I0111 18:30:03.598211 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4m95m\" (UniqueName: \"kubernetes.io/projected/96bcfd45-dc59-4c9e-8f6e-e5ce407201fd-kube-api-access-4m95m\") pod \"96bcfd45-dc59-4c9e-8f6e-e5ce407201fd\" (UID: \"96bcfd45-dc59-4c9e-8f6e-e5ce407201fd\") " Jan 11 18:30:03 crc kubenswrapper[4837]: I0111 18:30:03.598297 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/96bcfd45-dc59-4c9e-8f6e-e5ce407201fd-config-volume\") pod \"96bcfd45-dc59-4c9e-8f6e-e5ce407201fd\" (UID: \"96bcfd45-dc59-4c9e-8f6e-e5ce407201fd\") " Jan 11 18:30:03 crc kubenswrapper[4837]: I0111 18:30:03.598414 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/96bcfd45-dc59-4c9e-8f6e-e5ce407201fd-secret-volume\") pod \"96bcfd45-dc59-4c9e-8f6e-e5ce407201fd\" (UID: \"96bcfd45-dc59-4c9e-8f6e-e5ce407201fd\") " Jan 11 18:30:03 crc kubenswrapper[4837]: I0111 18:30:03.599457 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/96bcfd45-dc59-4c9e-8f6e-e5ce407201fd-config-volume" (OuterVolumeSpecName: "config-volume") pod "96bcfd45-dc59-4c9e-8f6e-e5ce407201fd" (UID: "96bcfd45-dc59-4c9e-8f6e-e5ce407201fd"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 11 18:30:03 crc kubenswrapper[4837]: I0111 18:30:03.604814 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/96bcfd45-dc59-4c9e-8f6e-e5ce407201fd-secret-volume" (OuterVolumeSpecName: "secret-volume") pod "96bcfd45-dc59-4c9e-8f6e-e5ce407201fd" (UID: "96bcfd45-dc59-4c9e-8f6e-e5ce407201fd"). InnerVolumeSpecName "secret-volume". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 11 18:30:03 crc kubenswrapper[4837]: I0111 18:30:03.605335 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/96bcfd45-dc59-4c9e-8f6e-e5ce407201fd-kube-api-access-4m95m" (OuterVolumeSpecName: "kube-api-access-4m95m") pod "96bcfd45-dc59-4c9e-8f6e-e5ce407201fd" (UID: "96bcfd45-dc59-4c9e-8f6e-e5ce407201fd"). InnerVolumeSpecName "kube-api-access-4m95m". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 11 18:30:03 crc kubenswrapper[4837]: I0111 18:30:03.700254 4837 reconciler_common.go:293] "Volume detached for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/96bcfd45-dc59-4c9e-8f6e-e5ce407201fd-secret-volume\") on node \"crc\" DevicePath \"\"" Jan 11 18:30:03 crc kubenswrapper[4837]: I0111 18:30:03.700306 4837 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-4m95m\" (UniqueName: \"kubernetes.io/projected/96bcfd45-dc59-4c9e-8f6e-e5ce407201fd-kube-api-access-4m95m\") on node \"crc\" DevicePath \"\"" Jan 11 18:30:03 crc kubenswrapper[4837]: I0111 18:30:03.700329 4837 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/96bcfd45-dc59-4c9e-8f6e-e5ce407201fd-config-volume\") on node \"crc\" DevicePath \"\"" Jan 11 18:30:04 crc kubenswrapper[4837]: I0111 18:30:04.076424 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29469270-js8gc" event={"ID":"96bcfd45-dc59-4c9e-8f6e-e5ce407201fd","Type":"ContainerDied","Data":"e24b295b659fe787dfd27000fba6c5da382149b8b63f39e2366c5483ad9bdca2"} Jan 11 18:30:04 crc kubenswrapper[4837]: I0111 18:30:04.076463 4837 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="e24b295b659fe787dfd27000fba6c5da382149b8b63f39e2366c5483ad9bdca2" Jan 11 18:30:04 crc kubenswrapper[4837]: I0111 18:30:04.076532 4837 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29469270-js8gc" Jan 11 18:30:04 crc kubenswrapper[4837]: I0111 18:30:04.596222 4837 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29469225-tjvdp"] Jan 11 18:30:04 crc kubenswrapper[4837]: I0111 18:30:04.606572 4837 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29469225-tjvdp"] Jan 11 18:30:06 crc kubenswrapper[4837]: I0111 18:30:06.377288 4837 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f2b9f39d-7ce7-4626-9a5c-7ed9b0917745" path="/var/lib/kubelet/pods/f2b9f39d-7ce7-4626-9a5c-7ed9b0917745/volumes" Jan 11 18:30:28 crc kubenswrapper[4837]: I0111 18:30:28.311957 4837 generic.go:334] "Generic (PLEG): container finished" podID="803594a1-a21b-4a8d-bf22-a2f1786b3822" containerID="ec0dbe4366ce26d5ceca364a3c5c7b477fd21e1cf35346bc091a56e28a5bc558" exitCode=0 Jan 11 18:30:28 crc kubenswrapper[4837]: I0111 18:30:28.312059 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/tempest-tests-tempest" event={"ID":"803594a1-a21b-4a8d-bf22-a2f1786b3822","Type":"ContainerDied","Data":"ec0dbe4366ce26d5ceca364a3c5c7b477fd21e1cf35346bc091a56e28a5bc558"} Jan 11 18:30:29 crc kubenswrapper[4837]: I0111 18:30:29.669726 4837 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/tempest-tests-tempest" Jan 11 18:30:29 crc kubenswrapper[4837]: I0111 18:30:29.811492 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/803594a1-a21b-4a8d-bf22-a2f1786b3822-openstack-config\") pod \"803594a1-a21b-4a8d-bf22-a2f1786b3822\" (UID: \"803594a1-a21b-4a8d-bf22-a2f1786b3822\") " Jan 11 18:30:29 crc kubenswrapper[4837]: I0111 18:30:29.811582 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"test-operator-ephemeral-workdir\" (UniqueName: \"kubernetes.io/empty-dir/803594a1-a21b-4a8d-bf22-a2f1786b3822-test-operator-ephemeral-workdir\") pod \"803594a1-a21b-4a8d-bf22-a2f1786b3822\" (UID: \"803594a1-a21b-4a8d-bf22-a2f1786b3822\") " Jan 11 18:30:29 crc kubenswrapper[4837]: I0111 18:30:29.811692 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/803594a1-a21b-4a8d-bf22-a2f1786b3822-openstack-config-secret\") pod \"803594a1-a21b-4a8d-bf22-a2f1786b3822\" (UID: \"803594a1-a21b-4a8d-bf22-a2f1786b3822\") " Jan 11 18:30:29 crc kubenswrapper[4837]: I0111 18:30:29.811726 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/803594a1-a21b-4a8d-bf22-a2f1786b3822-config-data\") pod \"803594a1-a21b-4a8d-bf22-a2f1786b3822\" (UID: \"803594a1-a21b-4a8d-bf22-a2f1786b3822\") " Jan 11 18:30:29 crc kubenswrapper[4837]: I0111 18:30:29.811754 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/803594a1-a21b-4a8d-bf22-a2f1786b3822-ssh-key\") pod \"803594a1-a21b-4a8d-bf22-a2f1786b3822\" (UID: \"803594a1-a21b-4a8d-bf22-a2f1786b3822\") " Jan 11 18:30:29 crc kubenswrapper[4837]: I0111 18:30:29.811825 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"test-operator-logs\" (UniqueName: \"kubernetes.io/local-volume/local-storage05-crc\") pod \"803594a1-a21b-4a8d-bf22-a2f1786b3822\" (UID: \"803594a1-a21b-4a8d-bf22-a2f1786b3822\") " Jan 11 18:30:29 crc kubenswrapper[4837]: I0111 18:30:29.811853 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7qzjl\" (UniqueName: \"kubernetes.io/projected/803594a1-a21b-4a8d-bf22-a2f1786b3822-kube-api-access-7qzjl\") pod \"803594a1-a21b-4a8d-bf22-a2f1786b3822\" (UID: \"803594a1-a21b-4a8d-bf22-a2f1786b3822\") " Jan 11 18:30:29 crc kubenswrapper[4837]: I0111 18:30:29.811941 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"test-operator-ephemeral-temporary\" (UniqueName: \"kubernetes.io/empty-dir/803594a1-a21b-4a8d-bf22-a2f1786b3822-test-operator-ephemeral-temporary\") pod \"803594a1-a21b-4a8d-bf22-a2f1786b3822\" (UID: \"803594a1-a21b-4a8d-bf22-a2f1786b3822\") " Jan 11 18:30:29 crc kubenswrapper[4837]: I0111 18:30:29.811971 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ca-certs\" (UniqueName: \"kubernetes.io/secret/803594a1-a21b-4a8d-bf22-a2f1786b3822-ca-certs\") pod \"803594a1-a21b-4a8d-bf22-a2f1786b3822\" (UID: \"803594a1-a21b-4a8d-bf22-a2f1786b3822\") " Jan 11 18:30:29 crc kubenswrapper[4837]: I0111 18:30:29.813363 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/803594a1-a21b-4a8d-bf22-a2f1786b3822-test-operator-ephemeral-temporary" (OuterVolumeSpecName: "test-operator-ephemeral-temporary") pod "803594a1-a21b-4a8d-bf22-a2f1786b3822" (UID: "803594a1-a21b-4a8d-bf22-a2f1786b3822"). InnerVolumeSpecName "test-operator-ephemeral-temporary". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 11 18:30:29 crc kubenswrapper[4837]: I0111 18:30:29.813502 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/803594a1-a21b-4a8d-bf22-a2f1786b3822-config-data" (OuterVolumeSpecName: "config-data") pod "803594a1-a21b-4a8d-bf22-a2f1786b3822" (UID: "803594a1-a21b-4a8d-bf22-a2f1786b3822"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 11 18:30:29 crc kubenswrapper[4837]: I0111 18:30:29.820193 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/803594a1-a21b-4a8d-bf22-a2f1786b3822-test-operator-ephemeral-workdir" (OuterVolumeSpecName: "test-operator-ephemeral-workdir") pod "803594a1-a21b-4a8d-bf22-a2f1786b3822" (UID: "803594a1-a21b-4a8d-bf22-a2f1786b3822"). InnerVolumeSpecName "test-operator-ephemeral-workdir". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 11 18:30:29 crc kubenswrapper[4837]: I0111 18:30:29.826444 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage05-crc" (OuterVolumeSpecName: "test-operator-logs") pod "803594a1-a21b-4a8d-bf22-a2f1786b3822" (UID: "803594a1-a21b-4a8d-bf22-a2f1786b3822"). InnerVolumeSpecName "local-storage05-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Jan 11 18:30:29 crc kubenswrapper[4837]: I0111 18:30:29.826614 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/803594a1-a21b-4a8d-bf22-a2f1786b3822-kube-api-access-7qzjl" (OuterVolumeSpecName: "kube-api-access-7qzjl") pod "803594a1-a21b-4a8d-bf22-a2f1786b3822" (UID: "803594a1-a21b-4a8d-bf22-a2f1786b3822"). InnerVolumeSpecName "kube-api-access-7qzjl". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 11 18:30:29 crc kubenswrapper[4837]: I0111 18:30:29.845370 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/803594a1-a21b-4a8d-bf22-a2f1786b3822-ssh-key" (OuterVolumeSpecName: "ssh-key") pod "803594a1-a21b-4a8d-bf22-a2f1786b3822" (UID: "803594a1-a21b-4a8d-bf22-a2f1786b3822"). InnerVolumeSpecName "ssh-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 11 18:30:29 crc kubenswrapper[4837]: I0111 18:30:29.846857 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/803594a1-a21b-4a8d-bf22-a2f1786b3822-ca-certs" (OuterVolumeSpecName: "ca-certs") pod "803594a1-a21b-4a8d-bf22-a2f1786b3822" (UID: "803594a1-a21b-4a8d-bf22-a2f1786b3822"). InnerVolumeSpecName "ca-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 11 18:30:29 crc kubenswrapper[4837]: I0111 18:30:29.847551 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/803594a1-a21b-4a8d-bf22-a2f1786b3822-openstack-config-secret" (OuterVolumeSpecName: "openstack-config-secret") pod "803594a1-a21b-4a8d-bf22-a2f1786b3822" (UID: "803594a1-a21b-4a8d-bf22-a2f1786b3822"). InnerVolumeSpecName "openstack-config-secret". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 11 18:30:29 crc kubenswrapper[4837]: I0111 18:30:29.869993 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/803594a1-a21b-4a8d-bf22-a2f1786b3822-openstack-config" (OuterVolumeSpecName: "openstack-config") pod "803594a1-a21b-4a8d-bf22-a2f1786b3822" (UID: "803594a1-a21b-4a8d-bf22-a2f1786b3822"). InnerVolumeSpecName "openstack-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 11 18:30:29 crc kubenswrapper[4837]: I0111 18:30:29.913798 4837 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage05-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage05-crc\") on node \"crc\" " Jan 11 18:30:29 crc kubenswrapper[4837]: I0111 18:30:29.913838 4837 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-7qzjl\" (UniqueName: \"kubernetes.io/projected/803594a1-a21b-4a8d-bf22-a2f1786b3822-kube-api-access-7qzjl\") on node \"crc\" DevicePath \"\"" Jan 11 18:30:29 crc kubenswrapper[4837]: I0111 18:30:29.913852 4837 reconciler_common.go:293] "Volume detached for volume \"test-operator-ephemeral-temporary\" (UniqueName: \"kubernetes.io/empty-dir/803594a1-a21b-4a8d-bf22-a2f1786b3822-test-operator-ephemeral-temporary\") on node \"crc\" DevicePath \"\"" Jan 11 18:30:29 crc kubenswrapper[4837]: I0111 18:30:29.913864 4837 reconciler_common.go:293] "Volume detached for volume \"ca-certs\" (UniqueName: \"kubernetes.io/secret/803594a1-a21b-4a8d-bf22-a2f1786b3822-ca-certs\") on node \"crc\" DevicePath \"\"" Jan 11 18:30:29 crc kubenswrapper[4837]: I0111 18:30:29.913877 4837 reconciler_common.go:293] "Volume detached for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/803594a1-a21b-4a8d-bf22-a2f1786b3822-openstack-config\") on node \"crc\" DevicePath \"\"" Jan 11 18:30:29 crc kubenswrapper[4837]: I0111 18:30:29.913890 4837 reconciler_common.go:293] "Volume detached for volume \"test-operator-ephemeral-workdir\" (UniqueName: \"kubernetes.io/empty-dir/803594a1-a21b-4a8d-bf22-a2f1786b3822-test-operator-ephemeral-workdir\") on node \"crc\" DevicePath \"\"" Jan 11 18:30:29 crc kubenswrapper[4837]: I0111 18:30:29.913900 4837 reconciler_common.go:293] "Volume detached for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/803594a1-a21b-4a8d-bf22-a2f1786b3822-openstack-config-secret\") on node \"crc\" DevicePath \"\"" Jan 11 18:30:29 crc kubenswrapper[4837]: I0111 18:30:29.913926 4837 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/803594a1-a21b-4a8d-bf22-a2f1786b3822-config-data\") on node \"crc\" DevicePath \"\"" Jan 11 18:30:29 crc kubenswrapper[4837]: I0111 18:30:29.913937 4837 reconciler_common.go:293] "Volume detached for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/803594a1-a21b-4a8d-bf22-a2f1786b3822-ssh-key\") on node \"crc\" DevicePath \"\"" Jan 11 18:30:29 crc kubenswrapper[4837]: I0111 18:30:29.933264 4837 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage05-crc" (UniqueName: "kubernetes.io/local-volume/local-storage05-crc") on node "crc" Jan 11 18:30:30 crc kubenswrapper[4837]: I0111 18:30:30.016006 4837 reconciler_common.go:293] "Volume detached for volume \"local-storage05-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage05-crc\") on node \"crc\" DevicePath \"\"" Jan 11 18:30:30 crc kubenswrapper[4837]: I0111 18:30:30.332070 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/tempest-tests-tempest" event={"ID":"803594a1-a21b-4a8d-bf22-a2f1786b3822","Type":"ContainerDied","Data":"c3c6499993014f892a165a8bf55478adcbd12e8578718b6fdb509cfb9fa85989"} Jan 11 18:30:30 crc kubenswrapper[4837]: I0111 18:30:30.332112 4837 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="c3c6499993014f892a165a8bf55478adcbd12e8578718b6fdb509cfb9fa85989" Jan 11 18:30:30 crc kubenswrapper[4837]: I0111 18:30:30.332188 4837 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/tempest-tests-tempest" Jan 11 18:30:33 crc kubenswrapper[4837]: I0111 18:30:33.474090 4837 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/test-operator-logs-pod-tempest-tempest-tests-tempest"] Jan 11 18:30:33 crc kubenswrapper[4837]: E0111 18:30:33.475128 4837 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="803594a1-a21b-4a8d-bf22-a2f1786b3822" containerName="tempest-tests-tempest-tests-runner" Jan 11 18:30:33 crc kubenswrapper[4837]: I0111 18:30:33.475156 4837 state_mem.go:107] "Deleted CPUSet assignment" podUID="803594a1-a21b-4a8d-bf22-a2f1786b3822" containerName="tempest-tests-tempest-tests-runner" Jan 11 18:30:33 crc kubenswrapper[4837]: E0111 18:30:33.475200 4837 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="96bcfd45-dc59-4c9e-8f6e-e5ce407201fd" containerName="collect-profiles" Jan 11 18:30:33 crc kubenswrapper[4837]: I0111 18:30:33.475213 4837 state_mem.go:107] "Deleted CPUSet assignment" podUID="96bcfd45-dc59-4c9e-8f6e-e5ce407201fd" containerName="collect-profiles" Jan 11 18:30:33 crc kubenswrapper[4837]: I0111 18:30:33.475550 4837 memory_manager.go:354] "RemoveStaleState removing state" podUID="803594a1-a21b-4a8d-bf22-a2f1786b3822" containerName="tempest-tests-tempest-tests-runner" Jan 11 18:30:33 crc kubenswrapper[4837]: I0111 18:30:33.475577 4837 memory_manager.go:354] "RemoveStaleState removing state" podUID="96bcfd45-dc59-4c9e-8f6e-e5ce407201fd" containerName="collect-profiles" Jan 11 18:30:33 crc kubenswrapper[4837]: I0111 18:30:33.476699 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/test-operator-logs-pod-tempest-tempest-tests-tempest" Jan 11 18:30:33 crc kubenswrapper[4837]: I0111 18:30:33.479231 4837 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"default-dockercfg-xcp84" Jan 11 18:30:33 crc kubenswrapper[4837]: I0111 18:30:33.483227 4837 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/test-operator-logs-pod-tempest-tempest-tests-tempest"] Jan 11 18:30:33 crc kubenswrapper[4837]: I0111 18:30:33.584716 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-v5t4v\" (UniqueName: \"kubernetes.io/projected/cab1d2b9-90c7-478b-905e-0487cb825e65-kube-api-access-v5t4v\") pod \"test-operator-logs-pod-tempest-tempest-tests-tempest\" (UID: \"cab1d2b9-90c7-478b-905e-0487cb825e65\") " pod="openstack/test-operator-logs-pod-tempest-tempest-tests-tempest" Jan 11 18:30:33 crc kubenswrapper[4837]: I0111 18:30:33.584825 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage05-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage05-crc\") pod \"test-operator-logs-pod-tempest-tempest-tests-tempest\" (UID: \"cab1d2b9-90c7-478b-905e-0487cb825e65\") " pod="openstack/test-operator-logs-pod-tempest-tempest-tests-tempest" Jan 11 18:30:33 crc kubenswrapper[4837]: I0111 18:30:33.686656 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-v5t4v\" (UniqueName: \"kubernetes.io/projected/cab1d2b9-90c7-478b-905e-0487cb825e65-kube-api-access-v5t4v\") pod \"test-operator-logs-pod-tempest-tempest-tests-tempest\" (UID: \"cab1d2b9-90c7-478b-905e-0487cb825e65\") " pod="openstack/test-operator-logs-pod-tempest-tempest-tests-tempest" Jan 11 18:30:33 crc kubenswrapper[4837]: I0111 18:30:33.686910 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage05-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage05-crc\") pod \"test-operator-logs-pod-tempest-tempest-tests-tempest\" (UID: \"cab1d2b9-90c7-478b-905e-0487cb825e65\") " pod="openstack/test-operator-logs-pod-tempest-tempest-tests-tempest" Jan 11 18:30:33 crc kubenswrapper[4837]: I0111 18:30:33.687665 4837 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage05-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage05-crc\") pod \"test-operator-logs-pod-tempest-tempest-tests-tempest\" (UID: \"cab1d2b9-90c7-478b-905e-0487cb825e65\") device mount path \"/mnt/openstack/pv05\"" pod="openstack/test-operator-logs-pod-tempest-tempest-tests-tempest" Jan 11 18:30:33 crc kubenswrapper[4837]: I0111 18:30:33.724080 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-v5t4v\" (UniqueName: \"kubernetes.io/projected/cab1d2b9-90c7-478b-905e-0487cb825e65-kube-api-access-v5t4v\") pod \"test-operator-logs-pod-tempest-tempest-tests-tempest\" (UID: \"cab1d2b9-90c7-478b-905e-0487cb825e65\") " pod="openstack/test-operator-logs-pod-tempest-tempest-tests-tempest" Jan 11 18:30:33 crc kubenswrapper[4837]: I0111 18:30:33.737926 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage05-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage05-crc\") pod \"test-operator-logs-pod-tempest-tempest-tests-tempest\" (UID: \"cab1d2b9-90c7-478b-905e-0487cb825e65\") " pod="openstack/test-operator-logs-pod-tempest-tempest-tests-tempest" Jan 11 18:30:33 crc kubenswrapper[4837]: I0111 18:30:33.805686 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/test-operator-logs-pod-tempest-tempest-tests-tempest" Jan 11 18:30:34 crc kubenswrapper[4837]: I0111 18:30:34.320700 4837 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/test-operator-logs-pod-tempest-tempest-tests-tempest"] Jan 11 18:30:34 crc kubenswrapper[4837]: I0111 18:30:34.378426 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/test-operator-logs-pod-tempest-tempest-tests-tempest" event={"ID":"cab1d2b9-90c7-478b-905e-0487cb825e65","Type":"ContainerStarted","Data":"f3fd588e203bfbe54761f9d138497579b3ef4414e436b12c70c65635d6628e3d"} Jan 11 18:30:36 crc kubenswrapper[4837]: I0111 18:30:36.399514 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/test-operator-logs-pod-tempest-tempest-tests-tempest" event={"ID":"cab1d2b9-90c7-478b-905e-0487cb825e65","Type":"ContainerStarted","Data":"16d22d1e6342f0350f2bb8bf94ce39a4058ddddb4723a7576a245b8ae823ceec"} Jan 11 18:30:36 crc kubenswrapper[4837]: I0111 18:30:36.423044 4837 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/test-operator-logs-pod-tempest-tempest-tests-tempest" podStartSLOduration=2.508120575 podStartE2EDuration="3.423019257s" podCreationTimestamp="2026-01-11 18:30:33 +0000 UTC" firstStartedPulling="2026-01-11 18:30:34.328046895 +0000 UTC m=+3608.506239641" lastFinishedPulling="2026-01-11 18:30:35.242945617 +0000 UTC m=+3609.421138323" observedRunningTime="2026-01-11 18:30:36.418466534 +0000 UTC m=+3610.596659270" watchObservedRunningTime="2026-01-11 18:30:36.423019257 +0000 UTC m=+3610.601211983" Jan 11 18:30:54 crc kubenswrapper[4837]: I0111 18:30:54.108901 4837 scope.go:117] "RemoveContainer" containerID="532cabea9051f842e4d1c7aedc1a13ba9afd22b80ad1fd825909f599e9c6e86a" Jan 11 18:30:58 crc kubenswrapper[4837]: I0111 18:30:58.212351 4837 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-must-gather-wkvxh/must-gather-msgnj"] Jan 11 18:30:58 crc kubenswrapper[4837]: I0111 18:30:58.214316 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-wkvxh/must-gather-msgnj" Jan 11 18:30:58 crc kubenswrapper[4837]: I0111 18:30:58.216434 4837 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-must-gather-wkvxh"/"default-dockercfg-6mtgc" Jan 11 18:30:58 crc kubenswrapper[4837]: I0111 18:30:58.216890 4837 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-must-gather-wkvxh"/"openshift-service-ca.crt" Jan 11 18:30:58 crc kubenswrapper[4837]: I0111 18:30:58.217098 4837 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-must-gather-wkvxh"/"kube-root-ca.crt" Jan 11 18:30:58 crc kubenswrapper[4837]: I0111 18:30:58.229126 4837 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-wkvxh/must-gather-msgnj"] Jan 11 18:30:58 crc kubenswrapper[4837]: I0111 18:30:58.295895 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qmcqx\" (UniqueName: \"kubernetes.io/projected/45253be1-632b-4c25-bb8b-f726177c8991-kube-api-access-qmcqx\") pod \"must-gather-msgnj\" (UID: \"45253be1-632b-4c25-bb8b-f726177c8991\") " pod="openshift-must-gather-wkvxh/must-gather-msgnj" Jan 11 18:30:58 crc kubenswrapper[4837]: I0111 18:30:58.296058 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/45253be1-632b-4c25-bb8b-f726177c8991-must-gather-output\") pod \"must-gather-msgnj\" (UID: \"45253be1-632b-4c25-bb8b-f726177c8991\") " pod="openshift-must-gather-wkvxh/must-gather-msgnj" Jan 11 18:30:58 crc kubenswrapper[4837]: I0111 18:30:58.398184 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/45253be1-632b-4c25-bb8b-f726177c8991-must-gather-output\") pod \"must-gather-msgnj\" (UID: \"45253be1-632b-4c25-bb8b-f726177c8991\") " pod="openshift-must-gather-wkvxh/must-gather-msgnj" Jan 11 18:30:58 crc kubenswrapper[4837]: I0111 18:30:58.398504 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qmcqx\" (UniqueName: \"kubernetes.io/projected/45253be1-632b-4c25-bb8b-f726177c8991-kube-api-access-qmcqx\") pod \"must-gather-msgnj\" (UID: \"45253be1-632b-4c25-bb8b-f726177c8991\") " pod="openshift-must-gather-wkvxh/must-gather-msgnj" Jan 11 18:30:58 crc kubenswrapper[4837]: I0111 18:30:58.398726 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/45253be1-632b-4c25-bb8b-f726177c8991-must-gather-output\") pod \"must-gather-msgnj\" (UID: \"45253be1-632b-4c25-bb8b-f726177c8991\") " pod="openshift-must-gather-wkvxh/must-gather-msgnj" Jan 11 18:30:58 crc kubenswrapper[4837]: I0111 18:30:58.416387 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qmcqx\" (UniqueName: \"kubernetes.io/projected/45253be1-632b-4c25-bb8b-f726177c8991-kube-api-access-qmcqx\") pod \"must-gather-msgnj\" (UID: \"45253be1-632b-4c25-bb8b-f726177c8991\") " pod="openshift-must-gather-wkvxh/must-gather-msgnj" Jan 11 18:30:58 crc kubenswrapper[4837]: I0111 18:30:58.533478 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-wkvxh/must-gather-msgnj" Jan 11 18:30:59 crc kubenswrapper[4837]: W0111 18:30:59.027118 4837 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod45253be1_632b_4c25_bb8b_f726177c8991.slice/crio-99d9ee1d5632afa2f797056bdebc0df890406c789ccaf1921d30fe18faaa0908 WatchSource:0}: Error finding container 99d9ee1d5632afa2f797056bdebc0df890406c789ccaf1921d30fe18faaa0908: Status 404 returned error can't find the container with id 99d9ee1d5632afa2f797056bdebc0df890406c789ccaf1921d30fe18faaa0908 Jan 11 18:30:59 crc kubenswrapper[4837]: I0111 18:30:59.028137 4837 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-wkvxh/must-gather-msgnj"] Jan 11 18:30:59 crc kubenswrapper[4837]: I0111 18:30:59.682794 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-wkvxh/must-gather-msgnj" event={"ID":"45253be1-632b-4c25-bb8b-f726177c8991","Type":"ContainerStarted","Data":"99d9ee1d5632afa2f797056bdebc0df890406c789ccaf1921d30fe18faaa0908"} Jan 11 18:31:07 crc kubenswrapper[4837]: I0111 18:31:07.786627 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-wkvxh/must-gather-msgnj" event={"ID":"45253be1-632b-4c25-bb8b-f726177c8991","Type":"ContainerStarted","Data":"664fd153625177cd8cd2575ba1b483e950198618da90b3b8fb26bdeb13a20b32"} Jan 11 18:31:07 crc kubenswrapper[4837]: I0111 18:31:07.787767 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-wkvxh/must-gather-msgnj" event={"ID":"45253be1-632b-4c25-bb8b-f726177c8991","Type":"ContainerStarted","Data":"d33ec879d3a24fa77b931917a343a1dcc2990c6d060eac4f0d2b25e2c0302d8b"} Jan 11 18:31:07 crc kubenswrapper[4837]: I0111 18:31:07.824299 4837 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-must-gather-wkvxh/must-gather-msgnj" podStartSLOduration=2.044974287 podStartE2EDuration="9.824280593s" podCreationTimestamp="2026-01-11 18:30:58 +0000 UTC" firstStartedPulling="2026-01-11 18:30:59.028889445 +0000 UTC m=+3633.207082151" lastFinishedPulling="2026-01-11 18:31:06.808195751 +0000 UTC m=+3640.986388457" observedRunningTime="2026-01-11 18:31:07.808129078 +0000 UTC m=+3641.986321784" watchObservedRunningTime="2026-01-11 18:31:07.824280593 +0000 UTC m=+3642.002473299" Jan 11 18:31:09 crc kubenswrapper[4837]: I0111 18:31:09.443582 4837 patch_prober.go:28] interesting pod/machine-config-daemon-pqnst container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Jan 11 18:31:09 crc kubenswrapper[4837]: I0111 18:31:09.444040 4837 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-pqnst" podUID="1cf6fa66-290a-4e29-bafc-e60185a22fe2" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Jan 11 18:31:11 crc kubenswrapper[4837]: I0111 18:31:11.097315 4837 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-must-gather-wkvxh/crc-debug-bnq6z"] Jan 11 18:31:11 crc kubenswrapper[4837]: I0111 18:31:11.098425 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-wkvxh/crc-debug-bnq6z" Jan 11 18:31:11 crc kubenswrapper[4837]: I0111 18:31:11.196875 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2dbvq\" (UniqueName: \"kubernetes.io/projected/f5d606a0-5fb4-47ae-b493-4e8956256ba6-kube-api-access-2dbvq\") pod \"crc-debug-bnq6z\" (UID: \"f5d606a0-5fb4-47ae-b493-4e8956256ba6\") " pod="openshift-must-gather-wkvxh/crc-debug-bnq6z" Jan 11 18:31:11 crc kubenswrapper[4837]: I0111 18:31:11.197377 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/f5d606a0-5fb4-47ae-b493-4e8956256ba6-host\") pod \"crc-debug-bnq6z\" (UID: \"f5d606a0-5fb4-47ae-b493-4e8956256ba6\") " pod="openshift-must-gather-wkvxh/crc-debug-bnq6z" Jan 11 18:31:11 crc kubenswrapper[4837]: I0111 18:31:11.299432 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2dbvq\" (UniqueName: \"kubernetes.io/projected/f5d606a0-5fb4-47ae-b493-4e8956256ba6-kube-api-access-2dbvq\") pod \"crc-debug-bnq6z\" (UID: \"f5d606a0-5fb4-47ae-b493-4e8956256ba6\") " pod="openshift-must-gather-wkvxh/crc-debug-bnq6z" Jan 11 18:31:11 crc kubenswrapper[4837]: I0111 18:31:11.299586 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/f5d606a0-5fb4-47ae-b493-4e8956256ba6-host\") pod \"crc-debug-bnq6z\" (UID: \"f5d606a0-5fb4-47ae-b493-4e8956256ba6\") " pod="openshift-must-gather-wkvxh/crc-debug-bnq6z" Jan 11 18:31:11 crc kubenswrapper[4837]: I0111 18:31:11.299791 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/f5d606a0-5fb4-47ae-b493-4e8956256ba6-host\") pod \"crc-debug-bnq6z\" (UID: \"f5d606a0-5fb4-47ae-b493-4e8956256ba6\") " pod="openshift-must-gather-wkvxh/crc-debug-bnq6z" Jan 11 18:31:11 crc kubenswrapper[4837]: I0111 18:31:11.324258 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2dbvq\" (UniqueName: \"kubernetes.io/projected/f5d606a0-5fb4-47ae-b493-4e8956256ba6-kube-api-access-2dbvq\") pod \"crc-debug-bnq6z\" (UID: \"f5d606a0-5fb4-47ae-b493-4e8956256ba6\") " pod="openshift-must-gather-wkvxh/crc-debug-bnq6z" Jan 11 18:31:11 crc kubenswrapper[4837]: I0111 18:31:11.418181 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-wkvxh/crc-debug-bnq6z" Jan 11 18:31:11 crc kubenswrapper[4837]: W0111 18:31:11.445467 4837 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podf5d606a0_5fb4_47ae_b493_4e8956256ba6.slice/crio-8d3608cacb2e8be3bbe727f1b79b7363a274dab4543e390ea06f42ebd4ef6dc5 WatchSource:0}: Error finding container 8d3608cacb2e8be3bbe727f1b79b7363a274dab4543e390ea06f42ebd4ef6dc5: Status 404 returned error can't find the container with id 8d3608cacb2e8be3bbe727f1b79b7363a274dab4543e390ea06f42ebd4ef6dc5 Jan 11 18:31:11 crc kubenswrapper[4837]: I0111 18:31:11.824300 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-wkvxh/crc-debug-bnq6z" event={"ID":"f5d606a0-5fb4-47ae-b493-4e8956256ba6","Type":"ContainerStarted","Data":"8d3608cacb2e8be3bbe727f1b79b7363a274dab4543e390ea06f42ebd4ef6dc5"} Jan 11 18:31:24 crc kubenswrapper[4837]: I0111 18:31:24.933377 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-wkvxh/crc-debug-bnq6z" event={"ID":"f5d606a0-5fb4-47ae-b493-4e8956256ba6","Type":"ContainerStarted","Data":"2f0b2b94cc7a5404adef5c0c69a2d2ffc79086e54a4eb7b484e6ca21b6880702"} Jan 11 18:31:24 crc kubenswrapper[4837]: I0111 18:31:24.951731 4837 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-must-gather-wkvxh/crc-debug-bnq6z" podStartSLOduration=0.943479886 podStartE2EDuration="13.951707682s" podCreationTimestamp="2026-01-11 18:31:11 +0000 UTC" firstStartedPulling="2026-01-11 18:31:11.447289589 +0000 UTC m=+3645.625482295" lastFinishedPulling="2026-01-11 18:31:24.455517385 +0000 UTC m=+3658.633710091" observedRunningTime="2026-01-11 18:31:24.947417608 +0000 UTC m=+3659.125610344" watchObservedRunningTime="2026-01-11 18:31:24.951707682 +0000 UTC m=+3659.129900398" Jan 11 18:31:39 crc kubenswrapper[4837]: I0111 18:31:39.443711 4837 patch_prober.go:28] interesting pod/machine-config-daemon-pqnst container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Jan 11 18:31:39 crc kubenswrapper[4837]: I0111 18:31:39.444238 4837 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-pqnst" podUID="1cf6fa66-290a-4e29-bafc-e60185a22fe2" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Jan 11 18:32:01 crc kubenswrapper[4837]: I0111 18:32:01.261380 4837 generic.go:334] "Generic (PLEG): container finished" podID="f5d606a0-5fb4-47ae-b493-4e8956256ba6" containerID="2f0b2b94cc7a5404adef5c0c69a2d2ffc79086e54a4eb7b484e6ca21b6880702" exitCode=0 Jan 11 18:32:01 crc kubenswrapper[4837]: I0111 18:32:01.261424 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-wkvxh/crc-debug-bnq6z" event={"ID":"f5d606a0-5fb4-47ae-b493-4e8956256ba6","Type":"ContainerDied","Data":"2f0b2b94cc7a5404adef5c0c69a2d2ffc79086e54a4eb7b484e6ca21b6880702"} Jan 11 18:32:02 crc kubenswrapper[4837]: I0111 18:32:02.390753 4837 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-wkvxh/crc-debug-bnq6z" Jan 11 18:32:02 crc kubenswrapper[4837]: I0111 18:32:02.427783 4837 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-must-gather-wkvxh/crc-debug-bnq6z"] Jan 11 18:32:02 crc kubenswrapper[4837]: I0111 18:32:02.437649 4837 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-must-gather-wkvxh/crc-debug-bnq6z"] Jan 11 18:32:02 crc kubenswrapper[4837]: I0111 18:32:02.541040 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/f5d606a0-5fb4-47ae-b493-4e8956256ba6-host\") pod \"f5d606a0-5fb4-47ae-b493-4e8956256ba6\" (UID: \"f5d606a0-5fb4-47ae-b493-4e8956256ba6\") " Jan 11 18:32:02 crc kubenswrapper[4837]: I0111 18:32:02.541208 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f5d606a0-5fb4-47ae-b493-4e8956256ba6-host" (OuterVolumeSpecName: "host") pod "f5d606a0-5fb4-47ae-b493-4e8956256ba6" (UID: "f5d606a0-5fb4-47ae-b493-4e8956256ba6"). InnerVolumeSpecName "host". PluginName "kubernetes.io/host-path", VolumeGidValue "" Jan 11 18:32:02 crc kubenswrapper[4837]: I0111 18:32:02.541283 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2dbvq\" (UniqueName: \"kubernetes.io/projected/f5d606a0-5fb4-47ae-b493-4e8956256ba6-kube-api-access-2dbvq\") pod \"f5d606a0-5fb4-47ae-b493-4e8956256ba6\" (UID: \"f5d606a0-5fb4-47ae-b493-4e8956256ba6\") " Jan 11 18:32:02 crc kubenswrapper[4837]: I0111 18:32:02.542174 4837 reconciler_common.go:293] "Volume detached for volume \"host\" (UniqueName: \"kubernetes.io/host-path/f5d606a0-5fb4-47ae-b493-4e8956256ba6-host\") on node \"crc\" DevicePath \"\"" Jan 11 18:32:02 crc kubenswrapper[4837]: I0111 18:32:02.547528 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f5d606a0-5fb4-47ae-b493-4e8956256ba6-kube-api-access-2dbvq" (OuterVolumeSpecName: "kube-api-access-2dbvq") pod "f5d606a0-5fb4-47ae-b493-4e8956256ba6" (UID: "f5d606a0-5fb4-47ae-b493-4e8956256ba6"). InnerVolumeSpecName "kube-api-access-2dbvq". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 11 18:32:02 crc kubenswrapper[4837]: I0111 18:32:02.644978 4837 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2dbvq\" (UniqueName: \"kubernetes.io/projected/f5d606a0-5fb4-47ae-b493-4e8956256ba6-kube-api-access-2dbvq\") on node \"crc\" DevicePath \"\"" Jan 11 18:32:03 crc kubenswrapper[4837]: I0111 18:32:03.287319 4837 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="8d3608cacb2e8be3bbe727f1b79b7363a274dab4543e390ea06f42ebd4ef6dc5" Jan 11 18:32:03 crc kubenswrapper[4837]: I0111 18:32:03.287484 4837 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-wkvxh/crc-debug-bnq6z" Jan 11 18:32:03 crc kubenswrapper[4837]: I0111 18:32:03.599721 4837 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-must-gather-wkvxh/crc-debug-qgmkb"] Jan 11 18:32:03 crc kubenswrapper[4837]: E0111 18:32:03.600290 4837 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f5d606a0-5fb4-47ae-b493-4e8956256ba6" containerName="container-00" Jan 11 18:32:03 crc kubenswrapper[4837]: I0111 18:32:03.600311 4837 state_mem.go:107] "Deleted CPUSet assignment" podUID="f5d606a0-5fb4-47ae-b493-4e8956256ba6" containerName="container-00" Jan 11 18:32:03 crc kubenswrapper[4837]: I0111 18:32:03.600665 4837 memory_manager.go:354] "RemoveStaleState removing state" podUID="f5d606a0-5fb4-47ae-b493-4e8956256ba6" containerName="container-00" Jan 11 18:32:03 crc kubenswrapper[4837]: I0111 18:32:03.601654 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-wkvxh/crc-debug-qgmkb" Jan 11 18:32:03 crc kubenswrapper[4837]: I0111 18:32:03.663847 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-r7tdn\" (UniqueName: \"kubernetes.io/projected/30805298-6272-40e2-a05a-d7b47d8b0bb4-kube-api-access-r7tdn\") pod \"crc-debug-qgmkb\" (UID: \"30805298-6272-40e2-a05a-d7b47d8b0bb4\") " pod="openshift-must-gather-wkvxh/crc-debug-qgmkb" Jan 11 18:32:03 crc kubenswrapper[4837]: I0111 18:32:03.664096 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/30805298-6272-40e2-a05a-d7b47d8b0bb4-host\") pod \"crc-debug-qgmkb\" (UID: \"30805298-6272-40e2-a05a-d7b47d8b0bb4\") " pod="openshift-must-gather-wkvxh/crc-debug-qgmkb" Jan 11 18:32:03 crc kubenswrapper[4837]: I0111 18:32:03.765668 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/30805298-6272-40e2-a05a-d7b47d8b0bb4-host\") pod \"crc-debug-qgmkb\" (UID: \"30805298-6272-40e2-a05a-d7b47d8b0bb4\") " pod="openshift-must-gather-wkvxh/crc-debug-qgmkb" Jan 11 18:32:03 crc kubenswrapper[4837]: I0111 18:32:03.765842 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-r7tdn\" (UniqueName: \"kubernetes.io/projected/30805298-6272-40e2-a05a-d7b47d8b0bb4-kube-api-access-r7tdn\") pod \"crc-debug-qgmkb\" (UID: \"30805298-6272-40e2-a05a-d7b47d8b0bb4\") " pod="openshift-must-gather-wkvxh/crc-debug-qgmkb" Jan 11 18:32:03 crc kubenswrapper[4837]: I0111 18:32:03.766550 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/30805298-6272-40e2-a05a-d7b47d8b0bb4-host\") pod \"crc-debug-qgmkb\" (UID: \"30805298-6272-40e2-a05a-d7b47d8b0bb4\") " pod="openshift-must-gather-wkvxh/crc-debug-qgmkb" Jan 11 18:32:03 crc kubenswrapper[4837]: I0111 18:32:03.784545 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-r7tdn\" (UniqueName: \"kubernetes.io/projected/30805298-6272-40e2-a05a-d7b47d8b0bb4-kube-api-access-r7tdn\") pod \"crc-debug-qgmkb\" (UID: \"30805298-6272-40e2-a05a-d7b47d8b0bb4\") " pod="openshift-must-gather-wkvxh/crc-debug-qgmkb" Jan 11 18:32:03 crc kubenswrapper[4837]: I0111 18:32:03.936161 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-wkvxh/crc-debug-qgmkb" Jan 11 18:32:04 crc kubenswrapper[4837]: I0111 18:32:04.301250 4837 generic.go:334] "Generic (PLEG): container finished" podID="30805298-6272-40e2-a05a-d7b47d8b0bb4" containerID="3ac8b730d0924b65be9170e112cd4c19b891c8bed12714d223023abfbb6cf1f2" exitCode=0 Jan 11 18:32:04 crc kubenswrapper[4837]: I0111 18:32:04.301408 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-wkvxh/crc-debug-qgmkb" event={"ID":"30805298-6272-40e2-a05a-d7b47d8b0bb4","Type":"ContainerDied","Data":"3ac8b730d0924b65be9170e112cd4c19b891c8bed12714d223023abfbb6cf1f2"} Jan 11 18:32:04 crc kubenswrapper[4837]: I0111 18:32:04.301581 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-wkvxh/crc-debug-qgmkb" event={"ID":"30805298-6272-40e2-a05a-d7b47d8b0bb4","Type":"ContainerStarted","Data":"0fdeda3143e0586285a4564cfbff4bec16b841565ca65e54422b1854542dc80d"} Jan 11 18:32:04 crc kubenswrapper[4837]: I0111 18:32:04.373348 4837 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f5d606a0-5fb4-47ae-b493-4e8956256ba6" path="/var/lib/kubelet/pods/f5d606a0-5fb4-47ae-b493-4e8956256ba6/volumes" Jan 11 18:32:04 crc kubenswrapper[4837]: I0111 18:32:04.774343 4837 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-must-gather-wkvxh/crc-debug-qgmkb"] Jan 11 18:32:04 crc kubenswrapper[4837]: I0111 18:32:04.781764 4837 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-must-gather-wkvxh/crc-debug-qgmkb"] Jan 11 18:32:05 crc kubenswrapper[4837]: I0111 18:32:05.419928 4837 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-wkvxh/crc-debug-qgmkb" Jan 11 18:32:05 crc kubenswrapper[4837]: I0111 18:32:05.604827 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/30805298-6272-40e2-a05a-d7b47d8b0bb4-host\") pod \"30805298-6272-40e2-a05a-d7b47d8b0bb4\" (UID: \"30805298-6272-40e2-a05a-d7b47d8b0bb4\") " Jan 11 18:32:05 crc kubenswrapper[4837]: I0111 18:32:05.604984 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/30805298-6272-40e2-a05a-d7b47d8b0bb4-host" (OuterVolumeSpecName: "host") pod "30805298-6272-40e2-a05a-d7b47d8b0bb4" (UID: "30805298-6272-40e2-a05a-d7b47d8b0bb4"). InnerVolumeSpecName "host". PluginName "kubernetes.io/host-path", VolumeGidValue "" Jan 11 18:32:05 crc kubenswrapper[4837]: I0111 18:32:05.605064 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-r7tdn\" (UniqueName: \"kubernetes.io/projected/30805298-6272-40e2-a05a-d7b47d8b0bb4-kube-api-access-r7tdn\") pod \"30805298-6272-40e2-a05a-d7b47d8b0bb4\" (UID: \"30805298-6272-40e2-a05a-d7b47d8b0bb4\") " Jan 11 18:32:05 crc kubenswrapper[4837]: I0111 18:32:05.606333 4837 reconciler_common.go:293] "Volume detached for volume \"host\" (UniqueName: \"kubernetes.io/host-path/30805298-6272-40e2-a05a-d7b47d8b0bb4-host\") on node \"crc\" DevicePath \"\"" Jan 11 18:32:05 crc kubenswrapper[4837]: I0111 18:32:05.622730 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/30805298-6272-40e2-a05a-d7b47d8b0bb4-kube-api-access-r7tdn" (OuterVolumeSpecName: "kube-api-access-r7tdn") pod "30805298-6272-40e2-a05a-d7b47d8b0bb4" (UID: "30805298-6272-40e2-a05a-d7b47d8b0bb4"). InnerVolumeSpecName "kube-api-access-r7tdn". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 11 18:32:05 crc kubenswrapper[4837]: I0111 18:32:05.708122 4837 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-r7tdn\" (UniqueName: \"kubernetes.io/projected/30805298-6272-40e2-a05a-d7b47d8b0bb4-kube-api-access-r7tdn\") on node \"crc\" DevicePath \"\"" Jan 11 18:32:05 crc kubenswrapper[4837]: I0111 18:32:05.981781 4837 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-must-gather-wkvxh/crc-debug-r9gp6"] Jan 11 18:32:05 crc kubenswrapper[4837]: E0111 18:32:05.982408 4837 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="30805298-6272-40e2-a05a-d7b47d8b0bb4" containerName="container-00" Jan 11 18:32:05 crc kubenswrapper[4837]: I0111 18:32:05.982428 4837 state_mem.go:107] "Deleted CPUSet assignment" podUID="30805298-6272-40e2-a05a-d7b47d8b0bb4" containerName="container-00" Jan 11 18:32:05 crc kubenswrapper[4837]: I0111 18:32:05.982804 4837 memory_manager.go:354] "RemoveStaleState removing state" podUID="30805298-6272-40e2-a05a-d7b47d8b0bb4" containerName="container-00" Jan 11 18:32:05 crc kubenswrapper[4837]: I0111 18:32:05.983791 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-wkvxh/crc-debug-r9gp6" Jan 11 18:32:06 crc kubenswrapper[4837]: I0111 18:32:06.116077 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ml59h\" (UniqueName: \"kubernetes.io/projected/78be57f3-c282-42ed-98cb-12d791c82f35-kube-api-access-ml59h\") pod \"crc-debug-r9gp6\" (UID: \"78be57f3-c282-42ed-98cb-12d791c82f35\") " pod="openshift-must-gather-wkvxh/crc-debug-r9gp6" Jan 11 18:32:06 crc kubenswrapper[4837]: I0111 18:32:06.116419 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/78be57f3-c282-42ed-98cb-12d791c82f35-host\") pod \"crc-debug-r9gp6\" (UID: \"78be57f3-c282-42ed-98cb-12d791c82f35\") " pod="openshift-must-gather-wkvxh/crc-debug-r9gp6" Jan 11 18:32:06 crc kubenswrapper[4837]: I0111 18:32:06.217773 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ml59h\" (UniqueName: \"kubernetes.io/projected/78be57f3-c282-42ed-98cb-12d791c82f35-kube-api-access-ml59h\") pod \"crc-debug-r9gp6\" (UID: \"78be57f3-c282-42ed-98cb-12d791c82f35\") " pod="openshift-must-gather-wkvxh/crc-debug-r9gp6" Jan 11 18:32:06 crc kubenswrapper[4837]: I0111 18:32:06.217943 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/78be57f3-c282-42ed-98cb-12d791c82f35-host\") pod \"crc-debug-r9gp6\" (UID: \"78be57f3-c282-42ed-98cb-12d791c82f35\") " pod="openshift-must-gather-wkvxh/crc-debug-r9gp6" Jan 11 18:32:06 crc kubenswrapper[4837]: I0111 18:32:06.218066 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/78be57f3-c282-42ed-98cb-12d791c82f35-host\") pod \"crc-debug-r9gp6\" (UID: \"78be57f3-c282-42ed-98cb-12d791c82f35\") " pod="openshift-must-gather-wkvxh/crc-debug-r9gp6" Jan 11 18:32:06 crc kubenswrapper[4837]: I0111 18:32:06.239579 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ml59h\" (UniqueName: \"kubernetes.io/projected/78be57f3-c282-42ed-98cb-12d791c82f35-kube-api-access-ml59h\") pod \"crc-debug-r9gp6\" (UID: \"78be57f3-c282-42ed-98cb-12d791c82f35\") " pod="openshift-must-gather-wkvxh/crc-debug-r9gp6" Jan 11 18:32:06 crc kubenswrapper[4837]: I0111 18:32:06.314059 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-wkvxh/crc-debug-r9gp6" Jan 11 18:32:06 crc kubenswrapper[4837]: I0111 18:32:06.325285 4837 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="0fdeda3143e0586285a4564cfbff4bec16b841565ca65e54422b1854542dc80d" Jan 11 18:32:06 crc kubenswrapper[4837]: I0111 18:32:06.325396 4837 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-wkvxh/crc-debug-qgmkb" Jan 11 18:32:06 crc kubenswrapper[4837]: I0111 18:32:06.375001 4837 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="30805298-6272-40e2-a05a-d7b47d8b0bb4" path="/var/lib/kubelet/pods/30805298-6272-40e2-a05a-d7b47d8b0bb4/volumes" Jan 11 18:32:07 crc kubenswrapper[4837]: I0111 18:32:07.335575 4837 generic.go:334] "Generic (PLEG): container finished" podID="78be57f3-c282-42ed-98cb-12d791c82f35" containerID="2878dff4de5afde64dfbbc2a622d1581e04aabe3f5810a4210ce6c8caf30d6e2" exitCode=0 Jan 11 18:32:07 crc kubenswrapper[4837]: I0111 18:32:07.335784 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-wkvxh/crc-debug-r9gp6" event={"ID":"78be57f3-c282-42ed-98cb-12d791c82f35","Type":"ContainerDied","Data":"2878dff4de5afde64dfbbc2a622d1581e04aabe3f5810a4210ce6c8caf30d6e2"} Jan 11 18:32:07 crc kubenswrapper[4837]: I0111 18:32:07.335933 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-wkvxh/crc-debug-r9gp6" event={"ID":"78be57f3-c282-42ed-98cb-12d791c82f35","Type":"ContainerStarted","Data":"04efb488a54f0ef069864e6ed2881b0e7fa350435857eedb2e77a8e46a4c3b09"} Jan 11 18:32:07 crc kubenswrapper[4837]: I0111 18:32:07.374328 4837 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-must-gather-wkvxh/crc-debug-r9gp6"] Jan 11 18:32:07 crc kubenswrapper[4837]: I0111 18:32:07.395579 4837 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-must-gather-wkvxh/crc-debug-r9gp6"] Jan 11 18:32:08 crc kubenswrapper[4837]: I0111 18:32:08.460569 4837 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-wkvxh/crc-debug-r9gp6" Jan 11 18:32:08 crc kubenswrapper[4837]: I0111 18:32:08.577685 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/78be57f3-c282-42ed-98cb-12d791c82f35-host\") pod \"78be57f3-c282-42ed-98cb-12d791c82f35\" (UID: \"78be57f3-c282-42ed-98cb-12d791c82f35\") " Jan 11 18:32:08 crc kubenswrapper[4837]: I0111 18:32:08.577787 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ml59h\" (UniqueName: \"kubernetes.io/projected/78be57f3-c282-42ed-98cb-12d791c82f35-kube-api-access-ml59h\") pod \"78be57f3-c282-42ed-98cb-12d791c82f35\" (UID: \"78be57f3-c282-42ed-98cb-12d791c82f35\") " Jan 11 18:32:08 crc kubenswrapper[4837]: I0111 18:32:08.577929 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/78be57f3-c282-42ed-98cb-12d791c82f35-host" (OuterVolumeSpecName: "host") pod "78be57f3-c282-42ed-98cb-12d791c82f35" (UID: "78be57f3-c282-42ed-98cb-12d791c82f35"). InnerVolumeSpecName "host". PluginName "kubernetes.io/host-path", VolumeGidValue "" Jan 11 18:32:08 crc kubenswrapper[4837]: I0111 18:32:08.578327 4837 reconciler_common.go:293] "Volume detached for volume \"host\" (UniqueName: \"kubernetes.io/host-path/78be57f3-c282-42ed-98cb-12d791c82f35-host\") on node \"crc\" DevicePath \"\"" Jan 11 18:32:08 crc kubenswrapper[4837]: I0111 18:32:08.599422 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/78be57f3-c282-42ed-98cb-12d791c82f35-kube-api-access-ml59h" (OuterVolumeSpecName: "kube-api-access-ml59h") pod "78be57f3-c282-42ed-98cb-12d791c82f35" (UID: "78be57f3-c282-42ed-98cb-12d791c82f35"). InnerVolumeSpecName "kube-api-access-ml59h". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 11 18:32:08 crc kubenswrapper[4837]: I0111 18:32:08.679905 4837 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-ml59h\" (UniqueName: \"kubernetes.io/projected/78be57f3-c282-42ed-98cb-12d791c82f35-kube-api-access-ml59h\") on node \"crc\" DevicePath \"\"" Jan 11 18:32:09 crc kubenswrapper[4837]: I0111 18:32:09.352080 4837 scope.go:117] "RemoveContainer" containerID="2878dff4de5afde64dfbbc2a622d1581e04aabe3f5810a4210ce6c8caf30d6e2" Jan 11 18:32:09 crc kubenswrapper[4837]: I0111 18:32:09.352377 4837 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-wkvxh/crc-debug-r9gp6" Jan 11 18:32:09 crc kubenswrapper[4837]: I0111 18:32:09.443536 4837 patch_prober.go:28] interesting pod/machine-config-daemon-pqnst container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Jan 11 18:32:09 crc kubenswrapper[4837]: I0111 18:32:09.443576 4837 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-pqnst" podUID="1cf6fa66-290a-4e29-bafc-e60185a22fe2" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Jan 11 18:32:09 crc kubenswrapper[4837]: I0111 18:32:09.443617 4837 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-pqnst" Jan 11 18:32:09 crc kubenswrapper[4837]: I0111 18:32:09.444291 4837 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"0032c5226bd846c444b1e1269293c95e1f965f59227e93f022572cc4a00df926"} pod="openshift-machine-config-operator/machine-config-daemon-pqnst" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Jan 11 18:32:09 crc kubenswrapper[4837]: I0111 18:32:09.444335 4837 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-pqnst" podUID="1cf6fa66-290a-4e29-bafc-e60185a22fe2" containerName="machine-config-daemon" containerID="cri-o://0032c5226bd846c444b1e1269293c95e1f965f59227e93f022572cc4a00df926" gracePeriod=600 Jan 11 18:32:09 crc kubenswrapper[4837]: E0111 18:32:09.575797 4837 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-pqnst_openshift-machine-config-operator(1cf6fa66-290a-4e29-bafc-e60185a22fe2)\"" pod="openshift-machine-config-operator/machine-config-daemon-pqnst" podUID="1cf6fa66-290a-4e29-bafc-e60185a22fe2" Jan 11 18:32:10 crc kubenswrapper[4837]: I0111 18:32:10.365930 4837 generic.go:334] "Generic (PLEG): container finished" podID="1cf6fa66-290a-4e29-bafc-e60185a22fe2" containerID="0032c5226bd846c444b1e1269293c95e1f965f59227e93f022572cc4a00df926" exitCode=0 Jan 11 18:32:10 crc kubenswrapper[4837]: I0111 18:32:10.378327 4837 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="78be57f3-c282-42ed-98cb-12d791c82f35" path="/var/lib/kubelet/pods/78be57f3-c282-42ed-98cb-12d791c82f35/volumes" Jan 11 18:32:10 crc kubenswrapper[4837]: I0111 18:32:10.379394 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-pqnst" event={"ID":"1cf6fa66-290a-4e29-bafc-e60185a22fe2","Type":"ContainerDied","Data":"0032c5226bd846c444b1e1269293c95e1f965f59227e93f022572cc4a00df926"} Jan 11 18:32:10 crc kubenswrapper[4837]: I0111 18:32:10.379462 4837 scope.go:117] "RemoveContainer" containerID="4dac9e020402176affb7612f80710fb29436721f04c0573c72f72cf26ab9f649" Jan 11 18:32:10 crc kubenswrapper[4837]: I0111 18:32:10.380365 4837 scope.go:117] "RemoveContainer" containerID="0032c5226bd846c444b1e1269293c95e1f965f59227e93f022572cc4a00df926" Jan 11 18:32:10 crc kubenswrapper[4837]: E0111 18:32:10.380800 4837 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-pqnst_openshift-machine-config-operator(1cf6fa66-290a-4e29-bafc-e60185a22fe2)\"" pod="openshift-machine-config-operator/machine-config-daemon-pqnst" podUID="1cf6fa66-290a-4e29-bafc-e60185a22fe2" Jan 11 18:32:22 crc kubenswrapper[4837]: I0111 18:32:22.364594 4837 scope.go:117] "RemoveContainer" containerID="0032c5226bd846c444b1e1269293c95e1f965f59227e93f022572cc4a00df926" Jan 11 18:32:22 crc kubenswrapper[4837]: E0111 18:32:22.365221 4837 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-pqnst_openshift-machine-config-operator(1cf6fa66-290a-4e29-bafc-e60185a22fe2)\"" pod="openshift-machine-config-operator/machine-config-daemon-pqnst" podUID="1cf6fa66-290a-4e29-bafc-e60185a22fe2" Jan 11 18:32:23 crc kubenswrapper[4837]: I0111 18:32:23.148379 4837 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_barbican-api-c9967c84b-cfjvt_42d725d3-10ec-4492-8598-b505cef336fd/barbican-api/0.log" Jan 11 18:32:23 crc kubenswrapper[4837]: I0111 18:32:23.263055 4837 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_barbican-api-c9967c84b-cfjvt_42d725d3-10ec-4492-8598-b505cef336fd/barbican-api-log/0.log" Jan 11 18:32:23 crc kubenswrapper[4837]: I0111 18:32:23.307174 4837 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_barbican-keystone-listener-6dbd56445d-4bk5s_bdeb4f5a-a034-4698-bf56-2f6e1d9cd7d0/barbican-keystone-listener/0.log" Jan 11 18:32:23 crc kubenswrapper[4837]: I0111 18:32:23.347198 4837 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_barbican-keystone-listener-6dbd56445d-4bk5s_bdeb4f5a-a034-4698-bf56-2f6e1d9cd7d0/barbican-keystone-listener-log/0.log" Jan 11 18:32:23 crc kubenswrapper[4837]: I0111 18:32:23.479476 4837 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_barbican-worker-5f749597dc-j8n24_59aacef4-5c25-42e6-a96f-5ca46dc94667/barbican-worker/0.log" Jan 11 18:32:23 crc kubenswrapper[4837]: I0111 18:32:23.519101 4837 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_barbican-worker-5f749597dc-j8n24_59aacef4-5c25-42e6-a96f-5ca46dc94667/barbican-worker-log/0.log" Jan 11 18:32:23 crc kubenswrapper[4837]: I0111 18:32:23.659089 4837 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_bootstrap-edpm-deployment-openstack-edpm-ipam-kfwxw_bb11040a-b0ba-46f9-b2bf-65c1b0c8cf51/bootstrap-edpm-deployment-openstack-edpm-ipam/0.log" Jan 11 18:32:23 crc kubenswrapper[4837]: I0111 18:32:23.751962 4837 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ceilometer-0_19839462-912e-421f-8d6d-a5ef5d8129f5/ceilometer-central-agent/0.log" Jan 11 18:32:23 crc kubenswrapper[4837]: I0111 18:32:23.841725 4837 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ceilometer-0_19839462-912e-421f-8d6d-a5ef5d8129f5/ceilometer-notification-agent/0.log" Jan 11 18:32:23 crc kubenswrapper[4837]: I0111 18:32:23.850139 4837 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ceilometer-0_19839462-912e-421f-8d6d-a5ef5d8129f5/proxy-httpd/0.log" Jan 11 18:32:23 crc kubenswrapper[4837]: I0111 18:32:23.903089 4837 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ceilometer-0_19839462-912e-421f-8d6d-a5ef5d8129f5/sg-core/0.log" Jan 11 18:32:24 crc kubenswrapper[4837]: I0111 18:32:24.053930 4837 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_cinder-api-0_a59977a8-3e8d-4fa9-866e-541d9e0d4bda/cinder-api/0.log" Jan 11 18:32:24 crc kubenswrapper[4837]: I0111 18:32:24.060707 4837 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_cinder-api-0_a59977a8-3e8d-4fa9-866e-541d9e0d4bda/cinder-api-log/0.log" Jan 11 18:32:24 crc kubenswrapper[4837]: I0111 18:32:24.203058 4837 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_cinder-scheduler-0_f6b872cd-f683-45bb-94db-710d997ef648/cinder-scheduler/0.log" Jan 11 18:32:24 crc kubenswrapper[4837]: I0111 18:32:24.276090 4837 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_cinder-scheduler-0_f6b872cd-f683-45bb-94db-710d997ef648/probe/0.log" Jan 11 18:32:24 crc kubenswrapper[4837]: I0111 18:32:24.298329 4837 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_configure-network-edpm-deployment-openstack-edpm-ipam-xmwj9_fd04d490-42de-47b9-aa6f-bc09ba8dd539/configure-network-edpm-deployment-openstack-edpm-ipam/0.log" Jan 11 18:32:24 crc kubenswrapper[4837]: I0111 18:32:24.485745 4837 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_configure-os-edpm-deployment-openstack-edpm-ipam-nffxn_355acc57-d5c4-46fa-8881-61cae424d004/configure-os-edpm-deployment-openstack-edpm-ipam/0.log" Jan 11 18:32:24 crc kubenswrapper[4837]: I0111 18:32:24.507726 4837 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_dnsmasq-dns-cb6ffcf87-ch9dq_25f14f43-12d0-4c7d-b823-4c6d3eecd355/init/0.log" Jan 11 18:32:24 crc kubenswrapper[4837]: I0111 18:32:24.725874 4837 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_dnsmasq-dns-cb6ffcf87-ch9dq_25f14f43-12d0-4c7d-b823-4c6d3eecd355/dnsmasq-dns/0.log" Jan 11 18:32:24 crc kubenswrapper[4837]: I0111 18:32:24.745033 4837 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_dnsmasq-dns-cb6ffcf87-ch9dq_25f14f43-12d0-4c7d-b823-4c6d3eecd355/init/0.log" Jan 11 18:32:24 crc kubenswrapper[4837]: I0111 18:32:24.775129 4837 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_download-cache-edpm-deployment-openstack-edpm-ipam-4d8rm_e6773d83-814c-42bc-8578-5746bb984988/download-cache-edpm-deployment-openstack-edpm-ipam/0.log" Jan 11 18:32:24 crc kubenswrapper[4837]: I0111 18:32:24.901664 4837 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_glance-default-external-api-0_f3914d94-6947-4a7c-ac5e-45bfe15ae144/glance-httpd/0.log" Jan 11 18:32:24 crc kubenswrapper[4837]: I0111 18:32:24.920216 4837 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_glance-default-external-api-0_f3914d94-6947-4a7c-ac5e-45bfe15ae144/glance-log/0.log" Jan 11 18:32:25 crc kubenswrapper[4837]: I0111 18:32:25.092510 4837 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_glance-default-internal-api-0_57a88fde-50af-4286-b9c6-8a5300b7f26b/glance-httpd/0.log" Jan 11 18:32:25 crc kubenswrapper[4837]: I0111 18:32:25.096083 4837 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_glance-default-internal-api-0_57a88fde-50af-4286-b9c6-8a5300b7f26b/glance-log/0.log" Jan 11 18:32:25 crc kubenswrapper[4837]: I0111 18:32:25.282552 4837 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_horizon-7f65cf99f6-zwzzs_ad90513d-7bd8-4407-af16-8d041440673f/horizon/0.log" Jan 11 18:32:25 crc kubenswrapper[4837]: I0111 18:32:25.371932 4837 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_install-certs-edpm-deployment-openstack-edpm-ipam-mknj9_e253716a-cb9e-4a48-aca6-5cbd870ef9d5/install-certs-edpm-deployment-openstack-edpm-ipam/0.log" Jan 11 18:32:25 crc kubenswrapper[4837]: I0111 18:32:25.616722 4837 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_horizon-7f65cf99f6-zwzzs_ad90513d-7bd8-4407-af16-8d041440673f/horizon-log/0.log" Jan 11 18:32:25 crc kubenswrapper[4837]: I0111 18:32:25.636069 4837 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_install-os-edpm-deployment-openstack-edpm-ipam-j8p5n_24037829-f96f-4b2e-93b1-968e19a0edb8/install-os-edpm-deployment-openstack-edpm-ipam/0.log" Jan 11 18:32:25 crc kubenswrapper[4837]: I0111 18:32:25.843141 4837 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_keystone-cron-29469241-hc276_d2c8127b-3998-456f-bd2c-01f945d7f0b9/keystone-cron/0.log" Jan 11 18:32:25 crc kubenswrapper[4837]: I0111 18:32:25.938930 4837 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_keystone-79986b9d84-7gl5k_42661ef7-7007-4cff-b945-85690a07399f/keystone-api/0.log" Jan 11 18:32:26 crc kubenswrapper[4837]: I0111 18:32:26.039021 4837 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_kube-state-metrics-0_e2197654-71c4-403f-98d8-994d0225a199/kube-state-metrics/0.log" Jan 11 18:32:26 crc kubenswrapper[4837]: I0111 18:32:26.083916 4837 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_libvirt-edpm-deployment-openstack-edpm-ipam-dwz7s_2948a5a1-4557-4e6b-82d0-6b8e9d7408b1/libvirt-edpm-deployment-openstack-edpm-ipam/0.log" Jan 11 18:32:26 crc kubenswrapper[4837]: I0111 18:32:26.459424 4837 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_neutron-75f9d98d89-rxjpb_556f75eb-e607-44ee-bbde-cc94844a98bd/neutron-api/0.log" Jan 11 18:32:26 crc kubenswrapper[4837]: I0111 18:32:26.516325 4837 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_neutron-75f9d98d89-rxjpb_556f75eb-e607-44ee-bbde-cc94844a98bd/neutron-httpd/0.log" Jan 11 18:32:26 crc kubenswrapper[4837]: I0111 18:32:26.709619 4837 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_neutron-metadata-edpm-deployment-openstack-edpm-ipam-s96gx_dafad3b0-31b4-467e-9604-485cb65e91e5/neutron-metadata-edpm-deployment-openstack-edpm-ipam/0.log" Jan 11 18:32:27 crc kubenswrapper[4837]: I0111 18:32:27.261038 4837 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-api-0_c6de8f9a-cf35-49c6-8b0c-d75ac48b3691/nova-api-log/0.log" Jan 11 18:32:27 crc kubenswrapper[4837]: I0111 18:32:27.262042 4837 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-cell0-conductor-0_bb1d44ca-482f-455e-bb8c-7c409c3ad6f8/nova-cell0-conductor-conductor/0.log" Jan 11 18:32:27 crc kubenswrapper[4837]: I0111 18:32:27.402430 4837 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-api-0_c6de8f9a-cf35-49c6-8b0c-d75ac48b3691/nova-api-api/0.log" Jan 11 18:32:27 crc kubenswrapper[4837]: I0111 18:32:27.562843 4837 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-cell1-conductor-0_88d7f74b-9a47-4152-bec1-11e05030e750/nova-cell1-conductor-conductor/0.log" Jan 11 18:32:27 crc kubenswrapper[4837]: I0111 18:32:27.592761 4837 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-cell1-novncproxy-0_d590d80f-b67b-4740-8433-bcab03dca733/nova-cell1-novncproxy-novncproxy/0.log" Jan 11 18:32:27 crc kubenswrapper[4837]: I0111 18:32:27.748712 4837 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-edpm-deployment-openstack-edpm-ipam-fctb7_af0d2223-27cf-46b8-9105-735784f027d5/nova-edpm-deployment-openstack-edpm-ipam/0.log" Jan 11 18:32:27 crc kubenswrapper[4837]: I0111 18:32:27.894018 4837 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-metadata-0_1847dbaa-536d-48f0-ac85-de5ad698e483/nova-metadata-log/0.log" Jan 11 18:32:28 crc kubenswrapper[4837]: I0111 18:32:28.161161 4837 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-scheduler-0_4180ef05-e41c-4e74-8e23-41fbda984554/nova-scheduler-scheduler/0.log" Jan 11 18:32:28 crc kubenswrapper[4837]: I0111 18:32:28.186616 4837 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstack-cell1-galera-0_bafaf023-917f-44a9-807e-b6a0f6a55e77/mysql-bootstrap/0.log" Jan 11 18:32:28 crc kubenswrapper[4837]: I0111 18:32:28.389699 4837 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstack-cell1-galera-0_bafaf023-917f-44a9-807e-b6a0f6a55e77/mysql-bootstrap/0.log" Jan 11 18:32:28 crc kubenswrapper[4837]: I0111 18:32:28.412556 4837 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstack-cell1-galera-0_bafaf023-917f-44a9-807e-b6a0f6a55e77/galera/0.log" Jan 11 18:32:28 crc kubenswrapper[4837]: I0111 18:32:28.553214 4837 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstack-galera-0_09134535-27db-4787-89a5-c01f72ffa182/mysql-bootstrap/0.log" Jan 11 18:32:28 crc kubenswrapper[4837]: I0111 18:32:28.785977 4837 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstack-galera-0_09134535-27db-4787-89a5-c01f72ffa182/mysql-bootstrap/0.log" Jan 11 18:32:28 crc kubenswrapper[4837]: I0111 18:32:28.795706 4837 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstack-galera-0_09134535-27db-4787-89a5-c01f72ffa182/galera/0.log" Jan 11 18:32:28 crc kubenswrapper[4837]: I0111 18:32:28.859795 4837 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-metadata-0_1847dbaa-536d-48f0-ac85-de5ad698e483/nova-metadata-metadata/0.log" Jan 11 18:32:28 crc kubenswrapper[4837]: I0111 18:32:28.953888 4837 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstackclient_a2380f65-2f68-4a02-95c4-b3fd94ba3adc/openstackclient/0.log" Jan 11 18:32:28 crc kubenswrapper[4837]: I0111 18:32:28.996203 4837 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-metrics-nfclh_62b32964-26a8-4080-a404-0b40c3122184/openstack-network-exporter/0.log" Jan 11 18:32:29 crc kubenswrapper[4837]: I0111 18:32:29.192563 4837 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-ovs-22bpd_0e7e2f2f-8ba4-4156-a06f-abe8b8c39477/ovsdb-server-init/0.log" Jan 11 18:32:29 crc kubenswrapper[4837]: I0111 18:32:29.345196 4837 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-ovs-22bpd_0e7e2f2f-8ba4-4156-a06f-abe8b8c39477/ovsdb-server-init/0.log" Jan 11 18:32:29 crc kubenswrapper[4837]: I0111 18:32:29.382670 4837 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-ovs-22bpd_0e7e2f2f-8ba4-4156-a06f-abe8b8c39477/ovs-vswitchd/0.log" Jan 11 18:32:29 crc kubenswrapper[4837]: I0111 18:32:29.392702 4837 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-ovs-22bpd_0e7e2f2f-8ba4-4156-a06f-abe8b8c39477/ovsdb-server/0.log" Jan 11 18:32:29 crc kubenswrapper[4837]: I0111 18:32:29.608972 4837 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-zfjdc_91f28f51-1965-4fdd-bcb8-c261644249d5/ovn-controller/0.log" Jan 11 18:32:29 crc kubenswrapper[4837]: I0111 18:32:29.629855 4837 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-edpm-deployment-openstack-edpm-ipam-gqrp6_8b03a0af-96d2-4573-aef4-3010b10d138b/ovn-edpm-deployment-openstack-edpm-ipam/0.log" Jan 11 18:32:29 crc kubenswrapper[4837]: I0111 18:32:29.826623 4837 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-northd-0_de66fa79-5d8b-48c3-a30a-af21fbdd19b3/openstack-network-exporter/0.log" Jan 11 18:32:29 crc kubenswrapper[4837]: I0111 18:32:29.853117 4837 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-northd-0_de66fa79-5d8b-48c3-a30a-af21fbdd19b3/ovn-northd/0.log" Jan 11 18:32:29 crc kubenswrapper[4837]: I0111 18:32:29.999534 4837 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovsdbserver-nb-0_5e69a588-3047-499a-b5cb-000fdcc7762a/openstack-network-exporter/0.log" Jan 11 18:32:30 crc kubenswrapper[4837]: I0111 18:32:30.003087 4837 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovsdbserver-nb-0_5e69a588-3047-499a-b5cb-000fdcc7762a/ovsdbserver-nb/0.log" Jan 11 18:32:30 crc kubenswrapper[4837]: I0111 18:32:30.116555 4837 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovsdbserver-sb-0_bc3c0fec-5357-46ca-929a-527f01e1eb3d/openstack-network-exporter/0.log" Jan 11 18:32:30 crc kubenswrapper[4837]: I0111 18:32:30.193076 4837 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovsdbserver-sb-0_bc3c0fec-5357-46ca-929a-527f01e1eb3d/ovsdbserver-sb/0.log" Jan 11 18:32:30 crc kubenswrapper[4837]: I0111 18:32:30.419002 4837 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_placement-86bdff5ffb-hdnql_1a6ff225-8495-4008-9719-c85bcb7fa65b/placement-api/0.log" Jan 11 18:32:30 crc kubenswrapper[4837]: I0111 18:32:30.439133 4837 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_placement-86bdff5ffb-hdnql_1a6ff225-8495-4008-9719-c85bcb7fa65b/placement-log/0.log" Jan 11 18:32:30 crc kubenswrapper[4837]: I0111 18:32:30.547072 4837 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-cell1-server-0_33de23cd-829c-449c-a816-d8a54f8ea68f/setup-container/0.log" Jan 11 18:32:30 crc kubenswrapper[4837]: I0111 18:32:30.644197 4837 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-cell1-server-0_33de23cd-829c-449c-a816-d8a54f8ea68f/rabbitmq/0.log" Jan 11 18:32:30 crc kubenswrapper[4837]: I0111 18:32:30.697776 4837 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-cell1-server-0_33de23cd-829c-449c-a816-d8a54f8ea68f/setup-container/0.log" Jan 11 18:32:30 crc kubenswrapper[4837]: I0111 18:32:30.829886 4837 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-server-0_c8ee05fd-6122-4935-a3c0-4b9f71175434/setup-container/0.log" Jan 11 18:32:30 crc kubenswrapper[4837]: I0111 18:32:30.954560 4837 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-server-0_c8ee05fd-6122-4935-a3c0-4b9f71175434/setup-container/0.log" Jan 11 18:32:30 crc kubenswrapper[4837]: I0111 18:32:30.958624 4837 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-server-0_c8ee05fd-6122-4935-a3c0-4b9f71175434/rabbitmq/0.log" Jan 11 18:32:31 crc kubenswrapper[4837]: I0111 18:32:31.001806 4837 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_reboot-os-edpm-deployment-openstack-edpm-ipam-ghw7s_a2775520-8fe3-45e2-aab4-91f962ef86cb/reboot-os-edpm-deployment-openstack-edpm-ipam/0.log" Jan 11 18:32:31 crc kubenswrapper[4837]: I0111 18:32:31.189235 4837 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_redhat-edpm-deployment-openstack-edpm-ipam-jwxdr_8c4987b1-e485-4665-9094-ed0f9cd0ed7d/redhat-edpm-deployment-openstack-edpm-ipam/0.log" Jan 11 18:32:31 crc kubenswrapper[4837]: I0111 18:32:31.248716 4837 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_repo-setup-edpm-deployment-openstack-edpm-ipam-vp9f5_4c81bfed-8b17-4ea6-90f8-794ea9dec0f6/repo-setup-edpm-deployment-openstack-edpm-ipam/0.log" Jan 11 18:32:31 crc kubenswrapper[4837]: I0111 18:32:31.389645 4837 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_run-os-edpm-deployment-openstack-edpm-ipam-xtrks_2809dbe5-de4c-4d4d-9a2c-85c51f4591cb/run-os-edpm-deployment-openstack-edpm-ipam/0.log" Jan 11 18:32:31 crc kubenswrapper[4837]: I0111 18:32:31.541975 4837 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ssh-known-hosts-edpm-deployment-jcmv2_bd3dd5e3-2424-41fd-a0cf-ae265214d12f/ssh-known-hosts-edpm-deployment/0.log" Jan 11 18:32:31 crc kubenswrapper[4837]: I0111 18:32:31.781893 4837 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-proxy-85f864d5b5-z8rsp_134689b3-4006-4e5e-a051-cf51f6c9cf51/proxy-server/0.log" Jan 11 18:32:31 crc kubenswrapper[4837]: I0111 18:32:31.818806 4837 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-proxy-85f864d5b5-z8rsp_134689b3-4006-4e5e-a051-cf51f6c9cf51/proxy-httpd/0.log" Jan 11 18:32:31 crc kubenswrapper[4837]: I0111 18:32:31.827177 4837 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-ring-rebalance-qn5fn_43068ba1-1d19-4822-88fa-e52f8fb21738/swift-ring-rebalance/0.log" Jan 11 18:32:32 crc kubenswrapper[4837]: I0111 18:32:32.006688 4837 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_23a0b787-b5b4-4a4e-828b-d7f34853603f/account-reaper/0.log" Jan 11 18:32:32 crc kubenswrapper[4837]: I0111 18:32:32.014365 4837 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_23a0b787-b5b4-4a4e-828b-d7f34853603f/account-auditor/0.log" Jan 11 18:32:32 crc kubenswrapper[4837]: I0111 18:32:32.148512 4837 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_23a0b787-b5b4-4a4e-828b-d7f34853603f/account-replicator/0.log" Jan 11 18:32:32 crc kubenswrapper[4837]: I0111 18:32:32.176275 4837 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_23a0b787-b5b4-4a4e-828b-d7f34853603f/account-server/0.log" Jan 11 18:32:32 crc kubenswrapper[4837]: I0111 18:32:32.243934 4837 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_23a0b787-b5b4-4a4e-828b-d7f34853603f/container-auditor/0.log" Jan 11 18:32:32 crc kubenswrapper[4837]: I0111 18:32:32.264440 4837 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_23a0b787-b5b4-4a4e-828b-d7f34853603f/container-replicator/0.log" Jan 11 18:32:32 crc kubenswrapper[4837]: I0111 18:32:32.378961 4837 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_23a0b787-b5b4-4a4e-828b-d7f34853603f/container-server/0.log" Jan 11 18:32:32 crc kubenswrapper[4837]: I0111 18:32:32.407970 4837 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_23a0b787-b5b4-4a4e-828b-d7f34853603f/container-updater/0.log" Jan 11 18:32:32 crc kubenswrapper[4837]: I0111 18:32:32.468168 4837 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_23a0b787-b5b4-4a4e-828b-d7f34853603f/object-auditor/0.log" Jan 11 18:32:32 crc kubenswrapper[4837]: I0111 18:32:32.524786 4837 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_23a0b787-b5b4-4a4e-828b-d7f34853603f/object-expirer/0.log" Jan 11 18:32:32 crc kubenswrapper[4837]: I0111 18:32:32.597620 4837 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_23a0b787-b5b4-4a4e-828b-d7f34853603f/object-server/0.log" Jan 11 18:32:32 crc kubenswrapper[4837]: I0111 18:32:32.658610 4837 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_23a0b787-b5b4-4a4e-828b-d7f34853603f/object-replicator/0.log" Jan 11 18:32:32 crc kubenswrapper[4837]: I0111 18:32:32.708301 4837 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_23a0b787-b5b4-4a4e-828b-d7f34853603f/object-updater/0.log" Jan 11 18:32:32 crc kubenswrapper[4837]: I0111 18:32:32.734315 4837 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_23a0b787-b5b4-4a4e-828b-d7f34853603f/rsync/0.log" Jan 11 18:32:32 crc kubenswrapper[4837]: I0111 18:32:32.861401 4837 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_23a0b787-b5b4-4a4e-828b-d7f34853603f/swift-recon-cron/0.log" Jan 11 18:32:32 crc kubenswrapper[4837]: I0111 18:32:32.928544 4837 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_telemetry-edpm-deployment-openstack-edpm-ipam-fg96b_38ba1b37-c033-461b-bf07-7aecd5d1e5a1/telemetry-edpm-deployment-openstack-edpm-ipam/0.log" Jan 11 18:32:33 crc kubenswrapper[4837]: I0111 18:32:33.080432 4837 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_tempest-tests-tempest_803594a1-a21b-4a8d-bf22-a2f1786b3822/tempest-tests-tempest-tests-runner/0.log" Jan 11 18:32:33 crc kubenswrapper[4837]: I0111 18:32:33.120224 4837 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_test-operator-logs-pod-tempest-tempest-tests-tempest_cab1d2b9-90c7-478b-905e-0487cb825e65/test-operator-logs-container/0.log" Jan 11 18:32:33 crc kubenswrapper[4837]: I0111 18:32:33.363854 4837 scope.go:117] "RemoveContainer" containerID="0032c5226bd846c444b1e1269293c95e1f965f59227e93f022572cc4a00df926" Jan 11 18:32:33 crc kubenswrapper[4837]: E0111 18:32:33.364737 4837 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-pqnst_openshift-machine-config-operator(1cf6fa66-290a-4e29-bafc-e60185a22fe2)\"" pod="openshift-machine-config-operator/machine-config-daemon-pqnst" podUID="1cf6fa66-290a-4e29-bafc-e60185a22fe2" Jan 11 18:32:33 crc kubenswrapper[4837]: I0111 18:32:33.407711 4837 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_validate-network-edpm-deployment-openstack-edpm-ipam-9nvm9_be22134a-b58f-4a66-bcb2-0545a067b33b/validate-network-edpm-deployment-openstack-edpm-ipam/0.log" Jan 11 18:32:36 crc kubenswrapper[4837]: I0111 18:32:36.442435 4837 pod_container_manager_linux.go:210] "Failed to delete cgroup paths" cgroupName=["kubepods","besteffort","pod30805298-6272-40e2-a05a-d7b47d8b0bb4"] err="unable to destroy cgroup paths for cgroup [kubepods besteffort pod30805298-6272-40e2-a05a-d7b47d8b0bb4] : Timed out while waiting for systemd to remove kubepods-besteffort-pod30805298_6272_40e2_a05a_d7b47d8b0bb4.slice" Jan 11 18:32:36 crc kubenswrapper[4837]: E0111 18:32:36.442993 4837 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to delete cgroup paths for [kubepods besteffort pod30805298-6272-40e2-a05a-d7b47d8b0bb4] : unable to destroy cgroup paths for cgroup [kubepods besteffort pod30805298-6272-40e2-a05a-d7b47d8b0bb4] : Timed out while waiting for systemd to remove kubepods-besteffort-pod30805298_6272_40e2_a05a_d7b47d8b0bb4.slice" pod="openshift-must-gather-wkvxh/crc-debug-qgmkb" podUID="30805298-6272-40e2-a05a-d7b47d8b0bb4" Jan 11 18:32:36 crc kubenswrapper[4837]: I0111 18:32:36.580881 4837 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-wkvxh/crc-debug-qgmkb" Jan 11 18:32:42 crc kubenswrapper[4837]: I0111 18:32:42.702759 4837 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_memcached-0_d1aa5cf3-303a-4a5b-8802-fe264fa090d6/memcached/0.log" Jan 11 18:32:45 crc kubenswrapper[4837]: I0111 18:32:45.363879 4837 scope.go:117] "RemoveContainer" containerID="0032c5226bd846c444b1e1269293c95e1f965f59227e93f022572cc4a00df926" Jan 11 18:32:45 crc kubenswrapper[4837]: E0111 18:32:45.364523 4837 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-pqnst_openshift-machine-config-operator(1cf6fa66-290a-4e29-bafc-e60185a22fe2)\"" pod="openshift-machine-config-operator/machine-config-daemon-pqnst" podUID="1cf6fa66-290a-4e29-bafc-e60185a22fe2" Jan 11 18:32:57 crc kubenswrapper[4837]: I0111 18:32:57.364753 4837 scope.go:117] "RemoveContainer" containerID="0032c5226bd846c444b1e1269293c95e1f965f59227e93f022572cc4a00df926" Jan 11 18:32:57 crc kubenswrapper[4837]: E0111 18:32:57.365611 4837 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-pqnst_openshift-machine-config-operator(1cf6fa66-290a-4e29-bafc-e60185a22fe2)\"" pod="openshift-machine-config-operator/machine-config-daemon-pqnst" podUID="1cf6fa66-290a-4e29-bafc-e60185a22fe2" Jan 11 18:32:58 crc kubenswrapper[4837]: I0111 18:32:58.071018 4837 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_barbican-operator-controller-manager-7ddb5c749-bv24m_02e82478-6974-4ae1-b8de-57688876d070/manager/0.log" Jan 11 18:32:58 crc kubenswrapper[4837]: I0111 18:32:58.163711 4837 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_cca30dc84e08e010a301b72613dd29e91e8e5b398c039860aa526153c79jwqg_a02b8a77-2a89-46a4-9aba-2472a30559f7/util/0.log" Jan 11 18:32:58 crc kubenswrapper[4837]: I0111 18:32:58.310033 4837 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_cca30dc84e08e010a301b72613dd29e91e8e5b398c039860aa526153c79jwqg_a02b8a77-2a89-46a4-9aba-2472a30559f7/pull/0.log" Jan 11 18:32:58 crc kubenswrapper[4837]: I0111 18:32:58.333789 4837 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_cca30dc84e08e010a301b72613dd29e91e8e5b398c039860aa526153c79jwqg_a02b8a77-2a89-46a4-9aba-2472a30559f7/pull/0.log" Jan 11 18:32:58 crc kubenswrapper[4837]: I0111 18:32:58.333832 4837 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_cca30dc84e08e010a301b72613dd29e91e8e5b398c039860aa526153c79jwqg_a02b8a77-2a89-46a4-9aba-2472a30559f7/util/0.log" Jan 11 18:32:58 crc kubenswrapper[4837]: I0111 18:32:58.491626 4837 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_cca30dc84e08e010a301b72613dd29e91e8e5b398c039860aa526153c79jwqg_a02b8a77-2a89-46a4-9aba-2472a30559f7/util/0.log" Jan 11 18:32:58 crc kubenswrapper[4837]: I0111 18:32:58.504747 4837 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_cca30dc84e08e010a301b72613dd29e91e8e5b398c039860aa526153c79jwqg_a02b8a77-2a89-46a4-9aba-2472a30559f7/extract/0.log" Jan 11 18:32:58 crc kubenswrapper[4837]: I0111 18:32:58.544311 4837 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_cca30dc84e08e010a301b72613dd29e91e8e5b398c039860aa526153c79jwqg_a02b8a77-2a89-46a4-9aba-2472a30559f7/pull/0.log" Jan 11 18:32:58 crc kubenswrapper[4837]: I0111 18:32:58.698268 4837 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_cinder-operator-controller-manager-9b68f5989-6fhn7_eb2c9390-f27a-46b0-9249-3e9bdc0c99e3/manager/0.log" Jan 11 18:32:58 crc kubenswrapper[4837]: I0111 18:32:58.701638 4837 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_designate-operator-controller-manager-9f958b845-26d4f_63384d88-7d49-4951-8ccd-10871b0b18ad/manager/0.log" Jan 11 18:32:58 crc kubenswrapper[4837]: I0111 18:32:58.882135 4837 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_heat-operator-controller-manager-594c8c9d5d-cz9h6_61f99042-0859-46d8-9af9-727352a885ee/manager/0.log" Jan 11 18:32:58 crc kubenswrapper[4837]: I0111 18:32:58.917085 4837 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_glance-operator-controller-manager-c6994669c-9ptmm_c4d04eda-5046-43cd-b407-ed14ec61cbd6/manager/0.log" Jan 11 18:32:59 crc kubenswrapper[4837]: I0111 18:32:59.116147 4837 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_horizon-operator-controller-manager-77d5c5b54f-75fgx_fc05ccce-2544-4a54-bdf8-ec1b792ac1ba/manager/0.log" Jan 11 18:32:59 crc kubenswrapper[4837]: I0111 18:32:59.303863 4837 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_infra-operator-controller-manager-77c48c7859-4hndt_c2312108-ddf5-4939-acc1-727557936791/manager/0.log" Jan 11 18:32:59 crc kubenswrapper[4837]: I0111 18:32:59.396227 4837 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_ironic-operator-controller-manager-78757b4889-87jjr_537b7dae-5831-4fa5-afba-a5c7e1229e61/manager/0.log" Jan 11 18:32:59 crc kubenswrapper[4837]: I0111 18:32:59.464658 4837 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_keystone-operator-controller-manager-767fdc4f47-jgrh6_8bdb5237-cb95-4e0c-b52c-85a8a419506b/manager/0.log" Jan 11 18:32:59 crc kubenswrapper[4837]: I0111 18:32:59.601929 4837 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_manila-operator-controller-manager-864f6b75bf-665ds_b8bd1c51-4c79-44f9-b7d8-e43dd8118ba4/manager/0.log" Jan 11 18:32:59 crc kubenswrapper[4837]: I0111 18:32:59.715003 4837 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_mariadb-operator-controller-manager-c87fff755-7k4z8_23a744d5-da8a-4fda-8c27-652e4f18d736/manager/0.log" Jan 11 18:32:59 crc kubenswrapper[4837]: I0111 18:32:59.814100 4837 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_neutron-operator-controller-manager-cb4666565-x648q_69296cc2-890b-439c-8151-9b10963bae3f/manager/0.log" Jan 11 18:32:59 crc kubenswrapper[4837]: I0111 18:32:59.971991 4837 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_nova-operator-controller-manager-65849867d6-vm5gr_b7039fa0-8e22-4369-abcd-baa005429b7b/manager/0.log" Jan 11 18:33:00 crc kubenswrapper[4837]: I0111 18:33:00.032417 4837 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_octavia-operator-controller-manager-7fc9b76cf6-2lvvk_b8022f77-44ba-493f-bed8-ad82fa1ca45a/manager/0.log" Jan 11 18:33:00 crc kubenswrapper[4837]: I0111 18:33:00.133655 4837 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_openstack-baremetal-operator-controller-manager-6b68b8b854pfgjq_374c350e-a484-40a8-8563-45eb7f3eafd1/manager/0.log" Jan 11 18:33:00 crc kubenswrapper[4837]: I0111 18:33:00.450366 4837 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_openstack-operator-controller-operator-597c79dd4-2dspz_229a8de5-0ba1-4408-b093-28e6e74c143b/operator/0.log" Jan 11 18:33:00 crc kubenswrapper[4837]: I0111 18:33:00.510428 4837 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_openstack-operator-index-qqjnc_815a7cf2-a384-4c14-954a-19e05a030e78/registry-server/0.log" Jan 11 18:33:00 crc kubenswrapper[4837]: I0111 18:33:00.815496 4837 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_ovn-operator-controller-manager-55db956ddc-4txs5_8fe4bbe3-9aed-4232-9036-d53346db80b2/manager/0.log" Jan 11 18:33:01 crc kubenswrapper[4837]: I0111 18:33:01.113687 4837 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_rabbitmq-cluster-operator-manager-668c99d594-kwp7z_c76abbe1-c9d2-414f-8c9a-372f8d5e17bc/operator/0.log" Jan 11 18:33:01 crc kubenswrapper[4837]: I0111 18:33:01.121220 4837 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_placement-operator-controller-manager-686df47fcb-wf222_ddce549f-ba1d-483d-b50b-4011c826bbff/manager/0.log" Jan 11 18:33:01 crc kubenswrapper[4837]: I0111 18:33:01.421257 4837 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_swift-operator-controller-manager-85dd56d4cc-hspqd_5ea53463-b9a9-4406-b27f-ab1324f4bdcc/manager/0.log" Jan 11 18:33:01 crc kubenswrapper[4837]: I0111 18:33:01.434147 4837 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_telemetry-operator-controller-manager-5f8f495fcf-tkj2c_0296da23-fe5c-4f47-b26b-6d83da73bf31/manager/0.log" Jan 11 18:33:01 crc kubenswrapper[4837]: I0111 18:33:01.437346 4837 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_openstack-operator-controller-manager-5569b88c46-6jqzq_3081056a-171f-44ab-a8c4-57a3c40686c4/manager/0.log" Jan 11 18:33:01 crc kubenswrapper[4837]: I0111 18:33:01.631419 4837 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_watcher-operator-controller-manager-64cd966744-jsqjz_56dd103a-afaf-46fa-9cf3-f85418264d29/manager/0.log" Jan 11 18:33:01 crc kubenswrapper[4837]: I0111 18:33:01.658376 4837 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_test-operator-controller-manager-7cd8bc9dbb-shqx6_04067843-8e2d-4a0c-8c68-2e321669b605/manager/0.log" Jan 11 18:33:08 crc kubenswrapper[4837]: I0111 18:33:08.365164 4837 scope.go:117] "RemoveContainer" containerID="0032c5226bd846c444b1e1269293c95e1f965f59227e93f022572cc4a00df926" Jan 11 18:33:08 crc kubenswrapper[4837]: E0111 18:33:08.365867 4837 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-pqnst_openshift-machine-config-operator(1cf6fa66-290a-4e29-bafc-e60185a22fe2)\"" pod="openshift-machine-config-operator/machine-config-daemon-pqnst" podUID="1cf6fa66-290a-4e29-bafc-e60185a22fe2" Jan 11 18:33:19 crc kubenswrapper[4837]: I0111 18:33:19.363525 4837 scope.go:117] "RemoveContainer" containerID="0032c5226bd846c444b1e1269293c95e1f965f59227e93f022572cc4a00df926" Jan 11 18:33:19 crc kubenswrapper[4837]: E0111 18:33:19.365325 4837 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-pqnst_openshift-machine-config-operator(1cf6fa66-290a-4e29-bafc-e60185a22fe2)\"" pod="openshift-machine-config-operator/machine-config-daemon-pqnst" podUID="1cf6fa66-290a-4e29-bafc-e60185a22fe2" Jan 11 18:33:20 crc kubenswrapper[4837]: I0111 18:33:20.192242 4837 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-machine-api_control-plane-machine-set-operator-78cbb6b69f-bs6rl_75e7928f-bf2f-4a17-b2eb-f4fc925c7ce3/control-plane-machine-set-operator/0.log" Jan 11 18:33:20 crc kubenswrapper[4837]: I0111 18:33:20.323168 4837 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-machine-api_machine-api-operator-5694c8668f-xk9j9_b61e27df-5c38-48b3-b6e9-bca3ce8aa429/kube-rbac-proxy/0.log" Jan 11 18:33:20 crc kubenswrapper[4837]: I0111 18:33:20.367604 4837 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-machine-api_machine-api-operator-5694c8668f-xk9j9_b61e27df-5c38-48b3-b6e9-bca3ce8aa429/machine-api-operator/0.log" Jan 11 18:33:32 crc kubenswrapper[4837]: I0111 18:33:32.958414 4837 log.go:25] "Finished parsing log file" path="/var/log/pods/cert-manager_cert-manager-858654f9db-2f444_469b992e-fb84-479b-8ec6-5c6490e9daf5/cert-manager-controller/0.log" Jan 11 18:33:33 crc kubenswrapper[4837]: I0111 18:33:33.157730 4837 log.go:25] "Finished parsing log file" path="/var/log/pods/cert-manager_cert-manager-cainjector-cf98fcc89-j7rbn_77732f18-1dd2-475e-9d27-69cf1f66df7d/cert-manager-cainjector/0.log" Jan 11 18:33:33 crc kubenswrapper[4837]: I0111 18:33:33.188072 4837 log.go:25] "Finished parsing log file" path="/var/log/pods/cert-manager_cert-manager-webhook-687f57d79b-ssj7v_5bf7e751-4059-4025-b610-732ec84bda0d/cert-manager-webhook/0.log" Jan 11 18:33:34 crc kubenswrapper[4837]: I0111 18:33:34.364414 4837 scope.go:117] "RemoveContainer" containerID="0032c5226bd846c444b1e1269293c95e1f965f59227e93f022572cc4a00df926" Jan 11 18:33:34 crc kubenswrapper[4837]: E0111 18:33:34.364818 4837 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-pqnst_openshift-machine-config-operator(1cf6fa66-290a-4e29-bafc-e60185a22fe2)\"" pod="openshift-machine-config-operator/machine-config-daemon-pqnst" podUID="1cf6fa66-290a-4e29-bafc-e60185a22fe2" Jan 11 18:33:46 crc kubenswrapper[4837]: I0111 18:33:46.545358 4837 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-console-plugin-6ff7998486-kkz9f_3b85571d-dea1-437f-bd5c-27d5d421411e/nmstate-console-plugin/0.log" Jan 11 18:33:46 crc kubenswrapper[4837]: I0111 18:33:46.723952 4837 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-handler-68ws8_1c2557e3-14a8-4911-92b0-564bb7b60b06/nmstate-handler/0.log" Jan 11 18:33:46 crc kubenswrapper[4837]: I0111 18:33:46.790087 4837 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-metrics-7f7f7578db-znz8s_c440fad0-c0e0-4553-ad26-b843f81c8863/kube-rbac-proxy/0.log" Jan 11 18:33:46 crc kubenswrapper[4837]: I0111 18:33:46.821465 4837 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-metrics-7f7f7578db-znz8s_c440fad0-c0e0-4553-ad26-b843f81c8863/nmstate-metrics/0.log" Jan 11 18:33:46 crc kubenswrapper[4837]: I0111 18:33:46.937933 4837 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-operator-6769fb99d-tx7vk_7340b2fb-4088-4358-977e-020434c7fa2c/nmstate-operator/0.log" Jan 11 18:33:46 crc kubenswrapper[4837]: I0111 18:33:46.998301 4837 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-webhook-f8fb84555-4tph5_5d9870f8-4c71-4490-8e77-17f1a82e725a/nmstate-webhook/0.log" Jan 11 18:33:48 crc kubenswrapper[4837]: I0111 18:33:48.364449 4837 scope.go:117] "RemoveContainer" containerID="0032c5226bd846c444b1e1269293c95e1f965f59227e93f022572cc4a00df926" Jan 11 18:33:48 crc kubenswrapper[4837]: E0111 18:33:48.364770 4837 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-pqnst_openshift-machine-config-operator(1cf6fa66-290a-4e29-bafc-e60185a22fe2)\"" pod="openshift-machine-config-operator/machine-config-daemon-pqnst" podUID="1cf6fa66-290a-4e29-bafc-e60185a22fe2" Jan 11 18:34:00 crc kubenswrapper[4837]: I0111 18:34:00.364224 4837 scope.go:117] "RemoveContainer" containerID="0032c5226bd846c444b1e1269293c95e1f965f59227e93f022572cc4a00df926" Jan 11 18:34:00 crc kubenswrapper[4837]: E0111 18:34:00.364983 4837 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-pqnst_openshift-machine-config-operator(1cf6fa66-290a-4e29-bafc-e60185a22fe2)\"" pod="openshift-machine-config-operator/machine-config-daemon-pqnst" podUID="1cf6fa66-290a-4e29-bafc-e60185a22fe2" Jan 11 18:34:14 crc kubenswrapper[4837]: I0111 18:34:14.364697 4837 scope.go:117] "RemoveContainer" containerID="0032c5226bd846c444b1e1269293c95e1f965f59227e93f022572cc4a00df926" Jan 11 18:34:14 crc kubenswrapper[4837]: E0111 18:34:14.365916 4837 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-pqnst_openshift-machine-config-operator(1cf6fa66-290a-4e29-bafc-e60185a22fe2)\"" pod="openshift-machine-config-operator/machine-config-daemon-pqnst" podUID="1cf6fa66-290a-4e29-bafc-e60185a22fe2" Jan 11 18:34:17 crc kubenswrapper[4837]: I0111 18:34:17.957403 4837 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_controller-5bddd4b946-5bpfr_da885226-0b14-4626-8d89-7d4505ab29a1/kube-rbac-proxy/0.log" Jan 11 18:34:18 crc kubenswrapper[4837]: I0111 18:34:18.063393 4837 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_controller-5bddd4b946-5bpfr_da885226-0b14-4626-8d89-7d4505ab29a1/controller/0.log" Jan 11 18:34:18 crc kubenswrapper[4837]: I0111 18:34:18.349923 4837 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-m6x95_d4fb0b82-36cb-45ce-b356-1a740d312fcf/cp-frr-files/0.log" Jan 11 18:34:18 crc kubenswrapper[4837]: I0111 18:34:18.457209 4837 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-m6x95_d4fb0b82-36cb-45ce-b356-1a740d312fcf/cp-frr-files/0.log" Jan 11 18:34:18 crc kubenswrapper[4837]: I0111 18:34:18.466622 4837 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-m6x95_d4fb0b82-36cb-45ce-b356-1a740d312fcf/cp-reloader/0.log" Jan 11 18:34:18 crc kubenswrapper[4837]: I0111 18:34:18.498326 4837 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-m6x95_d4fb0b82-36cb-45ce-b356-1a740d312fcf/cp-metrics/0.log" Jan 11 18:34:18 crc kubenswrapper[4837]: I0111 18:34:18.512539 4837 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-m6x95_d4fb0b82-36cb-45ce-b356-1a740d312fcf/cp-reloader/0.log" Jan 11 18:34:18 crc kubenswrapper[4837]: I0111 18:34:18.723049 4837 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-m6x95_d4fb0b82-36cb-45ce-b356-1a740d312fcf/cp-frr-files/0.log" Jan 11 18:34:18 crc kubenswrapper[4837]: I0111 18:34:18.730198 4837 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-m6x95_d4fb0b82-36cb-45ce-b356-1a740d312fcf/cp-reloader/0.log" Jan 11 18:34:18 crc kubenswrapper[4837]: I0111 18:34:18.786205 4837 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-m6x95_d4fb0b82-36cb-45ce-b356-1a740d312fcf/cp-metrics/0.log" Jan 11 18:34:18 crc kubenswrapper[4837]: I0111 18:34:18.787572 4837 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-m6x95_d4fb0b82-36cb-45ce-b356-1a740d312fcf/cp-metrics/0.log" Jan 11 18:34:18 crc kubenswrapper[4837]: I0111 18:34:18.922604 4837 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-m6x95_d4fb0b82-36cb-45ce-b356-1a740d312fcf/cp-frr-files/0.log" Jan 11 18:34:18 crc kubenswrapper[4837]: I0111 18:34:18.949495 4837 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-m6x95_d4fb0b82-36cb-45ce-b356-1a740d312fcf/cp-reloader/0.log" Jan 11 18:34:18 crc kubenswrapper[4837]: I0111 18:34:18.953508 4837 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-m6x95_d4fb0b82-36cb-45ce-b356-1a740d312fcf/cp-metrics/0.log" Jan 11 18:34:18 crc kubenswrapper[4837]: I0111 18:34:18.986367 4837 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-m6x95_d4fb0b82-36cb-45ce-b356-1a740d312fcf/controller/0.log" Jan 11 18:34:19 crc kubenswrapper[4837]: I0111 18:34:19.128305 4837 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-m6x95_d4fb0b82-36cb-45ce-b356-1a740d312fcf/frr-metrics/0.log" Jan 11 18:34:19 crc kubenswrapper[4837]: I0111 18:34:19.131889 4837 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-m6x95_d4fb0b82-36cb-45ce-b356-1a740d312fcf/kube-rbac-proxy/0.log" Jan 11 18:34:19 crc kubenswrapper[4837]: I0111 18:34:19.184563 4837 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-m6x95_d4fb0b82-36cb-45ce-b356-1a740d312fcf/kube-rbac-proxy-frr/0.log" Jan 11 18:34:19 crc kubenswrapper[4837]: I0111 18:34:19.331605 4837 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-m6x95_d4fb0b82-36cb-45ce-b356-1a740d312fcf/reloader/0.log" Jan 11 18:34:19 crc kubenswrapper[4837]: I0111 18:34:19.415916 4837 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-webhook-server-7784b6fcf-568c8_aa1bc5b4-0f84-413a-a7fd-d2531bbb8265/frr-k8s-webhook-server/0.log" Jan 11 18:34:19 crc kubenswrapper[4837]: I0111 18:34:19.703724 4837 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_metallb-operator-controller-manager-7b497646f-46nhj_b3e8b743-e8f1-453a-9f63-44700da2d56a/manager/0.log" Jan 11 18:34:19 crc kubenswrapper[4837]: I0111 18:34:19.768337 4837 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_metallb-operator-webhook-server-55549fb586-lvpmn_28a54c4f-092d-4c3e-b528-9d3651c4f3a9/webhook-server/0.log" Jan 11 18:34:19 crc kubenswrapper[4837]: I0111 18:34:19.867833 4837 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_speaker-8bfbc_e3e46c8e-1e90-49a8-a3eb-879ccd3c4807/kube-rbac-proxy/0.log" Jan 11 18:34:20 crc kubenswrapper[4837]: I0111 18:34:20.357248 4837 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_speaker-8bfbc_e3e46c8e-1e90-49a8-a3eb-879ccd3c4807/speaker/0.log" Jan 11 18:34:20 crc kubenswrapper[4837]: I0111 18:34:20.442209 4837 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-m6x95_d4fb0b82-36cb-45ce-b356-1a740d312fcf/frr/0.log" Jan 11 18:34:25 crc kubenswrapper[4837]: I0111 18:34:25.364473 4837 scope.go:117] "RemoveContainer" containerID="0032c5226bd846c444b1e1269293c95e1f965f59227e93f022572cc4a00df926" Jan 11 18:34:25 crc kubenswrapper[4837]: E0111 18:34:25.367360 4837 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-pqnst_openshift-machine-config-operator(1cf6fa66-290a-4e29-bafc-e60185a22fe2)\"" pod="openshift-machine-config-operator/machine-config-daemon-pqnst" podUID="1cf6fa66-290a-4e29-bafc-e60185a22fe2" Jan 11 18:34:33 crc kubenswrapper[4837]: I0111 18:34:33.854049 4837 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_5b7fccbebf0e22d2dd769066fa7aaa90fd620c5db34f2af6c91e4319d4f2nrw_55364c0f-8ad4-40fb-8739-65b09f608b27/util/0.log" Jan 11 18:34:34 crc kubenswrapper[4837]: I0111 18:34:34.009510 4837 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_5b7fccbebf0e22d2dd769066fa7aaa90fd620c5db34f2af6c91e4319d4f2nrw_55364c0f-8ad4-40fb-8739-65b09f608b27/pull/0.log" Jan 11 18:34:34 crc kubenswrapper[4837]: I0111 18:34:34.047311 4837 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_5b7fccbebf0e22d2dd769066fa7aaa90fd620c5db34f2af6c91e4319d4f2nrw_55364c0f-8ad4-40fb-8739-65b09f608b27/util/0.log" Jan 11 18:34:34 crc kubenswrapper[4837]: I0111 18:34:34.048979 4837 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_5b7fccbebf0e22d2dd769066fa7aaa90fd620c5db34f2af6c91e4319d4f2nrw_55364c0f-8ad4-40fb-8739-65b09f608b27/pull/0.log" Jan 11 18:34:34 crc kubenswrapper[4837]: I0111 18:34:34.263837 4837 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_5b7fccbebf0e22d2dd769066fa7aaa90fd620c5db34f2af6c91e4319d4f2nrw_55364c0f-8ad4-40fb-8739-65b09f608b27/extract/0.log" Jan 11 18:34:34 crc kubenswrapper[4837]: I0111 18:34:34.265882 4837 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_5b7fccbebf0e22d2dd769066fa7aaa90fd620c5db34f2af6c91e4319d4f2nrw_55364c0f-8ad4-40fb-8739-65b09f608b27/util/0.log" Jan 11 18:34:34 crc kubenswrapper[4837]: I0111 18:34:34.285703 4837 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_5b7fccbebf0e22d2dd769066fa7aaa90fd620c5db34f2af6c91e4319d4f2nrw_55364c0f-8ad4-40fb-8739-65b09f608b27/pull/0.log" Jan 11 18:34:34 crc kubenswrapper[4837]: I0111 18:34:34.434927 4837 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_98085b0df3808ebec39f9f9529f737144fe2dbcdaa4f334014817c0fa8ds5q5_a50fb0c0-dff3-4722-b0ed-4c014c80faee/util/0.log" Jan 11 18:34:34 crc kubenswrapper[4837]: I0111 18:34:34.580026 4837 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_98085b0df3808ebec39f9f9529f737144fe2dbcdaa4f334014817c0fa8ds5q5_a50fb0c0-dff3-4722-b0ed-4c014c80faee/util/0.log" Jan 11 18:34:34 crc kubenswrapper[4837]: I0111 18:34:34.601088 4837 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_98085b0df3808ebec39f9f9529f737144fe2dbcdaa4f334014817c0fa8ds5q5_a50fb0c0-dff3-4722-b0ed-4c014c80faee/pull/0.log" Jan 11 18:34:34 crc kubenswrapper[4837]: I0111 18:34:34.601235 4837 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_98085b0df3808ebec39f9f9529f737144fe2dbcdaa4f334014817c0fa8ds5q5_a50fb0c0-dff3-4722-b0ed-4c014c80faee/pull/0.log" Jan 11 18:34:34 crc kubenswrapper[4837]: I0111 18:34:34.759801 4837 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_98085b0df3808ebec39f9f9529f737144fe2dbcdaa4f334014817c0fa8ds5q5_a50fb0c0-dff3-4722-b0ed-4c014c80faee/util/0.log" Jan 11 18:34:34 crc kubenswrapper[4837]: I0111 18:34:34.781021 4837 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_98085b0df3808ebec39f9f9529f737144fe2dbcdaa4f334014817c0fa8ds5q5_a50fb0c0-dff3-4722-b0ed-4c014c80faee/extract/0.log" Jan 11 18:34:34 crc kubenswrapper[4837]: I0111 18:34:34.804784 4837 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_98085b0df3808ebec39f9f9529f737144fe2dbcdaa4f334014817c0fa8ds5q5_a50fb0c0-dff3-4722-b0ed-4c014c80faee/pull/0.log" Jan 11 18:34:34 crc kubenswrapper[4837]: I0111 18:34:34.915489 4837 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-bh2tn_2efe9489-3447-4e79-b762-935c88a0c3fe/extract-utilities/0.log" Jan 11 18:34:35 crc kubenswrapper[4837]: I0111 18:34:35.094075 4837 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-bh2tn_2efe9489-3447-4e79-b762-935c88a0c3fe/extract-utilities/0.log" Jan 11 18:34:35 crc kubenswrapper[4837]: I0111 18:34:35.094254 4837 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-bh2tn_2efe9489-3447-4e79-b762-935c88a0c3fe/extract-content/0.log" Jan 11 18:34:35 crc kubenswrapper[4837]: I0111 18:34:35.149019 4837 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-bh2tn_2efe9489-3447-4e79-b762-935c88a0c3fe/extract-content/0.log" Jan 11 18:34:35 crc kubenswrapper[4837]: I0111 18:34:35.300149 4837 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-bh2tn_2efe9489-3447-4e79-b762-935c88a0c3fe/extract-utilities/0.log" Jan 11 18:34:35 crc kubenswrapper[4837]: I0111 18:34:35.328610 4837 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-bh2tn_2efe9489-3447-4e79-b762-935c88a0c3fe/extract-content/0.log" Jan 11 18:34:35 crc kubenswrapper[4837]: I0111 18:34:35.529368 4837 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-lsj24_734801c7-6fa0-4055-a0bd-22b2824d4312/extract-utilities/0.log" Jan 11 18:34:35 crc kubenswrapper[4837]: I0111 18:34:35.703809 4837 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-lsj24_734801c7-6fa0-4055-a0bd-22b2824d4312/extract-utilities/0.log" Jan 11 18:34:35 crc kubenswrapper[4837]: I0111 18:34:35.754898 4837 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-lsj24_734801c7-6fa0-4055-a0bd-22b2824d4312/extract-content/0.log" Jan 11 18:34:35 crc kubenswrapper[4837]: I0111 18:34:35.830104 4837 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-bh2tn_2efe9489-3447-4e79-b762-935c88a0c3fe/registry-server/0.log" Jan 11 18:34:35 crc kubenswrapper[4837]: I0111 18:34:35.852729 4837 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-lsj24_734801c7-6fa0-4055-a0bd-22b2824d4312/extract-content/0.log" Jan 11 18:34:35 crc kubenswrapper[4837]: I0111 18:34:35.950767 4837 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-lsj24_734801c7-6fa0-4055-a0bd-22b2824d4312/extract-content/0.log" Jan 11 18:34:36 crc kubenswrapper[4837]: I0111 18:34:36.022310 4837 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-lsj24_734801c7-6fa0-4055-a0bd-22b2824d4312/extract-utilities/0.log" Jan 11 18:34:36 crc kubenswrapper[4837]: I0111 18:34:36.212457 4837 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_marketplace-operator-79b997595-xb262_a46aac0a-4b71-4559-9481-499e240587e4/marketplace-operator/0.log" Jan 11 18:34:36 crc kubenswrapper[4837]: I0111 18:34:36.238650 4837 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-cg2qj_179b571c-cc46-454a-bf70-652f09e1c934/extract-utilities/0.log" Jan 11 18:34:36 crc kubenswrapper[4837]: I0111 18:34:36.491366 4837 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-cg2qj_179b571c-cc46-454a-bf70-652f09e1c934/extract-content/0.log" Jan 11 18:34:36 crc kubenswrapper[4837]: I0111 18:34:36.505292 4837 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-cg2qj_179b571c-cc46-454a-bf70-652f09e1c934/extract-utilities/0.log" Jan 11 18:34:36 crc kubenswrapper[4837]: I0111 18:34:36.506167 4837 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-cg2qj_179b571c-cc46-454a-bf70-652f09e1c934/extract-content/0.log" Jan 11 18:34:36 crc kubenswrapper[4837]: I0111 18:34:36.519806 4837 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-lsj24_734801c7-6fa0-4055-a0bd-22b2824d4312/registry-server/0.log" Jan 11 18:34:36 crc kubenswrapper[4837]: I0111 18:34:36.644951 4837 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-cg2qj_179b571c-cc46-454a-bf70-652f09e1c934/extract-utilities/0.log" Jan 11 18:34:36 crc kubenswrapper[4837]: I0111 18:34:36.693839 4837 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-cg2qj_179b571c-cc46-454a-bf70-652f09e1c934/extract-content/0.log" Jan 11 18:34:36 crc kubenswrapper[4837]: I0111 18:34:36.844715 4837 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-cg2qj_179b571c-cc46-454a-bf70-652f09e1c934/registry-server/0.log" Jan 11 18:34:36 crc kubenswrapper[4837]: I0111 18:34:36.893826 4837 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-gs7kc_2467754a-ee89-4272-886e-bd185cc623a3/extract-utilities/0.log" Jan 11 18:34:37 crc kubenswrapper[4837]: I0111 18:34:37.045984 4837 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-gs7kc_2467754a-ee89-4272-886e-bd185cc623a3/extract-content/0.log" Jan 11 18:34:37 crc kubenswrapper[4837]: I0111 18:34:37.055874 4837 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-gs7kc_2467754a-ee89-4272-886e-bd185cc623a3/extract-utilities/0.log" Jan 11 18:34:37 crc kubenswrapper[4837]: I0111 18:34:37.073845 4837 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-gs7kc_2467754a-ee89-4272-886e-bd185cc623a3/extract-content/0.log" Jan 11 18:34:37 crc kubenswrapper[4837]: I0111 18:34:37.243695 4837 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-gs7kc_2467754a-ee89-4272-886e-bd185cc623a3/extract-content/0.log" Jan 11 18:34:37 crc kubenswrapper[4837]: I0111 18:34:37.250758 4837 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-gs7kc_2467754a-ee89-4272-886e-bd185cc623a3/extract-utilities/0.log" Jan 11 18:34:37 crc kubenswrapper[4837]: I0111 18:34:37.554436 4837 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-gs7kc_2467754a-ee89-4272-886e-bd185cc623a3/registry-server/0.log" Jan 11 18:34:38 crc kubenswrapper[4837]: I0111 18:34:38.364910 4837 scope.go:117] "RemoveContainer" containerID="0032c5226bd846c444b1e1269293c95e1f965f59227e93f022572cc4a00df926" Jan 11 18:34:38 crc kubenswrapper[4837]: E0111 18:34:38.365197 4837 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-pqnst_openshift-machine-config-operator(1cf6fa66-290a-4e29-bafc-e60185a22fe2)\"" pod="openshift-machine-config-operator/machine-config-daemon-pqnst" podUID="1cf6fa66-290a-4e29-bafc-e60185a22fe2" Jan 11 18:34:50 crc kubenswrapper[4837]: I0111 18:34:50.364318 4837 scope.go:117] "RemoveContainer" containerID="0032c5226bd846c444b1e1269293c95e1f965f59227e93f022572cc4a00df926" Jan 11 18:34:50 crc kubenswrapper[4837]: E0111 18:34:50.365316 4837 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-pqnst_openshift-machine-config-operator(1cf6fa66-290a-4e29-bafc-e60185a22fe2)\"" pod="openshift-machine-config-operator/machine-config-daemon-pqnst" podUID="1cf6fa66-290a-4e29-bafc-e60185a22fe2" Jan 11 18:35:01 crc kubenswrapper[4837]: I0111 18:35:01.364643 4837 scope.go:117] "RemoveContainer" containerID="0032c5226bd846c444b1e1269293c95e1f965f59227e93f022572cc4a00df926" Jan 11 18:35:01 crc kubenswrapper[4837]: E0111 18:35:01.365429 4837 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-pqnst_openshift-machine-config-operator(1cf6fa66-290a-4e29-bafc-e60185a22fe2)\"" pod="openshift-machine-config-operator/machine-config-daemon-pqnst" podUID="1cf6fa66-290a-4e29-bafc-e60185a22fe2" Jan 11 18:35:04 crc kubenswrapper[4837]: E0111 18:35:04.396593 4837 upgradeaware.go:427] Error proxying data from client to backend: readfrom tcp 38.102.83.196:53782->38.102.83.196:40331: write tcp 38.102.83.196:53782->38.102.83.196:40331: write: connection reset by peer Jan 11 18:35:14 crc kubenswrapper[4837]: I0111 18:35:14.365103 4837 scope.go:117] "RemoveContainer" containerID="0032c5226bd846c444b1e1269293c95e1f965f59227e93f022572cc4a00df926" Jan 11 18:35:14 crc kubenswrapper[4837]: E0111 18:35:14.366038 4837 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-pqnst_openshift-machine-config-operator(1cf6fa66-290a-4e29-bafc-e60185a22fe2)\"" pod="openshift-machine-config-operator/machine-config-daemon-pqnst" podUID="1cf6fa66-290a-4e29-bafc-e60185a22fe2" Jan 11 18:35:26 crc kubenswrapper[4837]: I0111 18:35:26.371815 4837 scope.go:117] "RemoveContainer" containerID="0032c5226bd846c444b1e1269293c95e1f965f59227e93f022572cc4a00df926" Jan 11 18:35:26 crc kubenswrapper[4837]: E0111 18:35:26.372772 4837 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-pqnst_openshift-machine-config-operator(1cf6fa66-290a-4e29-bafc-e60185a22fe2)\"" pod="openshift-machine-config-operator/machine-config-daemon-pqnst" podUID="1cf6fa66-290a-4e29-bafc-e60185a22fe2" Jan 11 18:35:37 crc kubenswrapper[4837]: I0111 18:35:37.364331 4837 scope.go:117] "RemoveContainer" containerID="0032c5226bd846c444b1e1269293c95e1f965f59227e93f022572cc4a00df926" Jan 11 18:35:37 crc kubenswrapper[4837]: E0111 18:35:37.365883 4837 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-pqnst_openshift-machine-config-operator(1cf6fa66-290a-4e29-bafc-e60185a22fe2)\"" pod="openshift-machine-config-operator/machine-config-daemon-pqnst" podUID="1cf6fa66-290a-4e29-bafc-e60185a22fe2" Jan 11 18:35:52 crc kubenswrapper[4837]: I0111 18:35:52.364761 4837 scope.go:117] "RemoveContainer" containerID="0032c5226bd846c444b1e1269293c95e1f965f59227e93f022572cc4a00df926" Jan 11 18:35:52 crc kubenswrapper[4837]: E0111 18:35:52.367246 4837 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-pqnst_openshift-machine-config-operator(1cf6fa66-290a-4e29-bafc-e60185a22fe2)\"" pod="openshift-machine-config-operator/machine-config-daemon-pqnst" podUID="1cf6fa66-290a-4e29-bafc-e60185a22fe2" Jan 11 18:35:54 crc kubenswrapper[4837]: I0111 18:35:54.330039 4837 scope.go:117] "RemoveContainer" containerID="4e2aaa90d78c6c9e5cd8a4743348828d25024dbd559fb74f26f75a8bfb7f2874" Jan 11 18:35:54 crc kubenswrapper[4837]: I0111 18:35:54.351821 4837 scope.go:117] "RemoveContainer" containerID="0cb2b8668501931d1b36e29c6f4861044ad8b3fb2e95db3f0d3a6a85d502d4e9" Jan 11 18:35:54 crc kubenswrapper[4837]: I0111 18:35:54.372252 4837 scope.go:117] "RemoveContainer" containerID="a34866db46716c6a7a02e085f059bc28f691429bc5adfcac003fb72c0817052a" Jan 11 18:36:03 crc kubenswrapper[4837]: I0111 18:36:03.365034 4837 scope.go:117] "RemoveContainer" containerID="0032c5226bd846c444b1e1269293c95e1f965f59227e93f022572cc4a00df926" Jan 11 18:36:03 crc kubenswrapper[4837]: E0111 18:36:03.366462 4837 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-pqnst_openshift-machine-config-operator(1cf6fa66-290a-4e29-bafc-e60185a22fe2)\"" pod="openshift-machine-config-operator/machine-config-daemon-pqnst" podUID="1cf6fa66-290a-4e29-bafc-e60185a22fe2" Jan 11 18:36:15 crc kubenswrapper[4837]: I0111 18:36:15.745297 4837 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-6gwtz"] Jan 11 18:36:15 crc kubenswrapper[4837]: E0111 18:36:15.748573 4837 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="78be57f3-c282-42ed-98cb-12d791c82f35" containerName="container-00" Jan 11 18:36:15 crc kubenswrapper[4837]: I0111 18:36:15.749079 4837 state_mem.go:107] "Deleted CPUSet assignment" podUID="78be57f3-c282-42ed-98cb-12d791c82f35" containerName="container-00" Jan 11 18:36:15 crc kubenswrapper[4837]: I0111 18:36:15.749429 4837 memory_manager.go:354] "RemoveStaleState removing state" podUID="78be57f3-c282-42ed-98cb-12d791c82f35" containerName="container-00" Jan 11 18:36:15 crc kubenswrapper[4837]: I0111 18:36:15.751086 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-6gwtz" Jan 11 18:36:15 crc kubenswrapper[4837]: I0111 18:36:15.756950 4837 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-6gwtz"] Jan 11 18:36:15 crc kubenswrapper[4837]: I0111 18:36:15.826838 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/88595eec-23d3-4eb3-b3d4-98c38763fc73-catalog-content\") pod \"certified-operators-6gwtz\" (UID: \"88595eec-23d3-4eb3-b3d4-98c38763fc73\") " pod="openshift-marketplace/certified-operators-6gwtz" Jan 11 18:36:15 crc kubenswrapper[4837]: I0111 18:36:15.827900 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ftgjh\" (UniqueName: \"kubernetes.io/projected/88595eec-23d3-4eb3-b3d4-98c38763fc73-kube-api-access-ftgjh\") pod \"certified-operators-6gwtz\" (UID: \"88595eec-23d3-4eb3-b3d4-98c38763fc73\") " pod="openshift-marketplace/certified-operators-6gwtz" Jan 11 18:36:15 crc kubenswrapper[4837]: I0111 18:36:15.828033 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/88595eec-23d3-4eb3-b3d4-98c38763fc73-utilities\") pod \"certified-operators-6gwtz\" (UID: \"88595eec-23d3-4eb3-b3d4-98c38763fc73\") " pod="openshift-marketplace/certified-operators-6gwtz" Jan 11 18:36:15 crc kubenswrapper[4837]: I0111 18:36:15.929607 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/88595eec-23d3-4eb3-b3d4-98c38763fc73-catalog-content\") pod \"certified-operators-6gwtz\" (UID: \"88595eec-23d3-4eb3-b3d4-98c38763fc73\") " pod="openshift-marketplace/certified-operators-6gwtz" Jan 11 18:36:15 crc kubenswrapper[4837]: I0111 18:36:15.929883 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ftgjh\" (UniqueName: \"kubernetes.io/projected/88595eec-23d3-4eb3-b3d4-98c38763fc73-kube-api-access-ftgjh\") pod \"certified-operators-6gwtz\" (UID: \"88595eec-23d3-4eb3-b3d4-98c38763fc73\") " pod="openshift-marketplace/certified-operators-6gwtz" Jan 11 18:36:15 crc kubenswrapper[4837]: I0111 18:36:15.929991 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/88595eec-23d3-4eb3-b3d4-98c38763fc73-utilities\") pod \"certified-operators-6gwtz\" (UID: \"88595eec-23d3-4eb3-b3d4-98c38763fc73\") " pod="openshift-marketplace/certified-operators-6gwtz" Jan 11 18:36:15 crc kubenswrapper[4837]: I0111 18:36:15.930185 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/88595eec-23d3-4eb3-b3d4-98c38763fc73-catalog-content\") pod \"certified-operators-6gwtz\" (UID: \"88595eec-23d3-4eb3-b3d4-98c38763fc73\") " pod="openshift-marketplace/certified-operators-6gwtz" Jan 11 18:36:15 crc kubenswrapper[4837]: I0111 18:36:15.930473 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/88595eec-23d3-4eb3-b3d4-98c38763fc73-utilities\") pod \"certified-operators-6gwtz\" (UID: \"88595eec-23d3-4eb3-b3d4-98c38763fc73\") " pod="openshift-marketplace/certified-operators-6gwtz" Jan 11 18:36:15 crc kubenswrapper[4837]: I0111 18:36:15.955428 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ftgjh\" (UniqueName: \"kubernetes.io/projected/88595eec-23d3-4eb3-b3d4-98c38763fc73-kube-api-access-ftgjh\") pod \"certified-operators-6gwtz\" (UID: \"88595eec-23d3-4eb3-b3d4-98c38763fc73\") " pod="openshift-marketplace/certified-operators-6gwtz" Jan 11 18:36:16 crc kubenswrapper[4837]: I0111 18:36:16.086998 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-6gwtz" Jan 11 18:36:16 crc kubenswrapper[4837]: I0111 18:36:16.565308 4837 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-6gwtz"] Jan 11 18:36:16 crc kubenswrapper[4837]: W0111 18:36:16.571308 4837 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod88595eec_23d3_4eb3_b3d4_98c38763fc73.slice/crio-ff3589f2519aa34fe2ea237e117e45cc4d36c8e54a8c68c6c8f6f24f4f52416b WatchSource:0}: Error finding container ff3589f2519aa34fe2ea237e117e45cc4d36c8e54a8c68c6c8f6f24f4f52416b: Status 404 returned error can't find the container with id ff3589f2519aa34fe2ea237e117e45cc4d36c8e54a8c68c6c8f6f24f4f52416b Jan 11 18:36:17 crc kubenswrapper[4837]: I0111 18:36:17.364066 4837 scope.go:117] "RemoveContainer" containerID="0032c5226bd846c444b1e1269293c95e1f965f59227e93f022572cc4a00df926" Jan 11 18:36:17 crc kubenswrapper[4837]: E0111 18:36:17.364510 4837 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-pqnst_openshift-machine-config-operator(1cf6fa66-290a-4e29-bafc-e60185a22fe2)\"" pod="openshift-machine-config-operator/machine-config-daemon-pqnst" podUID="1cf6fa66-290a-4e29-bafc-e60185a22fe2" Jan 11 18:36:17 crc kubenswrapper[4837]: I0111 18:36:17.374614 4837 generic.go:334] "Generic (PLEG): container finished" podID="88595eec-23d3-4eb3-b3d4-98c38763fc73" containerID="e7a35c14f608068fe7b15813373c2261c847b079174742a60104403a9a06db17" exitCode=0 Jan 11 18:36:17 crc kubenswrapper[4837]: I0111 18:36:17.374692 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-6gwtz" event={"ID":"88595eec-23d3-4eb3-b3d4-98c38763fc73","Type":"ContainerDied","Data":"e7a35c14f608068fe7b15813373c2261c847b079174742a60104403a9a06db17"} Jan 11 18:36:17 crc kubenswrapper[4837]: I0111 18:36:17.374751 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-6gwtz" event={"ID":"88595eec-23d3-4eb3-b3d4-98c38763fc73","Type":"ContainerStarted","Data":"ff3589f2519aa34fe2ea237e117e45cc4d36c8e54a8c68c6c8f6f24f4f52416b"} Jan 11 18:36:17 crc kubenswrapper[4837]: I0111 18:36:17.376518 4837 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Jan 11 18:36:18 crc kubenswrapper[4837]: I0111 18:36:18.146485 4837 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-66dzk"] Jan 11 18:36:18 crc kubenswrapper[4837]: I0111 18:36:18.152018 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-66dzk" Jan 11 18:36:18 crc kubenswrapper[4837]: I0111 18:36:18.162909 4837 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-66dzk"] Jan 11 18:36:18 crc kubenswrapper[4837]: I0111 18:36:18.283075 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/094b94b8-2df0-4674-923c-bcedda01f06d-catalog-content\") pod \"redhat-operators-66dzk\" (UID: \"094b94b8-2df0-4674-923c-bcedda01f06d\") " pod="openshift-marketplace/redhat-operators-66dzk" Jan 11 18:36:18 crc kubenswrapper[4837]: I0111 18:36:18.283167 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/094b94b8-2df0-4674-923c-bcedda01f06d-utilities\") pod \"redhat-operators-66dzk\" (UID: \"094b94b8-2df0-4674-923c-bcedda01f06d\") " pod="openshift-marketplace/redhat-operators-66dzk" Jan 11 18:36:18 crc kubenswrapper[4837]: I0111 18:36:18.283232 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-csf5m\" (UniqueName: \"kubernetes.io/projected/094b94b8-2df0-4674-923c-bcedda01f06d-kube-api-access-csf5m\") pod \"redhat-operators-66dzk\" (UID: \"094b94b8-2df0-4674-923c-bcedda01f06d\") " pod="openshift-marketplace/redhat-operators-66dzk" Jan 11 18:36:18 crc kubenswrapper[4837]: I0111 18:36:18.385845 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/094b94b8-2df0-4674-923c-bcedda01f06d-catalog-content\") pod \"redhat-operators-66dzk\" (UID: \"094b94b8-2df0-4674-923c-bcedda01f06d\") " pod="openshift-marketplace/redhat-operators-66dzk" Jan 11 18:36:18 crc kubenswrapper[4837]: I0111 18:36:18.385934 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/094b94b8-2df0-4674-923c-bcedda01f06d-utilities\") pod \"redhat-operators-66dzk\" (UID: \"094b94b8-2df0-4674-923c-bcedda01f06d\") " pod="openshift-marketplace/redhat-operators-66dzk" Jan 11 18:36:18 crc kubenswrapper[4837]: I0111 18:36:18.385999 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-csf5m\" (UniqueName: \"kubernetes.io/projected/094b94b8-2df0-4674-923c-bcedda01f06d-kube-api-access-csf5m\") pod \"redhat-operators-66dzk\" (UID: \"094b94b8-2df0-4674-923c-bcedda01f06d\") " pod="openshift-marketplace/redhat-operators-66dzk" Jan 11 18:36:18 crc kubenswrapper[4837]: I0111 18:36:18.386909 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/094b94b8-2df0-4674-923c-bcedda01f06d-catalog-content\") pod \"redhat-operators-66dzk\" (UID: \"094b94b8-2df0-4674-923c-bcedda01f06d\") " pod="openshift-marketplace/redhat-operators-66dzk" Jan 11 18:36:18 crc kubenswrapper[4837]: I0111 18:36:18.387178 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/094b94b8-2df0-4674-923c-bcedda01f06d-utilities\") pod \"redhat-operators-66dzk\" (UID: \"094b94b8-2df0-4674-923c-bcedda01f06d\") " pod="openshift-marketplace/redhat-operators-66dzk" Jan 11 18:36:18 crc kubenswrapper[4837]: I0111 18:36:18.403906 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-csf5m\" (UniqueName: \"kubernetes.io/projected/094b94b8-2df0-4674-923c-bcedda01f06d-kube-api-access-csf5m\") pod \"redhat-operators-66dzk\" (UID: \"094b94b8-2df0-4674-923c-bcedda01f06d\") " pod="openshift-marketplace/redhat-operators-66dzk" Jan 11 18:36:18 crc kubenswrapper[4837]: I0111 18:36:18.475909 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-66dzk" Jan 11 18:36:18 crc kubenswrapper[4837]: I0111 18:36:18.923572 4837 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-66dzk"] Jan 11 18:36:18 crc kubenswrapper[4837]: W0111 18:36:18.933068 4837 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod094b94b8_2df0_4674_923c_bcedda01f06d.slice/crio-d921a63f3324c110eab90aa528703027f8b9b95d525a798945e08b4467b224d8 WatchSource:0}: Error finding container d921a63f3324c110eab90aa528703027f8b9b95d525a798945e08b4467b224d8: Status 404 returned error can't find the container with id d921a63f3324c110eab90aa528703027f8b9b95d525a798945e08b4467b224d8 Jan 11 18:36:19 crc kubenswrapper[4837]: I0111 18:36:19.392171 4837 generic.go:334] "Generic (PLEG): container finished" podID="88595eec-23d3-4eb3-b3d4-98c38763fc73" containerID="ad5a5af7e00bbca96098192ec5670355f6fec7a70c7ef72d6c9c9cb4535b2bee" exitCode=0 Jan 11 18:36:19 crc kubenswrapper[4837]: I0111 18:36:19.392240 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-6gwtz" event={"ID":"88595eec-23d3-4eb3-b3d4-98c38763fc73","Type":"ContainerDied","Data":"ad5a5af7e00bbca96098192ec5670355f6fec7a70c7ef72d6c9c9cb4535b2bee"} Jan 11 18:36:19 crc kubenswrapper[4837]: I0111 18:36:19.394961 4837 generic.go:334] "Generic (PLEG): container finished" podID="094b94b8-2df0-4674-923c-bcedda01f06d" containerID="a2273bd94c532993e42867192534817429c0d0f816182efec4f0fed820e6a8ed" exitCode=0 Jan 11 18:36:19 crc kubenswrapper[4837]: I0111 18:36:19.394996 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-66dzk" event={"ID":"094b94b8-2df0-4674-923c-bcedda01f06d","Type":"ContainerDied","Data":"a2273bd94c532993e42867192534817429c0d0f816182efec4f0fed820e6a8ed"} Jan 11 18:36:19 crc kubenswrapper[4837]: I0111 18:36:19.395020 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-66dzk" event={"ID":"094b94b8-2df0-4674-923c-bcedda01f06d","Type":"ContainerStarted","Data":"d921a63f3324c110eab90aa528703027f8b9b95d525a798945e08b4467b224d8"} Jan 11 18:36:21 crc kubenswrapper[4837]: I0111 18:36:21.416036 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-66dzk" event={"ID":"094b94b8-2df0-4674-923c-bcedda01f06d","Type":"ContainerStarted","Data":"483f9b0baaf73a940066dda3aed196d256618be93c03590f64d3a5cfaff1d433"} Jan 11 18:36:21 crc kubenswrapper[4837]: I0111 18:36:21.419918 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-6gwtz" event={"ID":"88595eec-23d3-4eb3-b3d4-98c38763fc73","Type":"ContainerStarted","Data":"659c19ec82617039ce7f7d1496fe4dfefb76b995d2e8157acbb1f00fba1a7e85"} Jan 11 18:36:21 crc kubenswrapper[4837]: I0111 18:36:21.473809 4837 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-6gwtz" podStartSLOduration=3.7196823610000003 podStartE2EDuration="6.473786055s" podCreationTimestamp="2026-01-11 18:36:15 +0000 UTC" firstStartedPulling="2026-01-11 18:36:17.37631417 +0000 UTC m=+3951.554506876" lastFinishedPulling="2026-01-11 18:36:20.130417864 +0000 UTC m=+3954.308610570" observedRunningTime="2026-01-11 18:36:21.461984457 +0000 UTC m=+3955.640177233" watchObservedRunningTime="2026-01-11 18:36:21.473786055 +0000 UTC m=+3955.651978771" Jan 11 18:36:24 crc kubenswrapper[4837]: I0111 18:36:24.453243 4837 generic.go:334] "Generic (PLEG): container finished" podID="094b94b8-2df0-4674-923c-bcedda01f06d" containerID="483f9b0baaf73a940066dda3aed196d256618be93c03590f64d3a5cfaff1d433" exitCode=0 Jan 11 18:36:24 crc kubenswrapper[4837]: I0111 18:36:24.453501 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-66dzk" event={"ID":"094b94b8-2df0-4674-923c-bcedda01f06d","Type":"ContainerDied","Data":"483f9b0baaf73a940066dda3aed196d256618be93c03590f64d3a5cfaff1d433"} Jan 11 18:36:25 crc kubenswrapper[4837]: I0111 18:36:25.465384 4837 generic.go:334] "Generic (PLEG): container finished" podID="45253be1-632b-4c25-bb8b-f726177c8991" containerID="d33ec879d3a24fa77b931917a343a1dcc2990c6d060eac4f0d2b25e2c0302d8b" exitCode=0 Jan 11 18:36:25 crc kubenswrapper[4837]: I0111 18:36:25.465451 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-wkvxh/must-gather-msgnj" event={"ID":"45253be1-632b-4c25-bb8b-f726177c8991","Type":"ContainerDied","Data":"d33ec879d3a24fa77b931917a343a1dcc2990c6d060eac4f0d2b25e2c0302d8b"} Jan 11 18:36:25 crc kubenswrapper[4837]: I0111 18:36:25.466280 4837 scope.go:117] "RemoveContainer" containerID="d33ec879d3a24fa77b931917a343a1dcc2990c6d060eac4f0d2b25e2c0302d8b" Jan 11 18:36:26 crc kubenswrapper[4837]: I0111 18:36:26.088044 4837 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-6gwtz" Jan 11 18:36:26 crc kubenswrapper[4837]: I0111 18:36:26.088412 4837 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-6gwtz" Jan 11 18:36:26 crc kubenswrapper[4837]: I0111 18:36:26.156094 4837 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-6gwtz" Jan 11 18:36:26 crc kubenswrapper[4837]: I0111 18:36:26.478424 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-66dzk" event={"ID":"094b94b8-2df0-4674-923c-bcedda01f06d","Type":"ContainerStarted","Data":"886637c387b14ab3f797348b5fafd68123afe8170ba144b2f27efd43aa4ef6e3"} Jan 11 18:36:26 crc kubenswrapper[4837]: I0111 18:36:26.501590 4837 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-must-gather-wkvxh_must-gather-msgnj_45253be1-632b-4c25-bb8b-f726177c8991/gather/0.log" Jan 11 18:36:26 crc kubenswrapper[4837]: I0111 18:36:26.505102 4837 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-66dzk" podStartSLOduration=2.482719654 podStartE2EDuration="8.505087133s" podCreationTimestamp="2026-01-11 18:36:18 +0000 UTC" firstStartedPulling="2026-01-11 18:36:19.396327278 +0000 UTC m=+3953.574519984" lastFinishedPulling="2026-01-11 18:36:25.418694757 +0000 UTC m=+3959.596887463" observedRunningTime="2026-01-11 18:36:26.502135314 +0000 UTC m=+3960.680328090" watchObservedRunningTime="2026-01-11 18:36:26.505087133 +0000 UTC m=+3960.683279859" Jan 11 18:36:26 crc kubenswrapper[4837]: I0111 18:36:26.557973 4837 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-6gwtz" Jan 11 18:36:28 crc kubenswrapper[4837]: I0111 18:36:28.476763 4837 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-66dzk" Jan 11 18:36:28 crc kubenswrapper[4837]: I0111 18:36:28.477058 4837 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-66dzk" Jan 11 18:36:28 crc kubenswrapper[4837]: I0111 18:36:28.542545 4837 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-6gwtz"] Jan 11 18:36:28 crc kubenswrapper[4837]: I0111 18:36:28.542937 4837 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-6gwtz" podUID="88595eec-23d3-4eb3-b3d4-98c38763fc73" containerName="registry-server" containerID="cri-o://659c19ec82617039ce7f7d1496fe4dfefb76b995d2e8157acbb1f00fba1a7e85" gracePeriod=2 Jan 11 18:36:28 crc kubenswrapper[4837]: E0111 18:36:28.620463 4837 upgradeaware.go:441] Error proxying data from backend to client: writeto tcp 38.102.83.196:57672->38.102.83.196:40331: read tcp 38.102.83.196:57672->38.102.83.196:40331: read: connection reset by peer Jan 11 18:36:29 crc kubenswrapper[4837]: I0111 18:36:29.122911 4837 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-6gwtz" Jan 11 18:36:29 crc kubenswrapper[4837]: I0111 18:36:29.213965 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ftgjh\" (UniqueName: \"kubernetes.io/projected/88595eec-23d3-4eb3-b3d4-98c38763fc73-kube-api-access-ftgjh\") pod \"88595eec-23d3-4eb3-b3d4-98c38763fc73\" (UID: \"88595eec-23d3-4eb3-b3d4-98c38763fc73\") " Jan 11 18:36:29 crc kubenswrapper[4837]: I0111 18:36:29.214102 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/88595eec-23d3-4eb3-b3d4-98c38763fc73-utilities\") pod \"88595eec-23d3-4eb3-b3d4-98c38763fc73\" (UID: \"88595eec-23d3-4eb3-b3d4-98c38763fc73\") " Jan 11 18:36:29 crc kubenswrapper[4837]: I0111 18:36:29.214135 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/88595eec-23d3-4eb3-b3d4-98c38763fc73-catalog-content\") pod \"88595eec-23d3-4eb3-b3d4-98c38763fc73\" (UID: \"88595eec-23d3-4eb3-b3d4-98c38763fc73\") " Jan 11 18:36:29 crc kubenswrapper[4837]: I0111 18:36:29.215486 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/88595eec-23d3-4eb3-b3d4-98c38763fc73-utilities" (OuterVolumeSpecName: "utilities") pod "88595eec-23d3-4eb3-b3d4-98c38763fc73" (UID: "88595eec-23d3-4eb3-b3d4-98c38763fc73"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 11 18:36:29 crc kubenswrapper[4837]: I0111 18:36:29.222578 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/88595eec-23d3-4eb3-b3d4-98c38763fc73-kube-api-access-ftgjh" (OuterVolumeSpecName: "kube-api-access-ftgjh") pod "88595eec-23d3-4eb3-b3d4-98c38763fc73" (UID: "88595eec-23d3-4eb3-b3d4-98c38763fc73"). InnerVolumeSpecName "kube-api-access-ftgjh". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 11 18:36:29 crc kubenswrapper[4837]: I0111 18:36:29.258420 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/88595eec-23d3-4eb3-b3d4-98c38763fc73-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "88595eec-23d3-4eb3-b3d4-98c38763fc73" (UID: "88595eec-23d3-4eb3-b3d4-98c38763fc73"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 11 18:36:29 crc kubenswrapper[4837]: I0111 18:36:29.315764 4837 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/88595eec-23d3-4eb3-b3d4-98c38763fc73-utilities\") on node \"crc\" DevicePath \"\"" Jan 11 18:36:29 crc kubenswrapper[4837]: I0111 18:36:29.315803 4837 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/88595eec-23d3-4eb3-b3d4-98c38763fc73-catalog-content\") on node \"crc\" DevicePath \"\"" Jan 11 18:36:29 crc kubenswrapper[4837]: I0111 18:36:29.315817 4837 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-ftgjh\" (UniqueName: \"kubernetes.io/projected/88595eec-23d3-4eb3-b3d4-98c38763fc73-kube-api-access-ftgjh\") on node \"crc\" DevicePath \"\"" Jan 11 18:36:29 crc kubenswrapper[4837]: I0111 18:36:29.532318 4837 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-66dzk" podUID="094b94b8-2df0-4674-923c-bcedda01f06d" containerName="registry-server" probeResult="failure" output=< Jan 11 18:36:29 crc kubenswrapper[4837]: timeout: failed to connect service ":50051" within 1s Jan 11 18:36:29 crc kubenswrapper[4837]: > Jan 11 18:36:29 crc kubenswrapper[4837]: I0111 18:36:29.534411 4837 generic.go:334] "Generic (PLEG): container finished" podID="88595eec-23d3-4eb3-b3d4-98c38763fc73" containerID="659c19ec82617039ce7f7d1496fe4dfefb76b995d2e8157acbb1f00fba1a7e85" exitCode=0 Jan 11 18:36:29 crc kubenswrapper[4837]: I0111 18:36:29.534454 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-6gwtz" event={"ID":"88595eec-23d3-4eb3-b3d4-98c38763fc73","Type":"ContainerDied","Data":"659c19ec82617039ce7f7d1496fe4dfefb76b995d2e8157acbb1f00fba1a7e85"} Jan 11 18:36:29 crc kubenswrapper[4837]: I0111 18:36:29.534483 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-6gwtz" event={"ID":"88595eec-23d3-4eb3-b3d4-98c38763fc73","Type":"ContainerDied","Data":"ff3589f2519aa34fe2ea237e117e45cc4d36c8e54a8c68c6c8f6f24f4f52416b"} Jan 11 18:36:29 crc kubenswrapper[4837]: I0111 18:36:29.534504 4837 scope.go:117] "RemoveContainer" containerID="659c19ec82617039ce7f7d1496fe4dfefb76b995d2e8157acbb1f00fba1a7e85" Jan 11 18:36:29 crc kubenswrapper[4837]: I0111 18:36:29.534709 4837 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-6gwtz" Jan 11 18:36:29 crc kubenswrapper[4837]: I0111 18:36:29.576974 4837 scope.go:117] "RemoveContainer" containerID="ad5a5af7e00bbca96098192ec5670355f6fec7a70c7ef72d6c9c9cb4535b2bee" Jan 11 18:36:29 crc kubenswrapper[4837]: I0111 18:36:29.583053 4837 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-6gwtz"] Jan 11 18:36:29 crc kubenswrapper[4837]: I0111 18:36:29.603399 4837 scope.go:117] "RemoveContainer" containerID="e7a35c14f608068fe7b15813373c2261c847b079174742a60104403a9a06db17" Jan 11 18:36:29 crc kubenswrapper[4837]: I0111 18:36:29.606980 4837 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-6gwtz"] Jan 11 18:36:29 crc kubenswrapper[4837]: I0111 18:36:29.690923 4837 scope.go:117] "RemoveContainer" containerID="659c19ec82617039ce7f7d1496fe4dfefb76b995d2e8157acbb1f00fba1a7e85" Jan 11 18:36:29 crc kubenswrapper[4837]: E0111 18:36:29.691311 4837 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"659c19ec82617039ce7f7d1496fe4dfefb76b995d2e8157acbb1f00fba1a7e85\": container with ID starting with 659c19ec82617039ce7f7d1496fe4dfefb76b995d2e8157acbb1f00fba1a7e85 not found: ID does not exist" containerID="659c19ec82617039ce7f7d1496fe4dfefb76b995d2e8157acbb1f00fba1a7e85" Jan 11 18:36:29 crc kubenswrapper[4837]: I0111 18:36:29.691366 4837 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"659c19ec82617039ce7f7d1496fe4dfefb76b995d2e8157acbb1f00fba1a7e85"} err="failed to get container status \"659c19ec82617039ce7f7d1496fe4dfefb76b995d2e8157acbb1f00fba1a7e85\": rpc error: code = NotFound desc = could not find container \"659c19ec82617039ce7f7d1496fe4dfefb76b995d2e8157acbb1f00fba1a7e85\": container with ID starting with 659c19ec82617039ce7f7d1496fe4dfefb76b995d2e8157acbb1f00fba1a7e85 not found: ID does not exist" Jan 11 18:36:29 crc kubenswrapper[4837]: I0111 18:36:29.691399 4837 scope.go:117] "RemoveContainer" containerID="ad5a5af7e00bbca96098192ec5670355f6fec7a70c7ef72d6c9c9cb4535b2bee" Jan 11 18:36:29 crc kubenswrapper[4837]: E0111 18:36:29.691752 4837 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ad5a5af7e00bbca96098192ec5670355f6fec7a70c7ef72d6c9c9cb4535b2bee\": container with ID starting with ad5a5af7e00bbca96098192ec5670355f6fec7a70c7ef72d6c9c9cb4535b2bee not found: ID does not exist" containerID="ad5a5af7e00bbca96098192ec5670355f6fec7a70c7ef72d6c9c9cb4535b2bee" Jan 11 18:36:29 crc kubenswrapper[4837]: I0111 18:36:29.691779 4837 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ad5a5af7e00bbca96098192ec5670355f6fec7a70c7ef72d6c9c9cb4535b2bee"} err="failed to get container status \"ad5a5af7e00bbca96098192ec5670355f6fec7a70c7ef72d6c9c9cb4535b2bee\": rpc error: code = NotFound desc = could not find container \"ad5a5af7e00bbca96098192ec5670355f6fec7a70c7ef72d6c9c9cb4535b2bee\": container with ID starting with ad5a5af7e00bbca96098192ec5670355f6fec7a70c7ef72d6c9c9cb4535b2bee not found: ID does not exist" Jan 11 18:36:29 crc kubenswrapper[4837]: I0111 18:36:29.691797 4837 scope.go:117] "RemoveContainer" containerID="e7a35c14f608068fe7b15813373c2261c847b079174742a60104403a9a06db17" Jan 11 18:36:29 crc kubenswrapper[4837]: E0111 18:36:29.692020 4837 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e7a35c14f608068fe7b15813373c2261c847b079174742a60104403a9a06db17\": container with ID starting with e7a35c14f608068fe7b15813373c2261c847b079174742a60104403a9a06db17 not found: ID does not exist" containerID="e7a35c14f608068fe7b15813373c2261c847b079174742a60104403a9a06db17" Jan 11 18:36:29 crc kubenswrapper[4837]: I0111 18:36:29.692062 4837 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e7a35c14f608068fe7b15813373c2261c847b079174742a60104403a9a06db17"} err="failed to get container status \"e7a35c14f608068fe7b15813373c2261c847b079174742a60104403a9a06db17\": rpc error: code = NotFound desc = could not find container \"e7a35c14f608068fe7b15813373c2261c847b079174742a60104403a9a06db17\": container with ID starting with e7a35c14f608068fe7b15813373c2261c847b079174742a60104403a9a06db17 not found: ID does not exist" Jan 11 18:36:30 crc kubenswrapper[4837]: I0111 18:36:30.383651 4837 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="88595eec-23d3-4eb3-b3d4-98c38763fc73" path="/var/lib/kubelet/pods/88595eec-23d3-4eb3-b3d4-98c38763fc73/volumes" Jan 11 18:36:31 crc kubenswrapper[4837]: I0111 18:36:31.364376 4837 scope.go:117] "RemoveContainer" containerID="0032c5226bd846c444b1e1269293c95e1f965f59227e93f022572cc4a00df926" Jan 11 18:36:31 crc kubenswrapper[4837]: E0111 18:36:31.364756 4837 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-pqnst_openshift-machine-config-operator(1cf6fa66-290a-4e29-bafc-e60185a22fe2)\"" pod="openshift-machine-config-operator/machine-config-daemon-pqnst" podUID="1cf6fa66-290a-4e29-bafc-e60185a22fe2" Jan 11 18:36:34 crc kubenswrapper[4837]: I0111 18:36:34.891699 4837 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-must-gather-wkvxh/must-gather-msgnj"] Jan 11 18:36:34 crc kubenswrapper[4837]: I0111 18:36:34.892772 4837 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-must-gather-wkvxh/must-gather-msgnj" podUID="45253be1-632b-4c25-bb8b-f726177c8991" containerName="copy" containerID="cri-o://664fd153625177cd8cd2575ba1b483e950198618da90b3b8fb26bdeb13a20b32" gracePeriod=2 Jan 11 18:36:34 crc kubenswrapper[4837]: I0111 18:36:34.904444 4837 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-must-gather-wkvxh/must-gather-msgnj"] Jan 11 18:36:35 crc kubenswrapper[4837]: I0111 18:36:35.357217 4837 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-must-gather-wkvxh_must-gather-msgnj_45253be1-632b-4c25-bb8b-f726177c8991/copy/0.log" Jan 11 18:36:35 crc kubenswrapper[4837]: I0111 18:36:35.357730 4837 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-wkvxh/must-gather-msgnj" Jan 11 18:36:35 crc kubenswrapper[4837]: I0111 18:36:35.435044 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/45253be1-632b-4c25-bb8b-f726177c8991-must-gather-output\") pod \"45253be1-632b-4c25-bb8b-f726177c8991\" (UID: \"45253be1-632b-4c25-bb8b-f726177c8991\") " Jan 11 18:36:35 crc kubenswrapper[4837]: I0111 18:36:35.435121 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qmcqx\" (UniqueName: \"kubernetes.io/projected/45253be1-632b-4c25-bb8b-f726177c8991-kube-api-access-qmcqx\") pod \"45253be1-632b-4c25-bb8b-f726177c8991\" (UID: \"45253be1-632b-4c25-bb8b-f726177c8991\") " Jan 11 18:36:35 crc kubenswrapper[4837]: I0111 18:36:35.442945 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/45253be1-632b-4c25-bb8b-f726177c8991-kube-api-access-qmcqx" (OuterVolumeSpecName: "kube-api-access-qmcqx") pod "45253be1-632b-4c25-bb8b-f726177c8991" (UID: "45253be1-632b-4c25-bb8b-f726177c8991"). InnerVolumeSpecName "kube-api-access-qmcqx". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 11 18:36:35 crc kubenswrapper[4837]: I0111 18:36:35.538905 4837 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qmcqx\" (UniqueName: \"kubernetes.io/projected/45253be1-632b-4c25-bb8b-f726177c8991-kube-api-access-qmcqx\") on node \"crc\" DevicePath \"\"" Jan 11 18:36:35 crc kubenswrapper[4837]: I0111 18:36:35.572711 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/45253be1-632b-4c25-bb8b-f726177c8991-must-gather-output" (OuterVolumeSpecName: "must-gather-output") pod "45253be1-632b-4c25-bb8b-f726177c8991" (UID: "45253be1-632b-4c25-bb8b-f726177c8991"). InnerVolumeSpecName "must-gather-output". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 11 18:36:35 crc kubenswrapper[4837]: I0111 18:36:35.605538 4837 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-must-gather-wkvxh_must-gather-msgnj_45253be1-632b-4c25-bb8b-f726177c8991/copy/0.log" Jan 11 18:36:35 crc kubenswrapper[4837]: I0111 18:36:35.606165 4837 generic.go:334] "Generic (PLEG): container finished" podID="45253be1-632b-4c25-bb8b-f726177c8991" containerID="664fd153625177cd8cd2575ba1b483e950198618da90b3b8fb26bdeb13a20b32" exitCode=143 Jan 11 18:36:35 crc kubenswrapper[4837]: I0111 18:36:35.606247 4837 scope.go:117] "RemoveContainer" containerID="664fd153625177cd8cd2575ba1b483e950198618da90b3b8fb26bdeb13a20b32" Jan 11 18:36:35 crc kubenswrapper[4837]: I0111 18:36:35.606248 4837 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-wkvxh/must-gather-msgnj" Jan 11 18:36:35 crc kubenswrapper[4837]: I0111 18:36:35.640835 4837 reconciler_common.go:293] "Volume detached for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/45253be1-632b-4c25-bb8b-f726177c8991-must-gather-output\") on node \"crc\" DevicePath \"\"" Jan 11 18:36:35 crc kubenswrapper[4837]: I0111 18:36:35.642159 4837 scope.go:117] "RemoveContainer" containerID="d33ec879d3a24fa77b931917a343a1dcc2990c6d060eac4f0d2b25e2c0302d8b" Jan 11 18:36:35 crc kubenswrapper[4837]: I0111 18:36:35.737499 4837 scope.go:117] "RemoveContainer" containerID="664fd153625177cd8cd2575ba1b483e950198618da90b3b8fb26bdeb13a20b32" Jan 11 18:36:35 crc kubenswrapper[4837]: E0111 18:36:35.738146 4837 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"664fd153625177cd8cd2575ba1b483e950198618da90b3b8fb26bdeb13a20b32\": container with ID starting with 664fd153625177cd8cd2575ba1b483e950198618da90b3b8fb26bdeb13a20b32 not found: ID does not exist" containerID="664fd153625177cd8cd2575ba1b483e950198618da90b3b8fb26bdeb13a20b32" Jan 11 18:36:35 crc kubenswrapper[4837]: I0111 18:36:35.738175 4837 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"664fd153625177cd8cd2575ba1b483e950198618da90b3b8fb26bdeb13a20b32"} err="failed to get container status \"664fd153625177cd8cd2575ba1b483e950198618da90b3b8fb26bdeb13a20b32\": rpc error: code = NotFound desc = could not find container \"664fd153625177cd8cd2575ba1b483e950198618da90b3b8fb26bdeb13a20b32\": container with ID starting with 664fd153625177cd8cd2575ba1b483e950198618da90b3b8fb26bdeb13a20b32 not found: ID does not exist" Jan 11 18:36:35 crc kubenswrapper[4837]: I0111 18:36:35.738196 4837 scope.go:117] "RemoveContainer" containerID="d33ec879d3a24fa77b931917a343a1dcc2990c6d060eac4f0d2b25e2c0302d8b" Jan 11 18:36:35 crc kubenswrapper[4837]: E0111 18:36:35.738504 4837 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d33ec879d3a24fa77b931917a343a1dcc2990c6d060eac4f0d2b25e2c0302d8b\": container with ID starting with d33ec879d3a24fa77b931917a343a1dcc2990c6d060eac4f0d2b25e2c0302d8b not found: ID does not exist" containerID="d33ec879d3a24fa77b931917a343a1dcc2990c6d060eac4f0d2b25e2c0302d8b" Jan 11 18:36:35 crc kubenswrapper[4837]: I0111 18:36:35.738546 4837 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d33ec879d3a24fa77b931917a343a1dcc2990c6d060eac4f0d2b25e2c0302d8b"} err="failed to get container status \"d33ec879d3a24fa77b931917a343a1dcc2990c6d060eac4f0d2b25e2c0302d8b\": rpc error: code = NotFound desc = could not find container \"d33ec879d3a24fa77b931917a343a1dcc2990c6d060eac4f0d2b25e2c0302d8b\": container with ID starting with d33ec879d3a24fa77b931917a343a1dcc2990c6d060eac4f0d2b25e2c0302d8b not found: ID does not exist" Jan 11 18:36:36 crc kubenswrapper[4837]: I0111 18:36:36.376762 4837 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="45253be1-632b-4c25-bb8b-f726177c8991" path="/var/lib/kubelet/pods/45253be1-632b-4c25-bb8b-f726177c8991/volumes" Jan 11 18:36:38 crc kubenswrapper[4837]: I0111 18:36:38.524421 4837 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-66dzk" Jan 11 18:36:38 crc kubenswrapper[4837]: I0111 18:36:38.604542 4837 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-66dzk" Jan 11 18:36:38 crc kubenswrapper[4837]: I0111 18:36:38.771756 4837 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-66dzk"] Jan 11 18:36:39 crc kubenswrapper[4837]: I0111 18:36:39.647749 4837 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-66dzk" podUID="094b94b8-2df0-4674-923c-bcedda01f06d" containerName="registry-server" containerID="cri-o://886637c387b14ab3f797348b5fafd68123afe8170ba144b2f27efd43aa4ef6e3" gracePeriod=2 Jan 11 18:36:40 crc kubenswrapper[4837]: I0111 18:36:40.166163 4837 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-66dzk" Jan 11 18:36:40 crc kubenswrapper[4837]: I0111 18:36:40.230276 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/094b94b8-2df0-4674-923c-bcedda01f06d-utilities\") pod \"094b94b8-2df0-4674-923c-bcedda01f06d\" (UID: \"094b94b8-2df0-4674-923c-bcedda01f06d\") " Jan 11 18:36:40 crc kubenswrapper[4837]: I0111 18:36:40.230371 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/094b94b8-2df0-4674-923c-bcedda01f06d-catalog-content\") pod \"094b94b8-2df0-4674-923c-bcedda01f06d\" (UID: \"094b94b8-2df0-4674-923c-bcedda01f06d\") " Jan 11 18:36:40 crc kubenswrapper[4837]: I0111 18:36:40.230472 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-csf5m\" (UniqueName: \"kubernetes.io/projected/094b94b8-2df0-4674-923c-bcedda01f06d-kube-api-access-csf5m\") pod \"094b94b8-2df0-4674-923c-bcedda01f06d\" (UID: \"094b94b8-2df0-4674-923c-bcedda01f06d\") " Jan 11 18:36:40 crc kubenswrapper[4837]: I0111 18:36:40.231300 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/094b94b8-2df0-4674-923c-bcedda01f06d-utilities" (OuterVolumeSpecName: "utilities") pod "094b94b8-2df0-4674-923c-bcedda01f06d" (UID: "094b94b8-2df0-4674-923c-bcedda01f06d"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 11 18:36:40 crc kubenswrapper[4837]: I0111 18:36:40.239848 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/094b94b8-2df0-4674-923c-bcedda01f06d-kube-api-access-csf5m" (OuterVolumeSpecName: "kube-api-access-csf5m") pod "094b94b8-2df0-4674-923c-bcedda01f06d" (UID: "094b94b8-2df0-4674-923c-bcedda01f06d"). InnerVolumeSpecName "kube-api-access-csf5m". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 11 18:36:40 crc kubenswrapper[4837]: I0111 18:36:40.332632 4837 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/094b94b8-2df0-4674-923c-bcedda01f06d-utilities\") on node \"crc\" DevicePath \"\"" Jan 11 18:36:40 crc kubenswrapper[4837]: I0111 18:36:40.332693 4837 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-csf5m\" (UniqueName: \"kubernetes.io/projected/094b94b8-2df0-4674-923c-bcedda01f06d-kube-api-access-csf5m\") on node \"crc\" DevicePath \"\"" Jan 11 18:36:40 crc kubenswrapper[4837]: I0111 18:36:40.350915 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/094b94b8-2df0-4674-923c-bcedda01f06d-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "094b94b8-2df0-4674-923c-bcedda01f06d" (UID: "094b94b8-2df0-4674-923c-bcedda01f06d"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 11 18:36:40 crc kubenswrapper[4837]: I0111 18:36:40.434772 4837 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/094b94b8-2df0-4674-923c-bcedda01f06d-catalog-content\") on node \"crc\" DevicePath \"\"" Jan 11 18:36:40 crc kubenswrapper[4837]: I0111 18:36:40.657527 4837 generic.go:334] "Generic (PLEG): container finished" podID="094b94b8-2df0-4674-923c-bcedda01f06d" containerID="886637c387b14ab3f797348b5fafd68123afe8170ba144b2f27efd43aa4ef6e3" exitCode=0 Jan 11 18:36:40 crc kubenswrapper[4837]: I0111 18:36:40.657571 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-66dzk" event={"ID":"094b94b8-2df0-4674-923c-bcedda01f06d","Type":"ContainerDied","Data":"886637c387b14ab3f797348b5fafd68123afe8170ba144b2f27efd43aa4ef6e3"} Jan 11 18:36:40 crc kubenswrapper[4837]: I0111 18:36:40.657612 4837 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-66dzk" Jan 11 18:36:40 crc kubenswrapper[4837]: I0111 18:36:40.657627 4837 scope.go:117] "RemoveContainer" containerID="886637c387b14ab3f797348b5fafd68123afe8170ba144b2f27efd43aa4ef6e3" Jan 11 18:36:40 crc kubenswrapper[4837]: I0111 18:36:40.657614 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-66dzk" event={"ID":"094b94b8-2df0-4674-923c-bcedda01f06d","Type":"ContainerDied","Data":"d921a63f3324c110eab90aa528703027f8b9b95d525a798945e08b4467b224d8"} Jan 11 18:36:40 crc kubenswrapper[4837]: I0111 18:36:40.681034 4837 scope.go:117] "RemoveContainer" containerID="483f9b0baaf73a940066dda3aed196d256618be93c03590f64d3a5cfaff1d433" Jan 11 18:36:40 crc kubenswrapper[4837]: I0111 18:36:40.687803 4837 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-66dzk"] Jan 11 18:36:40 crc kubenswrapper[4837]: I0111 18:36:40.700666 4837 scope.go:117] "RemoveContainer" containerID="a2273bd94c532993e42867192534817429c0d0f816182efec4f0fed820e6a8ed" Jan 11 18:36:40 crc kubenswrapper[4837]: I0111 18:36:40.702107 4837 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-66dzk"] Jan 11 18:36:40 crc kubenswrapper[4837]: I0111 18:36:40.762306 4837 scope.go:117] "RemoveContainer" containerID="886637c387b14ab3f797348b5fafd68123afe8170ba144b2f27efd43aa4ef6e3" Jan 11 18:36:40 crc kubenswrapper[4837]: E0111 18:36:40.762830 4837 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"886637c387b14ab3f797348b5fafd68123afe8170ba144b2f27efd43aa4ef6e3\": container with ID starting with 886637c387b14ab3f797348b5fafd68123afe8170ba144b2f27efd43aa4ef6e3 not found: ID does not exist" containerID="886637c387b14ab3f797348b5fafd68123afe8170ba144b2f27efd43aa4ef6e3" Jan 11 18:36:40 crc kubenswrapper[4837]: I0111 18:36:40.762865 4837 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"886637c387b14ab3f797348b5fafd68123afe8170ba144b2f27efd43aa4ef6e3"} err="failed to get container status \"886637c387b14ab3f797348b5fafd68123afe8170ba144b2f27efd43aa4ef6e3\": rpc error: code = NotFound desc = could not find container \"886637c387b14ab3f797348b5fafd68123afe8170ba144b2f27efd43aa4ef6e3\": container with ID starting with 886637c387b14ab3f797348b5fafd68123afe8170ba144b2f27efd43aa4ef6e3 not found: ID does not exist" Jan 11 18:36:40 crc kubenswrapper[4837]: I0111 18:36:40.762885 4837 scope.go:117] "RemoveContainer" containerID="483f9b0baaf73a940066dda3aed196d256618be93c03590f64d3a5cfaff1d433" Jan 11 18:36:40 crc kubenswrapper[4837]: E0111 18:36:40.763281 4837 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"483f9b0baaf73a940066dda3aed196d256618be93c03590f64d3a5cfaff1d433\": container with ID starting with 483f9b0baaf73a940066dda3aed196d256618be93c03590f64d3a5cfaff1d433 not found: ID does not exist" containerID="483f9b0baaf73a940066dda3aed196d256618be93c03590f64d3a5cfaff1d433" Jan 11 18:36:40 crc kubenswrapper[4837]: I0111 18:36:40.763311 4837 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"483f9b0baaf73a940066dda3aed196d256618be93c03590f64d3a5cfaff1d433"} err="failed to get container status \"483f9b0baaf73a940066dda3aed196d256618be93c03590f64d3a5cfaff1d433\": rpc error: code = NotFound desc = could not find container \"483f9b0baaf73a940066dda3aed196d256618be93c03590f64d3a5cfaff1d433\": container with ID starting with 483f9b0baaf73a940066dda3aed196d256618be93c03590f64d3a5cfaff1d433 not found: ID does not exist" Jan 11 18:36:40 crc kubenswrapper[4837]: I0111 18:36:40.763328 4837 scope.go:117] "RemoveContainer" containerID="a2273bd94c532993e42867192534817429c0d0f816182efec4f0fed820e6a8ed" Jan 11 18:36:40 crc kubenswrapper[4837]: E0111 18:36:40.764562 4837 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a2273bd94c532993e42867192534817429c0d0f816182efec4f0fed820e6a8ed\": container with ID starting with a2273bd94c532993e42867192534817429c0d0f816182efec4f0fed820e6a8ed not found: ID does not exist" containerID="a2273bd94c532993e42867192534817429c0d0f816182efec4f0fed820e6a8ed" Jan 11 18:36:40 crc kubenswrapper[4837]: I0111 18:36:40.764590 4837 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a2273bd94c532993e42867192534817429c0d0f816182efec4f0fed820e6a8ed"} err="failed to get container status \"a2273bd94c532993e42867192534817429c0d0f816182efec4f0fed820e6a8ed\": rpc error: code = NotFound desc = could not find container \"a2273bd94c532993e42867192534817429c0d0f816182efec4f0fed820e6a8ed\": container with ID starting with a2273bd94c532993e42867192534817429c0d0f816182efec4f0fed820e6a8ed not found: ID does not exist" Jan 11 18:36:42 crc kubenswrapper[4837]: I0111 18:36:42.364402 4837 scope.go:117] "RemoveContainer" containerID="0032c5226bd846c444b1e1269293c95e1f965f59227e93f022572cc4a00df926" Jan 11 18:36:42 crc kubenswrapper[4837]: E0111 18:36:42.364795 4837 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-pqnst_openshift-machine-config-operator(1cf6fa66-290a-4e29-bafc-e60185a22fe2)\"" pod="openshift-machine-config-operator/machine-config-daemon-pqnst" podUID="1cf6fa66-290a-4e29-bafc-e60185a22fe2" Jan 11 18:36:42 crc kubenswrapper[4837]: I0111 18:36:42.378780 4837 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="094b94b8-2df0-4674-923c-bcedda01f06d" path="/var/lib/kubelet/pods/094b94b8-2df0-4674-923c-bcedda01f06d/volumes" Jan 11 18:36:57 crc kubenswrapper[4837]: I0111 18:36:57.365213 4837 scope.go:117] "RemoveContainer" containerID="0032c5226bd846c444b1e1269293c95e1f965f59227e93f022572cc4a00df926" Jan 11 18:36:57 crc kubenswrapper[4837]: E0111 18:36:57.367312 4837 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-pqnst_openshift-machine-config-operator(1cf6fa66-290a-4e29-bafc-e60185a22fe2)\"" pod="openshift-machine-config-operator/machine-config-daemon-pqnst" podUID="1cf6fa66-290a-4e29-bafc-e60185a22fe2" Jan 11 18:37:11 crc kubenswrapper[4837]: I0111 18:37:11.364072 4837 scope.go:117] "RemoveContainer" containerID="0032c5226bd846c444b1e1269293c95e1f965f59227e93f022572cc4a00df926" Jan 11 18:37:11 crc kubenswrapper[4837]: I0111 18:37:11.968082 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-pqnst" event={"ID":"1cf6fa66-290a-4e29-bafc-e60185a22fe2","Type":"ContainerStarted","Data":"c05585a8023b2af68a7f80685d84b74e2cdf192f7d872b26f5d474040ef515c9"} Jan 11 18:37:54 crc kubenswrapper[4837]: I0111 18:37:54.518529 4837 scope.go:117] "RemoveContainer" containerID="2f0b2b94cc7a5404adef5c0c69a2d2ffc79086e54a4eb7b484e6ca21b6880702" Jan 11 18:38:54 crc kubenswrapper[4837]: I0111 18:38:54.593593 4837 scope.go:117] "RemoveContainer" containerID="3ac8b730d0924b65be9170e112cd4c19b891c8bed12714d223023abfbb6cf1f2" Jan 11 18:39:36 crc kubenswrapper[4837]: I0111 18:39:36.028273 4837 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-must-gather-bfzgr/must-gather-8s65m"] Jan 11 18:39:36 crc kubenswrapper[4837]: E0111 18:39:36.029150 4837 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="094b94b8-2df0-4674-923c-bcedda01f06d" containerName="registry-server" Jan 11 18:39:36 crc kubenswrapper[4837]: I0111 18:39:36.029162 4837 state_mem.go:107] "Deleted CPUSet assignment" podUID="094b94b8-2df0-4674-923c-bcedda01f06d" containerName="registry-server" Jan 11 18:39:36 crc kubenswrapper[4837]: E0111 18:39:36.029176 4837 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="88595eec-23d3-4eb3-b3d4-98c38763fc73" containerName="extract-utilities" Jan 11 18:39:36 crc kubenswrapper[4837]: I0111 18:39:36.029183 4837 state_mem.go:107] "Deleted CPUSet assignment" podUID="88595eec-23d3-4eb3-b3d4-98c38763fc73" containerName="extract-utilities" Jan 11 18:39:36 crc kubenswrapper[4837]: E0111 18:39:36.029198 4837 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="88595eec-23d3-4eb3-b3d4-98c38763fc73" containerName="registry-server" Jan 11 18:39:36 crc kubenswrapper[4837]: I0111 18:39:36.029204 4837 state_mem.go:107] "Deleted CPUSet assignment" podUID="88595eec-23d3-4eb3-b3d4-98c38763fc73" containerName="registry-server" Jan 11 18:39:36 crc kubenswrapper[4837]: E0111 18:39:36.029216 4837 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="88595eec-23d3-4eb3-b3d4-98c38763fc73" containerName="extract-content" Jan 11 18:39:36 crc kubenswrapper[4837]: I0111 18:39:36.029221 4837 state_mem.go:107] "Deleted CPUSet assignment" podUID="88595eec-23d3-4eb3-b3d4-98c38763fc73" containerName="extract-content" Jan 11 18:39:36 crc kubenswrapper[4837]: E0111 18:39:36.029235 4837 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="094b94b8-2df0-4674-923c-bcedda01f06d" containerName="extract-utilities" Jan 11 18:39:36 crc kubenswrapper[4837]: I0111 18:39:36.029242 4837 state_mem.go:107] "Deleted CPUSet assignment" podUID="094b94b8-2df0-4674-923c-bcedda01f06d" containerName="extract-utilities" Jan 11 18:39:36 crc kubenswrapper[4837]: E0111 18:39:36.029252 4837 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="45253be1-632b-4c25-bb8b-f726177c8991" containerName="copy" Jan 11 18:39:36 crc kubenswrapper[4837]: I0111 18:39:36.029257 4837 state_mem.go:107] "Deleted CPUSet assignment" podUID="45253be1-632b-4c25-bb8b-f726177c8991" containerName="copy" Jan 11 18:39:36 crc kubenswrapper[4837]: E0111 18:39:36.029263 4837 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="45253be1-632b-4c25-bb8b-f726177c8991" containerName="gather" Jan 11 18:39:36 crc kubenswrapper[4837]: I0111 18:39:36.029268 4837 state_mem.go:107] "Deleted CPUSet assignment" podUID="45253be1-632b-4c25-bb8b-f726177c8991" containerName="gather" Jan 11 18:39:36 crc kubenswrapper[4837]: E0111 18:39:36.029280 4837 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="094b94b8-2df0-4674-923c-bcedda01f06d" containerName="extract-content" Jan 11 18:39:36 crc kubenswrapper[4837]: I0111 18:39:36.029286 4837 state_mem.go:107] "Deleted CPUSet assignment" podUID="094b94b8-2df0-4674-923c-bcedda01f06d" containerName="extract-content" Jan 11 18:39:36 crc kubenswrapper[4837]: I0111 18:39:36.029458 4837 memory_manager.go:354] "RemoveStaleState removing state" podUID="88595eec-23d3-4eb3-b3d4-98c38763fc73" containerName="registry-server" Jan 11 18:39:36 crc kubenswrapper[4837]: I0111 18:39:36.029469 4837 memory_manager.go:354] "RemoveStaleState removing state" podUID="094b94b8-2df0-4674-923c-bcedda01f06d" containerName="registry-server" Jan 11 18:39:36 crc kubenswrapper[4837]: I0111 18:39:36.029475 4837 memory_manager.go:354] "RemoveStaleState removing state" podUID="45253be1-632b-4c25-bb8b-f726177c8991" containerName="copy" Jan 11 18:39:36 crc kubenswrapper[4837]: I0111 18:39:36.029483 4837 memory_manager.go:354] "RemoveStaleState removing state" podUID="45253be1-632b-4c25-bb8b-f726177c8991" containerName="gather" Jan 11 18:39:36 crc kubenswrapper[4837]: I0111 18:39:36.030471 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-bfzgr/must-gather-8s65m" Jan 11 18:39:36 crc kubenswrapper[4837]: I0111 18:39:36.039196 4837 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-must-gather-bfzgr"/"kube-root-ca.crt" Jan 11 18:39:36 crc kubenswrapper[4837]: I0111 18:39:36.039267 4837 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-must-gather-bfzgr"/"openshift-service-ca.crt" Jan 11 18:39:36 crc kubenswrapper[4837]: I0111 18:39:36.074506 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wjxmw\" (UniqueName: \"kubernetes.io/projected/2001d3ff-95d2-472f-9116-306e63afac42-kube-api-access-wjxmw\") pod \"must-gather-8s65m\" (UID: \"2001d3ff-95d2-472f-9116-306e63afac42\") " pod="openshift-must-gather-bfzgr/must-gather-8s65m" Jan 11 18:39:36 crc kubenswrapper[4837]: I0111 18:39:36.074558 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/2001d3ff-95d2-472f-9116-306e63afac42-must-gather-output\") pod \"must-gather-8s65m\" (UID: \"2001d3ff-95d2-472f-9116-306e63afac42\") " pod="openshift-must-gather-bfzgr/must-gather-8s65m" Jan 11 18:39:36 crc kubenswrapper[4837]: I0111 18:39:36.075943 4837 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-bfzgr/must-gather-8s65m"] Jan 11 18:39:36 crc kubenswrapper[4837]: I0111 18:39:36.175472 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wjxmw\" (UniqueName: \"kubernetes.io/projected/2001d3ff-95d2-472f-9116-306e63afac42-kube-api-access-wjxmw\") pod \"must-gather-8s65m\" (UID: \"2001d3ff-95d2-472f-9116-306e63afac42\") " pod="openshift-must-gather-bfzgr/must-gather-8s65m" Jan 11 18:39:36 crc kubenswrapper[4837]: I0111 18:39:36.175518 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/2001d3ff-95d2-472f-9116-306e63afac42-must-gather-output\") pod \"must-gather-8s65m\" (UID: \"2001d3ff-95d2-472f-9116-306e63afac42\") " pod="openshift-must-gather-bfzgr/must-gather-8s65m" Jan 11 18:39:36 crc kubenswrapper[4837]: I0111 18:39:36.176113 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/2001d3ff-95d2-472f-9116-306e63afac42-must-gather-output\") pod \"must-gather-8s65m\" (UID: \"2001d3ff-95d2-472f-9116-306e63afac42\") " pod="openshift-must-gather-bfzgr/must-gather-8s65m" Jan 11 18:39:36 crc kubenswrapper[4837]: I0111 18:39:36.197578 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wjxmw\" (UniqueName: \"kubernetes.io/projected/2001d3ff-95d2-472f-9116-306e63afac42-kube-api-access-wjxmw\") pod \"must-gather-8s65m\" (UID: \"2001d3ff-95d2-472f-9116-306e63afac42\") " pod="openshift-must-gather-bfzgr/must-gather-8s65m" Jan 11 18:39:36 crc kubenswrapper[4837]: I0111 18:39:36.358637 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-bfzgr/must-gather-8s65m" Jan 11 18:39:37 crc kubenswrapper[4837]: I0111 18:39:37.241750 4837 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-bfzgr/must-gather-8s65m"] Jan 11 18:39:37 crc kubenswrapper[4837]: I0111 18:39:37.591288 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-bfzgr/must-gather-8s65m" event={"ID":"2001d3ff-95d2-472f-9116-306e63afac42","Type":"ContainerStarted","Data":"0ad8446931f3bed9b2f47c6a8d85b8c8f6deba00740e6665d9b7cdf150d3b802"} Jan 11 18:39:37 crc kubenswrapper[4837]: I0111 18:39:37.591342 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-bfzgr/must-gather-8s65m" event={"ID":"2001d3ff-95d2-472f-9116-306e63afac42","Type":"ContainerStarted","Data":"8aca70c6cd5e1078e111854ebc446f8cb27dab0c3d68f8f40c9ea83c86466f70"} Jan 11 18:39:38 crc kubenswrapper[4837]: I0111 18:39:38.607996 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-bfzgr/must-gather-8s65m" event={"ID":"2001d3ff-95d2-472f-9116-306e63afac42","Type":"ContainerStarted","Data":"04052dc1b4eaaf82b55820b0ef50fa33590dafcf55dc3437b38fa74670f8ae0b"} Jan 11 18:39:38 crc kubenswrapper[4837]: I0111 18:39:38.659462 4837 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-must-gather-bfzgr/must-gather-8s65m" podStartSLOduration=3.659438316 podStartE2EDuration="3.659438316s" podCreationTimestamp="2026-01-11 18:39:35 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-11 18:39:38.627844717 +0000 UTC m=+4152.806037433" watchObservedRunningTime="2026-01-11 18:39:38.659438316 +0000 UTC m=+4152.837631022" Jan 11 18:39:39 crc kubenswrapper[4837]: I0111 18:39:39.443909 4837 patch_prober.go:28] interesting pod/machine-config-daemon-pqnst container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Jan 11 18:39:39 crc kubenswrapper[4837]: I0111 18:39:39.444332 4837 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-pqnst" podUID="1cf6fa66-290a-4e29-bafc-e60185a22fe2" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Jan 11 18:39:41 crc kubenswrapper[4837]: I0111 18:39:41.099443 4837 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-must-gather-bfzgr/crc-debug-9s625"] Jan 11 18:39:41 crc kubenswrapper[4837]: I0111 18:39:41.101238 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-bfzgr/crc-debug-9s625" Jan 11 18:39:41 crc kubenswrapper[4837]: I0111 18:39:41.103123 4837 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-must-gather-bfzgr"/"default-dockercfg-fh9jq" Jan 11 18:39:41 crc kubenswrapper[4837]: I0111 18:39:41.169248 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4gkqp\" (UniqueName: \"kubernetes.io/projected/352bf2e0-daf1-4394-a67b-64048295f411-kube-api-access-4gkqp\") pod \"crc-debug-9s625\" (UID: \"352bf2e0-daf1-4394-a67b-64048295f411\") " pod="openshift-must-gather-bfzgr/crc-debug-9s625" Jan 11 18:39:41 crc kubenswrapper[4837]: I0111 18:39:41.169576 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/352bf2e0-daf1-4394-a67b-64048295f411-host\") pod \"crc-debug-9s625\" (UID: \"352bf2e0-daf1-4394-a67b-64048295f411\") " pod="openshift-must-gather-bfzgr/crc-debug-9s625" Jan 11 18:39:41 crc kubenswrapper[4837]: I0111 18:39:41.270855 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/352bf2e0-daf1-4394-a67b-64048295f411-host\") pod \"crc-debug-9s625\" (UID: \"352bf2e0-daf1-4394-a67b-64048295f411\") " pod="openshift-must-gather-bfzgr/crc-debug-9s625" Jan 11 18:39:41 crc kubenswrapper[4837]: I0111 18:39:41.270964 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4gkqp\" (UniqueName: \"kubernetes.io/projected/352bf2e0-daf1-4394-a67b-64048295f411-kube-api-access-4gkqp\") pod \"crc-debug-9s625\" (UID: \"352bf2e0-daf1-4394-a67b-64048295f411\") " pod="openshift-must-gather-bfzgr/crc-debug-9s625" Jan 11 18:39:41 crc kubenswrapper[4837]: I0111 18:39:41.271000 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/352bf2e0-daf1-4394-a67b-64048295f411-host\") pod \"crc-debug-9s625\" (UID: \"352bf2e0-daf1-4394-a67b-64048295f411\") " pod="openshift-must-gather-bfzgr/crc-debug-9s625" Jan 11 18:39:41 crc kubenswrapper[4837]: I0111 18:39:41.306480 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4gkqp\" (UniqueName: \"kubernetes.io/projected/352bf2e0-daf1-4394-a67b-64048295f411-kube-api-access-4gkqp\") pod \"crc-debug-9s625\" (UID: \"352bf2e0-daf1-4394-a67b-64048295f411\") " pod="openshift-must-gather-bfzgr/crc-debug-9s625" Jan 11 18:39:41 crc kubenswrapper[4837]: I0111 18:39:41.418838 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-bfzgr/crc-debug-9s625" Jan 11 18:39:41 crc kubenswrapper[4837]: W0111 18:39:41.458311 4837 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod352bf2e0_daf1_4394_a67b_64048295f411.slice/crio-df13cb00828983991aa3a5f7a618fcf2cc4a5cdce9256568371d93e1b5096915 WatchSource:0}: Error finding container df13cb00828983991aa3a5f7a618fcf2cc4a5cdce9256568371d93e1b5096915: Status 404 returned error can't find the container with id df13cb00828983991aa3a5f7a618fcf2cc4a5cdce9256568371d93e1b5096915 Jan 11 18:39:41 crc kubenswrapper[4837]: I0111 18:39:41.635934 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-bfzgr/crc-debug-9s625" event={"ID":"352bf2e0-daf1-4394-a67b-64048295f411","Type":"ContainerStarted","Data":"df13cb00828983991aa3a5f7a618fcf2cc4a5cdce9256568371d93e1b5096915"} Jan 11 18:39:42 crc kubenswrapper[4837]: I0111 18:39:42.646881 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-bfzgr/crc-debug-9s625" event={"ID":"352bf2e0-daf1-4394-a67b-64048295f411","Type":"ContainerStarted","Data":"8a116824bb73a003079673012a5705754413a97724f263dc40ad9152058d2fa5"} Jan 11 18:39:42 crc kubenswrapper[4837]: I0111 18:39:42.668353 4837 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-must-gather-bfzgr/crc-debug-9s625" podStartSLOduration=1.668302865 podStartE2EDuration="1.668302865s" podCreationTimestamp="2026-01-11 18:39:41 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-11 18:39:42.659784497 +0000 UTC m=+4156.837977203" watchObservedRunningTime="2026-01-11 18:39:42.668302865 +0000 UTC m=+4156.846495581" Jan 11 18:40:00 crc kubenswrapper[4837]: I0111 18:40:00.615824 4837 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-mgxzc"] Jan 11 18:40:00 crc kubenswrapper[4837]: I0111 18:40:00.619768 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-mgxzc" Jan 11 18:40:00 crc kubenswrapper[4837]: I0111 18:40:00.639070 4837 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-mgxzc"] Jan 11 18:40:00 crc kubenswrapper[4837]: I0111 18:40:00.745425 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/834208ce-015b-4a5b-aca3-f7aacb41d881-catalog-content\") pod \"community-operators-mgxzc\" (UID: \"834208ce-015b-4a5b-aca3-f7aacb41d881\") " pod="openshift-marketplace/community-operators-mgxzc" Jan 11 18:40:00 crc kubenswrapper[4837]: I0111 18:40:00.745721 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/834208ce-015b-4a5b-aca3-f7aacb41d881-utilities\") pod \"community-operators-mgxzc\" (UID: \"834208ce-015b-4a5b-aca3-f7aacb41d881\") " pod="openshift-marketplace/community-operators-mgxzc" Jan 11 18:40:00 crc kubenswrapper[4837]: I0111 18:40:00.745763 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9fppb\" (UniqueName: \"kubernetes.io/projected/834208ce-015b-4a5b-aca3-f7aacb41d881-kube-api-access-9fppb\") pod \"community-operators-mgxzc\" (UID: \"834208ce-015b-4a5b-aca3-f7aacb41d881\") " pod="openshift-marketplace/community-operators-mgxzc" Jan 11 18:40:00 crc kubenswrapper[4837]: I0111 18:40:00.847945 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/834208ce-015b-4a5b-aca3-f7aacb41d881-catalog-content\") pod \"community-operators-mgxzc\" (UID: \"834208ce-015b-4a5b-aca3-f7aacb41d881\") " pod="openshift-marketplace/community-operators-mgxzc" Jan 11 18:40:00 crc kubenswrapper[4837]: I0111 18:40:00.848049 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/834208ce-015b-4a5b-aca3-f7aacb41d881-utilities\") pod \"community-operators-mgxzc\" (UID: \"834208ce-015b-4a5b-aca3-f7aacb41d881\") " pod="openshift-marketplace/community-operators-mgxzc" Jan 11 18:40:00 crc kubenswrapper[4837]: I0111 18:40:00.848068 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9fppb\" (UniqueName: \"kubernetes.io/projected/834208ce-015b-4a5b-aca3-f7aacb41d881-kube-api-access-9fppb\") pod \"community-operators-mgxzc\" (UID: \"834208ce-015b-4a5b-aca3-f7aacb41d881\") " pod="openshift-marketplace/community-operators-mgxzc" Jan 11 18:40:00 crc kubenswrapper[4837]: I0111 18:40:00.848451 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/834208ce-015b-4a5b-aca3-f7aacb41d881-catalog-content\") pod \"community-operators-mgxzc\" (UID: \"834208ce-015b-4a5b-aca3-f7aacb41d881\") " pod="openshift-marketplace/community-operators-mgxzc" Jan 11 18:40:00 crc kubenswrapper[4837]: I0111 18:40:00.848500 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/834208ce-015b-4a5b-aca3-f7aacb41d881-utilities\") pod \"community-operators-mgxzc\" (UID: \"834208ce-015b-4a5b-aca3-f7aacb41d881\") " pod="openshift-marketplace/community-operators-mgxzc" Jan 11 18:40:00 crc kubenswrapper[4837]: I0111 18:40:00.867049 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9fppb\" (UniqueName: \"kubernetes.io/projected/834208ce-015b-4a5b-aca3-f7aacb41d881-kube-api-access-9fppb\") pod \"community-operators-mgxzc\" (UID: \"834208ce-015b-4a5b-aca3-f7aacb41d881\") " pod="openshift-marketplace/community-operators-mgxzc" Jan 11 18:40:00 crc kubenswrapper[4837]: I0111 18:40:00.957408 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-mgxzc" Jan 11 18:40:01 crc kubenswrapper[4837]: I0111 18:40:01.518481 4837 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-mgxzc"] Jan 11 18:40:01 crc kubenswrapper[4837]: I0111 18:40:01.873272 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-mgxzc" event={"ID":"834208ce-015b-4a5b-aca3-f7aacb41d881","Type":"ContainerStarted","Data":"91f0cc6d4930a27417c809dc04b99c87d9bf5aeac12c408874d6c7fdf977941b"} Jan 11 18:40:02 crc kubenswrapper[4837]: I0111 18:40:02.883111 4837 generic.go:334] "Generic (PLEG): container finished" podID="834208ce-015b-4a5b-aca3-f7aacb41d881" containerID="f2d110d975e6cd950cc2fb8c91e39053085a48007291460081167ce75028e732" exitCode=0 Jan 11 18:40:02 crc kubenswrapper[4837]: I0111 18:40:02.883291 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-mgxzc" event={"ID":"834208ce-015b-4a5b-aca3-f7aacb41d881","Type":"ContainerDied","Data":"f2d110d975e6cd950cc2fb8c91e39053085a48007291460081167ce75028e732"} Jan 11 18:40:03 crc kubenswrapper[4837]: I0111 18:40:03.599762 4837 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-btlg4"] Jan 11 18:40:03 crc kubenswrapper[4837]: I0111 18:40:03.605126 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-btlg4" Jan 11 18:40:03 crc kubenswrapper[4837]: I0111 18:40:03.618578 4837 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-btlg4"] Jan 11 18:40:03 crc kubenswrapper[4837]: I0111 18:40:03.699037 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c50e3212-84bd-49a1-a90e-4fdd6f68d3d6-utilities\") pod \"redhat-marketplace-btlg4\" (UID: \"c50e3212-84bd-49a1-a90e-4fdd6f68d3d6\") " pod="openshift-marketplace/redhat-marketplace-btlg4" Jan 11 18:40:03 crc kubenswrapper[4837]: I0111 18:40:03.699150 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c50e3212-84bd-49a1-a90e-4fdd6f68d3d6-catalog-content\") pod \"redhat-marketplace-btlg4\" (UID: \"c50e3212-84bd-49a1-a90e-4fdd6f68d3d6\") " pod="openshift-marketplace/redhat-marketplace-btlg4" Jan 11 18:40:03 crc kubenswrapper[4837]: I0111 18:40:03.699718 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-h7dqr\" (UniqueName: \"kubernetes.io/projected/c50e3212-84bd-49a1-a90e-4fdd6f68d3d6-kube-api-access-h7dqr\") pod \"redhat-marketplace-btlg4\" (UID: \"c50e3212-84bd-49a1-a90e-4fdd6f68d3d6\") " pod="openshift-marketplace/redhat-marketplace-btlg4" Jan 11 18:40:03 crc kubenswrapper[4837]: I0111 18:40:03.801407 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-h7dqr\" (UniqueName: \"kubernetes.io/projected/c50e3212-84bd-49a1-a90e-4fdd6f68d3d6-kube-api-access-h7dqr\") pod \"redhat-marketplace-btlg4\" (UID: \"c50e3212-84bd-49a1-a90e-4fdd6f68d3d6\") " pod="openshift-marketplace/redhat-marketplace-btlg4" Jan 11 18:40:03 crc kubenswrapper[4837]: I0111 18:40:03.801813 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c50e3212-84bd-49a1-a90e-4fdd6f68d3d6-utilities\") pod \"redhat-marketplace-btlg4\" (UID: \"c50e3212-84bd-49a1-a90e-4fdd6f68d3d6\") " pod="openshift-marketplace/redhat-marketplace-btlg4" Jan 11 18:40:03 crc kubenswrapper[4837]: I0111 18:40:03.801897 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c50e3212-84bd-49a1-a90e-4fdd6f68d3d6-catalog-content\") pod \"redhat-marketplace-btlg4\" (UID: \"c50e3212-84bd-49a1-a90e-4fdd6f68d3d6\") " pod="openshift-marketplace/redhat-marketplace-btlg4" Jan 11 18:40:03 crc kubenswrapper[4837]: I0111 18:40:03.802330 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c50e3212-84bd-49a1-a90e-4fdd6f68d3d6-catalog-content\") pod \"redhat-marketplace-btlg4\" (UID: \"c50e3212-84bd-49a1-a90e-4fdd6f68d3d6\") " pod="openshift-marketplace/redhat-marketplace-btlg4" Jan 11 18:40:03 crc kubenswrapper[4837]: I0111 18:40:03.802439 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c50e3212-84bd-49a1-a90e-4fdd6f68d3d6-utilities\") pod \"redhat-marketplace-btlg4\" (UID: \"c50e3212-84bd-49a1-a90e-4fdd6f68d3d6\") " pod="openshift-marketplace/redhat-marketplace-btlg4" Jan 11 18:40:03 crc kubenswrapper[4837]: I0111 18:40:03.826647 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-h7dqr\" (UniqueName: \"kubernetes.io/projected/c50e3212-84bd-49a1-a90e-4fdd6f68d3d6-kube-api-access-h7dqr\") pod \"redhat-marketplace-btlg4\" (UID: \"c50e3212-84bd-49a1-a90e-4fdd6f68d3d6\") " pod="openshift-marketplace/redhat-marketplace-btlg4" Jan 11 18:40:03 crc kubenswrapper[4837]: I0111 18:40:03.922552 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-btlg4" Jan 11 18:40:04 crc kubenswrapper[4837]: I0111 18:40:04.399089 4837 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-btlg4"] Jan 11 18:40:04 crc kubenswrapper[4837]: I0111 18:40:04.902753 4837 generic.go:334] "Generic (PLEG): container finished" podID="834208ce-015b-4a5b-aca3-f7aacb41d881" containerID="d6202489535355b59807581b8d25858eb7683f0dcd97559b6fb8ded3cf3dfb54" exitCode=0 Jan 11 18:40:04 crc kubenswrapper[4837]: I0111 18:40:04.902885 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-mgxzc" event={"ID":"834208ce-015b-4a5b-aca3-f7aacb41d881","Type":"ContainerDied","Data":"d6202489535355b59807581b8d25858eb7683f0dcd97559b6fb8ded3cf3dfb54"} Jan 11 18:40:04 crc kubenswrapper[4837]: I0111 18:40:04.905015 4837 generic.go:334] "Generic (PLEG): container finished" podID="c50e3212-84bd-49a1-a90e-4fdd6f68d3d6" containerID="61fcb90f08e8c6f3abea1835ae16368cf160a44e17065f346f13b97f40812a58" exitCode=0 Jan 11 18:40:04 crc kubenswrapper[4837]: I0111 18:40:04.905044 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-btlg4" event={"ID":"c50e3212-84bd-49a1-a90e-4fdd6f68d3d6","Type":"ContainerDied","Data":"61fcb90f08e8c6f3abea1835ae16368cf160a44e17065f346f13b97f40812a58"} Jan 11 18:40:04 crc kubenswrapper[4837]: I0111 18:40:04.905063 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-btlg4" event={"ID":"c50e3212-84bd-49a1-a90e-4fdd6f68d3d6","Type":"ContainerStarted","Data":"0565f52b368302310e1b0f324c0752a05f47675405c54ac8c7881ebe0873b608"} Jan 11 18:40:07 crc kubenswrapper[4837]: I0111 18:40:07.385797 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-mgxzc" event={"ID":"834208ce-015b-4a5b-aca3-f7aacb41d881","Type":"ContainerStarted","Data":"58a83e4296ef640051e17ff1eaf0e3fcadc3e32dcd64b96f99e346e42ab80e43"} Jan 11 18:40:07 crc kubenswrapper[4837]: I0111 18:40:07.387860 4837 generic.go:334] "Generic (PLEG): container finished" podID="c50e3212-84bd-49a1-a90e-4fdd6f68d3d6" containerID="fb1264d38cf32e197446c5f7cf28ebc9e89e45ad0a8c3a353a8e561ee128d3f2" exitCode=0 Jan 11 18:40:07 crc kubenswrapper[4837]: I0111 18:40:07.387901 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-btlg4" event={"ID":"c50e3212-84bd-49a1-a90e-4fdd6f68d3d6","Type":"ContainerDied","Data":"fb1264d38cf32e197446c5f7cf28ebc9e89e45ad0a8c3a353a8e561ee128d3f2"} Jan 11 18:40:07 crc kubenswrapper[4837]: I0111 18:40:07.426986 4837 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-mgxzc" podStartSLOduration=3.918554244 podStartE2EDuration="7.426967879s" podCreationTimestamp="2026-01-11 18:40:00 +0000 UTC" firstStartedPulling="2026-01-11 18:40:02.88921357 +0000 UTC m=+4177.067406276" lastFinishedPulling="2026-01-11 18:40:06.397627165 +0000 UTC m=+4180.575819911" observedRunningTime="2026-01-11 18:40:07.418922402 +0000 UTC m=+4181.597115118" watchObservedRunningTime="2026-01-11 18:40:07.426967879 +0000 UTC m=+4181.605160585" Jan 11 18:40:08 crc kubenswrapper[4837]: I0111 18:40:08.397309 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-btlg4" event={"ID":"c50e3212-84bd-49a1-a90e-4fdd6f68d3d6","Type":"ContainerStarted","Data":"158e0954ff38ec74aa9a272ccfc68d31db53546041e153189131968b7bdd3c50"} Jan 11 18:40:08 crc kubenswrapper[4837]: I0111 18:40:08.430895 4837 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-btlg4" podStartSLOduration=2.5670777830000002 podStartE2EDuration="5.43087722s" podCreationTimestamp="2026-01-11 18:40:03 +0000 UTC" firstStartedPulling="2026-01-11 18:40:04.906518176 +0000 UTC m=+4179.084710882" lastFinishedPulling="2026-01-11 18:40:07.770317613 +0000 UTC m=+4181.948510319" observedRunningTime="2026-01-11 18:40:08.423166673 +0000 UTC m=+4182.601359379" watchObservedRunningTime="2026-01-11 18:40:08.43087722 +0000 UTC m=+4182.609069926" Jan 11 18:40:09 crc kubenswrapper[4837]: I0111 18:40:09.444157 4837 patch_prober.go:28] interesting pod/machine-config-daemon-pqnst container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Jan 11 18:40:09 crc kubenswrapper[4837]: I0111 18:40:09.444223 4837 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-pqnst" podUID="1cf6fa66-290a-4e29-bafc-e60185a22fe2" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Jan 11 18:40:10 crc kubenswrapper[4837]: I0111 18:40:10.957899 4837 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-mgxzc" Jan 11 18:40:10 crc kubenswrapper[4837]: I0111 18:40:10.958233 4837 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-mgxzc" Jan 11 18:40:11 crc kubenswrapper[4837]: I0111 18:40:11.043207 4837 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-mgxzc" Jan 11 18:40:11 crc kubenswrapper[4837]: I0111 18:40:11.498706 4837 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-mgxzc" Jan 11 18:40:12 crc kubenswrapper[4837]: I0111 18:40:12.186936 4837 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-mgxzc"] Jan 11 18:40:13 crc kubenswrapper[4837]: I0111 18:40:13.444792 4837 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-mgxzc" podUID="834208ce-015b-4a5b-aca3-f7aacb41d881" containerName="registry-server" containerID="cri-o://58a83e4296ef640051e17ff1eaf0e3fcadc3e32dcd64b96f99e346e42ab80e43" gracePeriod=2 Jan 11 18:40:13 crc kubenswrapper[4837]: I0111 18:40:13.923025 4837 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-btlg4" Jan 11 18:40:13 crc kubenswrapper[4837]: I0111 18:40:13.925357 4837 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-btlg4" Jan 11 18:40:13 crc kubenswrapper[4837]: I0111 18:40:13.927617 4837 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-mgxzc" Jan 11 18:40:13 crc kubenswrapper[4837]: I0111 18:40:13.976473 4837 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-btlg4" Jan 11 18:40:14 crc kubenswrapper[4837]: I0111 18:40:14.091760 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-9fppb\" (UniqueName: \"kubernetes.io/projected/834208ce-015b-4a5b-aca3-f7aacb41d881-kube-api-access-9fppb\") pod \"834208ce-015b-4a5b-aca3-f7aacb41d881\" (UID: \"834208ce-015b-4a5b-aca3-f7aacb41d881\") " Jan 11 18:40:14 crc kubenswrapper[4837]: I0111 18:40:14.092118 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/834208ce-015b-4a5b-aca3-f7aacb41d881-utilities\") pod \"834208ce-015b-4a5b-aca3-f7aacb41d881\" (UID: \"834208ce-015b-4a5b-aca3-f7aacb41d881\") " Jan 11 18:40:14 crc kubenswrapper[4837]: I0111 18:40:14.092278 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/834208ce-015b-4a5b-aca3-f7aacb41d881-catalog-content\") pod \"834208ce-015b-4a5b-aca3-f7aacb41d881\" (UID: \"834208ce-015b-4a5b-aca3-f7aacb41d881\") " Jan 11 18:40:14 crc kubenswrapper[4837]: I0111 18:40:14.093458 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/834208ce-015b-4a5b-aca3-f7aacb41d881-utilities" (OuterVolumeSpecName: "utilities") pod "834208ce-015b-4a5b-aca3-f7aacb41d881" (UID: "834208ce-015b-4a5b-aca3-f7aacb41d881"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 11 18:40:14 crc kubenswrapper[4837]: I0111 18:40:14.104861 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/834208ce-015b-4a5b-aca3-f7aacb41d881-kube-api-access-9fppb" (OuterVolumeSpecName: "kube-api-access-9fppb") pod "834208ce-015b-4a5b-aca3-f7aacb41d881" (UID: "834208ce-015b-4a5b-aca3-f7aacb41d881"). InnerVolumeSpecName "kube-api-access-9fppb". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 11 18:40:14 crc kubenswrapper[4837]: I0111 18:40:14.160139 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/834208ce-015b-4a5b-aca3-f7aacb41d881-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "834208ce-015b-4a5b-aca3-f7aacb41d881" (UID: "834208ce-015b-4a5b-aca3-f7aacb41d881"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 11 18:40:14 crc kubenswrapper[4837]: I0111 18:40:14.193692 4837 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/834208ce-015b-4a5b-aca3-f7aacb41d881-utilities\") on node \"crc\" DevicePath \"\"" Jan 11 18:40:14 crc kubenswrapper[4837]: I0111 18:40:14.193724 4837 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/834208ce-015b-4a5b-aca3-f7aacb41d881-catalog-content\") on node \"crc\" DevicePath \"\"" Jan 11 18:40:14 crc kubenswrapper[4837]: I0111 18:40:14.193739 4837 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-9fppb\" (UniqueName: \"kubernetes.io/projected/834208ce-015b-4a5b-aca3-f7aacb41d881-kube-api-access-9fppb\") on node \"crc\" DevicePath \"\"" Jan 11 18:40:14 crc kubenswrapper[4837]: I0111 18:40:14.453373 4837 generic.go:334] "Generic (PLEG): container finished" podID="834208ce-015b-4a5b-aca3-f7aacb41d881" containerID="58a83e4296ef640051e17ff1eaf0e3fcadc3e32dcd64b96f99e346e42ab80e43" exitCode=0 Jan 11 18:40:14 crc kubenswrapper[4837]: I0111 18:40:14.453424 4837 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-mgxzc" Jan 11 18:40:14 crc kubenswrapper[4837]: I0111 18:40:14.453465 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-mgxzc" event={"ID":"834208ce-015b-4a5b-aca3-f7aacb41d881","Type":"ContainerDied","Data":"58a83e4296ef640051e17ff1eaf0e3fcadc3e32dcd64b96f99e346e42ab80e43"} Jan 11 18:40:14 crc kubenswrapper[4837]: I0111 18:40:14.454407 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-mgxzc" event={"ID":"834208ce-015b-4a5b-aca3-f7aacb41d881","Type":"ContainerDied","Data":"91f0cc6d4930a27417c809dc04b99c87d9bf5aeac12c408874d6c7fdf977941b"} Jan 11 18:40:14 crc kubenswrapper[4837]: I0111 18:40:14.454483 4837 scope.go:117] "RemoveContainer" containerID="58a83e4296ef640051e17ff1eaf0e3fcadc3e32dcd64b96f99e346e42ab80e43" Jan 11 18:40:14 crc kubenswrapper[4837]: I0111 18:40:14.478881 4837 scope.go:117] "RemoveContainer" containerID="d6202489535355b59807581b8d25858eb7683f0dcd97559b6fb8ded3cf3dfb54" Jan 11 18:40:14 crc kubenswrapper[4837]: I0111 18:40:14.499289 4837 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-mgxzc"] Jan 11 18:40:14 crc kubenswrapper[4837]: I0111 18:40:14.507460 4837 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-mgxzc"] Jan 11 18:40:14 crc kubenswrapper[4837]: I0111 18:40:14.512149 4837 scope.go:117] "RemoveContainer" containerID="f2d110d975e6cd950cc2fb8c91e39053085a48007291460081167ce75028e732" Jan 11 18:40:14 crc kubenswrapper[4837]: I0111 18:40:14.523274 4837 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-btlg4" Jan 11 18:40:14 crc kubenswrapper[4837]: I0111 18:40:14.578915 4837 scope.go:117] "RemoveContainer" containerID="58a83e4296ef640051e17ff1eaf0e3fcadc3e32dcd64b96f99e346e42ab80e43" Jan 11 18:40:14 crc kubenswrapper[4837]: E0111 18:40:14.585084 4837 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"58a83e4296ef640051e17ff1eaf0e3fcadc3e32dcd64b96f99e346e42ab80e43\": container with ID starting with 58a83e4296ef640051e17ff1eaf0e3fcadc3e32dcd64b96f99e346e42ab80e43 not found: ID does not exist" containerID="58a83e4296ef640051e17ff1eaf0e3fcadc3e32dcd64b96f99e346e42ab80e43" Jan 11 18:40:14 crc kubenswrapper[4837]: I0111 18:40:14.585129 4837 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"58a83e4296ef640051e17ff1eaf0e3fcadc3e32dcd64b96f99e346e42ab80e43"} err="failed to get container status \"58a83e4296ef640051e17ff1eaf0e3fcadc3e32dcd64b96f99e346e42ab80e43\": rpc error: code = NotFound desc = could not find container \"58a83e4296ef640051e17ff1eaf0e3fcadc3e32dcd64b96f99e346e42ab80e43\": container with ID starting with 58a83e4296ef640051e17ff1eaf0e3fcadc3e32dcd64b96f99e346e42ab80e43 not found: ID does not exist" Jan 11 18:40:14 crc kubenswrapper[4837]: I0111 18:40:14.585154 4837 scope.go:117] "RemoveContainer" containerID="d6202489535355b59807581b8d25858eb7683f0dcd97559b6fb8ded3cf3dfb54" Jan 11 18:40:14 crc kubenswrapper[4837]: E0111 18:40:14.585576 4837 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d6202489535355b59807581b8d25858eb7683f0dcd97559b6fb8ded3cf3dfb54\": container with ID starting with d6202489535355b59807581b8d25858eb7683f0dcd97559b6fb8ded3cf3dfb54 not found: ID does not exist" containerID="d6202489535355b59807581b8d25858eb7683f0dcd97559b6fb8ded3cf3dfb54" Jan 11 18:40:14 crc kubenswrapper[4837]: I0111 18:40:14.585613 4837 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d6202489535355b59807581b8d25858eb7683f0dcd97559b6fb8ded3cf3dfb54"} err="failed to get container status \"d6202489535355b59807581b8d25858eb7683f0dcd97559b6fb8ded3cf3dfb54\": rpc error: code = NotFound desc = could not find container \"d6202489535355b59807581b8d25858eb7683f0dcd97559b6fb8ded3cf3dfb54\": container with ID starting with d6202489535355b59807581b8d25858eb7683f0dcd97559b6fb8ded3cf3dfb54 not found: ID does not exist" Jan 11 18:40:14 crc kubenswrapper[4837]: I0111 18:40:14.585640 4837 scope.go:117] "RemoveContainer" containerID="f2d110d975e6cd950cc2fb8c91e39053085a48007291460081167ce75028e732" Jan 11 18:40:14 crc kubenswrapper[4837]: E0111 18:40:14.585867 4837 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f2d110d975e6cd950cc2fb8c91e39053085a48007291460081167ce75028e732\": container with ID starting with f2d110d975e6cd950cc2fb8c91e39053085a48007291460081167ce75028e732 not found: ID does not exist" containerID="f2d110d975e6cd950cc2fb8c91e39053085a48007291460081167ce75028e732" Jan 11 18:40:14 crc kubenswrapper[4837]: I0111 18:40:14.585888 4837 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f2d110d975e6cd950cc2fb8c91e39053085a48007291460081167ce75028e732"} err="failed to get container status \"f2d110d975e6cd950cc2fb8c91e39053085a48007291460081167ce75028e732\": rpc error: code = NotFound desc = could not find container \"f2d110d975e6cd950cc2fb8c91e39053085a48007291460081167ce75028e732\": container with ID starting with f2d110d975e6cd950cc2fb8c91e39053085a48007291460081167ce75028e732 not found: ID does not exist" Jan 11 18:40:15 crc kubenswrapper[4837]: I0111 18:40:15.474308 4837 generic.go:334] "Generic (PLEG): container finished" podID="352bf2e0-daf1-4394-a67b-64048295f411" containerID="8a116824bb73a003079673012a5705754413a97724f263dc40ad9152058d2fa5" exitCode=0 Jan 11 18:40:15 crc kubenswrapper[4837]: I0111 18:40:15.474493 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-bfzgr/crc-debug-9s625" event={"ID":"352bf2e0-daf1-4394-a67b-64048295f411","Type":"ContainerDied","Data":"8a116824bb73a003079673012a5705754413a97724f263dc40ad9152058d2fa5"} Jan 11 18:40:16 crc kubenswrapper[4837]: I0111 18:40:16.378415 4837 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="834208ce-015b-4a5b-aca3-f7aacb41d881" path="/var/lib/kubelet/pods/834208ce-015b-4a5b-aca3-f7aacb41d881/volumes" Jan 11 18:40:16 crc kubenswrapper[4837]: I0111 18:40:16.385884 4837 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-btlg4"] Jan 11 18:40:17 crc kubenswrapper[4837]: I0111 18:40:17.291728 4837 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-bfzgr/crc-debug-9s625" Jan 11 18:40:17 crc kubenswrapper[4837]: I0111 18:40:17.327090 4837 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-must-gather-bfzgr/crc-debug-9s625"] Jan 11 18:40:17 crc kubenswrapper[4837]: I0111 18:40:17.338579 4837 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-must-gather-bfzgr/crc-debug-9s625"] Jan 11 18:40:17 crc kubenswrapper[4837]: I0111 18:40:17.464517 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4gkqp\" (UniqueName: \"kubernetes.io/projected/352bf2e0-daf1-4394-a67b-64048295f411-kube-api-access-4gkqp\") pod \"352bf2e0-daf1-4394-a67b-64048295f411\" (UID: \"352bf2e0-daf1-4394-a67b-64048295f411\") " Jan 11 18:40:17 crc kubenswrapper[4837]: I0111 18:40:17.464766 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/352bf2e0-daf1-4394-a67b-64048295f411-host\") pod \"352bf2e0-daf1-4394-a67b-64048295f411\" (UID: \"352bf2e0-daf1-4394-a67b-64048295f411\") " Jan 11 18:40:17 crc kubenswrapper[4837]: I0111 18:40:17.464899 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/352bf2e0-daf1-4394-a67b-64048295f411-host" (OuterVolumeSpecName: "host") pod "352bf2e0-daf1-4394-a67b-64048295f411" (UID: "352bf2e0-daf1-4394-a67b-64048295f411"). InnerVolumeSpecName "host". PluginName "kubernetes.io/host-path", VolumeGidValue "" Jan 11 18:40:17 crc kubenswrapper[4837]: I0111 18:40:17.465206 4837 reconciler_common.go:293] "Volume detached for volume \"host\" (UniqueName: \"kubernetes.io/host-path/352bf2e0-daf1-4394-a67b-64048295f411-host\") on node \"crc\" DevicePath \"\"" Jan 11 18:40:17 crc kubenswrapper[4837]: I0111 18:40:17.470563 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/352bf2e0-daf1-4394-a67b-64048295f411-kube-api-access-4gkqp" (OuterVolumeSpecName: "kube-api-access-4gkqp") pod "352bf2e0-daf1-4394-a67b-64048295f411" (UID: "352bf2e0-daf1-4394-a67b-64048295f411"). InnerVolumeSpecName "kube-api-access-4gkqp". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 11 18:40:17 crc kubenswrapper[4837]: I0111 18:40:17.497404 4837 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-btlg4" podUID="c50e3212-84bd-49a1-a90e-4fdd6f68d3d6" containerName="registry-server" containerID="cri-o://158e0954ff38ec74aa9a272ccfc68d31db53546041e153189131968b7bdd3c50" gracePeriod=2 Jan 11 18:40:17 crc kubenswrapper[4837]: I0111 18:40:17.497797 4837 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-bfzgr/crc-debug-9s625" Jan 11 18:40:17 crc kubenswrapper[4837]: I0111 18:40:17.500863 4837 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="df13cb00828983991aa3a5f7a618fcf2cc4a5cdce9256568371d93e1b5096915" Jan 11 18:40:17 crc kubenswrapper[4837]: I0111 18:40:17.566339 4837 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-4gkqp\" (UniqueName: \"kubernetes.io/projected/352bf2e0-daf1-4394-a67b-64048295f411-kube-api-access-4gkqp\") on node \"crc\" DevicePath \"\"" Jan 11 18:40:17 crc kubenswrapper[4837]: I0111 18:40:17.946225 4837 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-btlg4" Jan 11 18:40:18 crc kubenswrapper[4837]: I0111 18:40:18.075384 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c50e3212-84bd-49a1-a90e-4fdd6f68d3d6-utilities\") pod \"c50e3212-84bd-49a1-a90e-4fdd6f68d3d6\" (UID: \"c50e3212-84bd-49a1-a90e-4fdd6f68d3d6\") " Jan 11 18:40:18 crc kubenswrapper[4837]: I0111 18:40:18.075756 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-h7dqr\" (UniqueName: \"kubernetes.io/projected/c50e3212-84bd-49a1-a90e-4fdd6f68d3d6-kube-api-access-h7dqr\") pod \"c50e3212-84bd-49a1-a90e-4fdd6f68d3d6\" (UID: \"c50e3212-84bd-49a1-a90e-4fdd6f68d3d6\") " Jan 11 18:40:18 crc kubenswrapper[4837]: I0111 18:40:18.075832 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c50e3212-84bd-49a1-a90e-4fdd6f68d3d6-catalog-content\") pod \"c50e3212-84bd-49a1-a90e-4fdd6f68d3d6\" (UID: \"c50e3212-84bd-49a1-a90e-4fdd6f68d3d6\") " Jan 11 18:40:18 crc kubenswrapper[4837]: I0111 18:40:18.076546 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/c50e3212-84bd-49a1-a90e-4fdd6f68d3d6-utilities" (OuterVolumeSpecName: "utilities") pod "c50e3212-84bd-49a1-a90e-4fdd6f68d3d6" (UID: "c50e3212-84bd-49a1-a90e-4fdd6f68d3d6"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 11 18:40:18 crc kubenswrapper[4837]: I0111 18:40:18.081438 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c50e3212-84bd-49a1-a90e-4fdd6f68d3d6-kube-api-access-h7dqr" (OuterVolumeSpecName: "kube-api-access-h7dqr") pod "c50e3212-84bd-49a1-a90e-4fdd6f68d3d6" (UID: "c50e3212-84bd-49a1-a90e-4fdd6f68d3d6"). InnerVolumeSpecName "kube-api-access-h7dqr". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 11 18:40:18 crc kubenswrapper[4837]: I0111 18:40:18.103912 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/c50e3212-84bd-49a1-a90e-4fdd6f68d3d6-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "c50e3212-84bd-49a1-a90e-4fdd6f68d3d6" (UID: "c50e3212-84bd-49a1-a90e-4fdd6f68d3d6"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 11 18:40:18 crc kubenswrapper[4837]: I0111 18:40:18.177798 4837 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-h7dqr\" (UniqueName: \"kubernetes.io/projected/c50e3212-84bd-49a1-a90e-4fdd6f68d3d6-kube-api-access-h7dqr\") on node \"crc\" DevicePath \"\"" Jan 11 18:40:18 crc kubenswrapper[4837]: I0111 18:40:18.177835 4837 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c50e3212-84bd-49a1-a90e-4fdd6f68d3d6-catalog-content\") on node \"crc\" DevicePath \"\"" Jan 11 18:40:18 crc kubenswrapper[4837]: I0111 18:40:18.177843 4837 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c50e3212-84bd-49a1-a90e-4fdd6f68d3d6-utilities\") on node \"crc\" DevicePath \"\"" Jan 11 18:40:18 crc kubenswrapper[4837]: I0111 18:40:18.373572 4837 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="352bf2e0-daf1-4394-a67b-64048295f411" path="/var/lib/kubelet/pods/352bf2e0-daf1-4394-a67b-64048295f411/volumes" Jan 11 18:40:18 crc kubenswrapper[4837]: I0111 18:40:18.509225 4837 generic.go:334] "Generic (PLEG): container finished" podID="c50e3212-84bd-49a1-a90e-4fdd6f68d3d6" containerID="158e0954ff38ec74aa9a272ccfc68d31db53546041e153189131968b7bdd3c50" exitCode=0 Jan 11 18:40:18 crc kubenswrapper[4837]: I0111 18:40:18.509297 4837 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-btlg4" Jan 11 18:40:18 crc kubenswrapper[4837]: I0111 18:40:18.509281 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-btlg4" event={"ID":"c50e3212-84bd-49a1-a90e-4fdd6f68d3d6","Type":"ContainerDied","Data":"158e0954ff38ec74aa9a272ccfc68d31db53546041e153189131968b7bdd3c50"} Jan 11 18:40:18 crc kubenswrapper[4837]: I0111 18:40:18.509465 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-btlg4" event={"ID":"c50e3212-84bd-49a1-a90e-4fdd6f68d3d6","Type":"ContainerDied","Data":"0565f52b368302310e1b0f324c0752a05f47675405c54ac8c7881ebe0873b608"} Jan 11 18:40:18 crc kubenswrapper[4837]: I0111 18:40:18.509501 4837 scope.go:117] "RemoveContainer" containerID="158e0954ff38ec74aa9a272ccfc68d31db53546041e153189131968b7bdd3c50" Jan 11 18:40:18 crc kubenswrapper[4837]: I0111 18:40:18.526932 4837 scope.go:117] "RemoveContainer" containerID="fb1264d38cf32e197446c5f7cf28ebc9e89e45ad0a8c3a353a8e561ee128d3f2" Jan 11 18:40:18 crc kubenswrapper[4837]: I0111 18:40:18.537365 4837 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-btlg4"] Jan 11 18:40:18 crc kubenswrapper[4837]: I0111 18:40:18.545869 4837 scope.go:117] "RemoveContainer" containerID="61fcb90f08e8c6f3abea1835ae16368cf160a44e17065f346f13b97f40812a58" Jan 11 18:40:18 crc kubenswrapper[4837]: I0111 18:40:18.554790 4837 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-btlg4"] Jan 11 18:40:18 crc kubenswrapper[4837]: I0111 18:40:18.645192 4837 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-must-gather-bfzgr/crc-debug-784t5"] Jan 11 18:40:18 crc kubenswrapper[4837]: E0111 18:40:18.645591 4837 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c50e3212-84bd-49a1-a90e-4fdd6f68d3d6" containerName="registry-server" Jan 11 18:40:18 crc kubenswrapper[4837]: I0111 18:40:18.645611 4837 state_mem.go:107] "Deleted CPUSet assignment" podUID="c50e3212-84bd-49a1-a90e-4fdd6f68d3d6" containerName="registry-server" Jan 11 18:40:18 crc kubenswrapper[4837]: E0111 18:40:18.645628 4837 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c50e3212-84bd-49a1-a90e-4fdd6f68d3d6" containerName="extract-utilities" Jan 11 18:40:18 crc kubenswrapper[4837]: I0111 18:40:18.645635 4837 state_mem.go:107] "Deleted CPUSet assignment" podUID="c50e3212-84bd-49a1-a90e-4fdd6f68d3d6" containerName="extract-utilities" Jan 11 18:40:18 crc kubenswrapper[4837]: E0111 18:40:18.645655 4837 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="834208ce-015b-4a5b-aca3-f7aacb41d881" containerName="extract-content" Jan 11 18:40:18 crc kubenswrapper[4837]: I0111 18:40:18.645661 4837 state_mem.go:107] "Deleted CPUSet assignment" podUID="834208ce-015b-4a5b-aca3-f7aacb41d881" containerName="extract-content" Jan 11 18:40:18 crc kubenswrapper[4837]: E0111 18:40:18.645692 4837 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="834208ce-015b-4a5b-aca3-f7aacb41d881" containerName="extract-utilities" Jan 11 18:40:18 crc kubenswrapper[4837]: I0111 18:40:18.645698 4837 state_mem.go:107] "Deleted CPUSet assignment" podUID="834208ce-015b-4a5b-aca3-f7aacb41d881" containerName="extract-utilities" Jan 11 18:40:18 crc kubenswrapper[4837]: E0111 18:40:18.645706 4837 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="834208ce-015b-4a5b-aca3-f7aacb41d881" containerName="registry-server" Jan 11 18:40:18 crc kubenswrapper[4837]: I0111 18:40:18.645711 4837 state_mem.go:107] "Deleted CPUSet assignment" podUID="834208ce-015b-4a5b-aca3-f7aacb41d881" containerName="registry-server" Jan 11 18:40:18 crc kubenswrapper[4837]: E0111 18:40:18.645755 4837 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c50e3212-84bd-49a1-a90e-4fdd6f68d3d6" containerName="extract-content" Jan 11 18:40:18 crc kubenswrapper[4837]: I0111 18:40:18.645765 4837 state_mem.go:107] "Deleted CPUSet assignment" podUID="c50e3212-84bd-49a1-a90e-4fdd6f68d3d6" containerName="extract-content" Jan 11 18:40:18 crc kubenswrapper[4837]: E0111 18:40:18.645780 4837 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="352bf2e0-daf1-4394-a67b-64048295f411" containerName="container-00" Jan 11 18:40:18 crc kubenswrapper[4837]: I0111 18:40:18.645786 4837 state_mem.go:107] "Deleted CPUSet assignment" podUID="352bf2e0-daf1-4394-a67b-64048295f411" containerName="container-00" Jan 11 18:40:18 crc kubenswrapper[4837]: I0111 18:40:18.646019 4837 memory_manager.go:354] "RemoveStaleState removing state" podUID="352bf2e0-daf1-4394-a67b-64048295f411" containerName="container-00" Jan 11 18:40:18 crc kubenswrapper[4837]: I0111 18:40:18.646276 4837 memory_manager.go:354] "RemoveStaleState removing state" podUID="c50e3212-84bd-49a1-a90e-4fdd6f68d3d6" containerName="registry-server" Jan 11 18:40:18 crc kubenswrapper[4837]: I0111 18:40:18.646333 4837 memory_manager.go:354] "RemoveStaleState removing state" podUID="834208ce-015b-4a5b-aca3-f7aacb41d881" containerName="registry-server" Jan 11 18:40:18 crc kubenswrapper[4837]: I0111 18:40:18.647404 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-bfzgr/crc-debug-784t5" Jan 11 18:40:18 crc kubenswrapper[4837]: I0111 18:40:18.649565 4837 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-must-gather-bfzgr"/"default-dockercfg-fh9jq" Jan 11 18:40:18 crc kubenswrapper[4837]: I0111 18:40:18.788033 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/e2e68a56-5e3c-4d6b-8091-8f0bee0fe914-host\") pod \"crc-debug-784t5\" (UID: \"e2e68a56-5e3c-4d6b-8091-8f0bee0fe914\") " pod="openshift-must-gather-bfzgr/crc-debug-784t5" Jan 11 18:40:18 crc kubenswrapper[4837]: I0111 18:40:18.788301 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-f95hx\" (UniqueName: \"kubernetes.io/projected/e2e68a56-5e3c-4d6b-8091-8f0bee0fe914-kube-api-access-f95hx\") pod \"crc-debug-784t5\" (UID: \"e2e68a56-5e3c-4d6b-8091-8f0bee0fe914\") " pod="openshift-must-gather-bfzgr/crc-debug-784t5" Jan 11 18:40:18 crc kubenswrapper[4837]: I0111 18:40:18.889568 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/e2e68a56-5e3c-4d6b-8091-8f0bee0fe914-host\") pod \"crc-debug-784t5\" (UID: \"e2e68a56-5e3c-4d6b-8091-8f0bee0fe914\") " pod="openshift-must-gather-bfzgr/crc-debug-784t5" Jan 11 18:40:18 crc kubenswrapper[4837]: I0111 18:40:18.889615 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-f95hx\" (UniqueName: \"kubernetes.io/projected/e2e68a56-5e3c-4d6b-8091-8f0bee0fe914-kube-api-access-f95hx\") pod \"crc-debug-784t5\" (UID: \"e2e68a56-5e3c-4d6b-8091-8f0bee0fe914\") " pod="openshift-must-gather-bfzgr/crc-debug-784t5" Jan 11 18:40:18 crc kubenswrapper[4837]: I0111 18:40:18.889708 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/e2e68a56-5e3c-4d6b-8091-8f0bee0fe914-host\") pod \"crc-debug-784t5\" (UID: \"e2e68a56-5e3c-4d6b-8091-8f0bee0fe914\") " pod="openshift-must-gather-bfzgr/crc-debug-784t5" Jan 11 18:40:19 crc kubenswrapper[4837]: I0111 18:40:19.075614 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-f95hx\" (UniqueName: \"kubernetes.io/projected/e2e68a56-5e3c-4d6b-8091-8f0bee0fe914-kube-api-access-f95hx\") pod \"crc-debug-784t5\" (UID: \"e2e68a56-5e3c-4d6b-8091-8f0bee0fe914\") " pod="openshift-must-gather-bfzgr/crc-debug-784t5" Jan 11 18:40:19 crc kubenswrapper[4837]: I0111 18:40:19.085419 4837 scope.go:117] "RemoveContainer" containerID="158e0954ff38ec74aa9a272ccfc68d31db53546041e153189131968b7bdd3c50" Jan 11 18:40:19 crc kubenswrapper[4837]: E0111 18:40:19.085980 4837 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"158e0954ff38ec74aa9a272ccfc68d31db53546041e153189131968b7bdd3c50\": container with ID starting with 158e0954ff38ec74aa9a272ccfc68d31db53546041e153189131968b7bdd3c50 not found: ID does not exist" containerID="158e0954ff38ec74aa9a272ccfc68d31db53546041e153189131968b7bdd3c50" Jan 11 18:40:19 crc kubenswrapper[4837]: I0111 18:40:19.086036 4837 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"158e0954ff38ec74aa9a272ccfc68d31db53546041e153189131968b7bdd3c50"} err="failed to get container status \"158e0954ff38ec74aa9a272ccfc68d31db53546041e153189131968b7bdd3c50\": rpc error: code = NotFound desc = could not find container \"158e0954ff38ec74aa9a272ccfc68d31db53546041e153189131968b7bdd3c50\": container with ID starting with 158e0954ff38ec74aa9a272ccfc68d31db53546041e153189131968b7bdd3c50 not found: ID does not exist" Jan 11 18:40:19 crc kubenswrapper[4837]: I0111 18:40:19.086064 4837 scope.go:117] "RemoveContainer" containerID="fb1264d38cf32e197446c5f7cf28ebc9e89e45ad0a8c3a353a8e561ee128d3f2" Jan 11 18:40:19 crc kubenswrapper[4837]: E0111 18:40:19.086615 4837 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"fb1264d38cf32e197446c5f7cf28ebc9e89e45ad0a8c3a353a8e561ee128d3f2\": container with ID starting with fb1264d38cf32e197446c5f7cf28ebc9e89e45ad0a8c3a353a8e561ee128d3f2 not found: ID does not exist" containerID="fb1264d38cf32e197446c5f7cf28ebc9e89e45ad0a8c3a353a8e561ee128d3f2" Jan 11 18:40:19 crc kubenswrapper[4837]: I0111 18:40:19.086647 4837 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"fb1264d38cf32e197446c5f7cf28ebc9e89e45ad0a8c3a353a8e561ee128d3f2"} err="failed to get container status \"fb1264d38cf32e197446c5f7cf28ebc9e89e45ad0a8c3a353a8e561ee128d3f2\": rpc error: code = NotFound desc = could not find container \"fb1264d38cf32e197446c5f7cf28ebc9e89e45ad0a8c3a353a8e561ee128d3f2\": container with ID starting with fb1264d38cf32e197446c5f7cf28ebc9e89e45ad0a8c3a353a8e561ee128d3f2 not found: ID does not exist" Jan 11 18:40:19 crc kubenswrapper[4837]: I0111 18:40:19.086666 4837 scope.go:117] "RemoveContainer" containerID="61fcb90f08e8c6f3abea1835ae16368cf160a44e17065f346f13b97f40812a58" Jan 11 18:40:19 crc kubenswrapper[4837]: E0111 18:40:19.086916 4837 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"61fcb90f08e8c6f3abea1835ae16368cf160a44e17065f346f13b97f40812a58\": container with ID starting with 61fcb90f08e8c6f3abea1835ae16368cf160a44e17065f346f13b97f40812a58 not found: ID does not exist" containerID="61fcb90f08e8c6f3abea1835ae16368cf160a44e17065f346f13b97f40812a58" Jan 11 18:40:19 crc kubenswrapper[4837]: I0111 18:40:19.086938 4837 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"61fcb90f08e8c6f3abea1835ae16368cf160a44e17065f346f13b97f40812a58"} err="failed to get container status \"61fcb90f08e8c6f3abea1835ae16368cf160a44e17065f346f13b97f40812a58\": rpc error: code = NotFound desc = could not find container \"61fcb90f08e8c6f3abea1835ae16368cf160a44e17065f346f13b97f40812a58\": container with ID starting with 61fcb90f08e8c6f3abea1835ae16368cf160a44e17065f346f13b97f40812a58 not found: ID does not exist" Jan 11 18:40:19 crc kubenswrapper[4837]: I0111 18:40:19.278255 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-bfzgr/crc-debug-784t5" Jan 11 18:40:19 crc kubenswrapper[4837]: W0111 18:40:19.315938 4837 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pode2e68a56_5e3c_4d6b_8091_8f0bee0fe914.slice/crio-c4cbafcaba471c38d8ccfca66baab43caed29b2ee1d8962a5ecfadfa01a7dbe0 WatchSource:0}: Error finding container c4cbafcaba471c38d8ccfca66baab43caed29b2ee1d8962a5ecfadfa01a7dbe0: Status 404 returned error can't find the container with id c4cbafcaba471c38d8ccfca66baab43caed29b2ee1d8962a5ecfadfa01a7dbe0 Jan 11 18:40:19 crc kubenswrapper[4837]: I0111 18:40:19.517297 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-bfzgr/crc-debug-784t5" event={"ID":"e2e68a56-5e3c-4d6b-8091-8f0bee0fe914","Type":"ContainerStarted","Data":"c4cbafcaba471c38d8ccfca66baab43caed29b2ee1d8962a5ecfadfa01a7dbe0"} Jan 11 18:40:20 crc kubenswrapper[4837]: I0111 18:40:20.374880 4837 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c50e3212-84bd-49a1-a90e-4fdd6f68d3d6" path="/var/lib/kubelet/pods/c50e3212-84bd-49a1-a90e-4fdd6f68d3d6/volumes" Jan 11 18:40:20 crc kubenswrapper[4837]: I0111 18:40:20.529612 4837 generic.go:334] "Generic (PLEG): container finished" podID="e2e68a56-5e3c-4d6b-8091-8f0bee0fe914" containerID="13af612321d308dbfca81341c206724c5e9ba9916df850976e52f2a92cf4d8da" exitCode=0 Jan 11 18:40:20 crc kubenswrapper[4837]: I0111 18:40:20.529693 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-bfzgr/crc-debug-784t5" event={"ID":"e2e68a56-5e3c-4d6b-8091-8f0bee0fe914","Type":"ContainerDied","Data":"13af612321d308dbfca81341c206724c5e9ba9916df850976e52f2a92cf4d8da"} Jan 11 18:40:20 crc kubenswrapper[4837]: I0111 18:40:20.931849 4837 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-must-gather-bfzgr/crc-debug-784t5"] Jan 11 18:40:20 crc kubenswrapper[4837]: I0111 18:40:20.942021 4837 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-must-gather-bfzgr/crc-debug-784t5"] Jan 11 18:40:21 crc kubenswrapper[4837]: I0111 18:40:21.656055 4837 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-bfzgr/crc-debug-784t5" Jan 11 18:40:21 crc kubenswrapper[4837]: I0111 18:40:21.840654 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-f95hx\" (UniqueName: \"kubernetes.io/projected/e2e68a56-5e3c-4d6b-8091-8f0bee0fe914-kube-api-access-f95hx\") pod \"e2e68a56-5e3c-4d6b-8091-8f0bee0fe914\" (UID: \"e2e68a56-5e3c-4d6b-8091-8f0bee0fe914\") " Jan 11 18:40:21 crc kubenswrapper[4837]: I0111 18:40:21.840888 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/e2e68a56-5e3c-4d6b-8091-8f0bee0fe914-host\") pod \"e2e68a56-5e3c-4d6b-8091-8f0bee0fe914\" (UID: \"e2e68a56-5e3c-4d6b-8091-8f0bee0fe914\") " Jan 11 18:40:21 crc kubenswrapper[4837]: I0111 18:40:21.841228 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/e2e68a56-5e3c-4d6b-8091-8f0bee0fe914-host" (OuterVolumeSpecName: "host") pod "e2e68a56-5e3c-4d6b-8091-8f0bee0fe914" (UID: "e2e68a56-5e3c-4d6b-8091-8f0bee0fe914"). InnerVolumeSpecName "host". PluginName "kubernetes.io/host-path", VolumeGidValue "" Jan 11 18:40:21 crc kubenswrapper[4837]: I0111 18:40:21.841461 4837 reconciler_common.go:293] "Volume detached for volume \"host\" (UniqueName: \"kubernetes.io/host-path/e2e68a56-5e3c-4d6b-8091-8f0bee0fe914-host\") on node \"crc\" DevicePath \"\"" Jan 11 18:40:21 crc kubenswrapper[4837]: I0111 18:40:21.847716 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e2e68a56-5e3c-4d6b-8091-8f0bee0fe914-kube-api-access-f95hx" (OuterVolumeSpecName: "kube-api-access-f95hx") pod "e2e68a56-5e3c-4d6b-8091-8f0bee0fe914" (UID: "e2e68a56-5e3c-4d6b-8091-8f0bee0fe914"). InnerVolumeSpecName "kube-api-access-f95hx". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 11 18:40:21 crc kubenswrapper[4837]: I0111 18:40:21.943070 4837 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-f95hx\" (UniqueName: \"kubernetes.io/projected/e2e68a56-5e3c-4d6b-8091-8f0bee0fe914-kube-api-access-f95hx\") on node \"crc\" DevicePath \"\"" Jan 11 18:40:22 crc kubenswrapper[4837]: I0111 18:40:22.169238 4837 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-must-gather-bfzgr/crc-debug-km4zc"] Jan 11 18:40:22 crc kubenswrapper[4837]: E0111 18:40:22.169785 4837 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e2e68a56-5e3c-4d6b-8091-8f0bee0fe914" containerName="container-00" Jan 11 18:40:22 crc kubenswrapper[4837]: I0111 18:40:22.169815 4837 state_mem.go:107] "Deleted CPUSet assignment" podUID="e2e68a56-5e3c-4d6b-8091-8f0bee0fe914" containerName="container-00" Jan 11 18:40:22 crc kubenswrapper[4837]: I0111 18:40:22.170177 4837 memory_manager.go:354] "RemoveStaleState removing state" podUID="e2e68a56-5e3c-4d6b-8091-8f0bee0fe914" containerName="container-00" Jan 11 18:40:22 crc kubenswrapper[4837]: I0111 18:40:22.173365 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-bfzgr/crc-debug-km4zc" Jan 11 18:40:22 crc kubenswrapper[4837]: I0111 18:40:22.350741 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/aeda6f4c-e596-4aa3-9e34-e3a8ccc48d54-host\") pod \"crc-debug-km4zc\" (UID: \"aeda6f4c-e596-4aa3-9e34-e3a8ccc48d54\") " pod="openshift-must-gather-bfzgr/crc-debug-km4zc" Jan 11 18:40:22 crc kubenswrapper[4837]: I0111 18:40:22.351331 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2ztw4\" (UniqueName: \"kubernetes.io/projected/aeda6f4c-e596-4aa3-9e34-e3a8ccc48d54-kube-api-access-2ztw4\") pod \"crc-debug-km4zc\" (UID: \"aeda6f4c-e596-4aa3-9e34-e3a8ccc48d54\") " pod="openshift-must-gather-bfzgr/crc-debug-km4zc" Jan 11 18:40:22 crc kubenswrapper[4837]: I0111 18:40:22.374609 4837 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e2e68a56-5e3c-4d6b-8091-8f0bee0fe914" path="/var/lib/kubelet/pods/e2e68a56-5e3c-4d6b-8091-8f0bee0fe914/volumes" Jan 11 18:40:22 crc kubenswrapper[4837]: I0111 18:40:22.453736 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/aeda6f4c-e596-4aa3-9e34-e3a8ccc48d54-host\") pod \"crc-debug-km4zc\" (UID: \"aeda6f4c-e596-4aa3-9e34-e3a8ccc48d54\") " pod="openshift-must-gather-bfzgr/crc-debug-km4zc" Jan 11 18:40:22 crc kubenswrapper[4837]: I0111 18:40:22.453840 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/aeda6f4c-e596-4aa3-9e34-e3a8ccc48d54-host\") pod \"crc-debug-km4zc\" (UID: \"aeda6f4c-e596-4aa3-9e34-e3a8ccc48d54\") " pod="openshift-must-gather-bfzgr/crc-debug-km4zc" Jan 11 18:40:22 crc kubenswrapper[4837]: I0111 18:40:22.453976 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2ztw4\" (UniqueName: \"kubernetes.io/projected/aeda6f4c-e596-4aa3-9e34-e3a8ccc48d54-kube-api-access-2ztw4\") pod \"crc-debug-km4zc\" (UID: \"aeda6f4c-e596-4aa3-9e34-e3a8ccc48d54\") " pod="openshift-must-gather-bfzgr/crc-debug-km4zc" Jan 11 18:40:22 crc kubenswrapper[4837]: I0111 18:40:22.473838 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2ztw4\" (UniqueName: \"kubernetes.io/projected/aeda6f4c-e596-4aa3-9e34-e3a8ccc48d54-kube-api-access-2ztw4\") pod \"crc-debug-km4zc\" (UID: \"aeda6f4c-e596-4aa3-9e34-e3a8ccc48d54\") " pod="openshift-must-gather-bfzgr/crc-debug-km4zc" Jan 11 18:40:22 crc kubenswrapper[4837]: I0111 18:40:22.496099 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-bfzgr/crc-debug-km4zc" Jan 11 18:40:22 crc kubenswrapper[4837]: I0111 18:40:22.546594 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-bfzgr/crc-debug-km4zc" event={"ID":"aeda6f4c-e596-4aa3-9e34-e3a8ccc48d54","Type":"ContainerStarted","Data":"197cb606b717caf3549a133096dbc57c009578f92c67851ad73a77764cda60ec"} Jan 11 18:40:22 crc kubenswrapper[4837]: I0111 18:40:22.548362 4837 scope.go:117] "RemoveContainer" containerID="13af612321d308dbfca81341c206724c5e9ba9916df850976e52f2a92cf4d8da" Jan 11 18:40:22 crc kubenswrapper[4837]: I0111 18:40:22.548504 4837 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-bfzgr/crc-debug-784t5" Jan 11 18:40:23 crc kubenswrapper[4837]: I0111 18:40:23.557338 4837 generic.go:334] "Generic (PLEG): container finished" podID="aeda6f4c-e596-4aa3-9e34-e3a8ccc48d54" containerID="d9100aebf2fdee000f0461562cd6c47a2a5e5de39ea8ace41e140e18ec4f659e" exitCode=0 Jan 11 18:40:23 crc kubenswrapper[4837]: I0111 18:40:23.557605 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-bfzgr/crc-debug-km4zc" event={"ID":"aeda6f4c-e596-4aa3-9e34-e3a8ccc48d54","Type":"ContainerDied","Data":"d9100aebf2fdee000f0461562cd6c47a2a5e5de39ea8ace41e140e18ec4f659e"} Jan 11 18:40:23 crc kubenswrapper[4837]: I0111 18:40:23.605725 4837 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-must-gather-bfzgr/crc-debug-km4zc"] Jan 11 18:40:23 crc kubenswrapper[4837]: I0111 18:40:23.616601 4837 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-must-gather-bfzgr/crc-debug-km4zc"] Jan 11 18:40:24 crc kubenswrapper[4837]: I0111 18:40:24.697265 4837 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-bfzgr/crc-debug-km4zc" Jan 11 18:40:24 crc kubenswrapper[4837]: I0111 18:40:24.802771 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/aeda6f4c-e596-4aa3-9e34-e3a8ccc48d54-host\") pod \"aeda6f4c-e596-4aa3-9e34-e3a8ccc48d54\" (UID: \"aeda6f4c-e596-4aa3-9e34-e3a8ccc48d54\") " Jan 11 18:40:24 crc kubenswrapper[4837]: I0111 18:40:24.803028 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2ztw4\" (UniqueName: \"kubernetes.io/projected/aeda6f4c-e596-4aa3-9e34-e3a8ccc48d54-kube-api-access-2ztw4\") pod \"aeda6f4c-e596-4aa3-9e34-e3a8ccc48d54\" (UID: \"aeda6f4c-e596-4aa3-9e34-e3a8ccc48d54\") " Jan 11 18:40:24 crc kubenswrapper[4837]: I0111 18:40:24.803036 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/aeda6f4c-e596-4aa3-9e34-e3a8ccc48d54-host" (OuterVolumeSpecName: "host") pod "aeda6f4c-e596-4aa3-9e34-e3a8ccc48d54" (UID: "aeda6f4c-e596-4aa3-9e34-e3a8ccc48d54"). InnerVolumeSpecName "host". PluginName "kubernetes.io/host-path", VolumeGidValue "" Jan 11 18:40:24 crc kubenswrapper[4837]: I0111 18:40:24.809538 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/aeda6f4c-e596-4aa3-9e34-e3a8ccc48d54-kube-api-access-2ztw4" (OuterVolumeSpecName: "kube-api-access-2ztw4") pod "aeda6f4c-e596-4aa3-9e34-e3a8ccc48d54" (UID: "aeda6f4c-e596-4aa3-9e34-e3a8ccc48d54"). InnerVolumeSpecName "kube-api-access-2ztw4". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 11 18:40:24 crc kubenswrapper[4837]: I0111 18:40:24.905197 4837 reconciler_common.go:293] "Volume detached for volume \"host\" (UniqueName: \"kubernetes.io/host-path/aeda6f4c-e596-4aa3-9e34-e3a8ccc48d54-host\") on node \"crc\" DevicePath \"\"" Jan 11 18:40:24 crc kubenswrapper[4837]: I0111 18:40:24.905236 4837 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2ztw4\" (UniqueName: \"kubernetes.io/projected/aeda6f4c-e596-4aa3-9e34-e3a8ccc48d54-kube-api-access-2ztw4\") on node \"crc\" DevicePath \"\"" Jan 11 18:40:25 crc kubenswrapper[4837]: I0111 18:40:25.589520 4837 scope.go:117] "RemoveContainer" containerID="d9100aebf2fdee000f0461562cd6c47a2a5e5de39ea8ace41e140e18ec4f659e" Jan 11 18:40:25 crc kubenswrapper[4837]: I0111 18:40:25.589575 4837 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-bfzgr/crc-debug-km4zc" Jan 11 18:40:26 crc kubenswrapper[4837]: I0111 18:40:26.392995 4837 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="aeda6f4c-e596-4aa3-9e34-e3a8ccc48d54" path="/var/lib/kubelet/pods/aeda6f4c-e596-4aa3-9e34-e3a8ccc48d54/volumes" Jan 11 18:40:39 crc kubenswrapper[4837]: I0111 18:40:39.444593 4837 patch_prober.go:28] interesting pod/machine-config-daemon-pqnst container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Jan 11 18:40:39 crc kubenswrapper[4837]: I0111 18:40:39.445205 4837 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-pqnst" podUID="1cf6fa66-290a-4e29-bafc-e60185a22fe2" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Jan 11 18:40:39 crc kubenswrapper[4837]: I0111 18:40:39.445263 4837 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-pqnst" Jan 11 18:40:39 crc kubenswrapper[4837]: I0111 18:40:39.446210 4837 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"c05585a8023b2af68a7f80685d84b74e2cdf192f7d872b26f5d474040ef515c9"} pod="openshift-machine-config-operator/machine-config-daemon-pqnst" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Jan 11 18:40:39 crc kubenswrapper[4837]: I0111 18:40:39.446265 4837 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-pqnst" podUID="1cf6fa66-290a-4e29-bafc-e60185a22fe2" containerName="machine-config-daemon" containerID="cri-o://c05585a8023b2af68a7f80685d84b74e2cdf192f7d872b26f5d474040ef515c9" gracePeriod=600 Jan 11 18:40:39 crc kubenswrapper[4837]: I0111 18:40:39.731742 4837 generic.go:334] "Generic (PLEG): container finished" podID="1cf6fa66-290a-4e29-bafc-e60185a22fe2" containerID="c05585a8023b2af68a7f80685d84b74e2cdf192f7d872b26f5d474040ef515c9" exitCode=0 Jan 11 18:40:39 crc kubenswrapper[4837]: I0111 18:40:39.731743 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-pqnst" event={"ID":"1cf6fa66-290a-4e29-bafc-e60185a22fe2","Type":"ContainerDied","Data":"c05585a8023b2af68a7f80685d84b74e2cdf192f7d872b26f5d474040ef515c9"} Jan 11 18:40:39 crc kubenswrapper[4837]: I0111 18:40:39.732042 4837 scope.go:117] "RemoveContainer" containerID="0032c5226bd846c444b1e1269293c95e1f965f59227e93f022572cc4a00df926" Jan 11 18:40:40 crc kubenswrapper[4837]: I0111 18:40:40.741224 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-pqnst" event={"ID":"1cf6fa66-290a-4e29-bafc-e60185a22fe2","Type":"ContainerStarted","Data":"fe7cad95c24e1a087376edc1b36877ab7b7662b0a1fd568a5297b3984bc6ddd8"} Jan 11 18:40:52 crc kubenswrapper[4837]: I0111 18:40:52.171666 4837 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_barbican-api-c9967c84b-cfjvt_42d725d3-10ec-4492-8598-b505cef336fd/barbican-api/0.log" Jan 11 18:40:52 crc kubenswrapper[4837]: I0111 18:40:52.350126 4837 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_barbican-api-c9967c84b-cfjvt_42d725d3-10ec-4492-8598-b505cef336fd/barbican-api-log/0.log" Jan 11 18:40:52 crc kubenswrapper[4837]: I0111 18:40:52.416815 4837 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_barbican-keystone-listener-6dbd56445d-4bk5s_bdeb4f5a-a034-4698-bf56-2f6e1d9cd7d0/barbican-keystone-listener/0.log" Jan 11 18:40:52 crc kubenswrapper[4837]: I0111 18:40:52.431693 4837 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_barbican-keystone-listener-6dbd56445d-4bk5s_bdeb4f5a-a034-4698-bf56-2f6e1d9cd7d0/barbican-keystone-listener-log/0.log" Jan 11 18:40:52 crc kubenswrapper[4837]: I0111 18:40:52.627365 4837 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_barbican-worker-5f749597dc-j8n24_59aacef4-5c25-42e6-a96f-5ca46dc94667/barbican-worker-log/0.log" Jan 11 18:40:52 crc kubenswrapper[4837]: I0111 18:40:52.631079 4837 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_barbican-worker-5f749597dc-j8n24_59aacef4-5c25-42e6-a96f-5ca46dc94667/barbican-worker/0.log" Jan 11 18:40:52 crc kubenswrapper[4837]: I0111 18:40:52.867238 4837 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_bootstrap-edpm-deployment-openstack-edpm-ipam-kfwxw_bb11040a-b0ba-46f9-b2bf-65c1b0c8cf51/bootstrap-edpm-deployment-openstack-edpm-ipam/0.log" Jan 11 18:40:52 crc kubenswrapper[4837]: I0111 18:40:52.904479 4837 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ceilometer-0_19839462-912e-421f-8d6d-a5ef5d8129f5/ceilometer-central-agent/0.log" Jan 11 18:40:52 crc kubenswrapper[4837]: I0111 18:40:52.952470 4837 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ceilometer-0_19839462-912e-421f-8d6d-a5ef5d8129f5/ceilometer-notification-agent/0.log" Jan 11 18:40:53 crc kubenswrapper[4837]: I0111 18:40:53.057876 4837 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ceilometer-0_19839462-912e-421f-8d6d-a5ef5d8129f5/proxy-httpd/0.log" Jan 11 18:40:53 crc kubenswrapper[4837]: I0111 18:40:53.094561 4837 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ceilometer-0_19839462-912e-421f-8d6d-a5ef5d8129f5/sg-core/0.log" Jan 11 18:40:53 crc kubenswrapper[4837]: I0111 18:40:53.212262 4837 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_cinder-api-0_a59977a8-3e8d-4fa9-866e-541d9e0d4bda/cinder-api/0.log" Jan 11 18:40:53 crc kubenswrapper[4837]: I0111 18:40:53.326557 4837 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_cinder-api-0_a59977a8-3e8d-4fa9-866e-541d9e0d4bda/cinder-api-log/0.log" Jan 11 18:40:53 crc kubenswrapper[4837]: I0111 18:40:53.421348 4837 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_cinder-scheduler-0_f6b872cd-f683-45bb-94db-710d997ef648/cinder-scheduler/0.log" Jan 11 18:40:53 crc kubenswrapper[4837]: I0111 18:40:53.464001 4837 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_cinder-scheduler-0_f6b872cd-f683-45bb-94db-710d997ef648/probe/0.log" Jan 11 18:40:53 crc kubenswrapper[4837]: I0111 18:40:53.574205 4837 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_configure-network-edpm-deployment-openstack-edpm-ipam-xmwj9_fd04d490-42de-47b9-aa6f-bc09ba8dd539/configure-network-edpm-deployment-openstack-edpm-ipam/0.log" Jan 11 18:40:53 crc kubenswrapper[4837]: I0111 18:40:53.682289 4837 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_configure-os-edpm-deployment-openstack-edpm-ipam-nffxn_355acc57-d5c4-46fa-8881-61cae424d004/configure-os-edpm-deployment-openstack-edpm-ipam/0.log" Jan 11 18:40:53 crc kubenswrapper[4837]: I0111 18:40:53.771406 4837 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_dnsmasq-dns-cb6ffcf87-ch9dq_25f14f43-12d0-4c7d-b823-4c6d3eecd355/init/0.log" Jan 11 18:40:53 crc kubenswrapper[4837]: I0111 18:40:53.939070 4837 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_dnsmasq-dns-cb6ffcf87-ch9dq_25f14f43-12d0-4c7d-b823-4c6d3eecd355/init/0.log" Jan 11 18:40:54 crc kubenswrapper[4837]: I0111 18:40:54.015018 4837 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_dnsmasq-dns-cb6ffcf87-ch9dq_25f14f43-12d0-4c7d-b823-4c6d3eecd355/dnsmasq-dns/0.log" Jan 11 18:40:54 crc kubenswrapper[4837]: I0111 18:40:54.060851 4837 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_download-cache-edpm-deployment-openstack-edpm-ipam-4d8rm_e6773d83-814c-42bc-8578-5746bb984988/download-cache-edpm-deployment-openstack-edpm-ipam/0.log" Jan 11 18:40:54 crc kubenswrapper[4837]: I0111 18:40:54.250125 4837 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_glance-default-external-api-0_f3914d94-6947-4a7c-ac5e-45bfe15ae144/glance-httpd/0.log" Jan 11 18:40:54 crc kubenswrapper[4837]: I0111 18:40:54.290452 4837 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_glance-default-external-api-0_f3914d94-6947-4a7c-ac5e-45bfe15ae144/glance-log/0.log" Jan 11 18:40:54 crc kubenswrapper[4837]: I0111 18:40:54.448995 4837 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_glance-default-internal-api-0_57a88fde-50af-4286-b9c6-8a5300b7f26b/glance-httpd/0.log" Jan 11 18:40:54 crc kubenswrapper[4837]: I0111 18:40:54.492233 4837 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_glance-default-internal-api-0_57a88fde-50af-4286-b9c6-8a5300b7f26b/glance-log/0.log" Jan 11 18:40:54 crc kubenswrapper[4837]: I0111 18:40:54.825359 4837 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_horizon-7f65cf99f6-zwzzs_ad90513d-7bd8-4407-af16-8d041440673f/horizon/0.log" Jan 11 18:40:55 crc kubenswrapper[4837]: I0111 18:40:55.007440 4837 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_install-certs-edpm-deployment-openstack-edpm-ipam-mknj9_e253716a-cb9e-4a48-aca6-5cbd870ef9d5/install-certs-edpm-deployment-openstack-edpm-ipam/0.log" Jan 11 18:40:55 crc kubenswrapper[4837]: I0111 18:40:55.168896 4837 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_horizon-7f65cf99f6-zwzzs_ad90513d-7bd8-4407-af16-8d041440673f/horizon-log/0.log" Jan 11 18:40:55 crc kubenswrapper[4837]: I0111 18:40:55.193586 4837 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_install-os-edpm-deployment-openstack-edpm-ipam-j8p5n_24037829-f96f-4b2e-93b1-968e19a0edb8/install-os-edpm-deployment-openstack-edpm-ipam/0.log" Jan 11 18:40:55 crc kubenswrapper[4837]: I0111 18:40:55.377261 4837 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_keystone-79986b9d84-7gl5k_42661ef7-7007-4cff-b945-85690a07399f/keystone-api/0.log" Jan 11 18:40:55 crc kubenswrapper[4837]: I0111 18:40:55.380716 4837 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_keystone-cron-29469241-hc276_d2c8127b-3998-456f-bd2c-01f945d7f0b9/keystone-cron/0.log" Jan 11 18:40:55 crc kubenswrapper[4837]: I0111 18:40:55.568745 4837 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_libvirt-edpm-deployment-openstack-edpm-ipam-dwz7s_2948a5a1-4557-4e6b-82d0-6b8e9d7408b1/libvirt-edpm-deployment-openstack-edpm-ipam/0.log" Jan 11 18:40:55 crc kubenswrapper[4837]: I0111 18:40:55.570592 4837 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_kube-state-metrics-0_e2197654-71c4-403f-98d8-994d0225a199/kube-state-metrics/0.log" Jan 11 18:40:55 crc kubenswrapper[4837]: I0111 18:40:55.973640 4837 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_neutron-75f9d98d89-rxjpb_556f75eb-e607-44ee-bbde-cc94844a98bd/neutron-httpd/0.log" Jan 11 18:40:55 crc kubenswrapper[4837]: I0111 18:40:55.996019 4837 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_neutron-75f9d98d89-rxjpb_556f75eb-e607-44ee-bbde-cc94844a98bd/neutron-api/0.log" Jan 11 18:40:56 crc kubenswrapper[4837]: I0111 18:40:56.076117 4837 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_neutron-metadata-edpm-deployment-openstack-edpm-ipam-s96gx_dafad3b0-31b4-467e-9604-485cb65e91e5/neutron-metadata-edpm-deployment-openstack-edpm-ipam/0.log" Jan 11 18:40:56 crc kubenswrapper[4837]: I0111 18:40:56.532768 4837 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-api-0_c6de8f9a-cf35-49c6-8b0c-d75ac48b3691/nova-api-log/0.log" Jan 11 18:40:56 crc kubenswrapper[4837]: I0111 18:40:56.770311 4837 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-cell0-conductor-0_bb1d44ca-482f-455e-bb8c-7c409c3ad6f8/nova-cell0-conductor-conductor/0.log" Jan 11 18:40:56 crc kubenswrapper[4837]: I0111 18:40:56.895117 4837 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-cell1-conductor-0_88d7f74b-9a47-4152-bec1-11e05030e750/nova-cell1-conductor-conductor/0.log" Jan 11 18:40:56 crc kubenswrapper[4837]: I0111 18:40:56.995094 4837 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-api-0_c6de8f9a-cf35-49c6-8b0c-d75ac48b3691/nova-api-api/0.log" Jan 11 18:40:57 crc kubenswrapper[4837]: I0111 18:40:57.056062 4837 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-cell1-novncproxy-0_d590d80f-b67b-4740-8433-bcab03dca733/nova-cell1-novncproxy-novncproxy/0.log" Jan 11 18:40:57 crc kubenswrapper[4837]: I0111 18:40:57.180585 4837 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-edpm-deployment-openstack-edpm-ipam-fctb7_af0d2223-27cf-46b8-9105-735784f027d5/nova-edpm-deployment-openstack-edpm-ipam/0.log" Jan 11 18:40:57 crc kubenswrapper[4837]: I0111 18:40:57.393981 4837 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-metadata-0_1847dbaa-536d-48f0-ac85-de5ad698e483/nova-metadata-log/0.log" Jan 11 18:40:57 crc kubenswrapper[4837]: I0111 18:40:57.645472 4837 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-scheduler-0_4180ef05-e41c-4e74-8e23-41fbda984554/nova-scheduler-scheduler/0.log" Jan 11 18:40:58 crc kubenswrapper[4837]: I0111 18:40:58.232169 4837 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstack-cell1-galera-0_bafaf023-917f-44a9-807e-b6a0f6a55e77/mysql-bootstrap/0.log" Jan 11 18:40:58 crc kubenswrapper[4837]: I0111 18:40:58.409772 4837 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstack-cell1-galera-0_bafaf023-917f-44a9-807e-b6a0f6a55e77/mysql-bootstrap/0.log" Jan 11 18:40:58 crc kubenswrapper[4837]: I0111 18:40:58.436476 4837 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstack-cell1-galera-0_bafaf023-917f-44a9-807e-b6a0f6a55e77/galera/0.log" Jan 11 18:40:58 crc kubenswrapper[4837]: I0111 18:40:58.644609 4837 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-metadata-0_1847dbaa-536d-48f0-ac85-de5ad698e483/nova-metadata-metadata/0.log" Jan 11 18:40:58 crc kubenswrapper[4837]: I0111 18:40:58.650839 4837 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstack-galera-0_09134535-27db-4787-89a5-c01f72ffa182/mysql-bootstrap/0.log" Jan 11 18:40:58 crc kubenswrapper[4837]: I0111 18:40:58.779187 4837 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstack-galera-0_09134535-27db-4787-89a5-c01f72ffa182/mysql-bootstrap/0.log" Jan 11 18:40:58 crc kubenswrapper[4837]: I0111 18:40:58.859803 4837 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstack-galera-0_09134535-27db-4787-89a5-c01f72ffa182/galera/0.log" Jan 11 18:40:58 crc kubenswrapper[4837]: I0111 18:40:58.880555 4837 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstackclient_a2380f65-2f68-4a02-95c4-b3fd94ba3adc/openstackclient/0.log" Jan 11 18:40:59 crc kubenswrapper[4837]: I0111 18:40:59.032524 4837 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-metrics-nfclh_62b32964-26a8-4080-a404-0b40c3122184/openstack-network-exporter/0.log" Jan 11 18:40:59 crc kubenswrapper[4837]: I0111 18:40:59.573523 4837 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-ovs-22bpd_0e7e2f2f-8ba4-4156-a06f-abe8b8c39477/ovsdb-server-init/0.log" Jan 11 18:40:59 crc kubenswrapper[4837]: I0111 18:40:59.798109 4837 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-ovs-22bpd_0e7e2f2f-8ba4-4156-a06f-abe8b8c39477/ovs-vswitchd/0.log" Jan 11 18:40:59 crc kubenswrapper[4837]: I0111 18:40:59.808637 4837 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-ovs-22bpd_0e7e2f2f-8ba4-4156-a06f-abe8b8c39477/ovsdb-server-init/0.log" Jan 11 18:40:59 crc kubenswrapper[4837]: I0111 18:40:59.832502 4837 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-ovs-22bpd_0e7e2f2f-8ba4-4156-a06f-abe8b8c39477/ovsdb-server/0.log" Jan 11 18:40:59 crc kubenswrapper[4837]: I0111 18:40:59.969062 4837 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-zfjdc_91f28f51-1965-4fdd-bcb8-c261644249d5/ovn-controller/0.log" Jan 11 18:41:00 crc kubenswrapper[4837]: I0111 18:41:00.091070 4837 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-edpm-deployment-openstack-edpm-ipam-gqrp6_8b03a0af-96d2-4573-aef4-3010b10d138b/ovn-edpm-deployment-openstack-edpm-ipam/0.log" Jan 11 18:41:00 crc kubenswrapper[4837]: I0111 18:41:00.211640 4837 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-northd-0_de66fa79-5d8b-48c3-a30a-af21fbdd19b3/openstack-network-exporter/0.log" Jan 11 18:41:00 crc kubenswrapper[4837]: I0111 18:41:00.260501 4837 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-northd-0_de66fa79-5d8b-48c3-a30a-af21fbdd19b3/ovn-northd/0.log" Jan 11 18:41:00 crc kubenswrapper[4837]: I0111 18:41:00.365583 4837 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovsdbserver-nb-0_5e69a588-3047-499a-b5cb-000fdcc7762a/openstack-network-exporter/0.log" Jan 11 18:41:00 crc kubenswrapper[4837]: I0111 18:41:00.417665 4837 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovsdbserver-nb-0_5e69a588-3047-499a-b5cb-000fdcc7762a/ovsdbserver-nb/0.log" Jan 11 18:41:00 crc kubenswrapper[4837]: I0111 18:41:00.560686 4837 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovsdbserver-sb-0_bc3c0fec-5357-46ca-929a-527f01e1eb3d/ovsdbserver-sb/0.log" Jan 11 18:41:00 crc kubenswrapper[4837]: I0111 18:41:00.579368 4837 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovsdbserver-sb-0_bc3c0fec-5357-46ca-929a-527f01e1eb3d/openstack-network-exporter/0.log" Jan 11 18:41:00 crc kubenswrapper[4837]: I0111 18:41:00.786111 4837 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_placement-86bdff5ffb-hdnql_1a6ff225-8495-4008-9719-c85bcb7fa65b/placement-api/0.log" Jan 11 18:41:00 crc kubenswrapper[4837]: I0111 18:41:00.855348 4837 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_placement-86bdff5ffb-hdnql_1a6ff225-8495-4008-9719-c85bcb7fa65b/placement-log/0.log" Jan 11 18:41:00 crc kubenswrapper[4837]: I0111 18:41:00.865644 4837 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-cell1-server-0_33de23cd-829c-449c-a816-d8a54f8ea68f/setup-container/0.log" Jan 11 18:41:00 crc kubenswrapper[4837]: I0111 18:41:00.999596 4837 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-cell1-server-0_33de23cd-829c-449c-a816-d8a54f8ea68f/setup-container/0.log" Jan 11 18:41:01 crc kubenswrapper[4837]: I0111 18:41:01.128721 4837 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-server-0_c8ee05fd-6122-4935-a3c0-4b9f71175434/setup-container/0.log" Jan 11 18:41:01 crc kubenswrapper[4837]: I0111 18:41:01.160110 4837 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-cell1-server-0_33de23cd-829c-449c-a816-d8a54f8ea68f/rabbitmq/0.log" Jan 11 18:41:01 crc kubenswrapper[4837]: I0111 18:41:01.279243 4837 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-server-0_c8ee05fd-6122-4935-a3c0-4b9f71175434/setup-container/0.log" Jan 11 18:41:01 crc kubenswrapper[4837]: I0111 18:41:01.299039 4837 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-server-0_c8ee05fd-6122-4935-a3c0-4b9f71175434/rabbitmq/0.log" Jan 11 18:41:01 crc kubenswrapper[4837]: I0111 18:41:01.397523 4837 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_reboot-os-edpm-deployment-openstack-edpm-ipam-ghw7s_a2775520-8fe3-45e2-aab4-91f962ef86cb/reboot-os-edpm-deployment-openstack-edpm-ipam/0.log" Jan 11 18:41:01 crc kubenswrapper[4837]: I0111 18:41:01.672886 4837 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_redhat-edpm-deployment-openstack-edpm-ipam-jwxdr_8c4987b1-e485-4665-9094-ed0f9cd0ed7d/redhat-edpm-deployment-openstack-edpm-ipam/0.log" Jan 11 18:41:01 crc kubenswrapper[4837]: I0111 18:41:01.801871 4837 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_repo-setup-edpm-deployment-openstack-edpm-ipam-vp9f5_4c81bfed-8b17-4ea6-90f8-794ea9dec0f6/repo-setup-edpm-deployment-openstack-edpm-ipam/0.log" Jan 11 18:41:01 crc kubenswrapper[4837]: I0111 18:41:01.886850 4837 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_run-os-edpm-deployment-openstack-edpm-ipam-xtrks_2809dbe5-de4c-4d4d-9a2c-85c51f4591cb/run-os-edpm-deployment-openstack-edpm-ipam/0.log" Jan 11 18:41:02 crc kubenswrapper[4837]: I0111 18:41:02.054597 4837 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ssh-known-hosts-edpm-deployment-jcmv2_bd3dd5e3-2424-41fd-a0cf-ae265214d12f/ssh-known-hosts-edpm-deployment/0.log" Jan 11 18:41:02 crc kubenswrapper[4837]: I0111 18:41:02.242567 4837 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-proxy-85f864d5b5-z8rsp_134689b3-4006-4e5e-a051-cf51f6c9cf51/proxy-httpd/0.log" Jan 11 18:41:02 crc kubenswrapper[4837]: I0111 18:41:02.290559 4837 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-proxy-85f864d5b5-z8rsp_134689b3-4006-4e5e-a051-cf51f6c9cf51/proxy-server/0.log" Jan 11 18:41:02 crc kubenswrapper[4837]: I0111 18:41:02.386892 4837 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-ring-rebalance-qn5fn_43068ba1-1d19-4822-88fa-e52f8fb21738/swift-ring-rebalance/0.log" Jan 11 18:41:02 crc kubenswrapper[4837]: I0111 18:41:02.503361 4837 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_23a0b787-b5b4-4a4e-828b-d7f34853603f/account-auditor/0.log" Jan 11 18:41:02 crc kubenswrapper[4837]: I0111 18:41:02.566283 4837 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_23a0b787-b5b4-4a4e-828b-d7f34853603f/account-reaper/0.log" Jan 11 18:41:02 crc kubenswrapper[4837]: I0111 18:41:02.661171 4837 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_23a0b787-b5b4-4a4e-828b-d7f34853603f/account-replicator/0.log" Jan 11 18:41:02 crc kubenswrapper[4837]: I0111 18:41:02.707634 4837 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_23a0b787-b5b4-4a4e-828b-d7f34853603f/account-server/0.log" Jan 11 18:41:02 crc kubenswrapper[4837]: I0111 18:41:02.713347 4837 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_23a0b787-b5b4-4a4e-828b-d7f34853603f/container-auditor/0.log" Jan 11 18:41:02 crc kubenswrapper[4837]: I0111 18:41:02.751456 4837 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_23a0b787-b5b4-4a4e-828b-d7f34853603f/container-replicator/0.log" Jan 11 18:41:02 crc kubenswrapper[4837]: I0111 18:41:02.876763 4837 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_23a0b787-b5b4-4a4e-828b-d7f34853603f/container-server/0.log" Jan 11 18:41:02 crc kubenswrapper[4837]: I0111 18:41:02.937505 4837 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_23a0b787-b5b4-4a4e-828b-d7f34853603f/object-expirer/0.log" Jan 11 18:41:02 crc kubenswrapper[4837]: I0111 18:41:02.958790 4837 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_23a0b787-b5b4-4a4e-828b-d7f34853603f/container-updater/0.log" Jan 11 18:41:02 crc kubenswrapper[4837]: I0111 18:41:02.960723 4837 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_23a0b787-b5b4-4a4e-828b-d7f34853603f/object-auditor/0.log" Jan 11 18:41:03 crc kubenswrapper[4837]: I0111 18:41:03.107765 4837 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_23a0b787-b5b4-4a4e-828b-d7f34853603f/object-replicator/0.log" Jan 11 18:41:03 crc kubenswrapper[4837]: I0111 18:41:03.145369 4837 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_23a0b787-b5b4-4a4e-828b-d7f34853603f/object-server/0.log" Jan 11 18:41:03 crc kubenswrapper[4837]: I0111 18:41:03.170643 4837 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_23a0b787-b5b4-4a4e-828b-d7f34853603f/rsync/0.log" Jan 11 18:41:03 crc kubenswrapper[4837]: I0111 18:41:03.203582 4837 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_23a0b787-b5b4-4a4e-828b-d7f34853603f/object-updater/0.log" Jan 11 18:41:03 crc kubenswrapper[4837]: I0111 18:41:03.330476 4837 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_23a0b787-b5b4-4a4e-828b-d7f34853603f/swift-recon-cron/0.log" Jan 11 18:41:03 crc kubenswrapper[4837]: I0111 18:41:03.412002 4837 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_telemetry-edpm-deployment-openstack-edpm-ipam-fg96b_38ba1b37-c033-461b-bf07-7aecd5d1e5a1/telemetry-edpm-deployment-openstack-edpm-ipam/0.log" Jan 11 18:41:03 crc kubenswrapper[4837]: I0111 18:41:03.522913 4837 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_tempest-tests-tempest_803594a1-a21b-4a8d-bf22-a2f1786b3822/tempest-tests-tempest-tests-runner/0.log" Jan 11 18:41:03 crc kubenswrapper[4837]: I0111 18:41:03.644962 4837 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_test-operator-logs-pod-tempest-tempest-tests-tempest_cab1d2b9-90c7-478b-905e-0487cb825e65/test-operator-logs-container/0.log" Jan 11 18:41:03 crc kubenswrapper[4837]: I0111 18:41:03.790846 4837 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_validate-network-edpm-deployment-openstack-edpm-ipam-9nvm9_be22134a-b58f-4a66-bcb2-0545a067b33b/validate-network-edpm-deployment-openstack-edpm-ipam/0.log" Jan 11 18:41:13 crc kubenswrapper[4837]: I0111 18:41:13.677744 4837 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_memcached-0_d1aa5cf3-303a-4a5b-8802-fe264fa090d6/memcached/0.log" Jan 11 18:41:32 crc kubenswrapper[4837]: I0111 18:41:32.463056 4837 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_barbican-operator-controller-manager-7ddb5c749-bv24m_02e82478-6974-4ae1-b8de-57688876d070/manager/0.log" Jan 11 18:41:32 crc kubenswrapper[4837]: I0111 18:41:32.647255 4837 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_cca30dc84e08e010a301b72613dd29e91e8e5b398c039860aa526153c79jwqg_a02b8a77-2a89-46a4-9aba-2472a30559f7/util/0.log" Jan 11 18:41:32 crc kubenswrapper[4837]: I0111 18:41:32.821232 4837 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_cca30dc84e08e010a301b72613dd29e91e8e5b398c039860aa526153c79jwqg_a02b8a77-2a89-46a4-9aba-2472a30559f7/pull/0.log" Jan 11 18:41:32 crc kubenswrapper[4837]: I0111 18:41:32.828858 4837 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_cca30dc84e08e010a301b72613dd29e91e8e5b398c039860aa526153c79jwqg_a02b8a77-2a89-46a4-9aba-2472a30559f7/pull/0.log" Jan 11 18:41:32 crc kubenswrapper[4837]: I0111 18:41:32.886742 4837 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_cca30dc84e08e010a301b72613dd29e91e8e5b398c039860aa526153c79jwqg_a02b8a77-2a89-46a4-9aba-2472a30559f7/util/0.log" Jan 11 18:41:33 crc kubenswrapper[4837]: I0111 18:41:33.004100 4837 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_cca30dc84e08e010a301b72613dd29e91e8e5b398c039860aa526153c79jwqg_a02b8a77-2a89-46a4-9aba-2472a30559f7/util/0.log" Jan 11 18:41:33 crc kubenswrapper[4837]: I0111 18:41:33.020066 4837 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_cca30dc84e08e010a301b72613dd29e91e8e5b398c039860aa526153c79jwqg_a02b8a77-2a89-46a4-9aba-2472a30559f7/extract/0.log" Jan 11 18:41:33 crc kubenswrapper[4837]: I0111 18:41:33.045373 4837 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_cca30dc84e08e010a301b72613dd29e91e8e5b398c039860aa526153c79jwqg_a02b8a77-2a89-46a4-9aba-2472a30559f7/pull/0.log" Jan 11 18:41:33 crc kubenswrapper[4837]: I0111 18:41:33.188650 4837 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_cinder-operator-controller-manager-9b68f5989-6fhn7_eb2c9390-f27a-46b0-9249-3e9bdc0c99e3/manager/0.log" Jan 11 18:41:33 crc kubenswrapper[4837]: I0111 18:41:33.250611 4837 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_designate-operator-controller-manager-9f958b845-26d4f_63384d88-7d49-4951-8ccd-10871b0b18ad/manager/0.log" Jan 11 18:41:33 crc kubenswrapper[4837]: I0111 18:41:33.418325 4837 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_glance-operator-controller-manager-c6994669c-9ptmm_c4d04eda-5046-43cd-b407-ed14ec61cbd6/manager/0.log" Jan 11 18:41:33 crc kubenswrapper[4837]: I0111 18:41:33.473953 4837 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_heat-operator-controller-manager-594c8c9d5d-cz9h6_61f99042-0859-46d8-9af9-727352a885ee/manager/0.log" Jan 11 18:41:33 crc kubenswrapper[4837]: I0111 18:41:33.649146 4837 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_horizon-operator-controller-manager-77d5c5b54f-75fgx_fc05ccce-2544-4a54-bdf8-ec1b792ac1ba/manager/0.log" Jan 11 18:41:34 crc kubenswrapper[4837]: I0111 18:41:34.059903 4837 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_infra-operator-controller-manager-77c48c7859-4hndt_c2312108-ddf5-4939-acc1-727557936791/manager/0.log" Jan 11 18:41:34 crc kubenswrapper[4837]: I0111 18:41:34.070482 4837 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_ironic-operator-controller-manager-78757b4889-87jjr_537b7dae-5831-4fa5-afba-a5c7e1229e61/manager/0.log" Jan 11 18:41:34 crc kubenswrapper[4837]: I0111 18:41:34.134900 4837 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_keystone-operator-controller-manager-767fdc4f47-jgrh6_8bdb5237-cb95-4e0c-b52c-85a8a419506b/manager/0.log" Jan 11 18:41:34 crc kubenswrapper[4837]: I0111 18:41:34.308461 4837 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_manila-operator-controller-manager-864f6b75bf-665ds_b8bd1c51-4c79-44f9-b7d8-e43dd8118ba4/manager/0.log" Jan 11 18:41:34 crc kubenswrapper[4837]: I0111 18:41:34.357572 4837 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_mariadb-operator-controller-manager-c87fff755-7k4z8_23a744d5-da8a-4fda-8c27-652e4f18d736/manager/0.log" Jan 11 18:41:34 crc kubenswrapper[4837]: I0111 18:41:34.478339 4837 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_neutron-operator-controller-manager-cb4666565-x648q_69296cc2-890b-439c-8151-9b10963bae3f/manager/0.log" Jan 11 18:41:34 crc kubenswrapper[4837]: I0111 18:41:34.673827 4837 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_nova-operator-controller-manager-65849867d6-vm5gr_b7039fa0-8e22-4369-abcd-baa005429b7b/manager/0.log" Jan 11 18:41:34 crc kubenswrapper[4837]: I0111 18:41:34.701862 4837 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_octavia-operator-controller-manager-7fc9b76cf6-2lvvk_b8022f77-44ba-493f-bed8-ad82fa1ca45a/manager/0.log" Jan 11 18:41:34 crc kubenswrapper[4837]: I0111 18:41:34.916875 4837 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_openstack-baremetal-operator-controller-manager-6b68b8b854pfgjq_374c350e-a484-40a8-8563-45eb7f3eafd1/manager/0.log" Jan 11 18:41:35 crc kubenswrapper[4837]: I0111 18:41:35.196066 4837 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_openstack-operator-index-qqjnc_815a7cf2-a384-4c14-954a-19e05a030e78/registry-server/0.log" Jan 11 18:41:35 crc kubenswrapper[4837]: I0111 18:41:35.264733 4837 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_openstack-operator-controller-operator-597c79dd4-2dspz_229a8de5-0ba1-4408-b093-28e6e74c143b/operator/0.log" Jan 11 18:41:35 crc kubenswrapper[4837]: I0111 18:41:35.507049 4837 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_ovn-operator-controller-manager-55db956ddc-4txs5_8fe4bbe3-9aed-4232-9036-d53346db80b2/manager/0.log" Jan 11 18:41:35 crc kubenswrapper[4837]: I0111 18:41:35.924444 4837 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_openstack-operator-controller-manager-5569b88c46-6jqzq_3081056a-171f-44ab-a8c4-57a3c40686c4/manager/0.log" Jan 11 18:41:36 crc kubenswrapper[4837]: I0111 18:41:36.092667 4837 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_placement-operator-controller-manager-686df47fcb-wf222_ddce549f-ba1d-483d-b50b-4011c826bbff/manager/0.log" Jan 11 18:41:36 crc kubenswrapper[4837]: I0111 18:41:36.103404 4837 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_rabbitmq-cluster-operator-manager-668c99d594-kwp7z_c76abbe1-c9d2-414f-8c9a-372f8d5e17bc/operator/0.log" Jan 11 18:41:36 crc kubenswrapper[4837]: I0111 18:41:36.291374 4837 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_swift-operator-controller-manager-85dd56d4cc-hspqd_5ea53463-b9a9-4406-b27f-ab1324f4bdcc/manager/0.log" Jan 11 18:41:36 crc kubenswrapper[4837]: I0111 18:41:36.334338 4837 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_telemetry-operator-controller-manager-5f8f495fcf-tkj2c_0296da23-fe5c-4f47-b26b-6d83da73bf31/manager/0.log" Jan 11 18:41:36 crc kubenswrapper[4837]: I0111 18:41:36.378089 4837 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_test-operator-controller-manager-7cd8bc9dbb-shqx6_04067843-8e2d-4a0c-8c68-2e321669b605/manager/0.log" Jan 11 18:41:36 crc kubenswrapper[4837]: I0111 18:41:36.487387 4837 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_watcher-operator-controller-manager-64cd966744-jsqjz_56dd103a-afaf-46fa-9cf3-f85418264d29/manager/0.log" Jan 11 18:41:58 crc kubenswrapper[4837]: I0111 18:41:58.791647 4837 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-machine-api_control-plane-machine-set-operator-78cbb6b69f-bs6rl_75e7928f-bf2f-4a17-b2eb-f4fc925c7ce3/control-plane-machine-set-operator/0.log" Jan 11 18:41:58 crc kubenswrapper[4837]: I0111 18:41:58.955706 4837 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-machine-api_machine-api-operator-5694c8668f-xk9j9_b61e27df-5c38-48b3-b6e9-bca3ce8aa429/kube-rbac-proxy/0.log" Jan 11 18:41:59 crc kubenswrapper[4837]: I0111 18:41:59.009152 4837 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-machine-api_machine-api-operator-5694c8668f-xk9j9_b61e27df-5c38-48b3-b6e9-bca3ce8aa429/machine-api-operator/0.log" Jan 11 18:42:12 crc kubenswrapper[4837]: I0111 18:42:12.660925 4837 log.go:25] "Finished parsing log file" path="/var/log/pods/cert-manager_cert-manager-858654f9db-2f444_469b992e-fb84-479b-8ec6-5c6490e9daf5/cert-manager-controller/0.log" Jan 11 18:42:12 crc kubenswrapper[4837]: I0111 18:42:12.809058 4837 log.go:25] "Finished parsing log file" path="/var/log/pods/cert-manager_cert-manager-cainjector-cf98fcc89-j7rbn_77732f18-1dd2-475e-9d27-69cf1f66df7d/cert-manager-cainjector/0.log" Jan 11 18:42:12 crc kubenswrapper[4837]: I0111 18:42:12.860170 4837 log.go:25] "Finished parsing log file" path="/var/log/pods/cert-manager_cert-manager-webhook-687f57d79b-ssj7v_5bf7e751-4059-4025-b610-732ec84bda0d/cert-manager-webhook/0.log" Jan 11 18:42:27 crc kubenswrapper[4837]: I0111 18:42:27.058412 4837 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-console-plugin-6ff7998486-kkz9f_3b85571d-dea1-437f-bd5c-27d5d421411e/nmstate-console-plugin/0.log" Jan 11 18:42:27 crc kubenswrapper[4837]: I0111 18:42:27.235332 4837 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-handler-68ws8_1c2557e3-14a8-4911-92b0-564bb7b60b06/nmstate-handler/0.log" Jan 11 18:42:27 crc kubenswrapper[4837]: I0111 18:42:27.324378 4837 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-metrics-7f7f7578db-znz8s_c440fad0-c0e0-4553-ad26-b843f81c8863/kube-rbac-proxy/0.log" Jan 11 18:42:27 crc kubenswrapper[4837]: I0111 18:42:27.382982 4837 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-metrics-7f7f7578db-znz8s_c440fad0-c0e0-4553-ad26-b843f81c8863/nmstate-metrics/0.log" Jan 11 18:42:27 crc kubenswrapper[4837]: I0111 18:42:27.457132 4837 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-operator-6769fb99d-tx7vk_7340b2fb-4088-4358-977e-020434c7fa2c/nmstate-operator/0.log" Jan 11 18:42:27 crc kubenswrapper[4837]: I0111 18:42:27.562407 4837 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-webhook-f8fb84555-4tph5_5d9870f8-4c71-4490-8e77-17f1a82e725a/nmstate-webhook/0.log" Jan 11 18:42:39 crc kubenswrapper[4837]: I0111 18:42:39.444535 4837 patch_prober.go:28] interesting pod/machine-config-daemon-pqnst container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Jan 11 18:42:39 crc kubenswrapper[4837]: I0111 18:42:39.445239 4837 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-pqnst" podUID="1cf6fa66-290a-4e29-bafc-e60185a22fe2" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Jan 11 18:42:56 crc kubenswrapper[4837]: I0111 18:42:56.234296 4837 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_controller-5bddd4b946-5bpfr_da885226-0b14-4626-8d89-7d4505ab29a1/kube-rbac-proxy/0.log" Jan 11 18:42:56 crc kubenswrapper[4837]: I0111 18:42:56.414590 4837 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_controller-5bddd4b946-5bpfr_da885226-0b14-4626-8d89-7d4505ab29a1/controller/0.log" Jan 11 18:42:56 crc kubenswrapper[4837]: I0111 18:42:56.439234 4837 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-m6x95_d4fb0b82-36cb-45ce-b356-1a740d312fcf/cp-frr-files/0.log" Jan 11 18:42:56 crc kubenswrapper[4837]: I0111 18:42:56.594540 4837 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-m6x95_d4fb0b82-36cb-45ce-b356-1a740d312fcf/cp-metrics/0.log" Jan 11 18:42:56 crc kubenswrapper[4837]: I0111 18:42:56.604420 4837 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-m6x95_d4fb0b82-36cb-45ce-b356-1a740d312fcf/cp-frr-files/0.log" Jan 11 18:42:56 crc kubenswrapper[4837]: I0111 18:42:56.625315 4837 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-m6x95_d4fb0b82-36cb-45ce-b356-1a740d312fcf/cp-reloader/0.log" Jan 11 18:42:56 crc kubenswrapper[4837]: I0111 18:42:56.678292 4837 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-m6x95_d4fb0b82-36cb-45ce-b356-1a740d312fcf/cp-reloader/0.log" Jan 11 18:42:56 crc kubenswrapper[4837]: I0111 18:42:56.874951 4837 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-m6x95_d4fb0b82-36cb-45ce-b356-1a740d312fcf/cp-metrics/0.log" Jan 11 18:42:56 crc kubenswrapper[4837]: I0111 18:42:56.877765 4837 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-m6x95_d4fb0b82-36cb-45ce-b356-1a740d312fcf/cp-frr-files/0.log" Jan 11 18:42:56 crc kubenswrapper[4837]: I0111 18:42:56.885178 4837 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-m6x95_d4fb0b82-36cb-45ce-b356-1a740d312fcf/cp-reloader/0.log" Jan 11 18:42:56 crc kubenswrapper[4837]: I0111 18:42:56.908379 4837 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-m6x95_d4fb0b82-36cb-45ce-b356-1a740d312fcf/cp-metrics/0.log" Jan 11 18:42:57 crc kubenswrapper[4837]: I0111 18:42:57.043534 4837 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-m6x95_d4fb0b82-36cb-45ce-b356-1a740d312fcf/cp-frr-files/0.log" Jan 11 18:42:57 crc kubenswrapper[4837]: I0111 18:42:57.064065 4837 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-m6x95_d4fb0b82-36cb-45ce-b356-1a740d312fcf/cp-reloader/0.log" Jan 11 18:42:57 crc kubenswrapper[4837]: I0111 18:42:57.098876 4837 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-m6x95_d4fb0b82-36cb-45ce-b356-1a740d312fcf/controller/0.log" Jan 11 18:42:57 crc kubenswrapper[4837]: I0111 18:42:57.104623 4837 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-m6x95_d4fb0b82-36cb-45ce-b356-1a740d312fcf/cp-metrics/0.log" Jan 11 18:42:57 crc kubenswrapper[4837]: I0111 18:42:57.266345 4837 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-m6x95_d4fb0b82-36cb-45ce-b356-1a740d312fcf/frr-metrics/0.log" Jan 11 18:42:57 crc kubenswrapper[4837]: I0111 18:42:57.297843 4837 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-m6x95_d4fb0b82-36cb-45ce-b356-1a740d312fcf/kube-rbac-proxy/0.log" Jan 11 18:42:57 crc kubenswrapper[4837]: I0111 18:42:57.326812 4837 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-m6x95_d4fb0b82-36cb-45ce-b356-1a740d312fcf/kube-rbac-proxy-frr/0.log" Jan 11 18:42:57 crc kubenswrapper[4837]: I0111 18:42:57.491755 4837 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-m6x95_d4fb0b82-36cb-45ce-b356-1a740d312fcf/reloader/0.log" Jan 11 18:42:57 crc kubenswrapper[4837]: I0111 18:42:57.528820 4837 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-webhook-server-7784b6fcf-568c8_aa1bc5b4-0f84-413a-a7fd-d2531bbb8265/frr-k8s-webhook-server/0.log" Jan 11 18:42:57 crc kubenswrapper[4837]: I0111 18:42:57.780008 4837 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_metallb-operator-controller-manager-7b497646f-46nhj_b3e8b743-e8f1-453a-9f63-44700da2d56a/manager/0.log" Jan 11 18:42:57 crc kubenswrapper[4837]: I0111 18:42:57.906207 4837 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_metallb-operator-webhook-server-55549fb586-lvpmn_28a54c4f-092d-4c3e-b528-9d3651c4f3a9/webhook-server/0.log" Jan 11 18:42:58 crc kubenswrapper[4837]: I0111 18:42:58.077950 4837 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_speaker-8bfbc_e3e46c8e-1e90-49a8-a3eb-879ccd3c4807/kube-rbac-proxy/0.log" Jan 11 18:42:58 crc kubenswrapper[4837]: I0111 18:42:58.506775 4837 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-m6x95_d4fb0b82-36cb-45ce-b356-1a740d312fcf/frr/0.log" Jan 11 18:42:58 crc kubenswrapper[4837]: I0111 18:42:58.540252 4837 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_speaker-8bfbc_e3e46c8e-1e90-49a8-a3eb-879ccd3c4807/speaker/0.log" Jan 11 18:43:07 crc kubenswrapper[4837]: I0111 18:43:07.484891 4837 prober.go:107] "Probe failed" probeType="Liveness" pod="metallb-system/frr-k8s-m6x95" podUID="d4fb0b82-36cb-45ce-b356-1a740d312fcf" containerName="frr" probeResult="failure" output="Get \"http://127.0.0.1:7573/livez\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Jan 11 18:43:09 crc kubenswrapper[4837]: I0111 18:43:09.444589 4837 patch_prober.go:28] interesting pod/machine-config-daemon-pqnst container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Jan 11 18:43:09 crc kubenswrapper[4837]: I0111 18:43:09.444978 4837 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-pqnst" podUID="1cf6fa66-290a-4e29-bafc-e60185a22fe2" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Jan 11 18:43:15 crc kubenswrapper[4837]: I0111 18:43:15.309292 4837 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_5b7fccbebf0e22d2dd769066fa7aaa90fd620c5db34f2af6c91e4319d4f2nrw_55364c0f-8ad4-40fb-8739-65b09f608b27/util/0.log" Jan 11 18:43:15 crc kubenswrapper[4837]: I0111 18:43:15.567149 4837 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_5b7fccbebf0e22d2dd769066fa7aaa90fd620c5db34f2af6c91e4319d4f2nrw_55364c0f-8ad4-40fb-8739-65b09f608b27/util/0.log" Jan 11 18:43:15 crc kubenswrapper[4837]: I0111 18:43:15.571315 4837 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_5b7fccbebf0e22d2dd769066fa7aaa90fd620c5db34f2af6c91e4319d4f2nrw_55364c0f-8ad4-40fb-8739-65b09f608b27/pull/0.log" Jan 11 18:43:15 crc kubenswrapper[4837]: I0111 18:43:15.574078 4837 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_5b7fccbebf0e22d2dd769066fa7aaa90fd620c5db34f2af6c91e4319d4f2nrw_55364c0f-8ad4-40fb-8739-65b09f608b27/pull/0.log" Jan 11 18:43:15 crc kubenswrapper[4837]: I0111 18:43:15.719753 4837 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_5b7fccbebf0e22d2dd769066fa7aaa90fd620c5db34f2af6c91e4319d4f2nrw_55364c0f-8ad4-40fb-8739-65b09f608b27/util/0.log" Jan 11 18:43:15 crc kubenswrapper[4837]: I0111 18:43:15.724990 4837 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_5b7fccbebf0e22d2dd769066fa7aaa90fd620c5db34f2af6c91e4319d4f2nrw_55364c0f-8ad4-40fb-8739-65b09f608b27/pull/0.log" Jan 11 18:43:15 crc kubenswrapper[4837]: I0111 18:43:15.752889 4837 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_5b7fccbebf0e22d2dd769066fa7aaa90fd620c5db34f2af6c91e4319d4f2nrw_55364c0f-8ad4-40fb-8739-65b09f608b27/extract/0.log" Jan 11 18:43:16 crc kubenswrapper[4837]: I0111 18:43:16.325462 4837 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_98085b0df3808ebec39f9f9529f737144fe2dbcdaa4f334014817c0fa8ds5q5_a50fb0c0-dff3-4722-b0ed-4c014c80faee/util/0.log" Jan 11 18:43:16 crc kubenswrapper[4837]: I0111 18:43:16.517988 4837 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_98085b0df3808ebec39f9f9529f737144fe2dbcdaa4f334014817c0fa8ds5q5_a50fb0c0-dff3-4722-b0ed-4c014c80faee/util/0.log" Jan 11 18:43:16 crc kubenswrapper[4837]: I0111 18:43:16.564412 4837 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_98085b0df3808ebec39f9f9529f737144fe2dbcdaa4f334014817c0fa8ds5q5_a50fb0c0-dff3-4722-b0ed-4c014c80faee/pull/0.log" Jan 11 18:43:16 crc kubenswrapper[4837]: I0111 18:43:16.590007 4837 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_98085b0df3808ebec39f9f9529f737144fe2dbcdaa4f334014817c0fa8ds5q5_a50fb0c0-dff3-4722-b0ed-4c014c80faee/pull/0.log" Jan 11 18:43:16 crc kubenswrapper[4837]: I0111 18:43:16.699437 4837 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_98085b0df3808ebec39f9f9529f737144fe2dbcdaa4f334014817c0fa8ds5q5_a50fb0c0-dff3-4722-b0ed-4c014c80faee/util/0.log" Jan 11 18:43:16 crc kubenswrapper[4837]: I0111 18:43:16.758818 4837 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_98085b0df3808ebec39f9f9529f737144fe2dbcdaa4f334014817c0fa8ds5q5_a50fb0c0-dff3-4722-b0ed-4c014c80faee/extract/0.log" Jan 11 18:43:16 crc kubenswrapper[4837]: I0111 18:43:16.765284 4837 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_98085b0df3808ebec39f9f9529f737144fe2dbcdaa4f334014817c0fa8ds5q5_a50fb0c0-dff3-4722-b0ed-4c014c80faee/pull/0.log" Jan 11 18:43:16 crc kubenswrapper[4837]: I0111 18:43:16.885045 4837 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-bh2tn_2efe9489-3447-4e79-b762-935c88a0c3fe/extract-utilities/0.log" Jan 11 18:43:17 crc kubenswrapper[4837]: I0111 18:43:17.017098 4837 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-bh2tn_2efe9489-3447-4e79-b762-935c88a0c3fe/extract-utilities/0.log" Jan 11 18:43:17 crc kubenswrapper[4837]: I0111 18:43:17.045685 4837 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-bh2tn_2efe9489-3447-4e79-b762-935c88a0c3fe/extract-content/0.log" Jan 11 18:43:17 crc kubenswrapper[4837]: I0111 18:43:17.069054 4837 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-bh2tn_2efe9489-3447-4e79-b762-935c88a0c3fe/extract-content/0.log" Jan 11 18:43:17 crc kubenswrapper[4837]: I0111 18:43:17.251306 4837 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-bh2tn_2efe9489-3447-4e79-b762-935c88a0c3fe/extract-content/0.log" Jan 11 18:43:17 crc kubenswrapper[4837]: I0111 18:43:17.305463 4837 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-bh2tn_2efe9489-3447-4e79-b762-935c88a0c3fe/extract-utilities/0.log" Jan 11 18:43:17 crc kubenswrapper[4837]: I0111 18:43:17.509570 4837 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-lsj24_734801c7-6fa0-4055-a0bd-22b2824d4312/extract-utilities/0.log" Jan 11 18:43:17 crc kubenswrapper[4837]: I0111 18:43:17.659230 4837 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-lsj24_734801c7-6fa0-4055-a0bd-22b2824d4312/extract-utilities/0.log" Jan 11 18:43:17 crc kubenswrapper[4837]: I0111 18:43:17.681637 4837 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-lsj24_734801c7-6fa0-4055-a0bd-22b2824d4312/extract-content/0.log" Jan 11 18:43:17 crc kubenswrapper[4837]: I0111 18:43:17.767633 4837 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-bh2tn_2efe9489-3447-4e79-b762-935c88a0c3fe/registry-server/0.log" Jan 11 18:43:17 crc kubenswrapper[4837]: I0111 18:43:17.779894 4837 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-lsj24_734801c7-6fa0-4055-a0bd-22b2824d4312/extract-content/0.log" Jan 11 18:43:17 crc kubenswrapper[4837]: I0111 18:43:17.913191 4837 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-lsj24_734801c7-6fa0-4055-a0bd-22b2824d4312/extract-utilities/0.log" Jan 11 18:43:17 crc kubenswrapper[4837]: I0111 18:43:17.922513 4837 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-lsj24_734801c7-6fa0-4055-a0bd-22b2824d4312/extract-content/0.log" Jan 11 18:43:18 crc kubenswrapper[4837]: I0111 18:43:18.152437 4837 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_marketplace-operator-79b997595-xb262_a46aac0a-4b71-4559-9481-499e240587e4/marketplace-operator/0.log" Jan 11 18:43:18 crc kubenswrapper[4837]: I0111 18:43:18.287084 4837 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-cg2qj_179b571c-cc46-454a-bf70-652f09e1c934/extract-utilities/0.log" Jan 11 18:43:18 crc kubenswrapper[4837]: I0111 18:43:18.446663 4837 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-lsj24_734801c7-6fa0-4055-a0bd-22b2824d4312/registry-server/0.log" Jan 11 18:43:18 crc kubenswrapper[4837]: I0111 18:43:18.469916 4837 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-cg2qj_179b571c-cc46-454a-bf70-652f09e1c934/extract-utilities/0.log" Jan 11 18:43:18 crc kubenswrapper[4837]: I0111 18:43:18.472296 4837 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-cg2qj_179b571c-cc46-454a-bf70-652f09e1c934/extract-content/0.log" Jan 11 18:43:18 crc kubenswrapper[4837]: I0111 18:43:18.539409 4837 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-cg2qj_179b571c-cc46-454a-bf70-652f09e1c934/extract-content/0.log" Jan 11 18:43:18 crc kubenswrapper[4837]: I0111 18:43:18.615360 4837 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-cg2qj_179b571c-cc46-454a-bf70-652f09e1c934/extract-utilities/0.log" Jan 11 18:43:18 crc kubenswrapper[4837]: I0111 18:43:18.645870 4837 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-cg2qj_179b571c-cc46-454a-bf70-652f09e1c934/extract-content/0.log" Jan 11 18:43:18 crc kubenswrapper[4837]: I0111 18:43:18.746142 4837 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-gs7kc_2467754a-ee89-4272-886e-bd185cc623a3/extract-utilities/0.log" Jan 11 18:43:18 crc kubenswrapper[4837]: I0111 18:43:18.801455 4837 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-cg2qj_179b571c-cc46-454a-bf70-652f09e1c934/registry-server/0.log" Jan 11 18:43:18 crc kubenswrapper[4837]: I0111 18:43:18.937022 4837 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-gs7kc_2467754a-ee89-4272-886e-bd185cc623a3/extract-content/0.log" Jan 11 18:43:18 crc kubenswrapper[4837]: I0111 18:43:18.950580 4837 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-gs7kc_2467754a-ee89-4272-886e-bd185cc623a3/extract-content/0.log" Jan 11 18:43:18 crc kubenswrapper[4837]: I0111 18:43:18.960229 4837 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-gs7kc_2467754a-ee89-4272-886e-bd185cc623a3/extract-utilities/0.log" Jan 11 18:43:19 crc kubenswrapper[4837]: I0111 18:43:19.122772 4837 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-gs7kc_2467754a-ee89-4272-886e-bd185cc623a3/extract-utilities/0.log" Jan 11 18:43:19 crc kubenswrapper[4837]: I0111 18:43:19.138305 4837 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-gs7kc_2467754a-ee89-4272-886e-bd185cc623a3/extract-content/0.log" Jan 11 18:43:19 crc kubenswrapper[4837]: I0111 18:43:19.559975 4837 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-gs7kc_2467754a-ee89-4272-886e-bd185cc623a3/registry-server/0.log" Jan 11 18:43:39 crc kubenswrapper[4837]: I0111 18:43:39.444240 4837 patch_prober.go:28] interesting pod/machine-config-daemon-pqnst container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Jan 11 18:43:39 crc kubenswrapper[4837]: I0111 18:43:39.444844 4837 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-pqnst" podUID="1cf6fa66-290a-4e29-bafc-e60185a22fe2" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Jan 11 18:43:39 crc kubenswrapper[4837]: I0111 18:43:39.444891 4837 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-pqnst" Jan 11 18:43:39 crc kubenswrapper[4837]: I0111 18:43:39.445646 4837 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"fe7cad95c24e1a087376edc1b36877ab7b7662b0a1fd568a5297b3984bc6ddd8"} pod="openshift-machine-config-operator/machine-config-daemon-pqnst" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Jan 11 18:43:39 crc kubenswrapper[4837]: I0111 18:43:39.445724 4837 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-pqnst" podUID="1cf6fa66-290a-4e29-bafc-e60185a22fe2" containerName="machine-config-daemon" containerID="cri-o://fe7cad95c24e1a087376edc1b36877ab7b7662b0a1fd568a5297b3984bc6ddd8" gracePeriod=600 Jan 11 18:43:39 crc kubenswrapper[4837]: E0111 18:43:39.577005 4837 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-pqnst_openshift-machine-config-operator(1cf6fa66-290a-4e29-bafc-e60185a22fe2)\"" pod="openshift-machine-config-operator/machine-config-daemon-pqnst" podUID="1cf6fa66-290a-4e29-bafc-e60185a22fe2" Jan 11 18:43:39 crc kubenswrapper[4837]: I0111 18:43:39.997008 4837 generic.go:334] "Generic (PLEG): container finished" podID="1cf6fa66-290a-4e29-bafc-e60185a22fe2" containerID="fe7cad95c24e1a087376edc1b36877ab7b7662b0a1fd568a5297b3984bc6ddd8" exitCode=0 Jan 11 18:43:39 crc kubenswrapper[4837]: I0111 18:43:39.997204 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-pqnst" event={"ID":"1cf6fa66-290a-4e29-bafc-e60185a22fe2","Type":"ContainerDied","Data":"fe7cad95c24e1a087376edc1b36877ab7b7662b0a1fd568a5297b3984bc6ddd8"} Jan 11 18:43:39 crc kubenswrapper[4837]: I0111 18:43:39.997268 4837 scope.go:117] "RemoveContainer" containerID="c05585a8023b2af68a7f80685d84b74e2cdf192f7d872b26f5d474040ef515c9" Jan 11 18:43:39 crc kubenswrapper[4837]: I0111 18:43:39.997925 4837 scope.go:117] "RemoveContainer" containerID="fe7cad95c24e1a087376edc1b36877ab7b7662b0a1fd568a5297b3984bc6ddd8" Jan 11 18:43:39 crc kubenswrapper[4837]: E0111 18:43:39.998165 4837 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-pqnst_openshift-machine-config-operator(1cf6fa66-290a-4e29-bafc-e60185a22fe2)\"" pod="openshift-machine-config-operator/machine-config-daemon-pqnst" podUID="1cf6fa66-290a-4e29-bafc-e60185a22fe2" Jan 11 18:43:52 crc kubenswrapper[4837]: I0111 18:43:52.365008 4837 scope.go:117] "RemoveContainer" containerID="fe7cad95c24e1a087376edc1b36877ab7b7662b0a1fd568a5297b3984bc6ddd8" Jan 11 18:43:52 crc kubenswrapper[4837]: E0111 18:43:52.365643 4837 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-pqnst_openshift-machine-config-operator(1cf6fa66-290a-4e29-bafc-e60185a22fe2)\"" pod="openshift-machine-config-operator/machine-config-daemon-pqnst" podUID="1cf6fa66-290a-4e29-bafc-e60185a22fe2" Jan 11 18:44:06 crc kubenswrapper[4837]: I0111 18:44:06.371407 4837 scope.go:117] "RemoveContainer" containerID="fe7cad95c24e1a087376edc1b36877ab7b7662b0a1fd568a5297b3984bc6ddd8" Jan 11 18:44:06 crc kubenswrapper[4837]: E0111 18:44:06.372922 4837 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-pqnst_openshift-machine-config-operator(1cf6fa66-290a-4e29-bafc-e60185a22fe2)\"" pod="openshift-machine-config-operator/machine-config-daemon-pqnst" podUID="1cf6fa66-290a-4e29-bafc-e60185a22fe2" Jan 11 18:44:17 crc kubenswrapper[4837]: I0111 18:44:17.365015 4837 scope.go:117] "RemoveContainer" containerID="fe7cad95c24e1a087376edc1b36877ab7b7662b0a1fd568a5297b3984bc6ddd8" Jan 11 18:44:17 crc kubenswrapper[4837]: E0111 18:44:17.366047 4837 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-pqnst_openshift-machine-config-operator(1cf6fa66-290a-4e29-bafc-e60185a22fe2)\"" pod="openshift-machine-config-operator/machine-config-daemon-pqnst" podUID="1cf6fa66-290a-4e29-bafc-e60185a22fe2" Jan 11 18:44:29 crc kubenswrapper[4837]: I0111 18:44:29.364636 4837 scope.go:117] "RemoveContainer" containerID="fe7cad95c24e1a087376edc1b36877ab7b7662b0a1fd568a5297b3984bc6ddd8" Jan 11 18:44:29 crc kubenswrapper[4837]: E0111 18:44:29.365567 4837 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-pqnst_openshift-machine-config-operator(1cf6fa66-290a-4e29-bafc-e60185a22fe2)\"" pod="openshift-machine-config-operator/machine-config-daemon-pqnst" podUID="1cf6fa66-290a-4e29-bafc-e60185a22fe2" Jan 11 18:44:41 crc kubenswrapper[4837]: I0111 18:44:41.366113 4837 scope.go:117] "RemoveContainer" containerID="fe7cad95c24e1a087376edc1b36877ab7b7662b0a1fd568a5297b3984bc6ddd8" Jan 11 18:44:41 crc kubenswrapper[4837]: E0111 18:44:41.367576 4837 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-pqnst_openshift-machine-config-operator(1cf6fa66-290a-4e29-bafc-e60185a22fe2)\"" pod="openshift-machine-config-operator/machine-config-daemon-pqnst" podUID="1cf6fa66-290a-4e29-bafc-e60185a22fe2" Jan 11 18:44:56 crc kubenswrapper[4837]: I0111 18:44:56.373945 4837 scope.go:117] "RemoveContainer" containerID="fe7cad95c24e1a087376edc1b36877ab7b7662b0a1fd568a5297b3984bc6ddd8" Jan 11 18:44:56 crc kubenswrapper[4837]: E0111 18:44:56.377328 4837 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-pqnst_openshift-machine-config-operator(1cf6fa66-290a-4e29-bafc-e60185a22fe2)\"" pod="openshift-machine-config-operator/machine-config-daemon-pqnst" podUID="1cf6fa66-290a-4e29-bafc-e60185a22fe2" Jan 11 18:45:00 crc kubenswrapper[4837]: I0111 18:45:00.216873 4837 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29469285-8rpjw"] Jan 11 18:45:00 crc kubenswrapper[4837]: E0111 18:45:00.219291 4837 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="aeda6f4c-e596-4aa3-9e34-e3a8ccc48d54" containerName="container-00" Jan 11 18:45:00 crc kubenswrapper[4837]: I0111 18:45:00.219391 4837 state_mem.go:107] "Deleted CPUSet assignment" podUID="aeda6f4c-e596-4aa3-9e34-e3a8ccc48d54" containerName="container-00" Jan 11 18:45:00 crc kubenswrapper[4837]: I0111 18:45:00.219734 4837 memory_manager.go:354] "RemoveStaleState removing state" podUID="aeda6f4c-e596-4aa3-9e34-e3a8ccc48d54" containerName="container-00" Jan 11 18:45:00 crc kubenswrapper[4837]: I0111 18:45:00.220513 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29469285-8rpjw" Jan 11 18:45:00 crc kubenswrapper[4837]: I0111 18:45:00.224058 4837 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"collect-profiles-dockercfg-kzf4t" Jan 11 18:45:00 crc kubenswrapper[4837]: I0111 18:45:00.224269 4837 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"collect-profiles-config" Jan 11 18:45:00 crc kubenswrapper[4837]: I0111 18:45:00.227148 4837 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29469285-8rpjw"] Jan 11 18:45:00 crc kubenswrapper[4837]: I0111 18:45:00.360924 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/93594db4-f700-4c4c-a508-d21774e749d6-config-volume\") pod \"collect-profiles-29469285-8rpjw\" (UID: \"93594db4-f700-4c4c-a508-d21774e749d6\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29469285-8rpjw" Jan 11 18:45:00 crc kubenswrapper[4837]: I0111 18:45:00.360987 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/93594db4-f700-4c4c-a508-d21774e749d6-secret-volume\") pod \"collect-profiles-29469285-8rpjw\" (UID: \"93594db4-f700-4c4c-a508-d21774e749d6\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29469285-8rpjw" Jan 11 18:45:00 crc kubenswrapper[4837]: I0111 18:45:00.361048 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-44wsl\" (UniqueName: \"kubernetes.io/projected/93594db4-f700-4c4c-a508-d21774e749d6-kube-api-access-44wsl\") pod \"collect-profiles-29469285-8rpjw\" (UID: \"93594db4-f700-4c4c-a508-d21774e749d6\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29469285-8rpjw" Jan 11 18:45:00 crc kubenswrapper[4837]: I0111 18:45:00.462637 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/93594db4-f700-4c4c-a508-d21774e749d6-secret-volume\") pod \"collect-profiles-29469285-8rpjw\" (UID: \"93594db4-f700-4c4c-a508-d21774e749d6\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29469285-8rpjw" Jan 11 18:45:00 crc kubenswrapper[4837]: I0111 18:45:00.462749 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-44wsl\" (UniqueName: \"kubernetes.io/projected/93594db4-f700-4c4c-a508-d21774e749d6-kube-api-access-44wsl\") pod \"collect-profiles-29469285-8rpjw\" (UID: \"93594db4-f700-4c4c-a508-d21774e749d6\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29469285-8rpjw" Jan 11 18:45:00 crc kubenswrapper[4837]: I0111 18:45:00.462859 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/93594db4-f700-4c4c-a508-d21774e749d6-config-volume\") pod \"collect-profiles-29469285-8rpjw\" (UID: \"93594db4-f700-4c4c-a508-d21774e749d6\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29469285-8rpjw" Jan 11 18:45:00 crc kubenswrapper[4837]: I0111 18:45:00.463750 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/93594db4-f700-4c4c-a508-d21774e749d6-config-volume\") pod \"collect-profiles-29469285-8rpjw\" (UID: \"93594db4-f700-4c4c-a508-d21774e749d6\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29469285-8rpjw" Jan 11 18:45:00 crc kubenswrapper[4837]: I0111 18:45:00.485970 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/93594db4-f700-4c4c-a508-d21774e749d6-secret-volume\") pod \"collect-profiles-29469285-8rpjw\" (UID: \"93594db4-f700-4c4c-a508-d21774e749d6\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29469285-8rpjw" Jan 11 18:45:00 crc kubenswrapper[4837]: I0111 18:45:00.492022 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-44wsl\" (UniqueName: \"kubernetes.io/projected/93594db4-f700-4c4c-a508-d21774e749d6-kube-api-access-44wsl\") pod \"collect-profiles-29469285-8rpjw\" (UID: \"93594db4-f700-4c4c-a508-d21774e749d6\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29469285-8rpjw" Jan 11 18:45:00 crc kubenswrapper[4837]: I0111 18:45:00.558616 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29469285-8rpjw" Jan 11 18:45:01 crc kubenswrapper[4837]: I0111 18:45:01.074794 4837 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29469285-8rpjw"] Jan 11 18:45:01 crc kubenswrapper[4837]: I0111 18:45:01.848324 4837 generic.go:334] "Generic (PLEG): container finished" podID="93594db4-f700-4c4c-a508-d21774e749d6" containerID="feadc8d52474f58bec33fd213acabbbc7aaef26f062d9c25198e1ce57c9668e4" exitCode=0 Jan 11 18:45:01 crc kubenswrapper[4837]: I0111 18:45:01.848405 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29469285-8rpjw" event={"ID":"93594db4-f700-4c4c-a508-d21774e749d6","Type":"ContainerDied","Data":"feadc8d52474f58bec33fd213acabbbc7aaef26f062d9c25198e1ce57c9668e4"} Jan 11 18:45:01 crc kubenswrapper[4837]: I0111 18:45:01.848453 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29469285-8rpjw" event={"ID":"93594db4-f700-4c4c-a508-d21774e749d6","Type":"ContainerStarted","Data":"017aca0c2507026291a5bf62efc5a44f89408a90f18b47d8c4ac3ac0ae9b5442"} Jan 11 18:45:02 crc kubenswrapper[4837]: I0111 18:45:02.864070 4837 generic.go:334] "Generic (PLEG): container finished" podID="2001d3ff-95d2-472f-9116-306e63afac42" containerID="0ad8446931f3bed9b2f47c6a8d85b8c8f6deba00740e6665d9b7cdf150d3b802" exitCode=0 Jan 11 18:45:02 crc kubenswrapper[4837]: I0111 18:45:02.864219 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-bfzgr/must-gather-8s65m" event={"ID":"2001d3ff-95d2-472f-9116-306e63afac42","Type":"ContainerDied","Data":"0ad8446931f3bed9b2f47c6a8d85b8c8f6deba00740e6665d9b7cdf150d3b802"} Jan 11 18:45:02 crc kubenswrapper[4837]: I0111 18:45:02.865720 4837 scope.go:117] "RemoveContainer" containerID="0ad8446931f3bed9b2f47c6a8d85b8c8f6deba00740e6665d9b7cdf150d3b802" Jan 11 18:45:03 crc kubenswrapper[4837]: I0111 18:45:03.173764 4837 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-must-gather-bfzgr_must-gather-8s65m_2001d3ff-95d2-472f-9116-306e63afac42/gather/0.log" Jan 11 18:45:03 crc kubenswrapper[4837]: I0111 18:45:03.258721 4837 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29469285-8rpjw" Jan 11 18:45:03 crc kubenswrapper[4837]: I0111 18:45:03.435302 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-44wsl\" (UniqueName: \"kubernetes.io/projected/93594db4-f700-4c4c-a508-d21774e749d6-kube-api-access-44wsl\") pod \"93594db4-f700-4c4c-a508-d21774e749d6\" (UID: \"93594db4-f700-4c4c-a508-d21774e749d6\") " Jan 11 18:45:03 crc kubenswrapper[4837]: I0111 18:45:03.435434 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/93594db4-f700-4c4c-a508-d21774e749d6-config-volume\") pod \"93594db4-f700-4c4c-a508-d21774e749d6\" (UID: \"93594db4-f700-4c4c-a508-d21774e749d6\") " Jan 11 18:45:03 crc kubenswrapper[4837]: I0111 18:45:03.435491 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/93594db4-f700-4c4c-a508-d21774e749d6-secret-volume\") pod \"93594db4-f700-4c4c-a508-d21774e749d6\" (UID: \"93594db4-f700-4c4c-a508-d21774e749d6\") " Jan 11 18:45:03 crc kubenswrapper[4837]: I0111 18:45:03.438095 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/93594db4-f700-4c4c-a508-d21774e749d6-config-volume" (OuterVolumeSpecName: "config-volume") pod "93594db4-f700-4c4c-a508-d21774e749d6" (UID: "93594db4-f700-4c4c-a508-d21774e749d6"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 11 18:45:03 crc kubenswrapper[4837]: I0111 18:45:03.456080 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/93594db4-f700-4c4c-a508-d21774e749d6-kube-api-access-44wsl" (OuterVolumeSpecName: "kube-api-access-44wsl") pod "93594db4-f700-4c4c-a508-d21774e749d6" (UID: "93594db4-f700-4c4c-a508-d21774e749d6"). InnerVolumeSpecName "kube-api-access-44wsl". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 11 18:45:03 crc kubenswrapper[4837]: I0111 18:45:03.457623 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/93594db4-f700-4c4c-a508-d21774e749d6-secret-volume" (OuterVolumeSpecName: "secret-volume") pod "93594db4-f700-4c4c-a508-d21774e749d6" (UID: "93594db4-f700-4c4c-a508-d21774e749d6"). InnerVolumeSpecName "secret-volume". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 11 18:45:03 crc kubenswrapper[4837]: I0111 18:45:03.539296 4837 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-44wsl\" (UniqueName: \"kubernetes.io/projected/93594db4-f700-4c4c-a508-d21774e749d6-kube-api-access-44wsl\") on node \"crc\" DevicePath \"\"" Jan 11 18:45:03 crc kubenswrapper[4837]: I0111 18:45:03.539322 4837 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/93594db4-f700-4c4c-a508-d21774e749d6-config-volume\") on node \"crc\" DevicePath \"\"" Jan 11 18:45:03 crc kubenswrapper[4837]: I0111 18:45:03.539333 4837 reconciler_common.go:293] "Volume detached for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/93594db4-f700-4c4c-a508-d21774e749d6-secret-volume\") on node \"crc\" DevicePath \"\"" Jan 11 18:45:03 crc kubenswrapper[4837]: I0111 18:45:03.878858 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29469285-8rpjw" event={"ID":"93594db4-f700-4c4c-a508-d21774e749d6","Type":"ContainerDied","Data":"017aca0c2507026291a5bf62efc5a44f89408a90f18b47d8c4ac3ac0ae9b5442"} Jan 11 18:45:03 crc kubenswrapper[4837]: I0111 18:45:03.878903 4837 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="017aca0c2507026291a5bf62efc5a44f89408a90f18b47d8c4ac3ac0ae9b5442" Jan 11 18:45:03 crc kubenswrapper[4837]: I0111 18:45:03.878940 4837 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29469285-8rpjw" Jan 11 18:45:04 crc kubenswrapper[4837]: I0111 18:45:04.339342 4837 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29469240-kx6l5"] Jan 11 18:45:04 crc kubenswrapper[4837]: I0111 18:45:04.350214 4837 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29469240-kx6l5"] Jan 11 18:45:04 crc kubenswrapper[4837]: I0111 18:45:04.385830 4837 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="118fb501-a455-4398-9221-bc3c8922d5ff" path="/var/lib/kubelet/pods/118fb501-a455-4398-9221-bc3c8922d5ff/volumes" Jan 11 18:45:09 crc kubenswrapper[4837]: I0111 18:45:09.365201 4837 scope.go:117] "RemoveContainer" containerID="fe7cad95c24e1a087376edc1b36877ab7b7662b0a1fd568a5297b3984bc6ddd8" Jan 11 18:45:09 crc kubenswrapper[4837]: E0111 18:45:09.366193 4837 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-pqnst_openshift-machine-config-operator(1cf6fa66-290a-4e29-bafc-e60185a22fe2)\"" pod="openshift-machine-config-operator/machine-config-daemon-pqnst" podUID="1cf6fa66-290a-4e29-bafc-e60185a22fe2" Jan 11 18:45:14 crc kubenswrapper[4837]: I0111 18:45:14.300121 4837 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-must-gather-bfzgr/must-gather-8s65m"] Jan 11 18:45:14 crc kubenswrapper[4837]: I0111 18:45:14.300976 4837 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-must-gather-bfzgr/must-gather-8s65m" podUID="2001d3ff-95d2-472f-9116-306e63afac42" containerName="copy" containerID="cri-o://04052dc1b4eaaf82b55820b0ef50fa33590dafcf55dc3437b38fa74670f8ae0b" gracePeriod=2 Jan 11 18:45:14 crc kubenswrapper[4837]: I0111 18:45:14.310917 4837 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-must-gather-bfzgr/must-gather-8s65m"] Jan 11 18:45:14 crc kubenswrapper[4837]: I0111 18:45:14.705409 4837 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-must-gather-bfzgr_must-gather-8s65m_2001d3ff-95d2-472f-9116-306e63afac42/copy/0.log" Jan 11 18:45:14 crc kubenswrapper[4837]: I0111 18:45:14.706135 4837 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-bfzgr/must-gather-8s65m" Jan 11 18:45:14 crc kubenswrapper[4837]: I0111 18:45:14.857868 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-wjxmw\" (UniqueName: \"kubernetes.io/projected/2001d3ff-95d2-472f-9116-306e63afac42-kube-api-access-wjxmw\") pod \"2001d3ff-95d2-472f-9116-306e63afac42\" (UID: \"2001d3ff-95d2-472f-9116-306e63afac42\") " Jan 11 18:45:14 crc kubenswrapper[4837]: I0111 18:45:14.857936 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/2001d3ff-95d2-472f-9116-306e63afac42-must-gather-output\") pod \"2001d3ff-95d2-472f-9116-306e63afac42\" (UID: \"2001d3ff-95d2-472f-9116-306e63afac42\") " Jan 11 18:45:14 crc kubenswrapper[4837]: I0111 18:45:14.862952 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2001d3ff-95d2-472f-9116-306e63afac42-kube-api-access-wjxmw" (OuterVolumeSpecName: "kube-api-access-wjxmw") pod "2001d3ff-95d2-472f-9116-306e63afac42" (UID: "2001d3ff-95d2-472f-9116-306e63afac42"). InnerVolumeSpecName "kube-api-access-wjxmw". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 11 18:45:14 crc kubenswrapper[4837]: I0111 18:45:14.960710 4837 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-wjxmw\" (UniqueName: \"kubernetes.io/projected/2001d3ff-95d2-472f-9116-306e63afac42-kube-api-access-wjxmw\") on node \"crc\" DevicePath \"\"" Jan 11 18:45:14 crc kubenswrapper[4837]: I0111 18:45:14.982433 4837 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-must-gather-bfzgr_must-gather-8s65m_2001d3ff-95d2-472f-9116-306e63afac42/copy/0.log" Jan 11 18:45:14 crc kubenswrapper[4837]: I0111 18:45:14.982835 4837 generic.go:334] "Generic (PLEG): container finished" podID="2001d3ff-95d2-472f-9116-306e63afac42" containerID="04052dc1b4eaaf82b55820b0ef50fa33590dafcf55dc3437b38fa74670f8ae0b" exitCode=143 Jan 11 18:45:14 crc kubenswrapper[4837]: I0111 18:45:14.982901 4837 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-bfzgr/must-gather-8s65m" Jan 11 18:45:14 crc kubenswrapper[4837]: I0111 18:45:14.982916 4837 scope.go:117] "RemoveContainer" containerID="04052dc1b4eaaf82b55820b0ef50fa33590dafcf55dc3437b38fa74670f8ae0b" Jan 11 18:45:15 crc kubenswrapper[4837]: I0111 18:45:15.000442 4837 scope.go:117] "RemoveContainer" containerID="0ad8446931f3bed9b2f47c6a8d85b8c8f6deba00740e6665d9b7cdf150d3b802" Jan 11 18:45:15 crc kubenswrapper[4837]: I0111 18:45:15.018083 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/2001d3ff-95d2-472f-9116-306e63afac42-must-gather-output" (OuterVolumeSpecName: "must-gather-output") pod "2001d3ff-95d2-472f-9116-306e63afac42" (UID: "2001d3ff-95d2-472f-9116-306e63afac42"). InnerVolumeSpecName "must-gather-output". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 11 18:45:15 crc kubenswrapper[4837]: I0111 18:45:15.062396 4837 reconciler_common.go:293] "Volume detached for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/2001d3ff-95d2-472f-9116-306e63afac42-must-gather-output\") on node \"crc\" DevicePath \"\"" Jan 11 18:45:15 crc kubenswrapper[4837]: I0111 18:45:15.104253 4837 scope.go:117] "RemoveContainer" containerID="04052dc1b4eaaf82b55820b0ef50fa33590dafcf55dc3437b38fa74670f8ae0b" Jan 11 18:45:15 crc kubenswrapper[4837]: E0111 18:45:15.106462 4837 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"04052dc1b4eaaf82b55820b0ef50fa33590dafcf55dc3437b38fa74670f8ae0b\": container with ID starting with 04052dc1b4eaaf82b55820b0ef50fa33590dafcf55dc3437b38fa74670f8ae0b not found: ID does not exist" containerID="04052dc1b4eaaf82b55820b0ef50fa33590dafcf55dc3437b38fa74670f8ae0b" Jan 11 18:45:15 crc kubenswrapper[4837]: I0111 18:45:15.106500 4837 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"04052dc1b4eaaf82b55820b0ef50fa33590dafcf55dc3437b38fa74670f8ae0b"} err="failed to get container status \"04052dc1b4eaaf82b55820b0ef50fa33590dafcf55dc3437b38fa74670f8ae0b\": rpc error: code = NotFound desc = could not find container \"04052dc1b4eaaf82b55820b0ef50fa33590dafcf55dc3437b38fa74670f8ae0b\": container with ID starting with 04052dc1b4eaaf82b55820b0ef50fa33590dafcf55dc3437b38fa74670f8ae0b not found: ID does not exist" Jan 11 18:45:15 crc kubenswrapper[4837]: I0111 18:45:15.106520 4837 scope.go:117] "RemoveContainer" containerID="0ad8446931f3bed9b2f47c6a8d85b8c8f6deba00740e6665d9b7cdf150d3b802" Jan 11 18:45:15 crc kubenswrapper[4837]: E0111 18:45:15.110382 4837 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"0ad8446931f3bed9b2f47c6a8d85b8c8f6deba00740e6665d9b7cdf150d3b802\": container with ID starting with 0ad8446931f3bed9b2f47c6a8d85b8c8f6deba00740e6665d9b7cdf150d3b802 not found: ID does not exist" containerID="0ad8446931f3bed9b2f47c6a8d85b8c8f6deba00740e6665d9b7cdf150d3b802" Jan 11 18:45:15 crc kubenswrapper[4837]: I0111 18:45:15.110413 4837 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0ad8446931f3bed9b2f47c6a8d85b8c8f6deba00740e6665d9b7cdf150d3b802"} err="failed to get container status \"0ad8446931f3bed9b2f47c6a8d85b8c8f6deba00740e6665d9b7cdf150d3b802\": rpc error: code = NotFound desc = could not find container \"0ad8446931f3bed9b2f47c6a8d85b8c8f6deba00740e6665d9b7cdf150d3b802\": container with ID starting with 0ad8446931f3bed9b2f47c6a8d85b8c8f6deba00740e6665d9b7cdf150d3b802 not found: ID does not exist" Jan 11 18:45:16 crc kubenswrapper[4837]: I0111 18:45:16.376866 4837 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="2001d3ff-95d2-472f-9116-306e63afac42" path="/var/lib/kubelet/pods/2001d3ff-95d2-472f-9116-306e63afac42/volumes" Jan 11 18:45:22 crc kubenswrapper[4837]: I0111 18:45:22.364505 4837 scope.go:117] "RemoveContainer" containerID="fe7cad95c24e1a087376edc1b36877ab7b7662b0a1fd568a5297b3984bc6ddd8" Jan 11 18:45:22 crc kubenswrapper[4837]: E0111 18:45:22.365781 4837 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-pqnst_openshift-machine-config-operator(1cf6fa66-290a-4e29-bafc-e60185a22fe2)\"" pod="openshift-machine-config-operator/machine-config-daemon-pqnst" podUID="1cf6fa66-290a-4e29-bafc-e60185a22fe2" Jan 11 18:45:35 crc kubenswrapper[4837]: I0111 18:45:35.364609 4837 scope.go:117] "RemoveContainer" containerID="fe7cad95c24e1a087376edc1b36877ab7b7662b0a1fd568a5297b3984bc6ddd8" Jan 11 18:45:35 crc kubenswrapper[4837]: E0111 18:45:35.366129 4837 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-pqnst_openshift-machine-config-operator(1cf6fa66-290a-4e29-bafc-e60185a22fe2)\"" pod="openshift-machine-config-operator/machine-config-daemon-pqnst" podUID="1cf6fa66-290a-4e29-bafc-e60185a22fe2" Jan 11 18:45:46 crc kubenswrapper[4837]: I0111 18:45:46.374827 4837 scope.go:117] "RemoveContainer" containerID="fe7cad95c24e1a087376edc1b36877ab7b7662b0a1fd568a5297b3984bc6ddd8" Jan 11 18:45:46 crc kubenswrapper[4837]: E0111 18:45:46.375650 4837 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-pqnst_openshift-machine-config-operator(1cf6fa66-290a-4e29-bafc-e60185a22fe2)\"" pod="openshift-machine-config-operator/machine-config-daemon-pqnst" podUID="1cf6fa66-290a-4e29-bafc-e60185a22fe2" Jan 11 18:45:54 crc kubenswrapper[4837]: I0111 18:45:54.839527 4837 scope.go:117] "RemoveContainer" containerID="2a893a0965754f760e681e36c14e768a4c22cec4b37f6eef68d16056931d793d" Jan 11 18:45:54 crc kubenswrapper[4837]: I0111 18:45:54.865908 4837 scope.go:117] "RemoveContainer" containerID="8a116824bb73a003079673012a5705754413a97724f263dc40ad9152058d2fa5" Jan 11 18:45:58 crc kubenswrapper[4837]: I0111 18:45:58.365603 4837 scope.go:117] "RemoveContainer" containerID="fe7cad95c24e1a087376edc1b36877ab7b7662b0a1fd568a5297b3984bc6ddd8" Jan 11 18:45:58 crc kubenswrapper[4837]: E0111 18:45:58.366375 4837 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-pqnst_openshift-machine-config-operator(1cf6fa66-290a-4e29-bafc-e60185a22fe2)\"" pod="openshift-machine-config-operator/machine-config-daemon-pqnst" podUID="1cf6fa66-290a-4e29-bafc-e60185a22fe2" Jan 11 18:46:13 crc kubenswrapper[4837]: I0111 18:46:13.365134 4837 scope.go:117] "RemoveContainer" containerID="fe7cad95c24e1a087376edc1b36877ab7b7662b0a1fd568a5297b3984bc6ddd8" Jan 11 18:46:13 crc kubenswrapper[4837]: E0111 18:46:13.366324 4837 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-pqnst_openshift-machine-config-operator(1cf6fa66-290a-4e29-bafc-e60185a22fe2)\"" pod="openshift-machine-config-operator/machine-config-daemon-pqnst" podUID="1cf6fa66-290a-4e29-bafc-e60185a22fe2" Jan 11 18:46:26 crc kubenswrapper[4837]: I0111 18:46:26.370244 4837 scope.go:117] "RemoveContainer" containerID="fe7cad95c24e1a087376edc1b36877ab7b7662b0a1fd568a5297b3984bc6ddd8" Jan 11 18:46:26 crc kubenswrapper[4837]: E0111 18:46:26.371298 4837 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-pqnst_openshift-machine-config-operator(1cf6fa66-290a-4e29-bafc-e60185a22fe2)\"" pod="openshift-machine-config-operator/machine-config-daemon-pqnst" podUID="1cf6fa66-290a-4e29-bafc-e60185a22fe2" Jan 11 18:46:41 crc kubenswrapper[4837]: I0111 18:46:41.364275 4837 scope.go:117] "RemoveContainer" containerID="fe7cad95c24e1a087376edc1b36877ab7b7662b0a1fd568a5297b3984bc6ddd8" Jan 11 18:46:41 crc kubenswrapper[4837]: E0111 18:46:41.365483 4837 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-pqnst_openshift-machine-config-operator(1cf6fa66-290a-4e29-bafc-e60185a22fe2)\"" pod="openshift-machine-config-operator/machine-config-daemon-pqnst" podUID="1cf6fa66-290a-4e29-bafc-e60185a22fe2" Jan 11 18:46:54 crc kubenswrapper[4837]: I0111 18:46:54.364526 4837 scope.go:117] "RemoveContainer" containerID="fe7cad95c24e1a087376edc1b36877ab7b7662b0a1fd568a5297b3984bc6ddd8" Jan 11 18:46:54 crc kubenswrapper[4837]: E0111 18:46:54.365197 4837 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-pqnst_openshift-machine-config-operator(1cf6fa66-290a-4e29-bafc-e60185a22fe2)\"" pod="openshift-machine-config-operator/machine-config-daemon-pqnst" podUID="1cf6fa66-290a-4e29-bafc-e60185a22fe2" Jan 11 18:47:05 crc kubenswrapper[4837]: I0111 18:47:05.364439 4837 scope.go:117] "RemoveContainer" containerID="fe7cad95c24e1a087376edc1b36877ab7b7662b0a1fd568a5297b3984bc6ddd8" Jan 11 18:47:05 crc kubenswrapper[4837]: E0111 18:47:05.366762 4837 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-pqnst_openshift-machine-config-operator(1cf6fa66-290a-4e29-bafc-e60185a22fe2)\"" pod="openshift-machine-config-operator/machine-config-daemon-pqnst" podUID="1cf6fa66-290a-4e29-bafc-e60185a22fe2" Jan 11 18:47:07 crc kubenswrapper[4837]: I0111 18:47:07.349740 4837 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-4kjbz"] Jan 11 18:47:07 crc kubenswrapper[4837]: E0111 18:47:07.350357 4837 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="93594db4-f700-4c4c-a508-d21774e749d6" containerName="collect-profiles" Jan 11 18:47:07 crc kubenswrapper[4837]: I0111 18:47:07.350369 4837 state_mem.go:107] "Deleted CPUSet assignment" podUID="93594db4-f700-4c4c-a508-d21774e749d6" containerName="collect-profiles" Jan 11 18:47:07 crc kubenswrapper[4837]: E0111 18:47:07.350383 4837 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2001d3ff-95d2-472f-9116-306e63afac42" containerName="copy" Jan 11 18:47:07 crc kubenswrapper[4837]: I0111 18:47:07.350389 4837 state_mem.go:107] "Deleted CPUSet assignment" podUID="2001d3ff-95d2-472f-9116-306e63afac42" containerName="copy" Jan 11 18:47:07 crc kubenswrapper[4837]: E0111 18:47:07.350425 4837 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2001d3ff-95d2-472f-9116-306e63afac42" containerName="gather" Jan 11 18:47:07 crc kubenswrapper[4837]: I0111 18:47:07.350431 4837 state_mem.go:107] "Deleted CPUSet assignment" podUID="2001d3ff-95d2-472f-9116-306e63afac42" containerName="gather" Jan 11 18:47:07 crc kubenswrapper[4837]: I0111 18:47:07.350596 4837 memory_manager.go:354] "RemoveStaleState removing state" podUID="2001d3ff-95d2-472f-9116-306e63afac42" containerName="gather" Jan 11 18:47:07 crc kubenswrapper[4837]: I0111 18:47:07.350611 4837 memory_manager.go:354] "RemoveStaleState removing state" podUID="2001d3ff-95d2-472f-9116-306e63afac42" containerName="copy" Jan 11 18:47:07 crc kubenswrapper[4837]: I0111 18:47:07.350624 4837 memory_manager.go:354] "RemoveStaleState removing state" podUID="93594db4-f700-4c4c-a508-d21774e749d6" containerName="collect-profiles" Jan 11 18:47:07 crc kubenswrapper[4837]: I0111 18:47:07.353203 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-4kjbz" Jan 11 18:47:07 crc kubenswrapper[4837]: I0111 18:47:07.361344 4837 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-4kjbz"] Jan 11 18:47:07 crc kubenswrapper[4837]: I0111 18:47:07.497463 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lgw2f\" (UniqueName: \"kubernetes.io/projected/5f58af82-12cb-466f-aeda-563be432b210-kube-api-access-lgw2f\") pod \"redhat-operators-4kjbz\" (UID: \"5f58af82-12cb-466f-aeda-563be432b210\") " pod="openshift-marketplace/redhat-operators-4kjbz" Jan 11 18:47:07 crc kubenswrapper[4837]: I0111 18:47:07.497644 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5f58af82-12cb-466f-aeda-563be432b210-catalog-content\") pod \"redhat-operators-4kjbz\" (UID: \"5f58af82-12cb-466f-aeda-563be432b210\") " pod="openshift-marketplace/redhat-operators-4kjbz" Jan 11 18:47:07 crc kubenswrapper[4837]: I0111 18:47:07.498184 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5f58af82-12cb-466f-aeda-563be432b210-utilities\") pod \"redhat-operators-4kjbz\" (UID: \"5f58af82-12cb-466f-aeda-563be432b210\") " pod="openshift-marketplace/redhat-operators-4kjbz" Jan 11 18:47:07 crc kubenswrapper[4837]: I0111 18:47:07.602465 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lgw2f\" (UniqueName: \"kubernetes.io/projected/5f58af82-12cb-466f-aeda-563be432b210-kube-api-access-lgw2f\") pod \"redhat-operators-4kjbz\" (UID: \"5f58af82-12cb-466f-aeda-563be432b210\") " pod="openshift-marketplace/redhat-operators-4kjbz" Jan 11 18:47:07 crc kubenswrapper[4837]: I0111 18:47:07.602617 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5f58af82-12cb-466f-aeda-563be432b210-catalog-content\") pod \"redhat-operators-4kjbz\" (UID: \"5f58af82-12cb-466f-aeda-563be432b210\") " pod="openshift-marketplace/redhat-operators-4kjbz" Jan 11 18:47:07 crc kubenswrapper[4837]: I0111 18:47:07.602840 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5f58af82-12cb-466f-aeda-563be432b210-utilities\") pod \"redhat-operators-4kjbz\" (UID: \"5f58af82-12cb-466f-aeda-563be432b210\") " pod="openshift-marketplace/redhat-operators-4kjbz" Jan 11 18:47:07 crc kubenswrapper[4837]: I0111 18:47:07.603171 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5f58af82-12cb-466f-aeda-563be432b210-catalog-content\") pod \"redhat-operators-4kjbz\" (UID: \"5f58af82-12cb-466f-aeda-563be432b210\") " pod="openshift-marketplace/redhat-operators-4kjbz" Jan 11 18:47:07 crc kubenswrapper[4837]: I0111 18:47:07.603318 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5f58af82-12cb-466f-aeda-563be432b210-utilities\") pod \"redhat-operators-4kjbz\" (UID: \"5f58af82-12cb-466f-aeda-563be432b210\") " pod="openshift-marketplace/redhat-operators-4kjbz" Jan 11 18:47:07 crc kubenswrapper[4837]: I0111 18:47:07.627820 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lgw2f\" (UniqueName: \"kubernetes.io/projected/5f58af82-12cb-466f-aeda-563be432b210-kube-api-access-lgw2f\") pod \"redhat-operators-4kjbz\" (UID: \"5f58af82-12cb-466f-aeda-563be432b210\") " pod="openshift-marketplace/redhat-operators-4kjbz" Jan 11 18:47:07 crc kubenswrapper[4837]: I0111 18:47:07.690606 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-4kjbz" Jan 11 18:47:08 crc kubenswrapper[4837]: I0111 18:47:08.175321 4837 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-4kjbz"] Jan 11 18:47:09 crc kubenswrapper[4837]: I0111 18:47:09.176116 4837 generic.go:334] "Generic (PLEG): container finished" podID="5f58af82-12cb-466f-aeda-563be432b210" containerID="771b0601aadbaa3a66e80205244749e4677ed8aa58f6c4726fce628c5b5d0719" exitCode=0 Jan 11 18:47:09 crc kubenswrapper[4837]: I0111 18:47:09.176176 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-4kjbz" event={"ID":"5f58af82-12cb-466f-aeda-563be432b210","Type":"ContainerDied","Data":"771b0601aadbaa3a66e80205244749e4677ed8aa58f6c4726fce628c5b5d0719"} Jan 11 18:47:09 crc kubenswrapper[4837]: I0111 18:47:09.176434 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-4kjbz" event={"ID":"5f58af82-12cb-466f-aeda-563be432b210","Type":"ContainerStarted","Data":"0ea74ba985086cc574fbeb59ffcaba1a44cd66aa72ef20b78c723aff5089132c"} Jan 11 18:47:09 crc kubenswrapper[4837]: I0111 18:47:09.180352 4837 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Jan 11 18:47:11 crc kubenswrapper[4837]: I0111 18:47:11.199564 4837 generic.go:334] "Generic (PLEG): container finished" podID="5f58af82-12cb-466f-aeda-563be432b210" containerID="ea8caac0b1704c06fe0041a8487a81fc0a03ecab7b2acf9208e72196ad39bc54" exitCode=0 Jan 11 18:47:11 crc kubenswrapper[4837]: I0111 18:47:11.199644 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-4kjbz" event={"ID":"5f58af82-12cb-466f-aeda-563be432b210","Type":"ContainerDied","Data":"ea8caac0b1704c06fe0041a8487a81fc0a03ecab7b2acf9208e72196ad39bc54"} Jan 11 18:47:12 crc kubenswrapper[4837]: I0111 18:47:12.215047 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-4kjbz" event={"ID":"5f58af82-12cb-466f-aeda-563be432b210","Type":"ContainerStarted","Data":"54402c14eb95baaa1b67cbb87090cfccffc14827107564037d3728236412ad7f"} Jan 11 18:47:12 crc kubenswrapper[4837]: I0111 18:47:12.257458 4837 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-4kjbz" podStartSLOduration=2.741111499 podStartE2EDuration="5.257429381s" podCreationTimestamp="2026-01-11 18:47:07 +0000 UTC" firstStartedPulling="2026-01-11 18:47:09.180028315 +0000 UTC m=+4603.358221041" lastFinishedPulling="2026-01-11 18:47:11.696346187 +0000 UTC m=+4605.874538923" observedRunningTime="2026-01-11 18:47:12.247297139 +0000 UTC m=+4606.425489845" watchObservedRunningTime="2026-01-11 18:47:12.257429381 +0000 UTC m=+4606.435622127" Jan 11 18:47:13 crc kubenswrapper[4837]: I0111 18:47:13.716664 4837 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-htksv"] Jan 11 18:47:13 crc kubenswrapper[4837]: I0111 18:47:13.726155 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-htksv" Jan 11 18:47:13 crc kubenswrapper[4837]: I0111 18:47:13.767861 4837 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-htksv"] Jan 11 18:47:13 crc kubenswrapper[4837]: I0111 18:47:13.819589 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/812de67e-8984-408a-a295-38c85ac294c0-utilities\") pod \"certified-operators-htksv\" (UID: \"812de67e-8984-408a-a295-38c85ac294c0\") " pod="openshift-marketplace/certified-operators-htksv" Jan 11 18:47:13 crc kubenswrapper[4837]: I0111 18:47:13.819658 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-j89fw\" (UniqueName: \"kubernetes.io/projected/812de67e-8984-408a-a295-38c85ac294c0-kube-api-access-j89fw\") pod \"certified-operators-htksv\" (UID: \"812de67e-8984-408a-a295-38c85ac294c0\") " pod="openshift-marketplace/certified-operators-htksv" Jan 11 18:47:13 crc kubenswrapper[4837]: I0111 18:47:13.819724 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/812de67e-8984-408a-a295-38c85ac294c0-catalog-content\") pod \"certified-operators-htksv\" (UID: \"812de67e-8984-408a-a295-38c85ac294c0\") " pod="openshift-marketplace/certified-operators-htksv" Jan 11 18:47:13 crc kubenswrapper[4837]: I0111 18:47:13.922514 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/812de67e-8984-408a-a295-38c85ac294c0-utilities\") pod \"certified-operators-htksv\" (UID: \"812de67e-8984-408a-a295-38c85ac294c0\") " pod="openshift-marketplace/certified-operators-htksv" Jan 11 18:47:13 crc kubenswrapper[4837]: I0111 18:47:13.922576 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-j89fw\" (UniqueName: \"kubernetes.io/projected/812de67e-8984-408a-a295-38c85ac294c0-kube-api-access-j89fw\") pod \"certified-operators-htksv\" (UID: \"812de67e-8984-408a-a295-38c85ac294c0\") " pod="openshift-marketplace/certified-operators-htksv" Jan 11 18:47:13 crc kubenswrapper[4837]: I0111 18:47:13.922612 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/812de67e-8984-408a-a295-38c85ac294c0-catalog-content\") pod \"certified-operators-htksv\" (UID: \"812de67e-8984-408a-a295-38c85ac294c0\") " pod="openshift-marketplace/certified-operators-htksv" Jan 11 18:47:13 crc kubenswrapper[4837]: I0111 18:47:13.923148 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/812de67e-8984-408a-a295-38c85ac294c0-catalog-content\") pod \"certified-operators-htksv\" (UID: \"812de67e-8984-408a-a295-38c85ac294c0\") " pod="openshift-marketplace/certified-operators-htksv" Jan 11 18:47:13 crc kubenswrapper[4837]: I0111 18:47:13.923366 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/812de67e-8984-408a-a295-38c85ac294c0-utilities\") pod \"certified-operators-htksv\" (UID: \"812de67e-8984-408a-a295-38c85ac294c0\") " pod="openshift-marketplace/certified-operators-htksv" Jan 11 18:47:13 crc kubenswrapper[4837]: I0111 18:47:13.957664 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-j89fw\" (UniqueName: \"kubernetes.io/projected/812de67e-8984-408a-a295-38c85ac294c0-kube-api-access-j89fw\") pod \"certified-operators-htksv\" (UID: \"812de67e-8984-408a-a295-38c85ac294c0\") " pod="openshift-marketplace/certified-operators-htksv" Jan 11 18:47:14 crc kubenswrapper[4837]: I0111 18:47:14.100133 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-htksv" Jan 11 18:47:14 crc kubenswrapper[4837]: I0111 18:47:14.641892 4837 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-htksv"] Jan 11 18:47:15 crc kubenswrapper[4837]: I0111 18:47:15.246202 4837 generic.go:334] "Generic (PLEG): container finished" podID="812de67e-8984-408a-a295-38c85ac294c0" containerID="d7b57b00f517bdb9c800135643d4f219439379ce8bc2422d294f725e992563e7" exitCode=0 Jan 11 18:47:15 crc kubenswrapper[4837]: I0111 18:47:15.246295 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-htksv" event={"ID":"812de67e-8984-408a-a295-38c85ac294c0","Type":"ContainerDied","Data":"d7b57b00f517bdb9c800135643d4f219439379ce8bc2422d294f725e992563e7"} Jan 11 18:47:15 crc kubenswrapper[4837]: I0111 18:47:15.246556 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-htksv" event={"ID":"812de67e-8984-408a-a295-38c85ac294c0","Type":"ContainerStarted","Data":"d0b3bfdd3ac78c9c90bf8d00086777d982973ff7e7e80135e2438a079e967319"} Jan 11 18:47:16 crc kubenswrapper[4837]: I0111 18:47:16.374630 4837 scope.go:117] "RemoveContainer" containerID="fe7cad95c24e1a087376edc1b36877ab7b7662b0a1fd568a5297b3984bc6ddd8" Jan 11 18:47:16 crc kubenswrapper[4837]: E0111 18:47:16.374990 4837 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-pqnst_openshift-machine-config-operator(1cf6fa66-290a-4e29-bafc-e60185a22fe2)\"" pod="openshift-machine-config-operator/machine-config-daemon-pqnst" podUID="1cf6fa66-290a-4e29-bafc-e60185a22fe2" Jan 11 18:47:17 crc kubenswrapper[4837]: I0111 18:47:17.272184 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-htksv" event={"ID":"812de67e-8984-408a-a295-38c85ac294c0","Type":"ContainerStarted","Data":"3b7bb27a125f9cf40349714b0ecbfebb448651cea7d7099563a3f26f2c3e7969"} Jan 11 18:47:17 crc kubenswrapper[4837]: I0111 18:47:17.691765 4837 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-4kjbz" Jan 11 18:47:17 crc kubenswrapper[4837]: I0111 18:47:17.691865 4837 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-4kjbz" Jan 11 18:47:18 crc kubenswrapper[4837]: I0111 18:47:18.287092 4837 generic.go:334] "Generic (PLEG): container finished" podID="812de67e-8984-408a-a295-38c85ac294c0" containerID="3b7bb27a125f9cf40349714b0ecbfebb448651cea7d7099563a3f26f2c3e7969" exitCode=0 Jan 11 18:47:18 crc kubenswrapper[4837]: I0111 18:47:18.287292 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-htksv" event={"ID":"812de67e-8984-408a-a295-38c85ac294c0","Type":"ContainerDied","Data":"3b7bb27a125f9cf40349714b0ecbfebb448651cea7d7099563a3f26f2c3e7969"} Jan 11 18:47:18 crc kubenswrapper[4837]: I0111 18:47:18.769312 4837 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-4kjbz" podUID="5f58af82-12cb-466f-aeda-563be432b210" containerName="registry-server" probeResult="failure" output=< Jan 11 18:47:18 crc kubenswrapper[4837]: timeout: failed to connect service ":50051" within 1s Jan 11 18:47:18 crc kubenswrapper[4837]: > Jan 11 18:47:20 crc kubenswrapper[4837]: I0111 18:47:20.313262 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-htksv" event={"ID":"812de67e-8984-408a-a295-38c85ac294c0","Type":"ContainerStarted","Data":"868f93fa2aa20460e8bc84e88e970ff1a2f0e026b391da11e3b6d69ad1da5a54"} Jan 11 18:47:20 crc kubenswrapper[4837]: I0111 18:47:20.354132 4837 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-htksv" podStartSLOduration=3.91318204 podStartE2EDuration="7.354106492s" podCreationTimestamp="2026-01-11 18:47:13 +0000 UTC" firstStartedPulling="2026-01-11 18:47:15.248848497 +0000 UTC m=+4609.427041223" lastFinishedPulling="2026-01-11 18:47:18.689772929 +0000 UTC m=+4612.867965675" observedRunningTime="2026-01-11 18:47:20.339081898 +0000 UTC m=+4614.517274604" watchObservedRunningTime="2026-01-11 18:47:20.354106492 +0000 UTC m=+4614.532299228" Jan 11 18:47:24 crc kubenswrapper[4837]: I0111 18:47:24.101237 4837 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-htksv" Jan 11 18:47:24 crc kubenswrapper[4837]: I0111 18:47:24.101553 4837 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-htksv" Jan 11 18:47:24 crc kubenswrapper[4837]: I0111 18:47:24.151393 4837 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-htksv" Jan 11 18:47:24 crc kubenswrapper[4837]: I0111 18:47:24.414910 4837 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-htksv" Jan 11 18:47:24 crc kubenswrapper[4837]: I0111 18:47:24.467145 4837 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-htksv"] Jan 11 18:47:26 crc kubenswrapper[4837]: I0111 18:47:26.383422 4837 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-htksv" podUID="812de67e-8984-408a-a295-38c85ac294c0" containerName="registry-server" containerID="cri-o://868f93fa2aa20460e8bc84e88e970ff1a2f0e026b391da11e3b6d69ad1da5a54" gracePeriod=2 Jan 11 18:47:26 crc kubenswrapper[4837]: I0111 18:47:26.891955 4837 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-htksv" Jan 11 18:47:26 crc kubenswrapper[4837]: I0111 18:47:26.997105 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/812de67e-8984-408a-a295-38c85ac294c0-catalog-content\") pod \"812de67e-8984-408a-a295-38c85ac294c0\" (UID: \"812de67e-8984-408a-a295-38c85ac294c0\") " Jan 11 18:47:26 crc kubenswrapper[4837]: I0111 18:47:26.997269 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-j89fw\" (UniqueName: \"kubernetes.io/projected/812de67e-8984-408a-a295-38c85ac294c0-kube-api-access-j89fw\") pod \"812de67e-8984-408a-a295-38c85ac294c0\" (UID: \"812de67e-8984-408a-a295-38c85ac294c0\") " Jan 11 18:47:26 crc kubenswrapper[4837]: I0111 18:47:26.997460 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/812de67e-8984-408a-a295-38c85ac294c0-utilities\") pod \"812de67e-8984-408a-a295-38c85ac294c0\" (UID: \"812de67e-8984-408a-a295-38c85ac294c0\") " Jan 11 18:47:26 crc kubenswrapper[4837]: I0111 18:47:26.998390 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/812de67e-8984-408a-a295-38c85ac294c0-utilities" (OuterVolumeSpecName: "utilities") pod "812de67e-8984-408a-a295-38c85ac294c0" (UID: "812de67e-8984-408a-a295-38c85ac294c0"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 11 18:47:27 crc kubenswrapper[4837]: I0111 18:47:27.004971 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/812de67e-8984-408a-a295-38c85ac294c0-kube-api-access-j89fw" (OuterVolumeSpecName: "kube-api-access-j89fw") pod "812de67e-8984-408a-a295-38c85ac294c0" (UID: "812de67e-8984-408a-a295-38c85ac294c0"). InnerVolumeSpecName "kube-api-access-j89fw". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 11 18:47:27 crc kubenswrapper[4837]: I0111 18:47:27.066183 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/812de67e-8984-408a-a295-38c85ac294c0-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "812de67e-8984-408a-a295-38c85ac294c0" (UID: "812de67e-8984-408a-a295-38c85ac294c0"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 11 18:47:27 crc kubenswrapper[4837]: I0111 18:47:27.099613 4837 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/812de67e-8984-408a-a295-38c85ac294c0-utilities\") on node \"crc\" DevicePath \"\"" Jan 11 18:47:27 crc kubenswrapper[4837]: I0111 18:47:27.099653 4837 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/812de67e-8984-408a-a295-38c85ac294c0-catalog-content\") on node \"crc\" DevicePath \"\"" Jan 11 18:47:27 crc kubenswrapper[4837]: I0111 18:47:27.099669 4837 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-j89fw\" (UniqueName: \"kubernetes.io/projected/812de67e-8984-408a-a295-38c85ac294c0-kube-api-access-j89fw\") on node \"crc\" DevicePath \"\"" Jan 11 18:47:27 crc kubenswrapper[4837]: I0111 18:47:27.365504 4837 scope.go:117] "RemoveContainer" containerID="fe7cad95c24e1a087376edc1b36877ab7b7662b0a1fd568a5297b3984bc6ddd8" Jan 11 18:47:27 crc kubenswrapper[4837]: E0111 18:47:27.366564 4837 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-pqnst_openshift-machine-config-operator(1cf6fa66-290a-4e29-bafc-e60185a22fe2)\"" pod="openshift-machine-config-operator/machine-config-daemon-pqnst" podUID="1cf6fa66-290a-4e29-bafc-e60185a22fe2" Jan 11 18:47:27 crc kubenswrapper[4837]: I0111 18:47:27.395426 4837 generic.go:334] "Generic (PLEG): container finished" podID="812de67e-8984-408a-a295-38c85ac294c0" containerID="868f93fa2aa20460e8bc84e88e970ff1a2f0e026b391da11e3b6d69ad1da5a54" exitCode=0 Jan 11 18:47:27 crc kubenswrapper[4837]: I0111 18:47:27.395487 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-htksv" event={"ID":"812de67e-8984-408a-a295-38c85ac294c0","Type":"ContainerDied","Data":"868f93fa2aa20460e8bc84e88e970ff1a2f0e026b391da11e3b6d69ad1da5a54"} Jan 11 18:47:27 crc kubenswrapper[4837]: I0111 18:47:27.395503 4837 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-htksv" Jan 11 18:47:27 crc kubenswrapper[4837]: I0111 18:47:27.395525 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-htksv" event={"ID":"812de67e-8984-408a-a295-38c85ac294c0","Type":"ContainerDied","Data":"d0b3bfdd3ac78c9c90bf8d00086777d982973ff7e7e80135e2438a079e967319"} Jan 11 18:47:27 crc kubenswrapper[4837]: I0111 18:47:27.395554 4837 scope.go:117] "RemoveContainer" containerID="868f93fa2aa20460e8bc84e88e970ff1a2f0e026b391da11e3b6d69ad1da5a54" Jan 11 18:47:27 crc kubenswrapper[4837]: I0111 18:47:27.437319 4837 scope.go:117] "RemoveContainer" containerID="3b7bb27a125f9cf40349714b0ecbfebb448651cea7d7099563a3f26f2c3e7969" Jan 11 18:47:27 crc kubenswrapper[4837]: I0111 18:47:27.437946 4837 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-htksv"] Jan 11 18:47:27 crc kubenswrapper[4837]: I0111 18:47:27.447466 4837 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-htksv"] Jan 11 18:47:27 crc kubenswrapper[4837]: I0111 18:47:27.475643 4837 scope.go:117] "RemoveContainer" containerID="d7b57b00f517bdb9c800135643d4f219439379ce8bc2422d294f725e992563e7" Jan 11 18:47:27 crc kubenswrapper[4837]: I0111 18:47:27.533889 4837 scope.go:117] "RemoveContainer" containerID="868f93fa2aa20460e8bc84e88e970ff1a2f0e026b391da11e3b6d69ad1da5a54" Jan 11 18:47:27 crc kubenswrapper[4837]: E0111 18:47:27.534490 4837 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"868f93fa2aa20460e8bc84e88e970ff1a2f0e026b391da11e3b6d69ad1da5a54\": container with ID starting with 868f93fa2aa20460e8bc84e88e970ff1a2f0e026b391da11e3b6d69ad1da5a54 not found: ID does not exist" containerID="868f93fa2aa20460e8bc84e88e970ff1a2f0e026b391da11e3b6d69ad1da5a54" Jan 11 18:47:27 crc kubenswrapper[4837]: I0111 18:47:27.534526 4837 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"868f93fa2aa20460e8bc84e88e970ff1a2f0e026b391da11e3b6d69ad1da5a54"} err="failed to get container status \"868f93fa2aa20460e8bc84e88e970ff1a2f0e026b391da11e3b6d69ad1da5a54\": rpc error: code = NotFound desc = could not find container \"868f93fa2aa20460e8bc84e88e970ff1a2f0e026b391da11e3b6d69ad1da5a54\": container with ID starting with 868f93fa2aa20460e8bc84e88e970ff1a2f0e026b391da11e3b6d69ad1da5a54 not found: ID does not exist" Jan 11 18:47:27 crc kubenswrapper[4837]: I0111 18:47:27.534551 4837 scope.go:117] "RemoveContainer" containerID="3b7bb27a125f9cf40349714b0ecbfebb448651cea7d7099563a3f26f2c3e7969" Jan 11 18:47:27 crc kubenswrapper[4837]: E0111 18:47:27.535121 4837 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"3b7bb27a125f9cf40349714b0ecbfebb448651cea7d7099563a3f26f2c3e7969\": container with ID starting with 3b7bb27a125f9cf40349714b0ecbfebb448651cea7d7099563a3f26f2c3e7969 not found: ID does not exist" containerID="3b7bb27a125f9cf40349714b0ecbfebb448651cea7d7099563a3f26f2c3e7969" Jan 11 18:47:27 crc kubenswrapper[4837]: I0111 18:47:27.535152 4837 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3b7bb27a125f9cf40349714b0ecbfebb448651cea7d7099563a3f26f2c3e7969"} err="failed to get container status \"3b7bb27a125f9cf40349714b0ecbfebb448651cea7d7099563a3f26f2c3e7969\": rpc error: code = NotFound desc = could not find container \"3b7bb27a125f9cf40349714b0ecbfebb448651cea7d7099563a3f26f2c3e7969\": container with ID starting with 3b7bb27a125f9cf40349714b0ecbfebb448651cea7d7099563a3f26f2c3e7969 not found: ID does not exist" Jan 11 18:47:27 crc kubenswrapper[4837]: I0111 18:47:27.535169 4837 scope.go:117] "RemoveContainer" containerID="d7b57b00f517bdb9c800135643d4f219439379ce8bc2422d294f725e992563e7" Jan 11 18:47:27 crc kubenswrapper[4837]: E0111 18:47:27.535540 4837 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d7b57b00f517bdb9c800135643d4f219439379ce8bc2422d294f725e992563e7\": container with ID starting with d7b57b00f517bdb9c800135643d4f219439379ce8bc2422d294f725e992563e7 not found: ID does not exist" containerID="d7b57b00f517bdb9c800135643d4f219439379ce8bc2422d294f725e992563e7" Jan 11 18:47:27 crc kubenswrapper[4837]: I0111 18:47:27.535599 4837 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d7b57b00f517bdb9c800135643d4f219439379ce8bc2422d294f725e992563e7"} err="failed to get container status \"d7b57b00f517bdb9c800135643d4f219439379ce8bc2422d294f725e992563e7\": rpc error: code = NotFound desc = could not find container \"d7b57b00f517bdb9c800135643d4f219439379ce8bc2422d294f725e992563e7\": container with ID starting with d7b57b00f517bdb9c800135643d4f219439379ce8bc2422d294f725e992563e7 not found: ID does not exist" Jan 11 18:47:27 crc kubenswrapper[4837]: I0111 18:47:27.774564 4837 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-4kjbz" Jan 11 18:47:27 crc kubenswrapper[4837]: I0111 18:47:27.847494 4837 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-4kjbz" Jan 11 18:47:28 crc kubenswrapper[4837]: I0111 18:47:28.385289 4837 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="812de67e-8984-408a-a295-38c85ac294c0" path="/var/lib/kubelet/pods/812de67e-8984-408a-a295-38c85ac294c0/volumes" Jan 11 18:47:29 crc kubenswrapper[4837]: I0111 18:47:29.801890 4837 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-4kjbz"] Jan 11 18:47:29 crc kubenswrapper[4837]: I0111 18:47:29.802230 4837 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-4kjbz" podUID="5f58af82-12cb-466f-aeda-563be432b210" containerName="registry-server" containerID="cri-o://54402c14eb95baaa1b67cbb87090cfccffc14827107564037d3728236412ad7f" gracePeriod=2 Jan 11 18:47:30 crc kubenswrapper[4837]: I0111 18:47:30.436004 4837 generic.go:334] "Generic (PLEG): container finished" podID="5f58af82-12cb-466f-aeda-563be432b210" containerID="54402c14eb95baaa1b67cbb87090cfccffc14827107564037d3728236412ad7f" exitCode=0 Jan 11 18:47:30 crc kubenswrapper[4837]: I0111 18:47:30.436074 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-4kjbz" event={"ID":"5f58af82-12cb-466f-aeda-563be432b210","Type":"ContainerDied","Data":"54402c14eb95baaa1b67cbb87090cfccffc14827107564037d3728236412ad7f"} Jan 11 18:47:30 crc kubenswrapper[4837]: I0111 18:47:30.802599 4837 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-4kjbz" Jan 11 18:47:30 crc kubenswrapper[4837]: I0111 18:47:30.978379 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5f58af82-12cb-466f-aeda-563be432b210-utilities\") pod \"5f58af82-12cb-466f-aeda-563be432b210\" (UID: \"5f58af82-12cb-466f-aeda-563be432b210\") " Jan 11 18:47:30 crc kubenswrapper[4837]: I0111 18:47:30.978846 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-lgw2f\" (UniqueName: \"kubernetes.io/projected/5f58af82-12cb-466f-aeda-563be432b210-kube-api-access-lgw2f\") pod \"5f58af82-12cb-466f-aeda-563be432b210\" (UID: \"5f58af82-12cb-466f-aeda-563be432b210\") " Jan 11 18:47:30 crc kubenswrapper[4837]: I0111 18:47:30.979055 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5f58af82-12cb-466f-aeda-563be432b210-catalog-content\") pod \"5f58af82-12cb-466f-aeda-563be432b210\" (UID: \"5f58af82-12cb-466f-aeda-563be432b210\") " Jan 11 18:47:30 crc kubenswrapper[4837]: I0111 18:47:30.979301 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/5f58af82-12cb-466f-aeda-563be432b210-utilities" (OuterVolumeSpecName: "utilities") pod "5f58af82-12cb-466f-aeda-563be432b210" (UID: "5f58af82-12cb-466f-aeda-563be432b210"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 11 18:47:30 crc kubenswrapper[4837]: I0111 18:47:30.980417 4837 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5f58af82-12cb-466f-aeda-563be432b210-utilities\") on node \"crc\" DevicePath \"\"" Jan 11 18:47:30 crc kubenswrapper[4837]: I0111 18:47:30.984464 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5f58af82-12cb-466f-aeda-563be432b210-kube-api-access-lgw2f" (OuterVolumeSpecName: "kube-api-access-lgw2f") pod "5f58af82-12cb-466f-aeda-563be432b210" (UID: "5f58af82-12cb-466f-aeda-563be432b210"). InnerVolumeSpecName "kube-api-access-lgw2f". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 11 18:47:31 crc kubenswrapper[4837]: I0111 18:47:31.084358 4837 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-lgw2f\" (UniqueName: \"kubernetes.io/projected/5f58af82-12cb-466f-aeda-563be432b210-kube-api-access-lgw2f\") on node \"crc\" DevicePath \"\"" Jan 11 18:47:31 crc kubenswrapper[4837]: I0111 18:47:31.128444 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/5f58af82-12cb-466f-aeda-563be432b210-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "5f58af82-12cb-466f-aeda-563be432b210" (UID: "5f58af82-12cb-466f-aeda-563be432b210"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 11 18:47:31 crc kubenswrapper[4837]: I0111 18:47:31.186643 4837 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5f58af82-12cb-466f-aeda-563be432b210-catalog-content\") on node \"crc\" DevicePath \"\"" Jan 11 18:47:31 crc kubenswrapper[4837]: I0111 18:47:31.451124 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-4kjbz" event={"ID":"5f58af82-12cb-466f-aeda-563be432b210","Type":"ContainerDied","Data":"0ea74ba985086cc574fbeb59ffcaba1a44cd66aa72ef20b78c723aff5089132c"} Jan 11 18:47:31 crc kubenswrapper[4837]: I0111 18:47:31.451183 4837 scope.go:117] "RemoveContainer" containerID="54402c14eb95baaa1b67cbb87090cfccffc14827107564037d3728236412ad7f" Jan 11 18:47:31 crc kubenswrapper[4837]: I0111 18:47:31.451237 4837 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-4kjbz" Jan 11 18:47:31 crc kubenswrapper[4837]: I0111 18:47:31.506389 4837 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-4kjbz"] Jan 11 18:47:31 crc kubenswrapper[4837]: I0111 18:47:31.508785 4837 scope.go:117] "RemoveContainer" containerID="ea8caac0b1704c06fe0041a8487a81fc0a03ecab7b2acf9208e72196ad39bc54" Jan 11 18:47:31 crc kubenswrapper[4837]: I0111 18:47:31.519261 4837 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-4kjbz"] Jan 11 18:47:31 crc kubenswrapper[4837]: I0111 18:47:31.546429 4837 scope.go:117] "RemoveContainer" containerID="771b0601aadbaa3a66e80205244749e4677ed8aa58f6c4726fce628c5b5d0719" Jan 11 18:47:32 crc kubenswrapper[4837]: I0111 18:47:32.382010 4837 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5f58af82-12cb-466f-aeda-563be432b210" path="/var/lib/kubelet/pods/5f58af82-12cb-466f-aeda-563be432b210/volumes" Jan 11 18:47:41 crc kubenswrapper[4837]: I0111 18:47:41.364199 4837 scope.go:117] "RemoveContainer" containerID="fe7cad95c24e1a087376edc1b36877ab7b7662b0a1fd568a5297b3984bc6ddd8" Jan 11 18:47:41 crc kubenswrapper[4837]: E0111 18:47:41.364869 4837 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-pqnst_openshift-machine-config-operator(1cf6fa66-290a-4e29-bafc-e60185a22fe2)\"" pod="openshift-machine-config-operator/machine-config-daemon-pqnst" podUID="1cf6fa66-290a-4e29-bafc-e60185a22fe2" Jan 11 18:47:52 crc kubenswrapper[4837]: I0111 18:47:52.364946 4837 scope.go:117] "RemoveContainer" containerID="fe7cad95c24e1a087376edc1b36877ab7b7662b0a1fd568a5297b3984bc6ddd8" Jan 11 18:47:52 crc kubenswrapper[4837]: E0111 18:47:52.366034 4837 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-pqnst_openshift-machine-config-operator(1cf6fa66-290a-4e29-bafc-e60185a22fe2)\"" pod="openshift-machine-config-operator/machine-config-daemon-pqnst" podUID="1cf6fa66-290a-4e29-bafc-e60185a22fe2" Jan 11 18:48:06 crc kubenswrapper[4837]: I0111 18:48:06.376465 4837 scope.go:117] "RemoveContainer" containerID="fe7cad95c24e1a087376edc1b36877ab7b7662b0a1fd568a5297b3984bc6ddd8" Jan 11 18:48:06 crc kubenswrapper[4837]: E0111 18:48:06.377606 4837 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-pqnst_openshift-machine-config-operator(1cf6fa66-290a-4e29-bafc-e60185a22fe2)\"" pod="openshift-machine-config-operator/machine-config-daemon-pqnst" podUID="1cf6fa66-290a-4e29-bafc-e60185a22fe2" Jan 11 18:48:17 crc kubenswrapper[4837]: I0111 18:48:17.363867 4837 scope.go:117] "RemoveContainer" containerID="fe7cad95c24e1a087376edc1b36877ab7b7662b0a1fd568a5297b3984bc6ddd8" Jan 11 18:48:17 crc kubenswrapper[4837]: E0111 18:48:17.364631 4837 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-pqnst_openshift-machine-config-operator(1cf6fa66-290a-4e29-bafc-e60185a22fe2)\"" pod="openshift-machine-config-operator/machine-config-daemon-pqnst" podUID="1cf6fa66-290a-4e29-bafc-e60185a22fe2" Jan 11 18:48:30 crc kubenswrapper[4837]: I0111 18:48:30.370735 4837 scope.go:117] "RemoveContainer" containerID="fe7cad95c24e1a087376edc1b36877ab7b7662b0a1fd568a5297b3984bc6ddd8" Jan 11 18:48:30 crc kubenswrapper[4837]: E0111 18:48:30.372901 4837 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-pqnst_openshift-machine-config-operator(1cf6fa66-290a-4e29-bafc-e60185a22fe2)\"" pod="openshift-machine-config-operator/machine-config-daemon-pqnst" podUID="1cf6fa66-290a-4e29-bafc-e60185a22fe2" Jan 11 18:48:42 crc kubenswrapper[4837]: I0111 18:48:42.364697 4837 scope.go:117] "RemoveContainer" containerID="fe7cad95c24e1a087376edc1b36877ab7b7662b0a1fd568a5297b3984bc6ddd8" Jan 11 18:48:43 crc kubenswrapper[4837]: I0111 18:48:43.235036 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-pqnst" event={"ID":"1cf6fa66-290a-4e29-bafc-e60185a22fe2","Type":"ContainerStarted","Data":"9bdd1556c0d8f26ee8df065131d8b4cc10e9d8e81636987c9404154259886be0"}